In May 2021, I co-authored a post with Sahdya Darr titled “Is government preparing to censor discussions about migration?”, where we fleshed out how the then-Home Secretary was leveraging the draft Online Safety Bill for explicitly political purposes.
Not all of my work ages well, but I gotta say, she and I nailed that one.
At the time, we were both working for Open Rights Group, and therefore that post was written in our then-professional capacities. However, we did not write that post “in character” for our day jobs because we were told to do so. We wrote it from our personal convictions, which we still hold, and stand by every word we wrote to this day.
Fast forward two years and we take no pleasure in noting that government is, indeed, planning to use the Online Safety Bill to censor discussions about migration. This came in the form of an amendment added in January:
Clause 11, page 10, line 25, at end insert—
“(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.”
Member’s explanatory statement
This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.
This may look like a minor amendment on a minor topic, but in the Bill’s wider context, you have to see it as the legislative equivalent of a beta test. Start small with one topic, iterate the response, expand to other areas.
Scared yet? You should be.
oooh look here I am thinking about the children
Before we get into the big problem with the amendment, first we have to deal with the obvious, and that is the fact that this amendment has been drafted “about children” but not the ones this Bill has allegedly been created to help.
You see, the only reason a British child would be thinking of crossing the channel in a rubber dinghy would be because they’d taken an overdose of Enid Blyton. They would not be planning to come here, from over there. They would be planning to go over there.
In other words, the amendment does not target British children on British soil. It targets foreign children on foreign soil. After all, “the sun will never set on British online enforcement” and its exceptional world-leadingness.
The UK has turned its local racism problem into a irrelevant discussion of global internet regulation than can harm others outside of their narrow world view. I don’t know what other way to describe this than digital colonialism. ????♀️
— mahsa alimardani (@maasalan) July 12, 2021
But one suspects that other countries might have something to say about their children and their adults having their internet browsing surveilled, by the British government, on the assumption that they were reading about channel migration in order to plan a shopping trip to Decathlon.
Yes, surveilled. Here’s what I mean by that, and this is where we bring in yet another unintended consequence of the “duty of care” model of platform regulation:
How will a service in scope of the Online Safety Bill know that A Child is reading content about channel crossings?
Easy: they will know who is a child by age-checking all users, as the Bill requires. Platforms, by law, will have to know who is viewing all content – not just porn, not just the nasty stuff which is illegal already, all content – and determine that by deploying either formal age verification or informal age assurance (such as AI or phrenology).
So everyone who has the agency to access an internet-connected device, whether aged 2 or 102, will have to go through an age-verification process to keep them from reading,
content about innocent people, not criminals and not “illegal” humans, who are fleeing from something so horrific that entering a cold dark sea at gunpoint is safer than what they have left behind.
And with a simple information request, government will be able to demand that tech companies provide them with a list of exactly who has been trying to access that content.
In other words, before the law is even out of committee processes, the mandatory age-gating regime is already being expanded from preventing precious British children from accessing porn, terrorism, and bullying to preventing anyone on the planet from accessing legal content, as determined by subjective UK government policy, via a mandatory identity layer enabling mass data collection.
So it’s time to stand up.
Because it seems to me that if you are in favour of the Bill, and its stated aims to “keep children safe online”, you should be standing in firm opposition to this misuse of the Bill as a politicised distraction from the greater aim.
It seems to me that if you are in favour of children’s rights, you should be standing in firm opposition to the amendment’s warping of the concept of “age-appropriate” content moderation, even if this forces you to consider the tragedies of poor brown foreign children who have filthy parents with the same compassion you grant to the tragedies of middle-class white English children who have impeccably media-trained parents.
It seems to me that if you are in favour of age-gating to protect children, whether your motivations are social or financial, you should be standing in firm opposition to this misuse of the technology by politicians who care far less about safeguarding the needs of vulnerable children than they do about keeping the votes of racist pensioners.
And it seems to me that if you are in favour of restrictions on migration, whether your motivations are compassionate or hateful, you should be standing in firm opposition to this beta-test run of politicised government censorship of subjective legal opinions on the topic – including your own.
And what after that?
When we wrote the work blog post in 2021, Sahdya and I teased out more unintended consequences, which is the policy professional’s first and foremost job to do:
One wonders where this would logically end. Would platforms be required to remove content from charities like the Red Cross which provide emergency medical help to asylum seekers, including children? Would the unforgettable images of children who have not survived the journey be banned for being subjectively harmful, despite being journalistic content? Would an order go out to social media companies to delete the accounts of charities and civil society groups working in the migration sector – including Open Rights Group – for discussing and thereby “glamourising” the issue?
All worthwhile questions which the January amendment places in a new light.
The Bill, and the amendment, are currently being batted around the Lords. Let’s hope that in their dark and musty chamber, they pause to consider those unintended consequences and let that light in.