Your cut-and-keep guide to the “unregulated wild west internet”

UK policy

At least once a month, the policy sphere has to put up with another round of politicians banging the table about the internet being an “unregulated wild west.” The trope prevails because many groups deliberately adopt it as a campaigning tactic. They find a sympathetic politician, whisper in their ear that the internet is an “unregulated wild west” and tell them that they – THEY – could be the swashbuckling sheriff riding in to be the hero. Grandiose statements are made. Political egos are stroked. Campaigns meet their KPIs. Rinse and repeat.

 

The wild west internet/hero sheriff fantasy trope is a clever one, because it automatically sets up a good guy/bad guy dichotomy. The swaggering sheriffs are the ones ridin’ into town to shoot a few baddies, and the rest of us who have more than a vague idea about what we’re actually talking about? We’re just a bunch of filthy accessories to child abuse and terrorism who don’t even have gossamer wings, or a second home, or a PR firm.

That rhetoric has now reached a very dangerous point. See, for example, this thread, where I discuss Conservative politicians actively proposing retroactive criminal sanctions for developers who have deployed end-to-end encryption for security or privacy purposes, because encryption is child abuse and someone wants to lay down the law. Sheriff fantasies have real-world consequences.

 

It’s a general fact of life of that making a factually incorrect and morally disingenous statement, such as “the internet is an unregulated wild west”, automatically invalidates everything the person says after that point. But the people who say that, regardless of whether they believe or understand it, have the money and the power in these debates. So I thought it would be helpful to bring in some cold hard facts. In this case, the facts come from the UK’s online harms/Online Safety Bill process, but I thought they’d provide a useful model for readers to follow in future.

The unregulated wild west internet The laws of the land

The next time a politician invokes the Wild West Internet trope, point them here. These are the cold hard laws that they either don’t know exist, or do know and don’t care because they like the way the sunlight shines on their sheriff star. Have them take this information on board, and ask them how they intend to reconcile their table-thumping demands with the fact that the laws they want to lay down already exist. Ask them why they want to create complex, hostile new regulations rather than improving existing ones.

Most important of all, ask them if they’re willing to commit to making the web better without chilling free speech, imposing a privatised law enforcement and surveillance state, or creating an online-offline legal dichotomy that might just require a whole new sheriff to ride in and teach them a thing or two.

If you get through that process and find that the hero sheriff/guardian angel fantasy is the whole point of the exercise for them, I can promise you there are other policymakers waiting in the wings who actually understand this stuff and know what they’re talking about. You need only ask.

The tables below were complied by Mark Leiser of eLaw/Leiden Law School and Edina Harbinja of Aston University for their excellent paper, CONTENT NOT AVAILABLE: Why The United Kingdom’s Proposal For A ‘Package Of Platform Safety Measures’ Will Harm Free Speech. You should read it, it’s very good.

I have merely taken the tables out of PDF and into HTML, and all credit for this work should go to them.

I may add links to the legal texts in the future on a rainy day.

Online Harms with a clear legal definition

These harms are already illegal both offline and online and there is no need to introduce new laws for them.

Harm Status Criminal law provision
CSEA Criminal offence Sections 47-51 of the Sexual Offences Act 2003
Terrorist content and activity Criminal offence Section 58 of the Terrorism Act 2000; Section 3 of the Counter-Terrorism and Border Security Act 2019
Organised immigration crime Criminal offence Modern Slavery Act 2015; Section 1 of the Asylum and Immigration (Treatment of Claimants, etc.) Act 2004; Sections 57 to 59 of the Sexual Offences Act 2003.
Modern slavery Criminal offence Modern Slavery Act 2015
Extreme pornography Criminal offence Section 63 of the Criminal Justice and Immigration Act 2008
Revenge pornography Criminal offence Section 33 of the Criminal Justice and Courts Act 2015
Harassment and cyberstalking Criminal offence Section 2, 2A, 4, 4A, Protection from Harassment Act 1997; Section 1 Malicious Communications Act 1988.
Hate Crime Criminal offence Public Order Act 1986; Racial and Religious Hatred Act 2006; Criminal Justice and Immigration Act 2008; Criminal Justice and Public Order Act 1994; For England, Wales, and Scotland, the Crime and Disorder Act 1998 makes hateful behavior towards a victim based on the victim’s membership (or presumed membership) in a racial group an “aggravating factor” for the purpose of sentencing in respect of specified crimes. Sections 2, 2A, 4, 4A of the Protection from Harassment Act 1997 also apply for racially and religiously aggravated offences of harassment and stalking and putting people in fear of violence, and stalking involving fear of violence. Finally, there are communication offence under section 127(1) of the Communications Act 2003, or section 1 of the Malicious Communications Act 1988, with enhanced sentencing due to hostility towards one of the five protected characteristics.
Encouraging or assisting suicide Criminal offence Section 2 and 2A of the Suicide Act 1961
Incitement to violence Criminal offence Section 44, 45 of the Serious Crime Act 2007
Sale of illegal goods/ services such as drugs and weapons on the open internet Criminal offence Section 1(1) of the Criminal Law Act 1977; Section 46 of the Serious Crime Act 2007; Fraud Act (2006), Misuse of Drugs Act (1971), or Firearms Act (1968)
Content illegally uploaded from prisons Criminal offence Section 40D(3A) Prison Act 1952
Sexting of indecent images by <18s Criminal offence
Section 45 of the Sexual Offences Act 2003
Modern slavery Criminal offence Modern Slavery Act 2015

Online Harms with a “less clear” legal definition

Harm Status Criminal law provision
Cyberbullying and trolling Potentially criminal in some instances, but vaguely defined and with serious implications for free speech. Potentially subset of communication offences under section 2, 2A, 4, 4A of the Protection from Harassment Act 1997; Section 1 Malicious Communications Act 1988, but vague and depends on the definitions, which are vague and overlapping
Extremist content and activity Criminal in many instances, but vaguely defined and difficult to apply uniformly Potentially Section 58 of the Terrorism Act 2000; Section 3 of the Counter-Terrorism and Border Security Act 2019, but this is already covered by terrorist content, so it is unclear why the extremist content is necessary as a “new harm.”
Coercive behaviour Vaguely formulated Potentially Section 2, 2A, 4, 4A, Protection from Harassment Act 1997; Section 1 Malicious Communications Act 1988, but vague and depends on the definition. This harm can also be potentially confused with existing offences, such as harassment. Further offence is found in section 76 of the Serious Crime Act 2015, but it only relates to domestic abuse cases.
Intimidation Potentially criminal, but also vaguely defined. Section 4 and 4A, Protection from Harassment Act 1997, already illegal, and serious fear of violence offences in Section 4 and 4a of the Public Order Act 1986, so unnecessary as a vaguely defined and subjective harm here. Its vagueness could mean that the harm may include legitimate free speech.
Disinformation Vague, regulation of “fake news” is in progress. Potentially covered by Section 127(2)(a) or (b) of the Communications Act 2003 and Section 1 Malicious Communications Act 1988, ongoing law reform in the area. Section 51 of the Criminal Law Act 1977 covers a bomb hoax; Hoaxes involving noxious substances or things are covered under section 114(2) of the Anti-Terrorism, Crime and Security Act 2001; giving a false alarm of fire exists under section 49(1) of the Fire and Rescue Services Act 2004; impersonating a police officer – section 90 of the Police Act 1996; section 106 of the Representation of the People Act 1983 offence to make or publish any false statement of fact in relation to the personal character of a candidate prior to or during an election.
Violent content Vaguely defined and problematic – any violent content online, including artistic speech could be harmful. It is unclear how this is different from harassment, fear of violence, threat, stalking and extreme pornography, and other already existing criminal offences, as noted above. Does it include artistic speech, video games, films and what implication can this vaguely defined harm have on free speech?
Advocacy of self-harm Dangerous precedent, blurs the lines between free speech and ‘advocacy’, as well as support self-harm support groups on social media. Not illegal, but the UK government has threatened to introduce legislation if platforms do not remove content promoting self-harm. The Law Commission notes “[publicizing] or glorifying self-harm is not ostensibly criminal either.” However, offence of causing grievous bodily harm with intent, contrary to section 18 of the Offences Against the Person Act 1861, could be used here, provided that the victim caused herself serious harm with intent, so assisting or encouraging such behavior could be guilty of an offence under sections 44 to 46 of the Serious Crime Act 2007.
Promotion of FGM Criminal offence Female Genital Mutilation Act 2003 makes the Act illegal, but there is no offence relating to its promotion. but see ss. 44- 46 of the Serious Crime Act, intentionally encouraging or assisting an offence; encouraging or assisting an offence believing it will be committed; and encouraging or assisting offences believing one or more will be committed.

Underage exposure to legal content

Harm Status Criminal law provision
Children accessing pornography Service providers’ liability for making pornographic
content available to persons under 18
Digital Economy Act 2017, s 14 requires providers to prevent children from accessing pornography, under a threat of financial penalties (implementation has been delayed)
Children accessing inappropriate material Vague, undefined, and problematic – who decides what is ‘inappropriate’ and who decides whether a child can access? What is the role of parents and education in helping kids understand what is appropriate for them to engage with? Do we really want parents determining what content a child accesses about sexual health is appropriate. There are existing provisions preventing children from accessing pornographic, obscene and other prohibited materials, as noted above. ‘Inappropriate’ as a category is extremely vague and open to interpretation, it is not certain whether it includes harmful online advertising, for example. It could also affect free speech of adults, children as well as other rights such as privacy
Under 13s using social media and under 18s using dating apps Already the rule; however, rarely enforced; moral panic. This is a question of adequate enforcement and age verification, as noted above
Excessive screen time Moral panic Evidence that the risk out weighs benefits have not been conclusive and the real harm is often overestimated by the media and advocacy groups

Yeehaw.

The Author

We are people of enormous power and influence over the open web. I empower digital professionals to use that power wisely. I advocate for an open web built around international standards of human rights, privacy, accessibility, and freedom of expression. This is my personal site, and does not reflect the work or opinions of my employer.