Your compliance obligations under the UK’s Online Safety Bill; or, welcome to hell


Estimated reading time: 15 minutes
UK policy
Parliament, 13 June 2022. It was actually a beautiful day but I'm loyal to the black and white aesthetic

Last month I wrote a post about the UK’s “world-leading” vision for age-gating the open web. It got a bit of attention. That post, sadly, encompassed only one aspect of your compliance obligations under the Online Safety Bill. In this post, I’m going to tell you about the rest.

I apologise in advance.

Author’s note: this post was written in July 2022, before the Online Safety Act became law. It is now law, and this post is therefore outdated, and you should not be using it as a definitive reference. Ofcom is currently consulting on its implementation of the law, which proposes many of the details which I described below, though I will not be blogging about that.

As with the previous post, this is tremendously long: 4100 words. There’s really no way to make it shorter. You’re going to need something slightly stronger than coffee. Please drink as responsibly as a Conservative MP with poor impulse control at the Commons bar in the middle of the day.

Preamble from the room where it happens

“This is fine”

This post was tricky enough to write, but on top of that, last week I found myself in the Palace of Westminster, representing my (now former) employer at a roundtable of small tech businesses and startups who stand to be collateral damage in the UK’s determination to regulate the internet around Facebook, via this Bill. That meeting gave me a chance to work through some of these concerns and to sharpen others.

Or it would have, if the nine MPs across three parties who were scheduled to attend actually showed up.

Only one did. 

And he’s a good ‘un who “gets it”, but who also has skin in the game about being on the receiving end of the most horrific online abuse and wants us to help him and people like him.

That is the kind of person we should be working with, and doing business with, to help him and people like him without throwing everyone and everything else under a bus. Me, I will sit down and work with representatives like him any time.

Where was everyone else? Well, you could assume that they were off playing real-life Game of Thrones; Boris Johnson’s time was up, you see, so they were all elsewhere drawing up factions and sharpening knives.

Or you could be cynical and say that a meeting about small businesses like yours held no interest for them; there’s no headline, no PR, no wild west hero sheriff fantasy, no “big tech crusader” mantle to claim for anyone sitting in a room with the likes of me and, by representative extension, the likes of you.

So the roundtable was a real-life version of the “this is fine” meme, as everyone who was in attendance sipped their coffee and nibbled their patisserie and chatted amiably while the room was on fire.

Still, being on one side of the building while the government was literally collapsing on the other side of the building, and then heading up the street to have a giggle fit at the party in front of No 10 (link NSFW), made it a meeting I’ll never forget. I mean, what are the chances that I walk into Parliament for the first time in two and a half years and the government implodes?

Whoopsie.

Now let’s amble.

I want to help you understand what the UK’s draft Online Safety Bill will mean for you and your work on the open web. This post is my attempt to explain the compliance obligations, as they’ve been drafted, and how they will hit you.

By “you”, I mean anyone working in a company or a project which will fall in scope of the Bill, whether that’s your paid work, your software community, or your personal hobby.

And by “you”, I mean a company which is Not Big Tech, or as I’ll call it for the purposes of this post, NBT. They have their compliance departments, legal teams, and policy specialists. You don’t.

You” also means an NBT whose product or service does not engage in legal and consensual adult content or conduct. That means porn. And that means porn either as your main business model or as some of the content on it; because if you are, you’ve got specialist compliance experts for that too.

It goes without saying that this is not legal compliance advice. It also goes without saying that this is a draft Bill, not quite “the law” yet, so anything I write here is subject to change.

Additionally, I present this post for information only: so don’t shoot the messenger. Not all of these ideas are necessary, proportionate, or even feasible. I’m presenting them to give you the facts you need to work with.

How to read this post

This post reflects the second draft version of the Online Safety Bill, plus amendments as was published on 28 June 2022. Unfortunately, it is only available in PDF (that link opens up the document, which is 230 pages).

When you see a numbering system following an excerpt from the Bill, that’s my shorthand for the Part, Chapter, Clause, and Paragraph it came from within the draft Bill text. So for example, 3/2/8/(5) refers to Part 3, Chapter 2, Clause 8, Paragraph 5. I wouldn’t have to do this if legislators published legislation in open text formats rather than PDFs. Pfft.

As you read this post, you should also hold everything in it in the light of these two questions.

The first is:

how will my having to do this address online harms and make the web a better place?

And the second is:

Why is this government throwing me, and my team, and my project, under a bus, with these compliance requirements and obligations, in order to get back at “big tech”?  

If you can’t come up with an answer to either of these questions, that itself is the answer.

I have divided this guide into six areas:

  1. Is your work in scope?
  2. Compliance assessment obligations
  3. Administrative obligations
  4. General monitoring obligations
  5. Compliance costs
  6. What can you do?

Is your work in scope of the UK’s Online Safety Bill?

Is it possible for your site, service, or app, which allows content to be shared and/or people to communicate with each other, to be accessed by any adult or any child within the UK?

Then you’re in scope.

NB “accessed” doesn’t necessarily mean that a user can set up an active account on your service. If a British adult can merely download your app on the app store, the app is in scope. If a British child could merely type your URL into a browser, the site is in scope.

This Bill has been aggressively promoted as being about “big tech” “social media” “tech giants”, but it is not, and it never was. The ongoing line that it’s here to “rein in the tech giants” is, and always has been, bullshit. In fact, I’m going to start being really hardline about this by saying that anyone – be it politicians, media, or civil society –  who still discusses it as being about “big tech” and “social media” and “tech giants” is spreading disinformation.

And you folks need to stop that.

So the bottom line is that it’s easier and safer to assume that you and your work are in scope, than to assume that you are not.

And don’t forget that this regulatory regime is expected to be extraterritorial. If you are not in the UK but your site, service, or app can be accessed by anyone in the UK, you’re fair game.

Compliance assessment obligations

First let’s talk about the paperwork. As you’ll recall from the previous post, the government’s digital regulation strategy is to scrap EU bureaucracy and paperwork and red tape in order to, erm, make way for UK bureaucracy and paperwork and red tape. Bumf, but British bumf!

First and foremost of these are the risk assessment obligations you will be required to devise and produce to keep Ofcom, as the online harms regulator, happy. This is a result of the Bill’s attempt to transfer the offline “health and safety” model to the open web, in the belief that online harms are a series of trip hazards which can be nailed down with proper risk assessments.

I’ve had a good few years to reflect on this model of internet regulation, and it finally dawned on me that the “trip hazard model” was a Freudian slip that gives the game away. The intention is not to prevent companies from laying “trip hazards”. The intention is to use the legislation to lay trip hazards in front of companies, in the form of these impossible risk assessment compliance processes, which exist solely to create the paperwork needed to set you up to fail.

It’s these assessments that are the trip hazards, on purpose.

Sometimes risk assessments can be good. However much everyone rats on GDPR, the privacy impact assessment is a priceless opportunity to ask open-ended questions, and follow where they lead, to prevent problems from ever happening down the road and mitigating the ones already in play. Those questions, of course, are based in international standards and human rights principles. What’s on the table here, by contrast, don’t seem to be open-ended questions nor the upholding of international principles. This is table-pounding which demands: prove yourself, or else.

So what are these assessments? Your NBT will have to go through these, at the very minimum. Their specific shape, size, and requirements are yet to be determined – that’s for Ofcom down the road, as with so much else of this legislation – but what is about to be hammered down in law are the following:

  1. An illegal content risk assessment
  2. Your duties towards that illegal content
  3. Your duties towards the reporting of content
  4. Your duties towards establishing complaints procedures
  5. Your duties about freedom of expression and privacy (this is all the Tory culture war BS landing on your doorstep)
  6. Your duties about record-keeping and review of your content moderation and takedown policies
  7. Your child access assessment (the age gating)
  8. A child risk assessment
  9. Your duties about children’s safety
  10. Transparency reports

Let’s zero in on just two of those: the illegal content risk assessment and the child safety risk assessment.

Risk assessment hell

For these two risk assessments, you will be required to identify the following, in writing:

  1. Your user base
  2. The level of risk to users of encountering each kind of illegal content (generally terrorism and CSAM)
  3. The level of risk to users of encoutering other illegal content
  4. The level of risk of harm to individuals presented by illegal content of different kinds
  5. The number of children accessing the service by age group
  6. The level of risk to children of encountering each kind of primary priority content (this means porn and content relating to self-harm, suicide, and eating disorders)
  7. The level of risk to children of encountering each kind of priority content (meaning online abuse, cyberbullying, harassment, harmful health content, or content depicting or encouraging violence)
  8. Each kind of primary priority or priority content which is hamrful to children or adults, with each separately assessed
  9. The presence of any non-designated content which is nevertheless harmful to children
  10. The level of risk of harm presented by different descriptions of content, for children, that is harmful, by age group
  11. The level of risk of functionalities allowing users to search for other users, including children
  12. The level of risk of functionalities allowing users to contact other users, including children
  13. The level of risk to adults of encountering other content that is harmful
  14. The level of risk of functionalities of the service facilitating the presence or dissemination of illegal content, identifying and assessing those functionalities that present a higher risk
  15. The different ways in which the service is used, and the impact that has on the level of risk of harm that might be suffered by individuals
  16. The nature and severity of the harm that might be suffered by individuals by the above, including age groups
  17. The design and operation of the service, including the business model, governance, and other systems and processes that might reduce or increase the risks of harms

I probably missed something in there, but you’re probably curled up in a ball crying as is.

Oh, see that last one? Every time you make a change to your business model, governance, or systems or processes which might theoretically increase the risk of any subjective online harm, you’ll be expected to check first with Ofcom, as the regulator.

Having fun yet? There’s more!

Administrative obligations

In addition to the risk assessments, you will have administrative compliance obligations to Ofcom as your content regulator. These include

  1. The requirement to register with them as a service provider in scope of the law
  2. The requirement to pay them an annual fee
  3. The requirement to respond to information notices (e.g. requests for them for clarification on anything we’ve discussed in this post)
  4. The requirement to designate a person to prepare a report to Ofcom when they come knocking
  5. The requirement to assist the person in preparing that report
  6. The requirement to cooperate with them in an investigation
  7. The requirement to attend an interview with them when they physically call you in
  8. The requirement to make a public statement in certain specific and extreme circumstances
  9. The requirement to report any CSAM/CSEA on your service to the National Crime Agency

No we’re not done yet.

General monitoring obligation

You probably don’t want to get into this today, but if you weren’t aware, I’ll just cut and paste this from a previous post:

The draft Bill’s explanatory notes provide a reminder (see page 12) that

Article 15 of the eCD also contained a prohibition on the imposition of requirements on service providers to generally monitor content they transmit or store, or to actively seek facts or circumstances indicating illegal activity. […] there is no longer a legal obligation on the United Kingdom to legislate in line with the provisions of the eCD following the end of the transition period on 31 December 2020.

Having swept 25 years of intermediary liability into the bin, the draft Bill text then goes on to establish a general monitoring obligation for both illegal content and legal content, which of course, means anything conveniently stuffed into the rubric of “children’s safety”.

In other words, the UK is ditching the prohibition on a general monitoring obligation – its own equivalent of the US Section 230 – specifically because it came from the EU and must be thrown out with the bathwater. In its place it becomes the first western nation to impose a general monitoring obligation over both illegal and legal and subjective content. Yay taking back control!

And that’s gonna cost you.

Compliance costs

In addition to the costs of your time and labour in complying with all of the above, how much is this regime going to run you, out of pocket?

Well, your NBT in scope of the Online Safety Bill is facing at least four kinds of ongoing compliance costs.

Regulator fees

The first cost you’ll incur is paying an annual fee to the regulator (Ofcom) for the privilege of being regulated by them. (6/71)

This is obviously intended to be similar to the annual data protection fee you pay to the ICO. However, that system is about working with the regulator to uphold your users’ fundamental right to privacy, while the OSB system is about working with the regulator to decimate your users’ fundamental right to privacy.

So we’re off to a flying start, then.

The amount of the fee has not yet been determined; it’s down to Ofcom to ensure that it is “justifiable and proportionate” (6/75/2/b).

However, my guiding rule is that you shouldn’t put too much trust in the words “justifiable and proportionate”, not when twelve years of Conservative attitudes about the internet have worked from the default position that you people are all filthy vermin who are complicit in child abuse and terrorism, and the burden of proof is on you to demonstrate otherwise.

(Remember VATMOSS? This makes that look like primary school.)

Clause 73 also discusses the establishment of a threshold figure. The explanatory notes state OFCOM will be funded via fees from providers of regulated services whose qualifying worldwide revenue is equal to or greater than the specified threshold as determined by clause 73. This clearly means that not every business in scope will be required to pay an Ofcom fee, but does that mean that businesses which don’t have to pay the fee will still face the compliance requirements regardless? And can somebody find out?

Compliance help

The second cost you’ll incur, as an NBT, is the legal advice which you will need to take to understand your compliance obligations: in other words, the expert advice you’ll need to bring in to help you know what it is you’re expected to do, lest you get shouted at that you are failing to meet your duty of care to Britain’s children.

Paragraph 124 of the Impact Assessment projects that cost as follows, and I have included the screen cap so that you understand that I am not making this up:

one regulatory professional at an hourly wage of £20.62 is expected to read the regulations within each business […] the explanatory notes are approximately 52,000 words and would therefore take just over four hours based on a reading speed of 200 words per minute.

In other words, your NBT is expected to locate the mythical regulatory compliance professional who

  • Understands your NBT from the top of its business model to the bottom of its codebase;
  • Understands how the Online Safety Bill will work in practice;
  • Understands how the Online Safety Bill will link up with every aspect of your NBT; and
  • Is willing to do all of this, from start to finish, in half a day, for around ninety quid.

Wow. Okay.

Let me put it this way, folks: as a regulatory professional, if you want me to even look at you, you need to move the decimal point in £20.62 one digit to the right. If you want me to actually listen to you, you need to move it another digit to the right.

Per day.

Mandatory age-gating software

The third compliance cost for your NBT will be implementing a third-party age verification or age assurance system, to identify the ages of everyone who accesses your service, even if they are not and never become actual users of your service, because if you don’t, YOU HATE THE CHILDREN.

We discussed this in the previous post, the 3000-word, top-of-Hacker-News one.

If you need to pause now and go find someone to give you a hug, I won’t hold that against you.

General monitoring software

The fourth compliance cost will be the scanning and monitoring services to detect both illegal as well as legal and subjective content, as we discussed above.

Because unless you spend every waking minute of your life in God-mode, monitoring and reading every keystroke your users type in every interaction with your service and every pixel they exchange with other humans, and rush in to correct or edit or censor anything that might be legal but subjectively harmful, you’re going to need to install some sort of automated screening software.

You’re not going to do that because you’re a horrible nosey internet stasi. You’re going to do that because your compliance obligations say you have to do that, or else duty of care the children etc etc blah blah; or if you’re lucky enough to have a senior management position in your organisation, it’s your personal freedom and your bollocks on the line, as you could well face criminal charges for being uncooperative.


At this point I want to share an email I received from a follower in response to last month’s post:

It seems to me that the Bill (like most attempts to “regulate big tech”) will actually reinforce the position of the incumbents, by further raising the bar for any smaller entity that wants to compete. The cost of implementation is likely to show big economies of scale – so favour the already big – and any attempt to negotiate on what the obligations are will require lawyer-time on a scale that only the big can afford. Littler people will be welcome to use the big platforms, I’m sure. So a Bill that is being widely sold as “beating up Facebook” will actually strengthen them. Whether that is the intent, I’ve no idea, but I think a lot of people are going to be surprised and disappointed.

He’s absolutely right. There was a time, about a year ago, that I made this meme as a joke:

Roll Safe meme: "can't have online harms if you don't have online services"

What I’ve come to realise since then is it’s not a joke. That’s the intention. Make it too prohibitive, risky, or impossible for public discourse to flow on smaller platforms and services; require the larger ones to become speech police and societal monitors; politicise the hell out of it and threaten tech workers with jail until they comply.

It’s not that they don’t realise they’re throwing you and your work under a bus.

It’s that they do.


So what can you do?

The Bill is in its report stage this week, just before Parliament heads off for summer recess, and any legislative programme – bad or good – takes a firm backseat to Tory Game of Thrones.

And I hate to break it to you, but this is starting to look like a lost cause.

As I learned myself last week, politicians aren’t interested in small businesses or projects like yours. If there’s no angle that will allow them to present themselves as crusading heroes “taking on the tech giants”, they’re not interested.

Those of you who have engaged with your representatives have tended to receive response letters full of stock messages about social media, and online harms. Remember what I said at the beginning of this post, about how this Bill has been aggressively promoted as social media legislation? Most elected representatives think that’s what it is, because that’s what they have been told too. They aren’t aware of the nuances or the complexity or the scope or the collateral damage either.

And the people who do understand those complexities, meaning the money machine that has driven this Bill – the one that only communicates with the public through paywalled, adtech-riddled, PR agency-drafted op-eds in right-leaning broadsheets – has far more power and influence than all of you put together ever will.

So what can you do? As the past month has shown, you could choose to let the people responsible for this Bill divide and conquer among themselves, taking the Bill with them. This Conservative Bill, for example, has rent the Conservative party into bitter factions. The Telegraph, whose idea this Bill was in the first place and who gleefully crusaded for it for years, is not only backtracking on its own Bill but is pretending like we all haven’t noticed. And even the civil society organisations who have supported this Bill are souring on it as they realised they’ve been used too. A little nudge might go a long way.

You could, also, stage your own campaign. Announce that you’re blocking UK users. Announce that you’re pulling out of the UK. Announce that you will not partner with the UK government in identifing all your users and surveilling their conversations, on the assumption that they are all deviant criminals. Send out a note to your users warning them that you may have to terminate their accounts if this Bill passes, and let them run with their anger. Take a stand. Defend your users’ rights. Be at the table, rather than on the menu.

But whatever you choose to do, you can’t just oppose what’s wrong. You have to come up with an alternative plan to put it right. Because your opposition to this Bill, and what it’s going to do to your work and to your users, will be taken as an endorsement of the status quo as well as a confession of guilt that you are complicit in those things. That’s the tactic, you see: any opposition to this Bill is either lobbying or “intransigence” or collusion with Big Tech.

And you are not guilty of any one of those things. So come prepared for battle, because just by being yourself, you have more ammunition than you think.

I have my own ideas – a creative exercise, if you will – about how this Bill could (and should) be scrapped over, and restarted from scratch, on a far better footing.

But you’ve read enough for today.

Header image by me, 6 July 2022: the room where it happens, snapped on my way to “The Room Where It Happens”. It was actually a very lovely summer’s day, but I’m committed to the black and white aesthetic.

The Author

I’m a UK tech policy wonk based in Glasgow. I work for an open web built around international standards of human rights, privacy, accessibility, and freedom of expression. The content and opinions on this site are mine alone and do not reflect the opinions of any current or previous team.

23 Comments

  1. Noel Burgess says

    A fascinating article, giving food for thought for weeks to come. Thank you – I like your style.

  2. J Lamont says

    Easier to block the UK subnets and let blighty become a tech blackhole than to comply. More business outside than in.

  3. A Roberts says

    This was frankly quite a dreadful read given your unpleasant tone. Besides your commentary, none of the proposed legislations are significantly far off from GDPR or COPPA. You seem to have interpreted the law in the most negative, unforgiving way possible, cherry-picked the worst bits, and added your own commentary on top. Maybe you don’t care about privacy as a small business owner, but have you considered that privacy is a right, and these laws (which are reasonably similar to compliance with the GDPR) are similarly

    • Hi A Roberts. Aside from the fact that I’m not a small business owner, you seem to have cherry picked your angle of reply. So I’ll direct you to the “Book” link above, where I discuss the book I’ve written, being published soon, which explains privacy for web professionals from the perspective of privacy as a right. It’s 47,000 words or so, drawing on 25 years of experience across code, policy, commerce, digital rights, and law. I hope you enjoy reading it.

    • Adrian Midgley says

      In reply to A Roberts.

      You’ve been reading something else.

  4. A guy says

    Would slapping a “We can’t show you this because your government wants to make us to terrible things to you so hey, tough luck” banner on any UK request be enough?

  5. Thomas Oakcroft says

    Can you confirm that this article would be illegal under the proposed Online Safety Bill?

    From memory, the maximum penalty is 12 years in jail; is this correct?

    • The article itself would not be illegal. As the Bill has been drafted, it would be possible for the content on it – or on any subjective but legal topic – to fall within a code of practice devised by the Secretary of State for DCMS for political purposes. That currently being one N. Dorries.

      As for jail time, I think you’re conflating this with something else. There is a senior management liability regime but prison terms have not yet been determined. Even if they had been, a person facing a 12 year prison sentence for a blog post published on their service is known as a martyr.

  6. Anon says

    These regulations create a barrier to entry that allows only big tech access to large parts of the internet.

    This is not regulation of big tech. This is a favor to big tech.

  7. percival says

    idfk what the hell i’m supposed to do.
    most of my friends are on the internet and thus not in the UK and like. i can’t do shit abt this fuckin shitty bill bc if it does pass then i only have my birth cert. and not any other kind of identification (i.e: driving license, passport) and idk if that’s going to be enough.
    i find it hard enough to make friends and keep them as it is. i’m not sure what i’m supposed to do.

  8. M Palmer says

    I guess lots of people will end up using VPNs & Tor & have to pretend they’re not British to access content abroad. Though the government might still know as client side scanning was added as a “last minute” amendment.

  9. T. Frutuoso says

    I’m struggling to find words to comment after reading this… It’s like the politicians are putting a rop### CENSORED BECAUSE OF THE CHILDREN ### necks of British tech startups, handing every bit of market to foreign actors… Oh well, at least the UK had Brexit.

    Great article, masterfully written. Greetings and best wishes from Portugal.

  10. Bob says

    That’s pretty horror show. I’ll assume that since I run a webserver on a Raspberry Pi glued to the side of my Desktop computer to share cat pictures with my friends and family that I will fall in scope. Would it be foolish for me to ask if you could write a suitably short message encompassing your/our concerns that I could place on my server as a 404 error redirect. Others might wish to do the same and also include a link on their pages to the same words to be included with the page links to privacy/cookie notices. Perhaps also a link to a webdevlaw page that expands on the simple message to give more detail but not running to 4500 words.

    • Thanks, but I have no interest in establishing myself as an internet intermediary because of this Bill.

      (Funnily enough, this blog still gets weekly hits from a mail order business which linked to a long-deleted blog post of mine in lieu of compliance.)

  11. Jason Trower-Rundle says

    Not to sound nihilistic or anything but how can any of you summon any chagrin for shit like this in the UK anymore? Fuck the UK
    Once my grandmother dies I’m selling everything I own and moving to southern italy to start an agrivoltaic vineyard.

  12. Anon. says

    This is just so utterly depressing. The nerds who care about this sort of thing are in a minority. I moved to using an always-on VPN after the snooper’s charter came out – and I thought _that_ would be the high water mark. I have no idea if it really provides any anonymity – correlation analysis is a powerful thing – but at the very least the fact that MI5 / MI6 are complaining that E2EE makes their life hard is probably a good sign that it is working. What it does do is send a massive “sod you” to dragnet intel gathering, and also prevents your ISP from (easily) seeing your DNS requests – and forwarding them to all and sundry who request them.

    I really hope the current political circus fixes some of the problems with this bill, but, to be honest, I doubt they will.

  13. Christopher Graham Yapp says

    The issue for me is professional indemnity insurance for SMEs, voluntary bodies and freelancers. Given wide scope, how will ABI react? If you can’t get insurance because of the wide scope and ill -defined harms this could kill many enterprises. A simpler example in the non digital world from years ago. All my local country pubs had children’s playgrounds in the early 1990s. They all vanished in 5 years. One local faced a 1500% increase in insurance costs for potential litigation despite not having had an incident for over a decade.

    • There’s insurance and there’s also the linked issue of procurement compliance. In my early web design days, I had a client drop me as a service provider because I didn’t produce a health and safety assessment of my “premises” for them, even though my business was me working from a laptop at home. Their procurement rules required the health and safety assessment from suppliers. So I was off their list. Proof positive, as if we ever needed it, that attempting to transfer the health and safety culture of the physical world to the digital world merely creates more bureaucratic points of failure, regardless of actual risk or harm.

  14. Russ says

    Such a sobering read, but thanks for the cohesive listing of compliance obligations in all their gruesomeness. On which subject, for anyone who missed it, compliance costs didn’t get a single mention during the Commons debate on the Bill’s Report Stage this afternoon. Not even by Her Majesty’s Opposition, or should I say Her Majesty’s Non-Opposition, or maybe even Her Majesty’s Let’s Propose Hundreds of Even More Complicated Layers of Amendments. Fortuitously, some of the proposed Opposition amendments were from Planet Zog, and got rightly defeated. Some of the proposers of these Amendments forgot what they were talking about, or windbagged their way off-topic, and it often took confused members on the Government benches a while to steer the debate back on course, although I can’t honestly say it was ever on course at any point during the day.

    So maybe the transfer of the Bill upstairs to the Lords is a good thing – the Bill still has a huge amount of work needed on it, but seems to have exhausted the efforts of the MPs, at least for the time being.

  15. I guess this means that my piddly little blog reviewing roast dinners in London, with a side of slagging off the government and the occasional picture of men in lingerie is doomed?

Comments are closed.