Book excerpt: Developing for privacy in the pandemic

The following is a draft of a postscript I’ve written for my upcoming book which probably shouldn’t wait until publication. It’s what you should know about developing to safeguard user privacy in the age of coronavirus.


As I finished this book – or so I thought – the Covid-19 pandemic swept across the globe. Me, I’ve gone from having Europe at my feet to working out of a £10 folding chair in my Scottish back garden. My coworkers are now birds and beasts rather than parliamentarians and policymakers. But I know I’m one of the lucky ones: I’ve stayed healthy, the people I love are well, work keeps me busy, my cupboard is full, and I have the amazing NHS at my back. I have no right to complain about anything. That obliges me to look after others who aren’t so lucky, and so follows this postscript on what the pandemic means for user privacy and the work you do to protect it.

In addition to the changes we have all had to make in our professional work, as well as the difficulties now present in our everyday family lives, the Coronavirus outbreak has brought the privacy issues I have discussed in this book into focus like never before. Privacy is real now. Some of those issues have been hard lessons learnt for all of us.

Let’s take the video conferencing platform Zoom, for example. Originally conceived as a business-to-business tool, its on-the-spot adoption for everything from family contacts to school lessons skyrocketed in the first weeks of the pandemic, to the point where “Zoom”, like “Google”, became a verb. As it did, a raft of privacy and security issues hit the headlines.

Consider each of these issues in the context of what you’ve learned in this book:

  • Zoom’s densely legalese privacy notice was found to have stated that personal data was being collected about meeting attendees which they reserved the right to sell;
  • Its iOS app used Facebook’s SDK to send data about users to Facebook, even if – as is the case with all deployments of the Facebook SDK – the user did not have a Facebook account;
  • Attendees could be tracked by meeting hosts without their knowledge;
  • The MacOS installation process did not disclose all the modules it initialised;
  • A zero-day exploit allowed remote executions on Windows machines;
  • Meeting hosts, again without attendees’ knowledge or consent, were able to use a data mining feature to display their LinkedIn profiles;
  • A data leak disclosed thousands of users’ email addresses;
  • The use of numerical meeting ID numbers, on an easily discernable pattern, led to “Zoombombing”, meaning the coordinated hijacking of meetings with grossly offensive content;
  • The company claimed the platform used end-to-end encryption, which it did not;
  • Its DIY encryption was poor;
  • Calls were routed through third countries with questionable approaches to privacy, such as China, without user knowledge or consent; and finally,
  • Features such as unverified users’ abilities to share their screen and change their names mid-meeting were turned on by default, leading to the Zoombombing of meetings – including one which I had the misfortune to be in – with stomach-churning videos of child sexual abuse.

Any one of these errors would be the end of a startup, and all of them together, in normal times, would be the subject of a Parliamentary enquiry if not a criminal investigation; were it not for the essential role the platform is playing during the pandemic, privacy regulators and public prosecutors would have eaten it alive. In response to the fully warranted bad publicity, the company’s CEO told the Wall Street Journal that “[W]e need to slow down and think about privacy and security first…that’s our new culture.” Yet all of those issues, which occurred under his watch, were completely unnecessary and totally preventable, and a true privacy and security first approach would have made sure of that.

(Note carefully how the decision-maker ultimately responsible for the mistakes gets to ‘ethics wash’ them into a personally inspirational leadership journey, while those of us on the receiving end of the mistakes struggle to unsee the horrific imagery they permitted. ‘Twas ever thus.)

It should not have to be that way for anyone else. While most of you will not be working on platforms of that scale, or on applications which will become household names overnight, you will all have to consider the privacy implications of the work you either create, rely on, or both.

For once, the law is actually quite clear here: privacy and data protection regulations have always allowed provisions for data sharing in the interest of public health or substantial public interest, such as national emergencies, which few would argue we are not in. It is what happens with that data outside clinical settings that poses the problem. Protecting our vital health data from scope creep, which could easily see it exploited for mass surveillance, discrimination against the vulnerable, or the creation of a protected class with special privileges, will be the true test of your commitment to privacy.

Your challenge lies in the systems, software, and applications you will create, contribute to, and use for:

  • Contact tracing to alert people that they have been near someone with an active infection;
  • Testing and responder capacity to make sure the right resources reach the right people;
  • Early warning and surveillance to see the second and third waves coming;
  • Quarantine and social control to safeguard the most vulnerable; and, of course
  • Research towards a vaccine and a cure.

The first category is perhaps the most personally relevant. For most of you, contact tracing apps and health monitoring systems will become not just a part of your daily routines, but essential to your ability to travel and work. Many of you will be given no choice about using these apps and uploading your personal data, as well as the data about your closest family and friends. The rights you are given in exchange for that data will depend on algorithms created by private companies with questionable track records on privacy. And the potential consequences for the abuses of your data and rights, whether you’re disabled and living in isolation or going to work sick because you have no health insurance, are unfathomable.

If the privacy risks which vulnerable people live with every day have never hit home for you, consider this a teachable moment: we are all vulnerable now. Your health, and your human rights, can change in a matter of hours, and your control over them is out of your hands.

So how do we, as the makers of the web, respond to a challenge which not all of us will survive?

Developing for the Coronavirus

The pandemic, by all scientific consensus, will be with us for the foreseeable future. Out of that chaos comes opportunity, so we must try to look at the pandemic as a means of advancing user-centric privacy technology in the right way. There is a balance to achieve in protecting our health, safeguarding the vulnerable, resourcing our health care systems, and keeping our societies running smoothly which should not mean trading away our privacy, our civil liberties, or our human rights. And if you are called upon to participate in the creation of pandemic technology, whether you are a project manager, a developer, or a designer, you have a vital role to play in achieving that balance.

The approach you take to ensuring user privacy during the pandemic will depend largely on the legal context which exists around you. If you live in a country with a view of privacy as a legally upheld human right, your will take a broader view of the issues in play than someone working in a country where privacy is a function of contractual terms and conditions.

Likewise, if you live in a country with a stable tradition of the rule of law, you may be inclined to develop based on your personal presumptions about the ways society works which people living in other countries have never known. Your challenge, as I have said throughout this book, is to do to everything you can to safeguard user privacy regardless of the presence or absence of privacy legislation, as well as a healthy regard for their human rights, and to do so with an understanding of how high the stakes are if you get it wrong.

If you do not have a rights-based privacy law to use to shape the constraints of your work, or if you work in an environment where the coronavirus is being exploited as a means of targeting vulnerable groups for mistreatment, your approach to developing for the pandemic should adhere to these guidelines, which we discussed at the beginning of this book, at the absolute minimum:

  • Data minimisation: the data collected by the application or system must be limited to the smallest amount of data possible and not aggregated with any other information. When data about an infected person is shared with a contact, that data must be the minimum amount possible, it must never identify the person, and the data given must never put the person at risk of retaliation or harm.
  • Purpose minimisation: the data collected should only be used for the purpose of managing public health during the pandemic and not shared with third parties for any reason, whether that is marketing (even for medical treatments) or, and for heaven’s sake please don’t do this, a social media SDK.
  • Lifecycle limitation: the data must be deleted as soon as it is no longer needed, even if the pandemic is ongoing. Any public health authority with access to that data must also delete it when it is no longer necessary or useful. No personally identifiable data should be retained by any third party for future uses. And when this nightmare is over, every scrap of data held within the apps and in the cloud should be deleted, along with the apps too.
  • Information, technical, and security measures: the applications and systems should be built with the highest consideration for security. Decentralisation is key. De-identification, random identifiers, and end-to-end encryption should also be considered. There must be careful consideration given to the need to balance privacy and security; for example, biometric identifiers should not be used to verify a person’s identification within a contact tracing app if that data could be aggregated with other data within the app to identify them to others, Thought should also be given to how these systems can, and will, be abused, such as false positives, scamming, or DDOS attacks, and how to mitigate those possibilities as much as possible.
  • Transparency and notice: All applications and systems, and the health services which provide them, must provide full disclosure over what data is collected, how it is used, who has access to it, how long it will be held, and what rights the person has over it. When data is shared with any party for any reason, it must be within the scope and constraints of the rule of law, and this basis must be made clear to the person in non-threatening language they can understand. Additionally, the applications and systems themselves should be open source, and should be hosted in projects with full transparent governance, including full disclosure of who funds the projects and who makes decisions about them.
  • Choice, control, and consent: the data about a person’s health should be owned by the person and kept under their control. They should be given a choice on where that data lives, which in this context, means decentralisation: if they want their data to never leave their device, that wish must be respected. The person’s data must not be shared with people they know, or people they don’t know, without consent. The data collection, and the applications themselves, should be used voluntarily and with full consent, which must be active and opted-into with full understanding. And it must be possible, as with all good data protection practice, for the user to revoke their consent for any or all uses of the data collection at any time.

As helpful as our fundamental privacy values may be, those principles alone cannot safeguard the rights and freedoms of the people in the data, nor can they provide them with the confidence that public health systems are working in their best interest.

That trust matters, and it works both ways.

After all, if technology is meant to be our way out of the pandemic, then the data people share through it must be provided with honesty and confidence. People will not share their health data if they fear consequences, punishment, or discrimination for providing the “wrong” answers, or if the systems are patronisingly presented as a game to see who gets a trophy for sharing the most information.

Likewise, if citizens view mandatory apps as a cynical data grab for private health insurance companies, marketing firms, brand influencers, and funeral plan vultures, their confidence in both the apps and the public health systems providing them will plummet.

And without a caring regard for people who can’t afford the latest smartphones, or are not able to use the ones they do have, making their rights dependent on technology they have no ability to access crosses the line from digital exclusion to social cleansing.

Protecting the people in the data

Clearly, we all need to get this right for everyone, including our families, our friends, and ourselves. So if you are able to commit to further safeguards for user privacy throughout the pandemic, whether that is through your own work on an everyday level or on a wider stage through proposing national legislation, a comprehensive model exists to show you the way.

As of this writing, a draft Coronavirus Safeguards Bill has been proposed in the UK to ensure the conditions which must exist to protect the civil liberties and human rights of users, above and beyond the existing privacy safeguards discussed above, in the event that contact tracing, early warning, or quarantine apps are made mandatory. The safety net within its suggested provisions integrates privacy law, data protection best practices, human rights principles, and, sadly, the hard lessons learnt from actual experiences around the world during the early weeks of the pandemic.

These proposed safeguards, which are currently falling victim to politics, include:

  • No sanctions: Nobody should be sanctioned for failing to install a contact tracing app, use it, or have their phone on their person at all times. This provision could include a ban on sanctions such as civic penalties, criminal charges, being fired from employment, losing the right to vote, or non-eligibility for public assistance. This provision would also safeguard people who do not own a smartphone, cannot afford one compatible with the app, cannot afford mobile data, or do not wish to have Bluetooth turned on at all times.
  • DPIAs: Any contact tracing apps must have a full Privacy Impact Assessment which must be made public for consultation and feedback.
  • No gamification and the right to privacy: There must be no requirement for people to install a contact tracing app, read the messages on it, contribute data to it, or keep it on their phones (e.g. not delete it). Gamification has no place in public health. If installations are to be made mandatory, there must be a strict privacy and human rights basis within the rule of law established within the regulations which mandate it. Additionally, children over 13 should have the right to veto parental insistence on their using the app. Privacy, after all, is the right to be left alone, and most teenagers are experts at this already.
  • Not a business opportunity: Any of the personal data collected and used by a contract tracing app, whether it is about the person or the people they have come into contact with, must not be used, shared, or processed by any party for any purpose other than the health system’s management of the pandemic.
  • No travel passports: Immunity certificates – app-based or otherwise – must not be made a mandatory condition for leaving the home, using public spaces, or taking public or private transport, and businesses must not be permitted to demand them. (As dark as the thought is, the draft Bill’s authors have also been mindful here of the potential physical risks to a person known to be walking around freely with a coveted immunity certificate on an expensive smartphone.)
  • Protected status: Because of the risks of discrimination and abuse, “coronavirus status” must be made a protected condition – in privacy terms, a special category of data – akin to sexual orientation or religion. Coronavirus cannot create a caste system of worthy and unworthy people.
  • Oversight: Finally, there should be a new independent regulatory body, similar to an equal rights commission, to conduct oversight, provide guidance, and receive complaints about violations of privacy and human rights related to the coronavirus. Systems are only as good as the people who build them, the leaders who direct them, and the watchdogs who keep them in check. No one’s health, or their human rights, should depend on a black box accountable to no one.

The UK’s draft Coronavirus Safeguards Bill is a way to make sure we all do better through common-sense provisions protected by the rule of law. Its model should be followed in more ways than one. For just as the pandemic will accelerate years’ worth of technological progress into a few short months, it should also accelerate years of stalled progress on privacy legislation, particularly in the United States. There could be no worse time for American developers – who, unlike their European counterparts, lack both an omnibus privacy law as well as centralised national public health systems – to be making up the rules as they go. Nor is this a time for the tech giants who are developing coronavirus apps to become America’s de facto privacy regulators.

So as stressful as these times are, do not let the pandemic stop you from fighting for a better web, for both good times and for bad. If you are in a position to contribute to the debates about Federal privacy legislation, or to seek to introduce provisions similar to those proposed in the draft Coronavirus Safeguards Bill, don’t wait to be asked. Do it, and do it today.

And if a service or application you are requested to use does not meet the safeguards I’ve discussed above, regardless of the presence or absence of a legal framework, be brave enough to call them out. You owe it to yourself, and you owe it to the people in the data who need your voice to protect them.

The Coronavirus pandemic is the darkest time that many of us will ever know, and it will change all of us forever. The technology we build in response must shine a light on our way out of the darkness, and not pull us further into it. None of us have a map showing how to get there. But I hope this book provides a torch.

9 thoughts on “Book excerpt: Developing for privacy in the pandemic”

  1. Heather, thank you for sharing this torch. Hope even non web savvy people will take note of that and try to read even if only this excerpt. I am definitely sharing… 🤞
    Looking forward to the full version of the book. Thank for all your hard work, getting this info out there.

      • Just the thought of not seeing you at a WC again makes me super sad. But will definitely let you know – one day when I’m in your neck of the woods – for a quick catch up! 🙂

  2. Hi Heather,

    Thanks for sharing, I look forward to the book. I will be sharing this far and wide, it covers a huge number of things I seem to be talking to people about at the moment. It is the people who are on ‘lists’ at the moment don’t really understand how they might get the chance to remove themselves from them, some of the more aggressive lists have been made without any kind of permissions and are effecting people I know in fairly horrible situations. Anyway I will share thank you.

    • Thank you friend. My personal experiences with being put on “lists” because of other people’s actions (you’ll remember them) has informed a lot of the energy I’ve put into this book. Perhaps that’s why it’s been such an ordeal to write.

  3. Hi Heather, most thought provoking, I agree with your decision to publish it now, before the book, and circulate it widely. I just hope it’s seen and makes an impact. You stay safe now, this virus is no less virulent than it’s ever been.

  4. Hi Heather. Thank you for this excerpt and l look forward to purchasing the book when it is published. It is particularly useful as I am writing a dissertation on privacy as part of my studies towards an LLM in international law. Would it be ok with you if I use some of your material properly cited as a part of my work ?

    • Hi Les, Thank you for the kind words. You can certainly cite the material; you can always update the citation with the book’s formal details once it’s published.

Comments are closed.