Vaccine Passports Could Be a Dystopian Nightmare
What seems like a sensible measure could be a slippery slope to a social credit system.
by James Meadway
19 March 2021
With the vaccine rollout well underway and summer fast approaching, the government’s attention has shifted towards the implementation of vaccine passports.
Under pressure from countries like Greece, which are desperate for tourist revenues, the European Commission is set to bring forward legislation for the introduction of a “digital green card” vaccine passport to allow travel inside the continent this summer. A growing number of countries have either brought in or are set to introduce versions of the same tool, whilst British Airways is amongst the travel companies set to introduce their own version of passporting in May. The UK government is funding eight different research projects for vaccine passporting, and, more recently, health secretary Matt Hancock has indicated that it is “working with international partners” on the development of certification.
Despite the push to implement this new technology, over 200,000 people have signed a petition urging the government not to introduce a vaccine passport, triggering a parliamentary debate earlier this week. The anti-lockdown Reform Party, a mutation of the Brexit Party, is campaigning against its introduction in Wales. Meanwhile, in France, president Macron has strongly implied he will look to block an EU-wide certification process.
While some of this opposition is, no doubt, motivated by anti-science vaccine scepticism – which can and should be robustly dismissed – concerns about the use of certificates, rather than the vaccine itself, should be taken seriously. Indeed, there is a real danger that embracing this new technology without taking the time to seriously consider the potential problems it poses, could lead to profound infringements on our freedom far beyond this pandemic.
With vaccine up-take varying across different parts of the population – not to mention that some people have been advised not to take the vaccine at all on health grounds – the introduction of the passport risks creating a twin-track society – one part certified and allowed to participate, the other not. Similarly, if a smartphone is required for certification, the 16% of the population that does not own a phone, and is overwhelmingly older and poorer, risk exclusion from social life.
A new report from the Ada Lovelace Institute suggests that the potential dangers from vaccine certification are far more widespread. It states that we simply do not know enough about the impact of vaccines on the transmission of the virus. This means that we don’t know if one person being vaccinated reduces risk for others, or by how much. The report also flags the significant dangers to privacy and security that could arise from cobbling together such a system so quickly.
If transmission is not substantially impacted by the vaccines, or if (as is more likely) we enter a sort of vaccine arms race against further mutations of the virus, it is unlikely that we will be able to return to much of the pre-coronavirus world any time soon. This creates a strong public health incentive to try and regulate people’s personal behaviour to stop cases from rising – as we’ve seen over the last 12 months, with constant social distancing measures and, when cases surge, the imposition of lockdowns.
Since we now have a better idea of the levels of risk different kinds of activities carry, there is also a public health incentive to control the specific things we do as well. This is reflected in England’s lockdown exit plan, which starts by allowing lower-risk meet-ups outside – well before, for example, the opening of bars and restaurants, which carry a higher risk. In this way, there is a solid medical incentive to log behaviour.
There is already a “backwards-looking” version of this in contract tracing – where someone known to be infected has their social contacts traced to inform them to self-isolate. Vaccine passports, by contrast, introduce a technology that already logs one kind of behaviour for medical purposes – in this case, taking a vaccine. Instead of medical characteristics being logged (like an immunity certificate flagging an antibody count), the vaccine passport flags behavioural characteristics.
Of course, the logging of a vaccine receipt is not too far away from logging any other kind of behaviour. What’s more, there are plausible medical reasons for wanting to do so. For example, given what we now know about the virus and how it is transmitted, it is not too hard to envisage a system flagging some forms of risky activity – visiting a nightclub, say – and factoring them within a personalised risk rating system, providing a personal score as to your current (assumed) Covid-19 risk. Your personalised risk rating could be altered to reflect what you, personally, were known to have been doing and where you had been.
Researchers at Oxford University have already developed a beta version of a personalised coronavirus risk calculator, which gives an individual estimate of a person’s chance of dying from Covid-19 in the next 90 days based on their personal characteristics – like pre-existing health conditions, or age. NHSX, the NHS’ in-house innovation arm, was working on a “health status” addition to its contact tracing app, prior to abandoning the development of the app in favour of Google and Apple products.
A behavioural extension of a personalised risk rating system would see a person’s activities logged over the course of say, 28 days, upon which a score would be calculated based on the presumed riskiness of their behaviour. This, in turn, could be presumed to give someone providing a service – whether running a pub or a theatre – a guide on the risks involved in allowing a specific individual to participate, backing up the use of mass testing.
While this may seem unsettling, in practice, we already accept a version of this behaviour-scoring mechanism primarily through the credit rating system, which aims to provide a summary indicator of personal creditworthiness for lenders based on their past behaviour. Credit is regularly denied to those assumed to be high-risk. And the practice is spreading in Britain: the Canopy app creates a “RentPassport” that logs tenant behaviour that aligns with landlords’ interests (such as paying rent on time), allowing them to build up a score over time. The problems arise when people are unable, through no fault of their own, to meet the behavioural standards of the app: credit-scoring systems are notoriously discriminatory.
The principle in personalised health risk scoring would be similar to these scoring systems: using personal history as a guide to understanding the current risk of a specific person. This health rating can also become financialised. While some insurers already offer reduced fees for regular gym-goers, a risk-scoring system could be marketed as providing privileged access to some activities. For coronavirus, it could even mean that those whose behaviour is deemed healthier would be given reduced fees on insurance considering the known long-term consequences of the virus in some cases – the illness that has become known as “long Covid”, for example, has debilitating (and costly) effects.
As a result, there is a significant commercial incentive for insurers and similar financial companies to try and personalise risk ratings, since it creates a far bigger market for insurance products: If the risk of a specific individual can be evaluated, it is possible to insure against those risks at a profit. The more individuals evaluated, and the more comprehensively they are being assessed, the more insurance and related financial products can be sold.
But where digital risk certificates would differ from conventional credit scoring is in their immediate use of the government as a validation method. Instead of having a private operator construct an indicator – as credit scorers do – the expectation on vaccination certificates is that only the government can be sufficiently trusted to provide a valid pass. However, because the government is trusted, the value of a government-registered score is far higher than even a fairly sophisticated private register, like existing credit scores. And because the government can sit in the middle of many different databases, it can choose to integrate them.
This is precisely the point at which similarities to something more like a social credit system – whereby particular kinds of behaviour are logged by a government-backed scheme – start to appear. China’s social credit systems are well-known. In 2014, eight pilot schemes were launched by private providers working with the Chinese government to create a nationally-integrated system by 2020 – a goal that is yet to be achieved. Meanwhile, Sesame Financial’s opt-in scheme extends the idea of credit scoring into other parts of life: playing video games marks your score down, for instance, whilst being a parent marks it up. (One popular dating app allows the user to display their score on their profile; 90 million users have opted to do so.)
Companies and government bodies are also covered by the systems, which, according to figures from the People’s Bank of China, “covered 1.02 billion individuals and 28.34 million companies and organisations by the end of 2019”. Sanctions can be applied through the system: individuals with low scores may find they are barred from buying train tickets, whilst government officials from low-scoring institutions can be blocked from “buying tickets above economy class on flights… [and]prohibited from racking up large bills at high-end hotels, night clubs and golf courses.” More seriously, there are reports of critical journalists finding themselves quietly blacklisted for travel tickets.
While it is true that we don’t know how the emerging infrastructure of digital certification will be used, as the Ada Lovelace report warns, “the pandemic is creating the conditions for an acceleration of individualised risk scoring… deliver[ing] significant power to the controllers of that infrastructure that will be hard to dissolve or dismantle post-crisis.” The danger is that we are creating both the incentives and the opportunities to allow its use in new ways.
How it started: ‘Once powers are yielded to the state at moments of crisis or emergency, it’s very rarely the case that the state hands them back.’
How it’s going: @michaelgove to lead #vaccinepassport review.
📨Email your MP to #StopVaccinePassports! https://t.co/PVWIyjDPro pic.twitter.com/lHkQvsKskB
— Big Brother Watch (@BigBrotherWatch) March 9, 2021
We have all already taken part in a decade-long experiment in trusting new data collection agencies with our personal data. And judging by how much information we’ve handed over, “function creep” – when the introduction of a technology for one purpose leads gradually to its use in others – in coronavirus data collection looks to be highly likely. The big tech business models pretty much demand the accumulation of greater and greater volumes of personal data, imposing function creep virtually as a business model.
And whilst governments, on the whole, may not operate to quite such obviously commercial considerations, function creep is a significant possibility for digital vaccine certification once the principle of officially validated behaviour is opened up. Most proposals for vaccine certification, including the one put forward by the European Commission, already move to include additional health information alongside vaccination status. This is, according to the Commission, in order to guard against the risk of discrimination, but clearly, some function creep is already happening.
Throw in the likelihood – as we are already seeing with quarantine certification in the UK – that the task of developing the system will be subcontracted to the private sector – with external companies both operating the scheme and, potentially, processing the data collected – and the commercial incentive to expand data collection becomes disturbingly apparent.
Even so, the practicalities of a vaccine certificate, even if imperfect, are likely to provoke a public demand for its use – from both the public themselves, who want a return to some kind of pre-coronavirus normality, and from those trying to run commercial operations that face difficulties operating with social distancing and other health measures in place. The Music Venue Trust, for example, is among those already lobbying for certification.
If a vaccine certificate allows some social situations to be de-risked enough to operate, then we are very likely to end up in a situation where these certificates not only exist, but where the principle of behaviour monitoring is readily accepted as a fact of social life.
All previous experience with data technology suggests that people will very happily trade their privacy for a more varied social existence. Indeed, Facebook is only able to exist because of where most people sit on this trade-off: heavily leaning towards allowing faceless corporations a great deal of access to their personal data. Moreover, Covid-19 has seemingly imposed a very stark choice: do we want to return to some semblance of normality at the expense of surveillance, or would we rather carry on with this half-life? It’s pretty easy to predict what people would choose given the options.
In the last fifteen years, we have collectively sleepwalked into a new set of technologies built around the collection and processing of personal data, with strikingly little thought as to the consequences. Finding ourselves now at a similar impasse, it is therefore absolutely vital that we heed the Ada Lovelace Institute’s recommendation for a thorough public discussion of public health certification.
Beyond that, implementing robust controls on the use and collection of data would be essential – given that there are at least some incentives for function creep on vaccine passporting. This would, ideally, mean not only public consent, but establishing a clear legal framework – for example, providing for legally-protected and collectively-managed “data trusts”.
While choosing the quick-fix solution to lockdown is no doubt tempting, the implications of doing so could be huge. We must therefore fight to retain the maximum possible control over what data is collected, and for what purposes. Our freedom depends on it.
James Meadway is an economist and Novara Media columnist.