As we move into the next stages of dealing with the current crisis, NHSX has developed a COVID-19 tracking app which is supposed to be an integral part of the government’s new ‘test, track and trace’ strategy. 

The decision to go with a bespoke centralised model rather than work with Apple and Google to create a decentralised one has raised a fair amount of concern, with particular attention paid to the infringements upon individual privacy this brings.

On the surface this model may seem rational though not ideal. However, when taken into a wider context this app is the latest in a line of inefficient technological initiatives from the government which do nothing but expand the surveillance state.

The main point of contention around the app comes down to the decision to use a centralised database rather than the de-centralised model championed by Apple and Google, and adopted by the majority of European countries.

In a centralised system, should you report symptoms (the UK has chosen to use self-diagnosing instead of test results as the key input), all the data your app has collected (the identification numbers of other phones with the app installed, that you have been in contact with) gets sent to NHSX (the digital arm of the NHS) where an algorithm is run to determine whether anyone you have come into contact with is at risk and needs to be alerted.

In a decentralised model, nothing gets sent to the NHS. Instead, should you report symptoms, the risk assessment is carried out on the local device and notifications sent based on those calculations.

Here, the general trade-off is between efficacy and privacy. A centralised model is supposed to enable a more thorough epidemiological study, but requires the surrendering of greater personal information. However, this observation doesn’t do justice to many of the issues facing NHSX’s latest technological development.

The first and most simple problem with this system is that it is not anonymous, despite what the government may say. An analysis of the app carried out by Michael Veale, a lecturer in digital rights and regulation, showed that it is not compatible with data protection laws, as the identification numbers can still be linked to individual devices, and ‘a centralised system is always a tiny step away from identification’.

The second and more worrying problem is the potential of mission creep. One of the few benefits of using a centralised system is that it is easier to expand beyond its original purpose. The Chief Executive of NHSX Matthew Gould has already said it ‘should give people the opportunity to offer more data if they wish to do so’ such as the location of where two phones contacted. 

Location data would initially be opt-in, but should this data prove useful – would it stay that way? Not to mention the fact that with an optimal uptake on the app of around 60%, societal pressure could make the option to refuse difficult.

Or worse still, what if your employer says you cannot return to work without the app? Even if the app never develops beyond its current structure there are still fears that the infamous Investigatory Powers Act could allow ‘people with symptoms or a diagnosis of Coronavirus to be tracked without notice’.

India has already demonstrated this kind of labour market intrusion in action. Their equivalent tracing app, known as Aarogya Setu, has already been made mandatory for citizens of many states, with hospitals even denying treatment to patients that refuse to download the app. Given the high level of uptake that is required for a successful test and trace system to work, governments may see this kind of disciplining effects as necessary to ram home success.

Thirdly, the NHSX app just is not effective enough to justify an invasion of our privacy. There are a number of issues which the app faces. It faces an inability to work internationally (thus having poor ‘interoperability’), and must keep up with phone operating system updates. Most difficult to achieve is the fact that the app must be constantly running in the foreground, meaning it needs an unlocked phone with the app open to work. 

Not only is having your phone unlocked whilst outside risky, but it also requires a shift in the way we all use our phones to be effective. This is not a problem for a decentralised model which could be built as a background process in the phone OS. Such a level of ineffectiveness has prompted the government to actually look into adopting the Apple-Google model.

This is not the only tech-focused expansion of state surveillance which is not effective enough to justify its existence. In the last year the use of live facial recognition has begun, for example at the Stratford Centre in London. These systems work by scanning every passer-by and checking them against a database of facial structures. 

Whilst there are warning signs this is taking place, in order to read them you must already be in the cameras’ view. A justifiable case could be made for this invasion of privacy were it to work, but a study from the University of Essex has shown live facial recognition to be verifiably accurate in just 19% of cases. Worse still, US research shows that it is black faces which present the highest false match rates, exacerbating an existing problem of trust between minority communities and the police. 

Ineffectual tech surveillance can go back even further. Take the Investigatory Powers Act 2016, a piece of legislation pushed through whilst Theresa May was Home Secretary and dubbed the ‘Snoopers Charter’. 

This bill gave a whopping 48 government bodies the ability to obtain your internet connection records from your internet service providers (unless you happen to be an MP or Lord) without a warrant, including interesting choices such as the Food Standards Agency. 

The problem with these sweeping sorts of legislation is that they push the criminal targets of the act into the practically untraceable dark web, leaving only the average citizen at the mercy of potential technological abuse in the future.

With the terrifying fallout of the pandemic fresh in our minds, many people may willingly hand over our human experiences in the form of data in exchange for ‘security’. After all, we never know when the next pandemic will be. It is this fragility which may be exploited by companies and states wishing to obtain personal data for reasons other than our own protection.

The question therefore is not whether we can tolerate intrusions into civil liberties in the name of collective protection, which is a huge and separate debate. But if we are to give up some of our privacy or any other right, at least let us give it up for something that works.

Leave a Reply

Discover more from Backbench

Subscribe now to keep reading and get access to the full archive.

Continue reading