Why trust is key to the user adoption of the NHS contact-tracing app

  • 9 June 2020
Why trust is key to the user adoption of the NHS contact-tracing app
Using Bluetooth the app will work continuously in the background of a person’s phone

In his latest column, our cyber-security expert, Davey Winder, delves into the NHS contact-tracing app, looking into the issue of user adoption and why trust is key. 

The technology of the NHSX tracking app appears to be pretty simple, if you strip it back to bare basics; Bluetooth detects other devices, and so the people using them, you are near to.

This information is kept on your phone for 28 days, and theirs. Unless a user inputs data suggesting they have Covid-19 symptoms, then the information is sent to the NHS and a ‘risk algorithm’ determines who, if anybody, needs to be notified about being in contact with you.

Importantly, device IDs are anonymised from the NHSX system perspective so cannot be used to link a person with a place or a device.

IP addresses aren’t seen by the NHSX system, but are by a separate ‘front end’ that deals with the cybersecurity side of things and that’s got all the security controls you’d imagine attached.

Perfect world

One of the key security elements is encryption. By generating ephemeral encrypted IDs within the app, not on the central server, security maintains a relatively strong posture as a server compromise doesn’t then automatically mean the privacy gig is up.

Or at least it should, in a perfect world. However, a world where a complex application is designed and coded in necessarily double-quick time, and yet will be rolled out on a scale that few app developers could ever imagine, is far from perfect.

Security researchers have already found a number of concerns that impact upon the integrity of the cryptographic protocol to be used, concerns that security protocols are undermined by encrypted and unencrypted data being transmitted side-by-side.

The most damning report found that “in the presence of an untrusted TLS server, the registration process does not properly guarantee either the integrity of the authority public key or the privacy of the shared secrets established at registration,” which impacts upon resistance to spoofing attacks for example.

It also means that the recovery of installation IDs without access to the private key might be facilitated thanks to “the storing and transmitting of unencrypted interaction logs,” the researchers reported.

That key recovery was also said to be possible by the eight-second interval to monitor interactions which might create ‘signatures’ that could be combined with unencrypted submissions. They were also concerned that the length of time that ‘broadcast values’ exist could reveal lifestyle attributes about users. The GCHQ National Cyber Security Centre (NCSC) seems to be taking these issues seriously, or at least it acknowledged the issues and hinted some would be fixed while others would be reviewed.

User adoption

Although this is by no means an ordinary development environment in these extraordinary times, Paul Farrington, EMEA CTO at Veracode, points out that the healthcare industry has the longest time of any to fix security bugs “with a median of 131 days going by until the is resolved.” And even if they were all addressed, Farrington says, “concerns will remain about the architecture being less conducive to protecting user privacy and the legal limits on agents of the Government misusing the data that is collected. That may prove to cause be a drag on user adoption, which is not in the interests of public health.”

Key to returning to normalcy?

And there lies the rub, and rub is a good word as there’s going to be a lot of friction here methinks. What we, as a population, are being asked to do is trust that everything will be secure. Simply saying ‘trust us’ really isn’t good enough when it comes to security and privacy issues.

It’s certainly not good enough when it comes to something as central to the return to some semblance of post-lockdown normality as the track and trace app has to be.

Especially as the UK isn’t following the decentralised model for such an app, but rather a centralised one. Simply put, a central database is storing stuff, rather than that stuff staying put on the device itself.

Without some legislation that dictates how this stuff, this data, can be used further down the road, things start getting sticky from the trust perspective.

As David Grout, CTO for EMEA at FireEye says, “with concerns surrounding the usage of the data in the app, and what will happen to that data even after the pandemic, there needs to be an agreed time restriction on how long the data is collected for and deletion rights which align with current data privacy regulations. Citizens should be made to feel in control of their data and reserve the right to have data deleted from the record once the crisis is over”,

Trust is key

A new study by researchers at Anomali does not make for easy reading when it comes to trust.

While focusing mainly on the potential for cybercriminals to use uncertainty about the app to launch phishing attacks, some 43% of those asked were concerned about this, it also confirmed what I’ve already said: trust is key.

A third of those asked were concerned about the government being able to track their whereabouts using the app and more than a third also had concerns about the government collecting data about them.

OK, so if two-thirds were alright on the trust equation there’s no huge problem, right? Wrong. For the app to be effective there has to be a very high take-up amongst the UK population.

Anything that eats into that willingness to download, install and use the app goes beyond just being a public relations disaster; it undermines the whole point of the app in the first place, to control the spread of the disease and save lives.

“Experts suggest that for the UK as a whole, about 60 percent of the population needs to install and use the software for it to live up to its full potential,” David Grout says.

“The government is relying on a public buy-in for the project to work.”

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Patient groups oppose NHSE plans for unified clinical registry platform

Patient groups oppose NHSE plans for unified clinical registry platform

Patient groups for people with blood disorders have raised concerns about NHS England plans to combine clinical registries in a single platform.
Harnessing AI and cybersecurity to transform healthcare in the UK

Harnessing AI and cybersecurity to transform healthcare in the UK

The UK healthcare sector is in a transformative era, driven by advancements in artificial intelligence (AI). AI has the potential to revolutionise healthcare by improving…
Junior doctors break strike to assist at sites hit by cyber attack

Junior doctors break strike to assist at sites hit by cyber attack

Guy’s and St Thomas’ and King’s College Hospital NHS Foundation Trusts continue to experience major disruption following the cyber attack on Synnovis.

1 Comments

  • Try this pair of syllogisms:
    1.
    To be trusted you have to be trustworthy
    The NHS is demonstrably not trustworthy
    Therefore the NHS will not be trusted.
    2.
    For the NHS App to be effective, the NHS needs to be trusted
    The NHS will not be trusted.
    Therefore the NHS App will not be effective.

    So what is plan B?

Comments are closed.