Google’s DeepMind building health data audit tool to counter transparency fears

  • 9 March 2017
Google’s DeepMind building health data audit tool to counter transparency fears
Google Deepmind's office in King's Cross, London.

DeepMind Health have moved to counter data privacy concerns by introducing an audit trail for the company’s access to NHS patient data.

Google’s artificial intelligence off-shoot announced on Thursday that it will develop a digital ledger called ‘verifiable data audit’, which will give trusts the ability to see how the data is being processed in real time.

Mustafa Suleyman, DeepMind’s co-founder, told Digital Health News that the technology “should bring a level of transparency and oversight to the use and processing of health data that will improve the level of trust and accountability”.

The mechanism, that will be developed this year, would allow the trusts to see not only when data is accessed, but also how that data is used and why.

In the blog-post making the announcement, written by Suleyman and Ben Laurie, DeepMind’s head of security and transparency, they said: “we want to make that verifiable and auditable, in real-time, for the first time”.

DeepMind is forming a expanding lists of partnerships with NHS trusts,  both through its clinical alerting app, Streams, and its artificial intelligence research.

However, the company’s involvement in the NHS has been criticised by privacy advocates, concerned about both the scope and transparency around Deepmind’s access to NHS patient data.

A New Scientist media investigation in May last year reported that the agreement between the Royal Free London NHS Foundation Trust and DeepMind involved information on 1.6 million patients over five years.

This led to national media attention, a still on-going Information Commissioner’s Office investigation.

Notwithstanding these concerns, Royal Free extended the partnership in November last year signing a new five-year deal, and Imperial College Healthcare NHS Trust signed up for the app in December.

DeepMind’s blog-post compared the forthcoming audit mechanism to block-chain, as it would be append-only and allow for third parties to verify that no one has tampered with the data.

The three technical challenges identified in creating the technology were ensuring there were no blind spots, it could be used answer different group’s needs and making sure the log is complete despite different systems storing the data.

On the latter, Suleyman and Laurie said “this doesn’t mean that a data processor like DeepMind should see data or audit logs from other systems”.

“Logs should remain decentralised, just like the data itself. Audit interoperability would simply provide additional reassurance that this data can’t be tampered with as it travels between systems.”

Suleyman told Digital Health News that he saw the technology as a “step change in the way we store and process large data sets”.

Jim Killock, executive director of Open Rights Group, said in a statement that it “seems like a very interesting attempt to improve auditing of the way the data is stored, copied and used”.

DeepMind Health is also overseen by a panel of Independent Reviewers, who meet four times a year, but have yet to publish their first annual report into the company.

George Danezis, professor of security and privacy engineering at UCL, said in a statement that “enhancing such audit logs with high-integrity cryptographic controls, inspired by block chains, provides a higher level of assurance that mistakes or violations of policy will be found, and unauthorized parties cannot hide their trails”.

DeepMind is also working with University College London Hospitals NHS Foundation Trust for a research project into head and neck cancer, that was announced in September.

DeepMind is a London-based AI company that Google brought for £360 million in 2014.

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

What makes the NHS App successful?

What makes the NHS App successful?

Building a ‘superapp’ is a delicate act of statecraft, writes Mohammad Al Ubaydli from Patients Know Best
ICO guidance on transparency published for health and care sector

ICO guidance on transparency published for health and care sector

New guidance has been issued by ICO over how health and care organisations should be transparent over the use of personal information.
Digital identity crucial for security and enabling transformation says Kelly

Digital identity crucial for security and enabling transformation says Kelly

Digital identity is crucial for security and enabling digital transformation in healthcare, Imprivata's Dr Sean Kelly told Digital Health News.

1 Comments

  • For primary and first-use data extracts this is a good solution – it would help allay some of the concerns about big-data companies who want to use their technologies for the benefit of the NHS.

    I’d argue that it does create a potential for a closed system, one that (if you carry this blockchain requirement through for all data-use transactions) means that all types of data manipulation and interrogation have to take place in an environment that does not create a secondary copy of the data to work on and with. Excel for example. How many clinicians, when managing large cohorts of patients, rely on this ubiquitous tool to slice and dice the data to suit their needs?

    The assumption a layperson might take from this initiative is that patient-data sets exist entirely within the managed realms of EMRs, EPRs and other formal applications and that creating a system to record transactions in the manner described will ensure that *all* patient records are so protected, needs to be challenged.

    The counter argument might be that yes, you *do* need to keep all such secondary data extracts and resulting analysis tools within the confines of the application ecosystem that has these protective measures in place. This is something I’d agree with but I must offer a note of caution – my experience with clinicians is that they’re always coming up with new ways to ask questions and are (seemingly) never satisfied with the reporting and analyses tools available in the here and now. Can we be sure that whoever creates this wonderful new ecosystem is going to allow all who use it, to play nicely?

    (Sorry for wall of text – I’m an clinical information analyst if you hadn’t guessed 🙂

Comments are closed.