Royal Free and Google DeepMind trial did not comply with DPA
- 3 July 2017
The Information Commissioner’s Office has found a high-profile trial between a London NHS trust and Google’s DeepMind artificial intelligence arm did not comply with the Data Protection Act.
In a keenly anticipated ruling, the ICO announced today that the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided details on 1.6m patients to Google DeepMind.
The ICO investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.
The Information Commissioner observed “the price of innovation does not need to be the erosion of fundamental privacy rights”.
The information exchange saw personal data from about 1.6 million patients transferred to DeepMind to test an alerting system for acute kidney injury, named Streams. A media investigation questioned the scale of the patient data, its use and patient’s knowledge of how their data was being used.
A letter, sent to the Royal Free from the ICO Commissioner, Elizabeth Denham, said that the exchange “did not fully comply with the requirements of the Data Protection Act 1998” and listed a number of “shortcomings” with the data processing deal.
Denham said: “Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.”
Denham added: “The Royal Free did not have a valid basis for satisfying the common law duty of confidence and therefore the processing of that data breached that duty”.
The ICO investigation, begun last May, found that the means to inform patients about how their data was being used were not adequate, and therefore they could not opt-out.
“The evidence presented to date leads the commissioner to conclude that data subjects were not adequately informed that the processing was taking place and that as result, the processing was neither fair nor transparent.”
“Put plainly, if the patients did not know that their information would be used in this way, they could not take steps to object.”
Royal Free has been asked to agree to a set of changes to allow the data sharing to continue.
In her letter, Denham said that the processing of patients records by DeepMind “significantly differs from what data subjects might reasonably have expected to happen to their data when presenting at the Royal Free for treatment”.
The example being a patient presenting to A&E would not expect their data to be accessible to a third party for testing of a new mobile application.
The number of patient records was also found to be “excessive”, unnecessary and out of proportion by the ICO.
DeepMind welcomed the report, but said mistakes were made.
In a blog post on DeepMind’s website, Mustafa Suleyman, co-founder and head of applied AI and Dominic King, clinical lead, said “in our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health.”
“We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole.”
“We got that wrong, and we need to do better.”
Denham said in a statement that: “There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.”
She said the ICO has asked the trust to commit to changes to address these concerns, and added that “the Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used”.
To comply with data regulations the trust has been asked by the ICO to:
- Establish a “proper legal basis” under the Data Protection Act for the DeepMind project
- Set out how it will comply with its duty of confidence to patients in any future trial involving personal data
- Complete a privacy impact assessment
- Commission an audit of the trial, the results of which will be shared with the ICO
The shortcomings found by the ICO breached the following data protection principles:
- Principle One: Personal data shall be processed fairly and lawfully
- Principle Three: Personal data should be adequate, relevant and not excessive
- Principle Six: Personal data shall be processed in accordance with the rights of data subjects
- Principle Seven: Appropriate technical and organisational controls shall be taken – this includes the need to ensure that appropriate contractual controls are in place when a data processor is used
The ICO investigation, concerns about how Streams handles patient consent, and disquiet about the large-scale transfer of patient records to a Google company, have not stopped Deepmind signing up other trusts.
Deals are now in place with University College London Hospitals NHS Foundation Trust, Moorfields Eye Hospital NHS Foundation Trust, and Imperial College Healthcare NHS Trust and Taunton and Somerset NHS Trust. Each of the trusts will likely want to review their approach to patient consent and data transfers in light of the ICO’s report on the Royal Free.
The Royal Free said in a statement: “We accept the ICO’s findings and have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used.
“We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.”
In DeepMind’s response, the company also cited how it’s looking to address transparency issues through its independent review panel and a transparency audit.
However, the first report from the independent reviewers has still not been published, with DeepMind saying it will be out “soon”.
Nicola Perrin, head of Understanding Patient Data, praised the ICO’s findings:
“Key lessons – the need for transparency, public engagement and proportionate use of data – must be learnt, so that everyone can have confidence that patient data is being used responsibly. It is good that both DeepMind and Royal Free have recognised that mistakes were made, and are now taking steps to address the concerns. The ICO ruling makes clear that data protection and innovation can work together for the benefit of patients.”
11 Comments
I’m afraid it’s clear from reading the Taunton DeepMind PIA that lessons have not be learned – the common law duty of confidentiality is not referenced in that document at all!
I understand that Google were offered the professional services of key individuals in the healthcare sector information governance field and did not even acknowledge their offer. Google has contempt for patient information and is not just unfamiliar with the health sector but woefully unaware of UK data protection laws.
Trusts should be very careful as data controllers about going along with a known company that claims that they have done this sort of thing before… which is what I come across time after time.
My comment was actually about the very different treatment meted out by the ICO to a Trust which followed all the correct procedures for safe destruction of hard drives, but was still held to account when there was a criminal misuse of the hard drives by a sub-contractor, and that afforded to Royal Free and DeepMind, where the disregard of the law – DPA – appears to have been deliberate.
But I do agree that the NHS /DH do appear to have an ingrained attitude that medical records are their property, regardless of any legal or ethical constraints.
I also agree that if patients were asked, the vast majority would give consent – especially if there was some clarity about the uses to which their records would be put, who would have access and how the data would be managed.
See Fair Shares for All – http://www.bcs.org/upload/pdf/fair-shares-for-all.pdf for the PHCSG thinking on this.
There does seem to be a change of approach to basic breaches of the DPA by the ICO recently.
Not long ago a Trust gave its old hard drives to a certified contractor for destruction – and they ended on ebay. The Trust – as Data Controller – was given a massive fine.
Royal Free knowingly breaches the DPA – and is asked to give an undertaking to change its procedures for sharing patient records with DeepMind in future, with apparently no penalty and no requirement to retrieve the data already passed to DeepMind – or for DeepMind to delete that data.
Are we reaching a stage where “innovative” use of patient identifiable data is a defence in what seems to be either deliberate or wilfully incompetent interpretation of the DPA?
Leaking data on ebay, mixing up patient records and gifting 1.2 million records to google is not innovative. It is illegal.
I’d say we are reaching a stage where it is becoming crystal clear that the state enforced healthcare system has a cavalier approach to data protection and a blase attitude to the rights and privacy of the patients who have no choice but to use it.
We have already seen the fallout from care.data. This will soon spill out all over the NHS and then where will we be?
Where next? Repeal the data protection act and have done with it? Sorry you have to use this healthcare service and guess what, when you do you belong to us.
The sad thing is that if the NHS had the basic courtesy of talking to patients and obtaining consent, most people would share their data for the greater good. I guess the NHS just doesn’t care enough to bother.
The media have an agenda to feed you an opinion based on a few elements of data protection, but not all of them.
Does red tape get in the way of real advances in health care?
How many lives have been improved or saved as a result of DeepMind and their app being used within The Royal Free? Do people want to remove this and reverse the good which hasn’t been publicised?
Most people would agree it is good to share data and want to share it for the greater good, but only where there is trust. Data protection law aside a basic courtesy would be to ask.
Otherwise, we might as well just do away with the data protection act, GDPR, etc. and just say your health data belongs to the state and you have no say.
However, research will depend on way more than the trivial data locked up in healthcare records. We have to enable a much greater relationship with individuals to understand them as people, their phenotypic data, their genome, etc. if we are to truly understand aetiology of diseases. This will require a lot of trust.
With the NHS taking such a cavalier attitude to data sharing people will lose trust and that is what will do more damage than the short term gains made here.
And if nothing else, until they repeal data protection legislation for health data, they broke the law.
Tony – I agree with your points on this and it makes me wonder if lessons have been learned, as the reassurance from The Royal Free comes across as being pretty defensive. Given the recommendations would the NHS as a whole benefit from providing a data framework for these kinds of collaborations rather than individual hospitals entering them without the required oversight. GDPR kicks in next year. C’est la vie.
It’s quite sad that a Trust attempting to do something innovative and creative has got it’s fingers burned in this way – although clearly they have to take responsibility for their flawed interpretation of the rules.
In my experience many Trusts have the opposite problem – a highly conservative approach to IG that sometimes stifles legitimate work.
So it would be sad but entirely understandable if this case makes Trusts even more reluctant to try anything new.
The Royal Free statement in response to the ICO says – “We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.” This is nonsense. Have they read the ICO’s findings?
After all that, it sounds like Deepmind still have access to the data – so its not clear whats changed ? Sorry Deepmind – “We got that wrong, and we need to do better.” is just not good enough. You know you were hedging your bets when you did this, and have come it of it relatively lightly. Everyone in this business knows what involved with consent and patient participation – you dont need to look too far from care.data for that. Messer’s Suleyman and King – looks like you got what you wanted, had your cake and ate it.
The problem with this and the NHS as a whole is that patients have a tendency not to complain. Hospitals, GPs, etc. can get away with most things because there is never really an outcry, it’s “our beloved NHS” after all and somehow it’s the government’s fault.
It’s not like you can shop around, you are just stuck with whatever state enforced healthcare service happens to be in your area and with the ICO, PHSO, etc. having few or no teeth, local NHS organisations are rarely held to account.
Comments are closed.