Earlier in 2016 Deepmind started a campaign to regain the public’s trust.
Shortcomings in handling of patient data
The ICO concluded not enough was done to inform patients that their information was being processed by DeepMind during the testing phase of the app.
There was also a lack of transparency about how patient information would be used to test the new app. Therefore patients could not exercise their statutory right to object to the processing of their information.
Changes to be made
Following the ICO investigation, the Trust has been asked to:
- Establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials;
- Set out how it will comply with its duty of confidence to patients in any future trial involving personal data;
- Complete a privacy impact assessment, including specific steps to ensure transparency
- Commission an audit of the trial, the results of which will be shared with the Information Commissioner, and which the Commissioner will have the right to publish as she sees appropriate.
The Information Commissioner has also published a blog, looking at what other NHS Trusts can learn from this case. Some learnings are that the reported shortcomings in the handling of patient data were avoidable, and that a privacy impact assessment should be an integral part of any innovation where patient data is concerned.
Trust can continue using Streams app
The trust has co-operated fully with the ICO’s investigation which began in May 2016. ‘It is helpful to receive some guidance on the issue about how patient information can be processed to test new technology. We also welcome the decision of the Department of Health to publish updated guidance for the wider NHS in the near future.’
The Trust has accepted all ICO’s findings and has made good progress to address the areas where they have concerns. ‘For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.’
The trust says it is committed to the partnership with DeepMind which they entered into in November 2016, which incorporated much of the learning from the early stages of the project. ‘We are determined to get this right to ensure that the NHS has the opportunity to benefit from the technology we all use in our everyday lives. We must embrace the opportunities which come from working with a world-leading company such as DeepMind to ensure the NHS does not get left behind.’
What the investigation was about
The focus of the investigation has been on the Royal Free London as the data controller and the ICO raised concerns about whether we could have done more to inform patients that their information was being processed to test the safety of Streams app and the amount of information that was processed.
Instant alert app
Each year, many thousands of people in UK hospitals die from conditions like sepsis and acute kidney injury, which could be prevented, simply because warning signs aren't picked up and acted on in time. Streams integrates different types of data and test results from a range of existing IT systems used by the hospital.
Because patient information is contained in one place – on a mobile application – it reduces the administrative burden on staff and means they can dedicate more time to delivering direct patient care. It is currently being used by clinicians at the Royal Free London to help identify patients at risk of acute kidney injury. The app has been through a rigorous user testing process and has been registered with the Medicines and Healthcare products Regulatory Agency (MHRA) as a medical device.
Deepmind draws its own conclusions
The ICO’s undertaking also recognised that the Royal Free has stayed in control of all patient data, with DeepMind confined to the role of “data processor” and acting on the Trust’s instructions throughout. No issues have been raised about the safety or security of the data.
Although the findings are about the Royal Free, Deepminds believes it needs to reflect on its own actions too. 'In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better.'
Improvements regarding Deepmind transparancy
It was a mistake to not publish the work when it first began in 2015, so Deepmind has proactively announced and published the contracts for subsequent NHS partnerships.
In the initial rush to collaborate with nurses and doctors to create products that addressed clinical need, not enough was done to make patients and the public aware of the work or invite them to challenge and shape priorities. "Since then we have worked with patient experts, devised a patient and public engagement strategy, and held our first big open event in September 2016 with many more to come."