DeepMind, the Royal Free and the ruling
One month ago, the UK Information Commissioner’s Office ruled that the patient data sharing agreement between the Royal Free London NHS Foundation Trust and Google DeepMind was non-compliant with data protection law in a number of areas.
What can we learn from this high-profile case?
- The commissioner identified 4 key actions for future eHealth projects
- DeepMind identified areas for their own improvement
- The Royal Free London NHS Foundation Trust have committed to a number of undertakings required by the Information Commissioner’s office.
Dare we let the dust settle?
Myhealthapps has received continued feedback from patient groups (and the patients they represent) about concerns they have on patient’s privacy risks, whether it’s from the terms and conditions when people sign up to a health app, or when their data becomes a commodity in the big data market.
The UK Information Commissioners’ ruling last month to Google DeepMind and the Royal Free London NHS Foundation Trust, was a clear shot across the bows. Although enforcement action is rare, it sends the message of what can be done when any eHealth project is not compliant with data protection law.
Occasionally, we still meet members of the development community who dismiss patient fears around privacy: “Oh they don’t care about that.” In a sense they are right. Many people are completely unaware of the risk. There’s not informed consent going on because individual patients may have little idea of what their rights are, and what is at risk.
It’s undeniable that protection of patient data is a hugely complex area, and even Google DeepMind, responding to the ruling, said:
“…we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better.”
– DeepMind – `What we’ve learned’
Four key lessons
For her part, reflecting on the issues raised by the case, Elizabeth Denham the Information Commissioner, identified four key lessons for the healthcare sector:
“1: It’s not a choice between privacy or innovation
…The price of innovation didn’t need to be the erosion of legally ensured, fundamental privacy rights.”
2: Don’t dive in too quickly
…Carry out your privacy impact assessment as soon as practicable, as part of your planning for a new innovation or trial.
3: New cloud processing technologies mean that you can, not that you always should
…consider whether the benefits are likely to be outweighed by the data protection implications for your patients.
4: Know the law and follow it
…obtain expert data protection advice as early as possible in the process.” –
-Elizabeth Denham, UK Information Commissioner
What impact did the case have on DeepMind?
In their statement reacting to the ruling, DeepMind identified a number of areas for their own improvement.
One of the key ones we reported on recently, as a possible approach for engaging patients transparently throughout eHealth projects:
“In our initial rush to collaborate with nurses and doctors to create products that addressed clinical need, we didn’t do enough to make patients and the public aware of our work or invite them to challenge and shape our priorities. Since then we have worked with patient experts, devised a patient and public engagement strategy, and held our first big open event.”
Data protection, what happens next?
This specific case will not simply go away. There will be reviews from all sides published. DeepMind have already published their first “independent review” panel annual report in July.
Although the independent panel does not yet have any obvious patient group members, it does recommend in its report, a strategy for patient and public involvement by Rosamund Snow.
Rosamund campaigns for patients’ involvement in healthcare, and her recommendations to DeepMind were underpinned by the principle that “at every level where clinicians have influence, ensure patients do too.”
So, for example, she recommends to:
“Appoint a patient lead with the same level of influence as the clinical lead.
Resource an entirely patientled AI project
Appoint Patient Advisors alongside the clinical advisory team…ensuring the scope is not just defined by doctors and nurses.
Create a patient panel to hold Deep Mind to account from the patient point of view.
Develop patient testing groups equivalent to clinical testing groups.”
– Rosamund Snow, patient campaigner.
Looking back on the case, the optimistic view could be that some healthy changes may be triggered by the enforcement action. As in Europe, we enter a period of uncertainty around the impact of the General Data Protection Regulation, seeing some best practice from the industry would be a source of hope.