This article was first published in ICLG – Digital Health


In the midst of the lingering COVID-19 pandemic, the United Kingdom (UK) and Europe’s governments and private businesses alike have embarked upon unprecedented data-driven digital innovation and transformation initiatives regulated by and at times challenged by ever evolving data protection and security rules. 

During the last two years of the pandemic, unprecedented swift developments in health technology both in the public and the private sector have enabled the UK to rise to the urgency of unexpected healthcare demands.  Contact tracing apps that could be quickly made accessible to the public to shore up defences against COVID-19 and protect healthcare infrastructure; accelerated research and development between multiple organisations and jurisdictions and expedited clinical trials in the development and roll out of vaccinations; the growth of online healthcare providers where premises were closed or inaccessible to patients; and the increase in sales of smart healthcare devices allowing patients greater involvement and control over their own health and health data have all been benefits arising out of this crisis. 

However, to gain user support for these advances, developers must keep an eye on their data protection obligations under the UK General Data Protection Regulation (UK GDPR), ensuring that they provide adequate processing information to users, have a lawful basis for their processing, make full use of data protection impact assessments so they can consider the risks inherent in their products and embed data protection and security at the design stage and by default.  As highlighted by the Information Commissioner’s Office (ICO), the effectiveness of data-driven technology relies in part on public trust and transparency is so important to developing and maintaining that public trust. [1]

Data Sharing in Healthcare: Contact Tracing

The NHS Covid-19 App

As the COVID-19 outbreak has prompted a wide range of responses from governments around the world, contact tracing apps have emerged as a double-edged digital weapon, both as a containment measure and as a privacy challenge.  In the UK, the NHS COVID-19 app, the official contact tracing app for England and Wales and a vital part of the NHS Test and Trace service in England, and the NHS Wales Test, Trace, Protect service, have been fraught with privacy concerns since their launch.  Clearly, the access and use of health and location data of millions of individuals by the NHS, during a sustained period of time since the start of the COVID-19 outbreak, has raised legitimate concerns of equally unprecedented mass surveillance of society at large.

Contact tracing has become a key tool in the battle against COVID-19 to alert people that they had come into contact with someone who had tested positive for the virus, check symptoms, book or order tests, and count isolation days, in an attempt to slow the spread of the virus.  The hasty development of the NHS COVID-19 app and the initial absence of the data protection impact assessment (DPIA) released late in August 2020 by the UK Department of Health and Social Care (DHSC) and criticised for lacking transparency, both undermined public trust and negatively influenced perceptions of app efficacy.  Since then, lessons were learned and the DHSC has regularly been updating the DPIA which is publicly available online as new functionalities were added to the app.  In particular, the use of the app’s QR scanner to check into places like restaurants, pubs, venues in the tourism and hospitality sector but also into close-contact businesses such as barbers, tailors or beauticians, raised serious privacy concerns that information about staff, customers and visitors, which constitutes personal data under the UK GDPR, may not be stored or used for contact tracing purposes only creating another occurrence of mission creep.  Whilst use of the app was a formal legal requirement for some venues prior to 19 July 2021, businesses may still be able continue data collection by relying on legitimate interests as the legal basis for the processing. 

ICO Guidance

On 4 May 2020, the ICO released its guidance on COVID-19 contact tracing: data protection expectations on app development, which confirmed the paramount importance for developers of contact tracing apps to perform a DPIA prior to implementation, given that the processing is likely to result in a high risk to the rights and freedoms of individuals. [2]  Further, DPIAs should be continuously reviewed and updated while the contact tracing technology is in use. 

In addition, on 2 July 2020,the UK ICO published its guidance on “Maintaining records of staff, customers and visitors for contact tracing purposes” and made clear that the information collected could not be used for direct marketing or other business purposes.  Yet in October 2020, the ICO launched an investigation into a number of digital contact tracing service providers to assess their data protection practices, including direct marketing, as concerns emerged that unlawful sharing and sale of information collected by QR barcodes was taking place with marketers, credit companies and insurance brokers.

ICO Enforcement

On 18 May 2021, the ICO announced that it had issued a monetary penalty notice to, and imposed a fine of £8,000 on, Tested.me Ltd, a contact tracing QR code provider, following various complaints from individuals for its sending of nearly 84,000 direct marketing emails without adequate valid consent, in violation of Regulation 22 of the Privacy and Electronic Communications Regulations 2003 (PECR). [3]

The ICO took the opportunity to remind providers of its guidelines including: 

  • Incorporating a data protection by design approach for the development of new products from the start.
  • Ensuring that privacy policies remain clear and simple so as to be easily understood. 
  • Not retaining data collected for more than 21 days.
  • Not using the data collected for marketing or any other business purpose.
  • Complying with the latest ICO’s online guidance.

Data Sharing in Healthcare: The UK NHS Digital GPDPR Programme  

Genesis Of The GPDPR

NHS Digital is the national custodian for health and care data in England and has responsibility for standardising, collecting, analysing, publishing and sharing data and information from across the health and social care system, including general practice.  In April 2021, the Secretary of State for Health and Social Care issued a Direction under the Health and Social Care Act 2012 requiring NHS Digital to establish and operate an information system for the collection and analysis of general practice data for health and social care purposes.

To date, NHS Digital collects patient data from general practices using a service called the General Practice Extraction Service (GPES).  On 12 May, NHS Digital issued a Data Provision Notice to GPs to let them know that the GPES will be replaced by a brand new scheme, the General Practice Data for Planning and Research (GPDPR programme) from 1 July 2021 with the aim of collecting pseudonymised GP data daily to support vital health and care planning and research).  

During the pandemic, NHS Digital had been legally permitted to collect and analyse healthcare information about patients to enable the identification of those most vulnerable to COVID-19, the roll out of vaccines and for critical COVID-19 research.  In practice, the data to be collected may not include patients’ names and addresses but could include a patient’s NHS number, date of birth and full postcode as well as information about mental health, domestic violence, treatments and addictions.

However, the principal difference between the GPES and the GPDPR programmes will not be the technology but rather the fact that, post-pandemic, the primary care data extracted through the GPDPR by NHS Digital is to be made available generally to third parties outside the NHS for research and planning. It is meant to involve a broader general purpose collection that would, through enhanced technology, enable faster access.  The intention was that NHS Digital would pseudonymise the data before sharing, and such data could only be converted back to identifiable data in certain circumstances and where there is legal reason.  Importantly, patients were entitled to opt out of the collection process completely or in part only of NHS Digital sharing their personal data, but were given a very short window of time to decide and very limited public information.  In response to growing general concerns that not enough time had been given to let people know specific information about the service, its purposes, patient rights to opt-out and that patient trust could be destroyed, the implementation date for the programme was moved from 1 July to 1 September 2021 to ensure that more time is allocated to speak with patients, doctors and health charities about the plans.  However, it has now been postponed until 31 March 2022. 

This latest initiative bore resemblance to the previous ill-fated “care.data” initiative, which also sought to share pseudonymised patient data collected by GPES with third parties, including commercial organisations outside the NHS, for research.  This was shut down in 2016 following criticism of its failure to adequately inform patients of the programme and their right to opt-out of collection.  Unfortunately, lessons do not appear to have been learned in the intervening years, as concerns were again raised that patients had again been inadequately informed about the collection and sharing of their data, in breach of the UK GDPR core principle that processing should be lawful, fair and transparent: the majority of communications had been published online rather than being sent to patients directly and it was unclear how many patients had been made aware of the programme through their GP surgeries.  Additional concerns centred on the measures that were being taken to secure patient data and the requirement for patients to opt-out of the scheme rather than having to actively opt-in. 

Data Protection Concerns
Lack of transparency

A key concern with the proposed GPDPR was the lack of transparency in communicating to the public and GPs how the data extracting and sharing was to work.  This was highlighted by the British Medical Association and the Royal College of General Practitioners and reiterated in a statement, from Elizabeth Denham, the then UK Information Commissioner (ICO). [4]

Although NHS Digital has said the Department of Health and Social Care and its executive agencies, NHS England, local authorities and research organisations may need to access the data, the limits on the range of other organisations which may look to access the data are unclear.  “Appropriate requests” from organisations wishing to access the data will be scrutinised by NHS Digital’s Independent Group Advising on the Release of Data and decisions will be published on NHS Digital’s publicly-available Data Release Register.  However, major concerns remain around access to data within the programme by “big tech” organisations who will likely see significant commercial benefits from accessing the highly sensitive information held on the database.  Access to such data for research and social care purposes does not exclude data monetisation opportunities for third parties, yet data sharing restrictions seem to be weak in the face of the broad definitions of “health and care planning and research” purposes and the security parameters, which do not detail how to address the risks of re-identification of pseudonymised data. 

It is now intended that NHS Digital will develop an engagement and communications campaign so that patients can be made aware of the scheme and in a better position to make informed choices.  A DPIA reflecting the changes to the programme, and demonstrating how all risks and mitigation measures had been considered and addressed, will also be published before the data collection commences.  This should help to answer many of the concerns and should go a long way in making the scheme more transparent to all. [5]

Data Security

Whilst NHS Digital has said that it will be using a secure system to collect and store the data there is little information about what security measures will be in place.

Data collected as part of the scheme will be pseudonymised when it is collected from GPs.  The UK GDPR defines pseudonymisation as the processing of personal data in a way that means that it can no longer be attributed to the data subject. [6] This involves replacing personal data with pseudonyms which can only be re-identified using additional information, known as a key, which must be kept separate from the pseudonymised data.  NHS Digital have said that any data which could be used to identify someone directly will be replaced with unique codes and then also securely encrypted. [7]

In particular, there are concerns that NHS Digital itself could re-identify the data using other data it already holds under its existing Personal Demographics Service which contains patients’ name, address, date of birth and NHS Number.  Despite NHS Digital stating that data collected would not be sold or used solely for commercial purposes, there are concerns that if big tech platforms such as Google Amazon or Apple, private health providers or insurers are able to gain access to patient data through the scheme then they may be able to use this alongside other data they hold to identify patients and exploit the data for monetary gain to the cost of the NHS. 

Again, the DPIA should provide further assurance as to what risks have been identified and how NHS Digital plans to deal with those to secure patient data. 

Opting out

Under the existing framework, if patients did not want their data to be shared with NHS Digital, then they were required to actively opt-out rather than opting-in. 

Patients can opt-out of their data being shared under GPDPR by registering a Type 1 Opt-out directly with their GP surgery or a National Data Opt-out (or both).  A Type 1 Opt-out prohibits the uploading and extraction of a patient’s data whereas the National Data Opt-out only limits the ways that NHS Digital will be allowed to use confidential patient information for research and planning.

If patients did not opt-out, then their data was designed to be automatically shared with NHS Digital when the programme went live.  There was a concern that many people, especially those members of society who do not have access to the internet, may not have been able to take advantage of this opt-out. 

Additionally, there was a concern that although patients could opt-out after the programme had commenced, this would only prevent further data from being collected.  It would not obligate NHS Digital to delete any data already collected, which by then would have been shared with multiple third parties. 

The requirement now published for NHS Digital to ensure that an individual’s data can be erased once they have requested to opt out of the scheme should assist in alleviating these concerns to some extent.  However, given that the scheme will remain subject to patient opt-out rather than an opt-in, the requirements for valid consent under the UK GDPR will not be met and NHS Digital must therefore rely on alternative bases for the lawful collection and sharing of data. 

Lawful basis

NHS Digital will only be allowed to collect and share patient data if there is an applicable lawful basis for processing data as set out in the UK GDPR.  Given that, as structured, patient consent is not appropriate, it must rely on another basis under Articles 6(1) and 9(2) of the UK GDPR. 

Fortunately, there were valid grounds provided in legislation: The Health and Social Care Act 2012 (the Act) contains provisions allowing the Secretary of State for Health and Social care (the Secretary of State) to make directions to instruct NHS Digital to collect and analyse data to help the health service.  On 6 April 2021, the Secretary of State sent the General Practice Data for Planning and Research Directions 2021 (the Directions) to NHS Digital, authorising it to collect and analyse pseudonymised data from GP practices.  Following receipt of the Directions, NHS Digital sent a Data Provision Notice to GP practices who were then legally required to share patient data with NHS Digital on the basis of the Directions.  This notice has subsequently been withdrawn, but is likely to be replaced once the necessary conditions to restart the scheme have been satisfied, as outlined below.

Once a new Data Provision Notice has been reissued, GPs will be able to rely on Article 6(1)(c) of the UK GDPR as the lawful basis for sharing of patient data with NHS Digital as they have a legal obligation under the Act, the Directions and the Notice to share the relevant patient data. [8] 

NHS Digital will also rely on this basis to collect, analyse, publish and share patient data. 

The UK GDPR also states that when special categories of personal data (which include health data) are being shared, then one of the specified conditions in Article 9 UK GDPR, must also be satisfied. [9]

NHS Digital have stated that the following Article 9 conditions will be relied on:

i. Article 9(2)(g): the sharing of patient data for reasons of substantial public interest, being the processing of patient data for planning and research purposes to improve health and care services.

ii. Article 9(2)(h): the sharing of patient data for the purposes of providing care and managing health and social care systems and services. 

iii. Article 9(2)(i): necessary sharing for reasons of public interest in the area of public health. 

iv. Article 9(2)(j): sharing for archiving, research purposes or for statistical purposes. [10]

Next steps 

Following its launch on 12 May, the original go-live date for the GPDPR data extraction was originally set at 1 July 2021.  However, in view of the widespread concerns raised, it has been paused until 31 March 2022, pending satisfaction of a number of conditions, the most important in terms of data privacy being that: 

  • patient awareness of the scheme must be increased through a campaign of engagement and communication; and
  • patients must be able to delete their personal data if they choose to opt-out of sharing it with NHS Digital, even if after data has been uploaded.

Further communications from NHS Digital, since the scheme was first published, have helped to clarify and address some aspects of the concerns which have been raised but more needs to be done to ensure that the scheme is launched in compliance with data protection laws. 

Wearables / Medical Devices 

Wearable technology, also known as “wearables”, is evolving to become an important category of the Internet of things, supported by the growth of mobile networks, high-speed data transfer, and miniaturised microprocessors, enabling life-changing applications in medicine and other fields. During the pandemic, there has been a rise in the use of wearables, with more people taking fitness and the monitoring of their health and wellbeing into their own hands.

Transparency

While wearables are great tools for monitoring health and general wellbeing, such devices continuously collect and store masses of personal data, including special category health information, which, if not processed compliantly, can put data subjects at risk.  Under the UK GDPR, Health data falls under “special category data”.  In order to lawfully process special category data, a lawful basis under Article 6 and a specific condition under Article 9 of the UK GDPR must be identified.  Any processing must also be fair and transparent.

Developers and owners of such devices need to ensure continued compliance with their data protection obligations, including ensuring that users are fully informed of what data is collected and how this is to be used and shared, and users should take the time to understand what is happening to their personal data by reading the privacy information provided. 

Lawful grounds for processing

Where the personal data of an average user is to be uploaded and processed through the app of the wearable device, the legal basis for processing any special category data is likely to be explicit consent.  Consent may also be required to process user personal data not deemed sensitive.  

However, other personal information, such as GPS location or contact details, may be justified on the grounds of contract or legitimate interest.  The provider’s privacy notice will need to be accessible to the individual at the time the data is collected.  It will need to include an explanation of the data collected, and the lawful grounds for processing.  For consent to be valid under the GDPR it must be freely given, specific to the use it is collected for, and be clear and unambiguous.  It will need to ensure not only that users can easily withdraw consent but also that their personal data is not further processed.  If explicit consent is required in relation to the processing of special category data, it must be provided separately in a clear and specific statement, and cannot be inferred from an individual’s conduct.

The position is more complicated, however, where a sports club or coach is looking to use a smart wearable device to analyse performance data of its professional athletes, including where this is built into a smart kit that the athlete is required to wear.  Where this is done in the context of an employment relationship, consent is unlikely to be the appropriate basis as it is unlikely to be deemed “freely given” by the athlete.  Other lawful bases that may be available include those under Article 9(2)(b) or 9(2)(h) UK GDPR but, in the latter case, would require any processing to be carried out by or under the supervision of any appropriate health professional.

Privacy and Security by Design

The UK GDPR requires health tech companies to implement – from the outset – appropriate technical and organisation measures to ensure the protection of individual rights.  This approach means that developers will need to incorporate UK GDPR-compliant processes at every step of the way, from the design phase of a system, service or product throughout its entire life cycle.  All the UK GDPR principles will apply to that effect:

  • data minimisation means that the app may only collect and process the minimum amount of data necessary to achieve the specific purpose (which must be clearly set out);
  • data security means that personal data must be processed in a manner that ensures appropriate security of the personal data;
  • purpose limitation on the use of data means that personal data may only be processed by the app for the purpose for which the personal data was collected;
  • data Retention means that personal data should not be stored for longer than necessary; and
  • accuracy of data means that personal data must be kept up to date.

As the app provider must be able to demonstrate compliance with the principles set out in the UK GDPR, conducting a DPIA will assist with this objective and remains the best practice in any case when it comes to health and wellness apps, where there is a likelihood of high risk to the individual, A DPIA will be able to assess three main considerations: (1) definition of the nature and scope of the data collected; (2) determination of the necessity and compliance measures required; and (3) identification of the risks to individuals, together with the appropriate measures to mitigate those risks. 

Looking Forward: Data Protection Reforms 

The ICO and the UK Government’s Department for Digital, Culture, Media and Sport (DCMS) have launched a number of consultations impacting data protection and data security rules applicable to digital health. 

ICO Consultation on Data Transfers

The ICO launched a consultation on 11 August 2021 on “how organisations can continue to protect people’s personal data when it’s transferred outside of the UK”.  The ICO consultation includes a three-part data transfer suite of proposals and options as follows:

  • Proposal and plans for updates to guidance on international transfers.
  • Transfer risk assessments.
  • The international data transfer agreement. 

At this time, the outcome of the consultation has not yet been published but it will have a significant impact on the digital health sector as the proposals aim to facilitate the flow of data to non-adequate jurisdictions while maintaining high standards of data protection for people’s personal information when being transferred outside of the UK.

DCMS Consultation “Data: A New Direction”

On 10 September 2021, the UK DCMS launched a consultation outlining its proposals to extensively reform the UK’s data protection and privacy regime, “Data: A new Direction”, following its departure from the EU.  A year after the publication of the National Data Strategy, [11] the Consultation further explores the potential for new data rules to better support the digital economy, establish a pro-growth and innovation friendly regime across the UK, increase trade and improve healthcare while maintaining high data protection standards.  The objectives set out in this paper are consistent with the proposals put forward by the UK government at the G7 summit roundtable of Data Protection and Privacy Authorities [12] held on 7 and 8 September 2021, in relation to the design of artificial intelligence in line with data protection.

The Consultation sets out in five chapters the areas of focus for data protection reform including:

  • Chapter 1 – Reducing barriers to responsible innovation.
  • Chapter 2 – Reducing burdens on businesses and delivering better outcomes for people.
  • Chapter 3 – Boosting trade and reducing barriers to data flows.
  • Chapter 4 – Delivering better public services.
  • Chapter 5 – Reform of the Information Commissioner’s Office.

In particular, the consultation outlines several proposals for amendments to research provisions within the existing data protection framework with a view to ensure legal certainty and reduce complexity for organisations that process personal data for research.  It notably proposes to establish a statutory definition for “scientific research” and create a new separate lawful ground for research.  It aims to simplify the rules on using (and re-using) data for research. 

The consultation also aims to address the complexity of the governance rules applicable to AI, Machine Learning and the use of AI technology.  Unlocking the power of data is one of the government’s top 10 technology priorities.  The National AI Strategy published on 22 September 2021 underscores the importance of this consultation and the role of data protection for broader AI governance.  The consultation asks for views on whether organisations should be permitted: “to use personal data more freely, subject to appropriate safeguards, for the purpose of training and testing AI responsibly.” 

A source of much debate is the proposal that Article 22 of the UK GDPR, which provides that individuals must not be subject to solely automated decisions which produce legal effects or similarly significant effects, without human intervention, should be removed and solely automated decision making permitted.  

The government supports the use of “data intermediaries”, which may well be the role for which many health-tech providers qualify, and aims to champion data intermediary activities such as data sharing, processing and pooling to “ensure responsible and trusted data use”.

Finally, obstacles to international data flows have been identified as a main concern to efforts of international data sharing and research during the pandemic.  The DCMS proposal includes plans to agree a series of post-Brexit “data adequacy” partnerships with the United States, Australia, the Republic of Korea, Singapore, the Dubai International Financial Centre and Colombia.  The UK will also prioritise future partnerships with India, Brazil, Kenya and Indonesia.  The “data adequacy” partnerships are formed with countries deemed to have high data protection standards and mean organisations do not have to implement additional compliance measures to share personal data internationally.  A Mission Statement on the UK’s approach to international data transfers and the “UK Adequacy Manual” were also published on the same day the consultation was published.

Conclusion

The ongoing nature of the COVID-19 pandemic means that the use of data for healthcare purposes such as contact tracing, medical research and public policy development is likely to continue.  The use of data will continue to play a vital role in assisting and shaping the response to the challenges that the virus poses.  To facilitate this use of data, individuals must have trust and confidence that their data will be processed in accordance with data protection rules, securely, fairly and transparently and developers, healthcare bodies and government must pay to embed data protections in their innovation processes at all stages.

Acknowledgment

The authors are grateful for the contributions made to this article by Gareth Bell, trainee solicitor within the Data/Commercial Team.

Endnotes

[1]  Blog: Regulating Through a Pandemic: J. Dipple-Johnston, Deputy Commissioner and Chief Regulatory Officer, ICO, 27 July 2021.

[2]  ICO, COVID-19 Contact tracing: data protection expectations on app development, 4 May 2020.

[3]  ICO, ICO takes action against contact tracing QR code provider, 18 May 2021.

[4]  ICO, Elizabeth Denham Statement on Delay to the Launch of GPDPR, 8 June 2021.

[5] Jo Churchill, Letter from Jo Churchill MP, 19 July 2021.

[6] Article 4(5) UK GDPR.

[7] NHS Digital, Collecting GP Data – advice for the public, 24 August 2021.

[8] Article 6(1)(c) UK GDPR.

[9] Article 9 UK GDPR.

[10] NHS Digital, General Practice Data for Planning and Research: GP Practice Privacy Notice, 24 August 2021.

[11] https://www.gov.uk/government/publications/uk-national-data-strategy/national-data-strategy.

[12] https://ico.org.uk/media/about-the-ico/documents/4018242/g7-attachment-202109.pdf.

Key Contacts

Dr. Nathalie Moreno

Dr. Nathalie Moreno

Partner, Commercial and Data Protection
London

View profile