UK Data Use and Access Bill

Background to the UK Data Use and Access Bill

The new Labour Government introduced the Data Use and Access Bill in the House of Lords in October 2024. It is currently progressing through the House of Commons at the committee stage, before moving forward for final consideration and Royal Assent.

For UK organisations, the proposed changes may signal a shift in data management policies, requiring adjustments to compliance frameworks and operational processes.

What are the Proposed Key Changes under the UK Data Use and Access Bill?

The UK Data Use and Access (DUA) Bill introduces nuanced adjustments to the current regime rather than completely overhauling it. That said, certain changes could affect data handling, sharing, and compliance obligations for UK businesses.

1. Introduction of Recognised Legitimate Interests

The Bill introduces ‘Recognised Legitimate Interests’ as a new legal basis for data processing, specifically allowing certain security-related activities - such as fraud prevention, public safety, and national security – to be considered legitimate interests by default, potentially without requiring a Legitimate Interests Assessment (LIA). It is important to note, however, that while the Bill simplifies the process for organisations to rely on legitimate interests, it does not eliminate the need for an LIA in all cases.

Currently, this new legal basis explicitly applies to private organisations and does not appear to extend to public authorities, potentially excluding NHS organisations and further clarification is needed regarding the impact on NHS-held health data.

Additionally, the Bill recognises direct benefits for organisations involved in direct marketing, intra-group administrative purposes, and ensuring network security by making it clearer that such processing activities may qualify under legitimate interests.

It is important to stress that the Bill still requires organisations to assess whether an individual’s rights override their business interests when relying on legitimate interests for marketing - a process known as the balancing test. This means that data controllers must evaluate the impact on individuals before using legitimate interests as a legal basis, ensuring that it does not override fundamental rights and freedoms.

Additionally, the Bill does not override existing Privacy and Electronic Communications Regulations (PECR), which still requires consent for certain marketing channels, such as email and SMS marketing. The rules for general commercial direct marketing remain unchanged, meaning that in many cases, explicit consent will still be required under PECR.

2. Changes to Data Subject Access Requests (DSARs):

Processing Data Subject Access Requests (DSARs) can be costly and time-consuming due to the large volume of data typically involved. Under the current UK GDPR framework, organisations must respond to DSARs without undue delay and within one calendar month of receipt. The Data (Use and Access) Bill retains this timeframe but introduces the “reasonable and proportionate” search principle for responses.

The Bill clarifies that organisations are required to conduct “reasonable and proportionate” searches when responding to DSARs. This means that while organisations must make genuine efforts to locate and provide the requested personal data, they are not obligated to conduct exhaustive searches that would impose an excessive burden. This clarification aligns with the guidance of the ICO, which states that organisations should perform a reasonable search for the requested information.

Additionally, the Bill allows organisations to pause the response period in certain circumstances, such as:

  • When verifying the identity of the data subject.
  • When requesting additional information necessary to process the request.
  • When dealing with complex requests or multiple requests from the same individual.

Once the necessary information is provided, the response timeframe resumes. However, organisations must notify the individual of the delay and provide reasons for the extension within the original one-calendar month period.

3. Clarification on Automated Decision-Making

Existing Article 22 of the UK GDPR restricts solely automated decision-making (ADM) that has a significant legal effect on individuals, requiring meaningful human oversight for all such processes.

The Bill clarifies that ‘meaningful human intervention’ necessitates a competent person reviewing automated decisions. This ensures that human oversight in ADM processes is substantive and informed. For organisations using AI-driven processes, this means they must uphold transparency and accountability in decision-making. Additionally, they are required to inform individuals and comply with non-discrimination laws, such as the Equality Act 2010.

The Bill further specifies that ADM processes involving any type of personal data must still be subject to appropriate safeguards.

4. Changes to the Protection of Children’s Personal Data

The DUA Bill introduces several provisions aimed at strengthening the protection of children’s personal data. It defines children’s ‘higher protection matters’ as considerations for how best to safeguard and support children when using services. The Bill also acknowledges that children may be less aware of the risks and consequences of data processing and have different needs at various stages of development.

Recent developments highlight ongoing efforts to enhance children's data protection:

  • The ICO has launched investigations into platforms such as TikTok, Reddit, and Imgur regarding their handling of children’s data, focusing on content recommendations and age verification methods.
  • The ICO introduced the Age-Appropriate Design Code, also known as the Children’s Code, a UK code of practice requiring online services likely to be accessed by children to be designed with their safety and privacy in mind.
  • The Online Safety Act, while broader in scope, is aimed at improving online safety for all users, which also includes a focus on protecting children. Regulated by Ofcom, the Act introduces stricter content moderation requirements for platforms to prevent harm to minors. In December 2024, Ofcom issued its first codes of practice under the Act, targeting illegal harms such as child sexual abuse and incitement to suicide. The Act also mandates age verification measures to prevent children from accessing harmful content, including the use of AI facial checks and email analysis.

These initiatives reflect a broader effort to create a safer digital environment for children, ensuring that their personal data is handled with due care and consideration.

5. Cookies and Other Similar Tracking Technologies

The Bill expands the scope for implementing cookies and similar tracking technologies without requiring user consent, under certain conditions.

It specifies that cookies used solely for statistical purposes - such as improving services or websites - will be exempt from the consent requirement, if users are informed of their purpose and can easily opt out. This change aims to reduce compliance burdens for organisations managing cookie regulations.

The Bill also seeks to standardise enforcement across the UK GDPR, the Data Protection Act 2018, and the Privacy and Electronic Communications Regulations 2003 (PECR). Organisations are advised to ensure compliance with PECR, particularly regarding cookie usage and direct marketing.

6. Revised International Data Transfer Mechanisms

The Bill places a strong emphasis on international data transfers, allowing data transfers to countries where the protection standard is “not materially lower” than the UK’s. This change is intended to enhance flexibility for businesses engaging in global data exchanges. While this could streamline cross-border business operations, concerns may remain regarding its potential impact on the EU-UK adequacy decision.

Additionally, the DUA Bill restricts the Secretary of State’s ability to amend existing transfer safeguards. Any modifications will require secondary legislation to take effect.

7. Digital Verification Services (Digital ID)

The DUA Bill establishes a Digital ID Trust Framework to drive innovation and broader adoption of digital identities. This framework aims to streamline regulations for digital verification services, enhance national security measures for provider registration, and increase oversight and consultation. Key provisions of the framework include simplifying regulations to make digital verification services more efficient and accessible.

8. Penalties for Non-Compliance

The DUA Bill enhances PECR enforcement powers, bringing penalties in line with UK GDPR. It permits fines of up to 4% of global turnover or £17.5 million, whichever is greater, significantly raising potential penalties for non-compliance.

Given its current trajectory, it is anticipated that the DUA Bill will be enacted in 2025. As the Bill moves through Parliament, UK organisations may begin assessing its potential impact on their data management and compliance frameworks.

If you have any queries or would like further information, please visit our data protection services section or contact Christopher Beveridge.