accessibilityalertarrow-downarrow-leftarrow-rightarrow-upchevron-downchevron-leftchevron-rightchevron-upclosedigital-transformationdiversitydownloaddrivedropboxeventsexitexpandfacebookguideinstagramjob-pontingslanguage-selectorlanguagelinkedinlocationmailmenuminuspencilphonephotoplayplussearchsharesoundshottransactionstwitteruploadwebinarwp-searchwt-arrowyoutube
Client Alerts Client Alerts

Draft Digital Omnibus: promise of simplification and alignment for AI and data

The Digital Omnibus proposal published by the European Commission on 19 November 2025 outlines sweeping changes that aim to simplify compliance for companies in and outside the EU, subject to EU digital regulations. The scope of the proposal is far-reaching, covering data and privacy, AI, cybersecurity and online user experience to consolidate existing regulations and eliminate overlaps, in particular with regard to security and notification obligations.

In this article, we focus on selected key proposed changes to the AI Act, GDPR and the Data Act, areas with the most overlaps and open questions on how to tackle them. Note: the draft is not yet law; details and timelines may change during trilogue and implementing acts.

A more practical approach to AI

The proposed draft aims to introduce a more pragmatic and business-friendly approach by reducing rigidity and aligning compliance with practical realities. If implemented, the changes would significantly ease operational and compliance burdens on companies, fostering innovation and increasing cost efficiency.

Some specific proposals in the draft aim to:

  • remove mandatory AI literacy requirements, replacing them with a non-binding obligation to promote AI literacy through training and best practice sharing;
  • extend regulatory simplifications granted to small and medium-sized enterprises (SMEs) to small mid-caps (SMCs), including simplified technical documentation requirements and special consideration in the application of penalties;
  • further cement the AI Office’s role as the competent central supervisory authority for certain AI systems and grant expanded rights over GPAI systems while retaining current rights and duties. Additionally, the AI Office receives supervision rights over AI systems integrated into very large online platforms and very large online search engines as defined under the Digital Services Act;
  • push back high-risk AI system compliance deadlines and fines on transparency obligations, with the entry into application of such requirements being linked to the availability of compliance support measures such as harmonised standards, common specifications and guidelines. These include guidelines on the practical application of high-risk classification and incident reporting. High-risk AI system requirements in Annex III will apply by 2 December 2027, while those in Annex I will apply by 2 August 2028 at the latest; and
  • delay the applicability of fines for infringements of transparency and marking obligations until 2 February 2027 for AI systems placed on the market before 2 August 2026. Obligations under Article 50 remain enforceable by 2 August 2026.

Simplification of rules and less stringent protection for data handling

Overall, the Digital Omnibus draft makes targeted, pro‑clarity adjustments to the GDPR and Data Act that streamline compliance, harmonise procedures and better align the legal framework with AI, data sharing and cloud realities. For companies, the key upsides include fewer and simpler breach filings (96‑hour, one‑stop), clearer lawful bases for AI and automated decisions, forthcoming EU‑level criteria to treat some pseudonymised data as non‑personal and stronger guardrails that protect trade secrets while narrowing public‑sector data demands to true emergencies – together reducing compliance friction and increasing legal certainty and defensibility.

Some notable changes to GDPR include the following:

  • An amended definition of personal data, where information is not considered personal data for a given controller if that controller lacks means reasonably likely to identify the person to whom the information relates. This amendment would create a “subjective, entity‑relative” approach for when datasets fall outside the GDPR for a particular controller or recipient. However, this requires robust, evidence‑based assessments and documentation of identifiability and may create uncertainty regarding processing roles, such as joint controllers/data processors and related contractual requirements. Further, misclassification risks would remain until tested by authorities or litigants.
  • Guidance on classifying data from pseudonymisation from the Commission is to be provided in implementing acts specifying the means and criteria to determine when data resulting from pseudonymisation no longer constitute personal data for certain entities. This would offer a harmonised pathway on EU-level to treat some pseudonymised datasets as non‑personal in defined contexts, improving legal certainty for data sharing and secondary use. Yet, the actual outcome would still depend on the quality and timing of the aforementioned implementing acts.
  • A higher notification threshold, longer deadlines and one‑stop reporting for data breaches. The threshold for notifying authorities would rise to cases where a breach is “likely to result in a high risk”, with a 96‑hour deadline and use of a single EU entry point. Positive consequences include fewer routine filings, more consistent practice and a lower administrative burden, as well as convergence with the obligation to inform data subjects.
  • Measures to handle abusive requests for access to personal data that allow controllers to charge a reasonable fee or refuse requests that are manifestly unfounded or excessive, including where the right is abused for purposes other than data protection. This would be another welcomed improvement r to curtail very often vindictive or tactical use of access rights and reduce legal and operational drag. However, controllers would still bear the burden of proof for such abusive requests.
  • Simplified cookie consent rules that expand the scope of consent-free cookies, introduce an automated machine‑readable consent mechanism and a six‑month no‑reprompt period. Controllers must accept and honour standardised, machine‑readable choices (e.g. via browsers or other tools) once standards are available. If a user declines consent, controllers may not ask again for the same purpose for at least six months. Media service providers are exempt from the automated signal obligation when providing media services, while the six‑month no‑reprompt rule would still apply. An expected positive effect is streamlined user choice management, reduced consent fatigue and provide a presumption of compliance when aligned with harmonised standards.

Clearer legal paths for automated decision-making that acknowledge solely automated decisions may be “necessary” for contract performance (even if a human could decide) and recognise a legitimate interest basis for using personal data in AI development and operation, subject to safeguards. Further, the draft suggests that biometric verification can be permitted where it is performed under the data subject’s sole control. This would ease legal friction for authentication, pricing and AI lifecycle processing, as well as provide a defensible lawful‑basis framework that better reflects current technology.

Proposed changes to the Data Act worth highlighting are:

  • Stronger IP and sovereignty guardrails that give data holders the right to refuse data sharing where there is a high risk of unlawful acquisition, use or disclosure of trade secrets to third‑country entities or EU entities under their control, subject to weaker protection. This provides better protection for sensitive assets and clearer grounds to resist risky disclosures, reducing exposure to extraterritorial orders and leakage. However, refusals must be duly substantiated on objective elements and notified to the competent authority, creating dispute and supervision risk.
  • B2G access limited to “public emergencies” with streamlined procedures and a complaint mechanism. This would reduce routine exposure to broad data demands and introduce clearer safeguards and proportionality.

What’s next?

The proposal will now go through the legislative process. Given that the compliance deadline for the AI Act is less than a year away on 2 August 2026, we may expect a push for a quick turnaround. At the same time, some adjustments to the proposed text are likely to accommodate political efforts to preserve fundamental rights and freedoms.

To prepare, companies can already:

  • review current practices and policies and flag potential areas for adjustment;
  • take note of shifting timelines and consider how to align them with other changes that may be required;
  • remain committed to current compliance initiatives and timelines to ensure alignment with updated requirements; and
  • track the constant stream of guidelines and other materials issued by the Commission and the AI Office to allow for more tailored application of the AI Act, the GDPR and any relevant national legislation.

Download the Client Alert here

Download PDF

Contributors