accessibilityalertarrow-downarrow-leftarrow-rightarrow-upchevron-downchevron-leftchevron-rightchevron-upclosedigital-transformationdiversitydownloaddrivedropboxeventsexitexpandfacebookguideinstagramjob-pontingslanguage-selectorlanguagelinkedinlocationmailmenuminuspencilphonephotoplayplussearchsharesoundshottransactionstwitteruploadwebinarwp-searchwt-arrowyoutube
Article Article

Digital Services Act explained: new obligations for online businesses and other digital services

The EU Digital Services Act (DSA) has introduced a new liability and compliance framework for digital services offered to users within the European Union. While large platform providers like Meta or Amazon are already subject to the new rules since August 2023, the DSA’s comprehensive compliance obligations will become fully applicable to all covered online businesses on 17 February 2024 (for background see here).

In particular, Chapter III of the DSA imposes a broad spectrum of complex and partly interrelated due diligence and transparency requirements – including annual reporting, mandatory T&C changes, effective notice and complaint mechanisms, as well as advertising and algorithmic transparency. The implementation of these new obligations will require a significant technical and organisational effort, which is why affected online businesses should allocate adequate preparation time and resources to this project.

Following a layered regulatory approach, however, the applicable compliance obligations will, in each case, depend on the specific type and size of the services offered. We have thus created the below chart to help businesses navigate the DSA’s complex regulatory framework and identify the compliance oblations applicable to them.

Mere Conduit

Transmission of user information via a communication network or provision of access thereto

Examples:

  • internet service providers
  • wireless access points
  • VPNs
  • top-level domain name registries
  • interpersonal communication services (e.g. VoIP)

Applicable obligations: 1-4

Caching

Transmission of user information via a communication network with automatic and temporary storage solely for efficiency purposes

Examples:

  • web and database caching
  • content delivery networks
  • reverse proxies
  • content adaptation proxies

Applicable obligations: 1-4

Hosting

Storage of user information

Examples:

  • cloud computing (e.g. SaaS)
  • web hosting
  • referencing services
  • file storage and sharing services
  • chat functions with message storage

Applicable obligations: 1-7

Online Platforms

Hosting service including the public dissemination of user information

Examples:

  • social media
  • online marketplaces
  • comparison portals
  • comment or rating forums

Applicable obligations: 1-15

B2C Online Marketplaces

Online platforms allowing consumers to conclude distance contracts with traders

Examples:

  • B2C trading platforms

Applicable obligations: 1-18

Very Large Online Platforms (VLOPs) / Very Large Online Search Engines (VLOSEs)

Services with more than 45 million average monthly users within the EU

Examples:

Applicable obligations: 1-19

Obligations under the DSA

1. Takedown and information disclosure orders

Member State judicial or administrative authorities may order service providers to remove illegal content or to disclose information on service recipients (i.e. users). In this respect, the DSA introduces a special information mechanism (Art 9 and 10), under which orders should comply with certain formal requirements and, in turn, providers are obligated to inform the issuing authorities about the effects given to these orders without undue delay. In addition, affected users must also be informed about the effects given to such takedown and disclosure orders, accompanied by information on their territorial scope and existing possibilities for redress.

2. Points of contact and legal representatives

All service providers must designate single points of contact and publish respective (electronic) contact information for both:

  • communications with Member State authorities, the Commission, and the European Board for Digital Services (Art 11) and
  • communications with users (Art 12).

In addition, providers without an EU establishment must appoint an EU-based legal representative to receive, comply with, and enforce DSA-related decisions (Art 13). These representatives may also bear liability for DSA non-compliance, without prejudice to any liability of the provider itself.

 

3. Terms and conditions

All service providers must revise applicable terms and conditions (T&Cs) to introduce information regarding their content moderation practices and internal complaint procedures (Art 14). These revisions should include detailed explanations on any content-related policies, procedures, measures and tools, including information on algorithmic decision-making and human review.

 

The information must be easily accessible, in machine-readable format, and written in a plain, intelligible, and unambiguous language. In this respect, special care must be taken (and age-appropriate language used) where the service is intended for minors. Furthermore, users must be notified of any significant modifications to the T&C.

4. Annual transparency reports

All service providers are required to publish annual reports that are easily accessible and in a machine-readable format, providing clear and comprehensible information with respect to any content moderation activities undertaken during the specified reporting period (Art 15). Depending on the type of service, this may include information concerning takedown or other orders received from competent authorities, content notices by users and trusted flaggers, as well as complaints received through internal complaint-handling systems.

 

Given the extent of information to be reported, providers should assess whether their content moderation systems (e.g. dashboards) enable them to efficiently extract, document, and report the required information. Moreover, the Commission may adopt implementing acts to introduce templates apropos the form, content, and other details of these reports, including harmonised reporting periods.

5. Notice and action mechanisms

Hosting providers shall implement user-friendly (electronic) mechanisms (e.g. via online forms or other functionalities) that enable users and third parties to report illegal content by submitting sufficiently precise and adequately substantiated notices (Art 16). Upon receiving a notice, the service provider must, without undue delay, confirm receipt and follow-up with the user once a decision in respect of the notice has been reached. In the context of the provider’s reasoned decisions, the user should also be informed about the possibilities for redress.

 

Decisions on notified content must be taken in a timely, diligent, non-arbitrary, and objective manner, which requires providers to implement adequate technical and organisational precautions such as sufficiently detailed notice forms and adequate staffing. Additionally, where automated means are used for decision-making, this must be disclosed to the user.

6. Statement of reasons

Hosting providers are required to provide affected users with clear and specific explanations, so-called statements of reasons, whenever they impose restrictions (e.g. removal/demotion of content or suspension/restrictions of user accounts) on the basis that user content is unlawful or in violation of the provider’s T&C (Art 17).

 

The statement of reason must include, inter alia, details related to the consequences of the decision, its territorial scope and duration, information on any automated decision-making means, as well as the facts, circumstances, and legal basis relied upon in the decision. Furthermore, the affected user must be informed with regard to available redress possibilities.

7. Reporting criminal offences

When a hosting provider becomes aware of any information that gives rise to the suspicion that a criminal offence involving a threat to the life or safety of persons has taken place, this must be promptly notified to the competent authorities (Art 18).

8. Internal complaint-handling system

Providers of online platforms must implement an effective, internal complaint-handling mechanism (Art 20), which enables users to file complaints against the provider’s content-moderation decisions (e.g. decisions to remove/disable user content or to suspend/terminate the user’s service access). The ability to make a complaint must be available to users for at least six months from their receipt of the decision, and any submitted complaints must be handled in a timely, non-discriminatory, diligent, and non-arbitrary manner. Moreover, decisions require appropriately qualified human supervision and may thus not be taken solely on the basis of automated means.

 

If the complaint proves to be justified on the basis of the aforementioned criteria, the provider is obligated to reverse the contested decision (e.g. to reinstate content or an account). The provider must also inform complainants of its reasoned decision along with available redress options, including detailed information about mandatory out-of-court dispute settlements (Art 21) and other remedies.

9. Trusted flaggers

Providers of online platforms must ensure that notices submitted by so-called “trusted flaggers” are given priority and are processed without undue delay (Art 22). Trusted flaggers will be appointed by the Member State’s Digital Services Coordinator and the Commission will maintain a publicly accessible database of trusted flaggers. Providers should thus ensure that they have technical and organisational measures in place that allow them to identify, prioritise, and promptly process notices submitted by trusted flaggers.

10. Protection against misuse

To combat the misuse of their systems, online platform providers must suspend, for a reasonable time-period, their services for users who frequently provide manifestly illegal content (Art 23). In addition, providers must also suspend the processing of notices or complaints by individuals who frequently submit manifestly unfounded notices. In both instances, prior warnings need to be issued and the applicable process must be clearly outlined in the providers’ T&Cs, alongside examples of circumstances that are taken into account when assessing what constitutes such a misuse.

11. Additional transparency reporting: disputes, suspensions, and monthly active users

Providers of online platforms are subject to further transparency reporting in addition to the Art 15 reports (see obligation 4 above) (Art 24). In particular, they must submit reports to the Commission including, inter alia, details concerning the disputes submitted to the out-of-court dispute settlement, pursuant to Art 21 and the number of suspensions imposed based on system misuses in accordance with Art 23. Additionally, every six months, such providers must publish information on the number of average monthly active users on their online interface. Lastly, content-restriction decisions and respective statements of reasons (Art 17) must be submitted to the Commission’s transparency database. The Commission may adopt implementing acts to introduce templates concerning the form, content and other details of such reports.

12. User interface transparency by design

Providers of online platforms are required to ensure that their interfaces meet certain design and accessibility standards (“user interface transparency by design“) (Art 25). In particular, providers must refrain from designing their online platforms in a way that seeks to shape user behaviour in a particular way (prohibition of so-called “dark patterns”) or in any other deceitful or manipulative manner that would impair users’ ability to make free and informed decisions.

 

As such, the DSA inter alia prohibits platform interfaces that deceitfully or manipulatively promote certain user choices over others, or repeatedly request user choices that have already been made. Moreover, the DSA prohibits user lock-ins through termination mechanisms that are more complicated than the initial subscription or sign-up. The Commission may issue guidelines applicable to specific deceitful practices.

 

Online platform providers should thus conduct an audit of their online interfaces to ensure that they comply with the principle of “user interface transparency by design” and do not contain any dark patterns.

13. Advertising transparency and user profiling

Providers of online platforms are subject to heightened advertising transparency requirements (Art 26). In particular, these providers must clearly identify online advertisements as such and reveal adequate information about the advertisement’s source and financing.

 

Moreover, with respect to the targeted advertising, providers must reveal information relating to the main parameters of how target groups are determined and, where applicable, how to change those parameters. Targeted advertisements based on profiling using special category data (as defined under Art 9 GDPR) are prohibited outright.

14. Recommender system transparency

Providers of online platforms, which have recommender systems in place, are required to set out the main parameters used in such systems in their T&Cs, including any available options for users to modify or influence said parameters (Art 27). These main parameters should explain why certain content is suggested and include, at the very least, the most significant criteria for user recommendations and the reasons for the relative importance of those parameters.

15. Protection of minors

Providers of online platforms accessible to minors are required to implement appropriate and proportionate measures to maintain a high level of security, privacy, and safety of minors (Art 28). Inter alia, providers are prohibited from presenting targeted advertisements based on profiling using the personal data of minors. The Commission may issue guidelines to assist with the implementation of these appropriate measures.

16. KYBC checks

Providers of online marketplaces allowing B2C distance contracts (B2C online marketplaces) are required to conduct KYBC (“know your business customer“) checks on new traders offering products or services to consumers in the EU, including the vetting of information provided through reliable services (Art 30). For any existing traders, providers of B2C online marketplaces are required to make best efforts to conduct such KYBC checks within 12 months.

17. User interface compliance by design

Providers of B2C online marketplaces must design their online interfaces in a way that allows traders to comply with their obligations regarding pre-contractual compliance and product safety information under applicable Union law (Art 31). They must also allow traders to clearly identify those products and services they offer to customers within the EU and to add relevant product labels (e.g. compliance markings). Finally, providers shall make best efforts to randomly check in any official, freely accessible, and machine-readable online database whether the products and services offered have been identified as illegal.

 

B2C marketplace providers should thus check whether their online interfaces comply with the principle of “compliance by design” and ensure that there is a mechanism in place for random checks for illegal products and services.

18. Right to information

If the provider of a B2C online marketplace becomes aware that an illegal product or service has been offered through its services, the provider must inform consumers who purchased the illegal product or service of this fact and disclose the identity of the trader and any relevant means of redress (Art 32). This shall apply to any purchases within the six months preceding the moment that the provider became aware of the illegality. If the provider does not have contact details of the consumers, the respective information must be made publicly available.

19. Due diligence obligations for VLOPs / VLOSEs

Lastly, VLOPs and VLOSEs face extensive additional compliance obligations under Chapter III, Section 5 DSA. Specifically, providers of VLOPs and VLOSEs must undertake the following:

  • conduct a risk assessment (Art 34) and implement risk mitigation measures (Art 35)
  • implement a crisis response mechanism (Art 36)
  • commission annual independent compliance audits (Art 37)
  • implement at least one non-profiling-based recommender system option (Art 38)
  • observe heightened advertising transparency (Art 39)
  • provide data access and scrutiny to supervisory authorities (Art 40)
  • install a compliance function, such as a “DSA Officer” (Art 41)
  • file additional transparency reports (Art 42)
  • pay supervisory fees (Art 43)

 

*Obligations 4 and 8 – 18 (except for the obligation pursuant to Art 24(3) DSA to communicate to the Digital Services Coordinator updated information on average monthly active users) do not apply to online platforms that qualify as micro or small enterprises as defined in Commission Recommendation 2003/361/EC (i.e., those with fewer than 50 employees and less than EUR 10 million in annual turnover) that are not VLOPs.

**Obligations 8 – 18 (applicable to online platforms) do not apply to providers of hosting services in case that the dissemination activity of the service to the public is merely a minor and purely ancillary feature of another service, or a minor function of the main service, and it cannot be used separately due to objective and technical reasons (of course, integrating the feature or functionality into another service should not be an attempt to circumvent the application of the DSA obligations).

Contributors