On October 19, 2022, the Digital Services Act (DSA) was published in the Official Journal of the European Union, thereby triggering its entry into force. The DSA creates a first-of-its-kind regulatory framework that, like the General Data Protection Regulation (GDPR), could set an international benchmark for regulating intermediary services such as search engines, e-commerce platforms, hosting services, and more. 
To achieve these regulatory goals, the DSA creates a pyramid-like, category-based approach to applying obligations to intermediary services, with those at the bottom of the pyramid having the least obligations. If an intermediary service falls into a higher category, then the service has stricter obligations in addition to those services in the lower category.
Given that the DSA could apply internationally and introduces a plethora of onerous obligations, it is important to review its scope, requirements, and what these could mean for businesses around the world.
On March 1, 2018, the European Commission published the non-binding Commission Recommendation 2018/314, calling for the need to address “illegal online content” and its “serious negative consequences for users.”
On July 16, 2019, Ursula von der Leyen, then-candidate for President of the European Commission, announced her political guidelines for the 2019-2024 Commission, in which she called for a “new Digital Services Act” to upgrade liability and safety rules for digital platforms, services, and products.
To this end, the Commission launched a public consultation process to gather comments and evidence regarding how online platforms should be regulated. Then, the Commission published the proposal for the Digital Services Act on December 15, 2020, alongside an evidence-based impact assessment.
On April 22, 2022, European policymakers in Brussels reached an agreement after 16 hours of negotiations, and a few months later the European Parliament approved the DSA along with the Digital Markets Act.
And finally, four years after its conception by Ursula von der Leyen, the DSA was published in the Official Journal of the European Union on October 19, 2022, thereby marking its entry into force.
To whom does the DSA apply?
The DSA applies to any intermediary service offered to natural or legal persons that have their place of establishment or are located in the EU, irrespective of whether the provider of that intermediary service is established in the EU.
The DSA broadly defines “intermediary service” to include a number of service categories, including:
- Mere conduits of transmissions, such as top-level domain name registries, DNS services and resolvers, certificate authorities that issue digital certificates, and more.
- Caching services, such as the provision of content delivery networks and reverse proxies.
- Hosting services, such as cloud computing, web hosting, file storage, and more.
- Online platforms, which is a subcategory of hosting services:
- Online platforms are hosting services that are primarily used, at the request of a recipient of the service, to store and disseminate information to the public, such as e-commerce marketplaces, app stores, social media platforms, and more.
- Search engines, such as Google, Bing, and other online services that allow users to input queries to perform searches.
- Very large online platforms and search engines, which is a special designation given to online platforms or search engines that reach at least 45 million recipients in the EU.
Recital 29 of the DSA states that whether a specific intermediary service constitutes a mere conduit, a caching service, or a hosting service — which is the first question a business should consider — depends solely on the service’s technical functionalities and should be assessed on a case-by-case basis.
And this analysis is important, because the category in which a service lands will determine the number of obligations required under the law. And there are many obligations.
Can the DSA apply to companies outside of the EU?
The DSA applies to any intermediary service offered to natural or legal persons that have their place of establishment or are located in the EU, irrespective of whether the intermediary service is established in the EU.
However, while this scope may appear overly broad, the law clarifies in Article 3 and Recitals 7 – 8 that the intermediary service must have a “substantial connection to the Union” to be covered. Such a substantial connection results from:
- Having an establishment in the EU; or
- Having a significant number of recipients of the service in a Member State; or
- Targeting activities toward a Member State, which can result from:
- the use of a Member State’s language or currency;
- the possibility of EU recipients ordering products or services;
- the use of a relevant top-level domain;
- the availability of an app in a relevant national app store;
- advertising in a Member State or in a language used by a Member State;
- providing customer services in a language generally used in a Member State.
While the law requires a substantial connection, the possibility of falling into the extraterritorial scope, much like the GDPR, requires companies to take care in considering how they advertise or offer their intermediary service and whether such advertising or offerings could place them squarely in the scope of the law.
Does the DSA treat all online intermediary services equally?
The DSA uses a tiered, pyramid-like approach to impose cumulative obligations on the various categories of intermediary services.
Obligations for all providers of intermediary services
The bottom of this pyramid-like framework includes all providers of intermediary services. The DSA imposes on this category a substantial list of due diligence and transparency obligations. These include:
- Designating a single point of contact for communicating with Member State authorities (Article 11).
- Designating a single point of contact for communicating with recipients of the service (Article 12).
- Providing information in the terms and conditions about any policies, procedures, measures, and tools used for content moderation, algorithmic decision-making, and the handling of internal complaints (Article 14).
- Making publicly available a yearly content moderation report (Article 15).
- And for providers which do not have an establishment in the EU yet fall within the law’s extraterritorial scope: designate a legal representative in a Member State and ensure the representative can be held liable for non-compliance with obligations under the DSA (Article 13).
Additional obligations for hosting services and the subcategory of online platforms
In addition to the above obligations, providers of hosting services and providers of online platforms must satisfy the following obligations:
- Creating a mechanism through which any individual or entity can notify the provider about the presence of information on the service that the individual or entity considers illegal (Article 16).
- Providing a clear and specific statement of reasons to recipients affected by restrictions imposed on the basis of information provided by the recipient is illegal or incompatible with the provider’s terms and conditions (Article 17).
- Notifying law enforcement or judicial authorities if the provider becomes aware of information giving rise to certain legally-prescribed criminal offenses (Article 18).
Additional obligations just for providers of online platforms
In addition to the two lists of obligations above, providers of online platforms — the subcategory of hosting services — must also satisfy the following obligations:
- Creating an internal complaint-handling system through which recipients can, free of charge, lodge complaints against the provider, and provide recipients with access to the system for at least six months following certain decisions that may affect the recipient (Article 20).
- Allowing recipients to select any out-of-court dispute settlement body certified under the DSA to resolve disputes relating to Article 20 decisions (Article 21).
- Implementing technical and organizational measures to ensure notices submitted by trusted flaggers — that is, entities awarded this role by a Member State’s Digital Services Coordinator — are prioritized, processed, and decided upon without undue delay (Article 22).
- Suspending recipients that frequently provide manifestly illegal content (Article 23).
- Making publicly available a yearly content moderation report that, in addition to the Article 15 requirements, shall detail the number of disputes submitted to out-of-court dispute settlement bodies pursuant to Article 21 and the number of recipients suspended pursuant to Article 23 (Article 24).
- Designing, organizing, and operating the online platform’s interfaces in a way that does not deceive or manipulate recipients so as to materially distort or impair their ability to make free and informed decisions (Article 25).
- Ensuring that each advertisement presented to recipients via the online platform’s interface contains certain legally-prescribed disclosures (Article 26).
- Implementing measures to ensure a high level of privacy, safety, and security for minors, if the online platform is accessible to minors (Article 28).
It is important to note that most of these obligations do not apply to providers of online platforms that qualify as micro or small enterprises. A micro enterprise is one that employs fewer than 10 people and whose annual turnover and/or annual balance sheet total does not exceed EUR 2 million. A small enterprise is one that employs fewer than 50 people and whose annual turnover and/or annual balance sheet does not exceed EUR 10 million.
Additional obligations for very large online platforms and online search engines
The DSA imposes even more obligations on providers of “very large” online platforms or search engines. To be given this designation, the online platform or search engine must have at least 45 million monthly active EU recipients and been recognized as “very large” by the European Commission. Once given such a designation, the very large online platform or search engine has four months before the following obligations apply:
- Conducting yearly risk assessments of its service and systems, including algorithmic systems (Article 34).
- Implementing mitigation measures tailored to the specific risks identified by the yearly risk assessment (Article 35).
- Taking actions specified by the European Commission in response to a crisis (Article 36).
- Paying for independent audits on a yearly basis to ensure compliance with the DSA (Article 37).
- Creating a searchable repository of legally-specified information relating to advertisements on the online platform or search engine (Article 39).
- Providing the European Commission or the Digital Services Coordinator with information necessary to monitor and assess compliance with the DSA (Article 40).
- Establishing a compliance function, giving it sufficient authority, statute, resources, and access to management to monitor compliance with the DSA (Article 41).
- Making publicly available the Article 15 content moderation report every six months (Article 42).
- Paying an annual supervisory fee for their designation as “very large” (Article 43).
Are the enforcement penalties harsher than the GDPR?
The DSA requires Member States to lay down rules on penalties for infringements of the law by providers of intermediary services. The DSA requires Member States to ensure that the maximum amount of fines that may be imposed for a failure to comply with any obligation under the DSA shall be 6% of the annual worldwide turnover of the provider’s preceding financial year.
However, less serious infringements under the DSA, such as supplying misleading information or failing to submit to an inspection, shall result in a fine of up to 1% of the provider’s annual income or worldwide turnover in the preceding financial year.
By contrast, GDPR violations could result in a fine of up to EUR 20 million or 4% of a company’s worldwide annual revenue from the preceding financial year, whichever is higher.
What are the next steps for the DSA?
The bulk of the DSA’s obligations shall apply starting February 17, 2024. However, by February 17, 2023 and at least once every six months thereafter, all providers of intermediary services must publish information on the service’s average monthly active recipients in the Union.