Digital Services Act: new rules on digital services apply to all intermediary service providers

From February 17 this year, the DSA – initially applicable only to large online platforms (“VLOPs”) and large online search engines (“VLOSEs”) – became fully applicable to all intermediary service providers in the EU, thus including platforms or search engines with less than 45 million active users.

The Digital Services Act (Regulation (UE) 2022/2065 on a Single Market for Digital Services – hereinafter referred to as the “DSA”), together with the Digital Markets Act (Regulation (UE) 2022/1925 on contestable and fair markets in the digital sector - “DMA”) , form the so-called “Digital Services Package”, i.e. the set of rules by which the European Union intended to (i) create a safer digital space in which the fundamental rights of all users of digital services are protected, and (ii) establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally. Indeed, the DSA and DMA represent the European Union’s effort to both govern the power of large technology companies and promote competition in the digital market.

This paper will discuss only the main innovations of the DSA, specifically referring to intermediary service providers with fewer than 45 million active users. As for VLOPs and VLOSEs, however, please refer to our previous article “DSA: European Commission designated 19 Very Large Online Platforms and Search Engines“.

***

Generally speaking, the DSA arises from the need to establish a targeted set of mandatory, uniform, effective and proportionate rules at a European level in order to protect and improve effective performance of the internal market by establishing conditions for the development and expansion of innovative digital services. The regulation is essentially based on three key factors:

  • the liability of intermediary service providers, who will be subject to new rules to remove illegal and harmful content;
  • consumer rights, which are enhanced, especially with regard to product information transparency and personal data protection; and
  • competition. In fact, the DSA aims to promote competition in the digital market, particularly by fostering interoperability among digital services and preventing abuse of dominant position.

The DSA came into force on November 16, 2022, initially with partial application only to gatekeepers, which are those digital platforms with a systematic role in the internal market (with more than 45 million active users), including both large online platforms (“VLOPs”) and large online search engines (“VLOSEs”), as identified by the European Commission under Article 33 of the DSA.

Starting from February 17, 2024, however, the DSA began to apply to all intermediary service providers in the EU, including platforms or search engines with fewer than 45 million active users.

What are digital services?
Digital services include a large category of online services, from simple websites to internet infrastructure services and online platforms. These include, for example, online markets, social networks, content sharing platforms, app stores, and online travel and accommodation platforms.

Specifically, the DSA applies to intermediary services, which are defined under Article 3(1) letter g) of the DSA as: 

  1. a “mere conduitservice, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
  2. a “cachingservice, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
  3. a “hostingservice, consisting of the storage of information provided by, and at the request of a recipient of the service. This definition also includes the concept of online platforms.

Which obligations does the DSA introduce for intermediary service providers?
Since February 17th, all intermediary service providers are required to comply with the provisions that entail transparency on algorithms and advertising, promote the fight against online violence and disinformation, as well as provisions protecting minors and precluding user profiling using sensitive data.

Generally speaking, the DSA distinguishes the obligations according to the type of intermediary service providers, providing respectively:

a) provisions applicable to all providers of intermediary services (artt. 11-15 of the DSA);
b) additional provisions applicable to providers of hosting services, including online platforms (artt. 16-18 of the DSA);
c) additional provisions applicable to providers of online platforms (artt. 19-28 of the DSA);
d) additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders (the “e-commerce platforms” or “marketplace”) (artt. 29-32 of the DSA);
e) additional obligations for VLOPs and VLOSEs (artt. 33-43 of the DSA). 

A) Provisions applicable to all providers of intermediary services
The first provisions, stated in Section 1 of Chapter III of the DSA, apply indiscriminately to all service providers.

Specifically, these concerns, the duty to establish contact points of contact for Member States’ authorities, the Commission, and the Board (art. 11 of the DSA) and points of contact for recipients of the service (art. 12 of the DSA).

In addition, all intermediary service providers are required to specify (in a concise, intelligible, and accessible manner) in their general terms and conditions information regarding the restrictions they impose on the use of their services, including the policies, procedures, measures, and tools used for the purpose of content moderation, including algorithmic decision making and human verification. The general terms and conditions must also provide for internal complaint models and out-of-court dispute resolution mechanisms.

Intermediary service providers are also required to act diligently, objectively, and proportionately, taking into account the rights and interests of all parties involved.

Finally, intermediary service providers are obliged to publish, at least once a year, explicit reports on their content moderation activities. These reports should contain: the number of orders received, the number of reports submitted, the use of automated tools, and the number of complaints received.

B) Additional provisions applicable to providers of hosting services, including online platforms
Those providers are required to set up a mechanism for reporting illegal content electronically. Any decisions following up on reports will have to be prompt, taken diligently and in a non-arbitrary and objective manner. In addition, where hosting service providers, use automated tools for the processes of handling reports or in taking the related decisions, they must inform the person or entity that submitted the report of the automated use of such tools.

Where hosting service providers intend to take a restrictive measure, they must also provide all affected service recipients with a clear and specific reason, along with a range of information (e.g., they must set out the facts and circumstances on which it is based, information on any automated tools used, or information on the remedies available to the recipient).

In addition, if there is a suspect that a crime has been committed, the provider must immediately inform the relevant judicial authorities.

C) Additional provisions applicable to providers of online platforms
Online platforms are a subcategory with respect to hosting service providers. They are used to refer to social networks or those platforms that allow consumers to conclude distance contracts with traders, i.e., e-commerce platforms (or marketplaces).

The DSA clarifies that, in order to avoid unreasonable burdens on micro or small enterprises (as defined by Recommendation 2003/361/EC ), they will not be subject to the requirements of Articles 19 to 28 of the DSA for online platforms.

Besides general obligations (i.e., setting up an internal complaint-handling system and allowing for out-of-court dispute resolution), online platform providers:

  • Must adopt a preferential treatment for reports received from so-called trusted flaggers, i.e., an entity (not ever an individual) – of a public nature, or a nongovernmental organization, or private or semi-public bodies, such as consumer associations – that the digital services coordinator (for Italy, the Communications Guarantee Authority – “AGCOM”) recognizes as such since it has proven, among other things, that it has special skills and expertise in fighting illegal content and that it carries out its activities in a diligent, accurate and objective manner.
  • In addition to the information regarding annual reports on content moderation activities prescribed for all providers, they must also report on the number of disputes submitted to dispute resolution bodies, the outcomes of those disputes, the average time for their resolution, and the number of suspensions applied. In addition, every six months, they must publish for each platform and each search engine, information about the average monthly number of active recipients of the service in the EU.
  • They must design, organize, and manage their interfaces in such a way that recipients of the services offered are not deceived or manipulated or so that their capabilities in making free and informed decisions are not materially distorted or compromised. Namely, this is the prohibition of so-called dark patterns (deceptive design patterns), i.e., interfaces and navigation paths designed to influence users, often by exploiting cognitive biases so that they take unconscious or unwanted actions.
  • In the event that they place advertisements on their interfaces, they are also required to ensure that recipients are able to identify clearly, concisely, unambiguously and in real time:
  •  whether the information is an advertisement;
  • the natural or legal person on whose behalf the advertisement is presented;
  •  the natural or legal person who paid for the advertisement, if that person is different from the person on whose behalf it is presented;
  • information on parameters used to determine the recipient to whom it is presented and how these parameters can be changed.
  • They must comply with the transparency obligation on the content recommendation system (i.e., suggestion systems): this means that they must disclose in a clear and transparent manner what parameters are used as well as any options available to service recipients that allow them to change or influence these main parameters.
  • In the case of platforms that are also accessible to minors, they must comply with the obligation to take appropriate and proportionate measures to ensure a high level of privacy, security and protection of minors. On the other hand, if the online platform provider is aware of the user's minor age, profiling-based advertising is prohibited. 

D) Additional provisions applicable to e-commerce platforms providers
As pointed out in the previous paragraph, although there is no specific definition of them in the DSA, e-commerce platforms fall under the category of hosting services. 

The DSA specifies that, in order to avoid any disproportionate burdens on micro or small enterprises, they will not be subject to the additional obligations under Articles 29 to 32 of the DSA for e-commerce platform providers.

Crucial in this regard is the requirement for traceability of commercial operators. It is based on the new “KYBC - know your business client” principle and requires e-commerce platforms to design and set up their online interface in such a way as to enable commercial operators to release to consumers a whole range of information (such as, but not limited to, the name, address, and telephone number of the commercial operator, any business register with which the operator is registered, self-certification of the commercial operator by which it undertakes to offer only products or services that comply with EU standards) in a clear, easily accessible and understandable way. The purpose is to make it clear to consumers that the commercial operator they are buying from is an entity other than the marketplace.

Marketplaces also have an obligation to assess the reliability of the information provided to them by commercial operators. In fact, where the e-commerce platform provider becomes aware that the information it receives is inaccurate, incomplete, or not updated, or has reason to believe so, it must request the commercial operator to remedy the situation without delay. If the latter fails to comply, the e-commerce platform provider must suspend the provision of its service until the commercial operator complies with the request. 

There is also an obligation for marketplace providers to design and set up their online interface from the beginning (so called by design): the aim is to enable commercial operators to fulfill their obligations under EU law in relation to pre-contractual information, product compliance and safety, providing consumers with all the necessary information requests to identify the products and services offered.

Lastly, there are also information requirements: e-commerce platform providers will have to inform consumers, within six months of becoming aware, that the product or service they purchased is illegal, also making them aware of the identity of the commercial operator and any remedy that can be pursued.

***

Over the next few months, the Commission will implement the digital services regulations together with the National Authorities, which – such as AGCOM – will ensure compliance by platforms based in their territory.

In the meantime, all operators concerned are required to adopt a compliance plan in line with the provisions prescribed. 

Failure to do so, and, therefore, in the event of violation of the obligations under the DSA, penalties, under Article 52(3) of the DSA, may be up to 6 percent of total annual turnover, and recipients of digital services may seek compensation for damages or losses incurred as a result of breaches caused by platforms, as provided in Article 54 of the DSA.

Enterprises should also be careful to submit correct and complete information, and/or to rectify, where requested, the information submitted, as well as to subject themselves to inspections. Indeed, failure to do so may result in penalties of up to 1 percent of the annual income or turnover of the provider concerned in such cases, pursuant to Article 52(3) of the DSA.