Content moderation and the processing of personal data: the interplay between DSA and GDPR
The DSA, which became fully applicable on 17 February 2024, establishes a harmonized framework for intermediary services in the digital market, aimed at creating a safe and trustworthy online environment and at combating the dissemination of illegal content and disinformation.
The DSA rules primarily apply to intermediary services providing mere conduit, caching, and hosting, namely online services, simple websites, internet infrastructure services, and online platforms (marketplaces, social networks, content-sharing platforms, app stores, etc.). However, the DSA introduces specific rules and more stringent obligations for Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), i.e. online platforms and intermediaries reaching more than 45 million monthly active users in the EU.
It is now widely acknowledged that digital services have a profound impact on our daily lives. They are used to communicate, make purchases, order food, access information, watch movies, listen to music, and much more, facilitating cross-border trade and enabling businesses to reach new markets. Nevertheless, alongside the many benefits of digital transformation, several critical issues have emerged that directly affect users’ fundamental rights, including the protection of personal data.
In this context, the EDPB Guidelines 03/2025 (the “Guidelines”) have been issued to clarify the relationship between the DSA and the GDPR, with the aim of ensuring a coherent and harmonized application of the two regulatory frameworks.
Indeed, the relationship between the DSA and the GDPR is one of complementarity rather than substitution: the former safeguards the proper functioning of the digital market, while the latter ensures the protection of personal data.
The Guidelines are therefore intended to guide providers of intermediary services in the application of the GDPR within contexts regulated by the DSA, promoting a coordinated interpretation of the two frameworks in order to ensure a digital environment compliant with the principles of lawfulness, proportionality, and transparency.
Among the key provisions of the DSA are “voluntary” online content moderation and the “notice and action” mechanism. Both, however, very frequently entail the processing of personal data.
“Voluntary” online content moderation
Pursuant to Article 7 of the DSA, providers of mere conduit, caching, and hosting services (“Providers”) are permitted to carry out voluntary investigations on their own initiative or to take appropriate measures to identify and remove illegal content, or to disable access to such content.
These actions may be carried out using machine-learning technologies capable of detecting specific characteristics of content based on the large datasets used for training. This clearly entails the processing of personal data and therefore requires Providers to comply with the GDPR, including provisions relating to lawfulness, fairness, and transparency towards data subjects, as well as data minimization and the obligations of privacy by design and by default.
In this regard, the EDPB clarifies that voluntary investigations aimed at identifying and removing illegal content may rely on the following legal bases:
- legal obligation (Article 6(1)(c) GDPR), where removal is required by law;
- legitimate interest (Article 6(1)(f) GDPR), where the Provider acts to protect its users or its service.
In the latter case, the Provider shall also meet three cumulative conditions, which should be documented in the Legitimate Interest Assessment (“LIA”):
- the pursued interest shall be legitimate;
- the processing of personal data shall be necessary to pursue that legitimate interest, and such interest cannot be achieved equally effectively by alternative means;
- the interests or fundamental rights and freedoms of the data subjects shall not override the legitimate interest pursued by the controller.
“Notice and action” mechanism and statement of reasons obligation
Article 16 of the DSA requires hosting service providers, including online platforms, to establish reporting mechanisms (“notice and action”) enabling any individual to notify, by electronic means, the presence of illegal content. Upon receipt of a notice, the provider may decide whether to take action, for example by removing or restricting the content.
Such systems entail the processing of personal data relating to the notifier, the recipients of the service, and, in some cases, third parties.
In particular, the Guidelines emphasize the need for proportionate processing limited to the purposes of the DSA, providing that Providers (acting as data controllers) shall:
- collect and process only data that are strictly necessary for the pursued purpose;
- identify the notifier only on an optional basis, except where identification is strictly necessary to qualify content as “illegal”;
- inform the notifier in a clear and transparent manner if their identity is disclosed to the user concerned.
The DSA also allows the use of automated systems for handling notices, provided that notifiers are informed in accordance with Article 13 GDPR. Where such decisions fall within the scope of Article 22 GDPR, stringent safeguards shall be respected, including authorization by EU or national law, the protection of data subjects’ rights, and the prohibition of decisions based on special categories of data, unless explicit consent or substantial public interest grounds apply.
Finally, Article 17 of the DSA requires hosting providers to provide users with a clear and specific statement of reasons for each decision to remove or restrict content, specifying any use of automated means, the legal basis for the action taken, and the available redress mechanisms. This obligation does not apply to removals ordered by competent authorities.
Furthermore, within the framework of activities aimed at combating illegal content, Section 3 of Chapter III of the DSA imposes additional obligations solely on providers of online platforms, which may entail the processing of personal data.
Pursuant to Article 20 of the DSA, both the recipients affected by decisions concerning the illegality of content or its incompatibility with the online platforms’ terms and conditions, and the individuals or entities that submitted a notice, have the right to lodge a complaint, respectively to challenge a decision adversely affecting them or to contest an allegedly inadequate action taken in response to a notice.
The EDPB welcomes the fact that, in both cases, the DSA requires online platform providers to ensure that decisions under Article 20 DSA are taken under the supervision of appropriately qualified staff and not solely on the basis of automated means.
Moreover, considering that “misuse of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints […] undermines trust and harms the rights and legitimate interests of the parties concerned”, Article 23 of the DSA allows online platforms to suspend their relevant activities of persons engaged in abusive behavior (i.e. recipients that frequently provide manifestly illegal content and notifiers or complainants that frequently submit manifestly unfounded notices or complaints). While providing safeguards against the misuse of online platforms, the DSA also requires that such safeguards be “appropriate, proportionate and effective” and that they “respect the rights and legitimate interests of all parties involved, including the applicable fundamental rights and freedoms enshrined in the Charter”.
In this regard, the EDPB welcomes the safeguards already identified by the DSA, as they help prevent the adoption of automated decision-making in such cases, and recalls to online platform providers to take into account, when identifying measures to counter abuse and defining related policies in their terms of use, the need to ensure compliance with all data protection principles set out in Article 5 GDPR, in particular the principles of data minimization, accuracy, transparency, and data retention.
Conclusions
The analysis of the interplay between the DSA and the GDPR clearly shows that the two regulatory frameworks should not be viewed as alternatives, but rather as complementary instruments. The DSA aims to ensure a safer, more transparent, and more accountable digital ecosystem, while the GDPR remains the primary reference point for the protection of users’ personal data. Content moderation and the implementation of notice and action mechanisms therefore require an integrated approach, in which the principles of lawfulness, proportionality, data minimization, and transparency guide every stage of data processing.
For companies falling within the scope of application of the DSA, there is a clear need to adopt structured processes and robust governance measures. In practical terms, several operational priorities can be identified, including:
- mapping of processing activities related to content moderation: companies should update their records of processing activities, clearly identifying operations related to both voluntary investigations and reporting mechanisms, including the legal bases relied upon, categories of data processed, retention periods, and any automated systems used.
- LIA and DPIA: where legitimate interests are relied upon, it shall be rigorously documented through a LIA, and where machinelearning technologies or automated moderation are used, DPIA should be carried out.
- User information: privacy notices and terms of service shall be reviewed in light of the DSA requirements.
- Complaint and redress management systems (Article 20 DSA): platforms should establish effective and accessible channels to (i) allow users to challenge the removal or restriction of content and (ii) allow notifiers to contest inaction or the inadequacy of the response.