Protecting minors in the digital sphere: the European Commission’s guidelines on the DSA

In recent years, the debate on protecting minors online has evolved from a primarily ethical concern into a field shaped by detailed regulatory and technical requirements. With the publication on 10 October 2025 of the final version of the Guidelines on the protection of minors (the “Guidelines”) under the Digital Services Act (“DSA” – Regulation EU 2022/2065), the European Commission has set out clear standards to assess compliance with Article 28(1) of the DSA. This provision requires online platforms to ensure a high level of protection, safety, and privacy for minors.

The purpose of the Guidelines is to support online platforms that are accessible to minors in identifying, assessing, and mitigating risks by providing a non-exhaustive list of measures designed to strengthen the protection of minors.

When is an online platform considered “accessible to minors”?

It is not sufficient for platforms to state in their terms and conditions that a service is “not permitted for minors”. According to the Guidelines, an online platform can be considered accessible to minors if it:

  • in practice allows minors to access the service without implementing effective measures to prevent such access; or
  • is predominantly used by, or targeted at, minors; or
  • the provider knows or can reasonably be expected to know that part of its user base consists of minors (for example, because it processes data revealing age or is widely known among minors).

What safety measures do the Guidelines identify?

The Guidelines outline several key measures to ensure a high standard of protection for minors:

(i) Risk assessment

The Guidelines suggest that online platforms should conduct a comprehensive, evidence-based risk assessment. This involves not only identifying illegal content but also understanding how minors interact with the service, which risks they face, and whether current measures are effective. The assessment should follow existing laws and recognised best practices 1, be updated at least annually, and be published in a summarised form, indicating the level of risk for minors (e.g., low, medium or high).

(ii) Age-assurance measures

To ensure that minors do not access services or content inappropriate for their age, platforms must have reliable ways to determine whether users meet age requirements. Age assurance refers to methods to estimate or verify users’ age or to confirm they are above or below a given threshold.
The Guidelines identify three common methods:

  • self-declaration: users state their age or age range. This is easy to bypass and the Guidelines do not consider it adequate;
  • age estimation: technologies that estimate age using elements such as facial analysis, typing style, interests or online activity. This method may raise accuracy and privacy concerns;
  • age verification: more reliable checks that use official documents (e.g., passport, ID card) or trusted digital credentials (e.g., government-issued ID or the EU Digital Identity Wallet) 2.

When choosing a method, the principle of proportionality should be applied: platforms should use stricter methods only where the risk is higher (e.g., gambling or anonymous chat services) 3, and avoid collecting more information than necessary 4. Platforms should also clearly explain how and why age information is requested, provide users with more than one option, and make available an accessible complaints channel 5.

(iii) Registration, account settings, interface design and other tools

The Guidelines consider that, when age assurance is deemed necessary, registration or authentication may serve as an initial step. The Guidelines also highlight the critical role of design and default settings: because most users do not change default configurations, these setting shape minors’ behaviour online experiences. This means, for example, implementing restrictive privacy settings by default (e.g., controlling who can follow or message the minor), automatically disabling risky features (such as geolocation, autoplay, microphone and camera, contact syncing and tracking), avoiding addictive design features (such as infinite scrolling, constant notifications, automatic video playback), and providing support tools that are easy to find and use.

(iv) Recommendation systems and commercial practices

The Guidelines devote significant attention to commercial practices and recommendation systems 6. Minors are particularly vulnerable to persuasive techniques: they do not always distinguish between content and advertising, or between suggestions and manipulation. For this reason, the Guidelines recommend explaining why content is recommended, allowing users to reset their feed, and require – among other things – that advertisements or sponsored content (e.g., by influencers) be clearly recognisable, that algorithms do not amplify harmful or age-inappropriate content, and that AI systems are not used as tools to influence or sell to minors (e.g., via chatbots).

(v) Content moderation

Content moderation involves monitoring and removing content or users that may harm minors’ privacy, safety, and wellbeing. It is an important tool to safeguard users and prevent serious risks such as bullying, exposure to harmful content or grooming 7. In this regard, the Guidelines provide that platforms should, among other things, clearly define harmful content and behaviours, adopt specific procedures and policy for prompt removal of harmful or illegal content and accounts, ensure human oversight of content moderation (in addition to automated tools), provide regular training for moderators, adopt technical safeguards to prevent AI systems from creating or disseminating harmful content, and regularly review and improve moderation systems.

What does this mean for online platforms?

Platforms face a dua challenge: ensuring the technical effectiveness of safety measures (such as age verification, moderation, transparent recommendation systems) while also maintaining a fair balance with fundamental rights, privacy, and user experience.

As age-verification systems continue to develop and become interoperable across the EU, platforms should map risks for minors, document their decisions, and publish their risk assessment. They will need to adopt “child-friendly” terms and conditions and describe their services in clear, age-appropriate language (e.g., through infographics).

Ultimately, and as emphasised in the Guidelines, platforms should implement credible governance mechanisms for the protection of minors, with dedicated roles, policy and training.


 1. The Guidelines refer to the models, templates, and other guidance provided by UNICEF (e.g.,https://www.unicef.org/childrightsandbusiness/workstreams/responsible-technology/D-CRIA), by the Dutch Ministry of the Interior and Kingdom Relations, or by the European standardization body CEN-CENELEC. As for very large online platforms and search engines, this risk analysis may also be carried out as part of the general assessment of systemic risks under Article 34 of the DSA.

 2. Once implemented, the European Digital Identity Wallets will provide secure, reliable and privacy-preserving electronic identification means for everyone in the UE. Each Member State is required to provide all its citizens, residents and businesses with at least one wallet, which should allow them to prove their identity and to securely store, share and sign important digital documents by the end of 2026. Before the European Digital Identity Wallets become available, the Commission is piloting a European age-verification solution (in the form of an app) that can only confirm whether the user is over 18, which, once completed, will serve as an example of compliance and a benchmark standard for age-verification methods.

 3. The Guidelines emphasize that only certain content, sections, or features of a platform may pose risks to minors, or that there may be areas where risks can be mitigated through other measures and/or areas where this is not possible. In such cases, instead of imposing age limits for accessing the service as a whole, providers of such online platforms should assess which content, sections, or features on their platform pose risks to minors and implement access restrictions supported by age-verification methods to reduce those risks in a proportionate and appropriate manner. For example, the Guidelines consider age-verification systems to be proportionate for accessing gambling-related content, or where, due to the risks identified for minors, the terms and conditions require users to be at least 18 years old to access the service, or where risks are identified relating to harmful content and behaviours, consumer risks, or contact risks (e.g., features such as chat conversations, anonymous messaging, image/video sharing) and such risks cannot be mitigated through less intrusive measures, or where EU or national law requires a minimum age to access certain products or services. Conversely, the use of an age-estimation system will be considered proportionate where, due to the risks identified, the platform’s terms and conditions require users to exceed a minimum age lower than 18 to access the service, or where the platform has identified medium-level risks for minors on its service that cannot be mitigated through less restrictive measures.

 4. When considering age-estimation methods that require the processing of personal data, providers of online platforms accessible to minors should ensure that data-protection principles, in particular data minimisation, are properly implemented and remain robust over time, as well as take into account the EDPB 2025 Statement on Age Assurance. In particular, age verification should be treated as a separate and distinct process, not linked to other data-collection activities of online platforms, and it should not allow for the retention of personal data beyond information relating to age ranges.

  5. The Guidelines specify that such a system may also be integrated into the internal complaint-handling system (Article 20 of the DSA).

  6. These are systems that determine how information is prioritised, optimised and displayed to minors. For further details, please refer to our previous contribution, available on AgendaDigitale (in Italian).

  7. These are cases in which someone tries to befriend a minor in order to deceive them or make them feel uncomfortable.