Last Tuesday, the European Association for Digital Transition (AETD in its Spanish acronym) warned that the large digital platforms are unwilling to implement truly effective age verification systems, despite growing evidence on the risks the current digital environment poses to children. The Association underlines that, to date, the measures taken by the platforms have generally been cosmetic and aimed at avoiding responsibility rather than ensuring effective protection.
In this context, the AETD welcomes the Spanish government’s announcement to ban access to social media for children under 16. This measure is consistent with existing social and educational concerns, and is aligned with the proposal for a State Pact in defence of minors in the digital realm, advocated by the Association itself in 2023. This initiative was promoted by Ana Caballero, vice-president of the AETD, who chaired the group of experts created two years ago by the Ministry of Youth and Children to improve the protection of minors in the digital sphere.
3 MEASURES– VERIFICATION, LABELLING AND CATEGORIZATION
Nevertheless, Caballero warns that “improvement of the digital environment for minors cannot rest only on restricting access”, and she applauds the fact that the Spanish government has explicitly noted the responsibility of platform executives.
The Association has repeatedly noted, digital platforms have not shown a real inclination to implement effective age verification systems, and any regulatory progress will be insufficient if it is not accompanied by concrete and verifiable technical measures. In this sense, Caballero stresses the need to “establish mechanisms for smart labelling of content, as well as a clear and unified categorization of content that allows risks to be identified and will limit minors’ exposure”. The organisation warns that without joint implementation of these three measures – effective verification, smart labelling and content categorization – it will not be possible to make significant progress to protect minors in digital environments.
The AETD also points out that the Spanish government’s announcement is still vague in significant areas, and that in some respects it is more like a declaration of intent than a fully operational plan. Specifically, the association considers the commitment to investigate algorithms that control content distribution is insufficiently defined, since there are no effective mechanisms to audit them, nor are the platforms required to have clear accountability or real obligations to be transparent. In the absence of these instruments, the organisation warns, it is impossible to rigorously assess the real impact of recommendation systems.
PROFITABILITY CANNOT BE BUILT AT THE EXPENSE OF HEALTH AND RIGHTS
The association understands that the business model of large digital platforms is legitimately based on obtaining profits, but stresses that this profitability cannot be built at the expense of the health and fundamental rights of citizens. This concern is even more serious when the main people affected are especially vulnerable groups, such as minors, who have not yet fully developed the critical and executive capacities necessary to identify the risks.
In this regard the AETD stresses that this is not an isolated problem nor bad management by individuals, but rather it is a structural, corporate and systemic problem, linked to the very functioning of the platforms and the incentives they create. The organisation considers it essential to analyse which practices and designs are facilitating the creation of addictive patterns, as well as their relationship with phenomena such as disinformation, cyberbullying and the deterioration of minors’ emotional well-being.
Given all this, the AETD calls for the Spanish government to equip the Spanish National Markets and Competition Commission (CNMC in its Spanish acronym) with the necessary tools to strengthen effective control over the platforms, in line with the Digital Services Act

