(AGENPARL) - Roma, 26 Novembre 2025(AGENPARL) – Wed 26 November 2025 [1] scribo-webmail-logo [44]
Press service **
European Parliament **
Available in *
[2] scribo-webmail-bg [45]
[3] scribo-webmail-es [46]
[4] scribo-webmail-cs [47]
[5] scribo-webmail-da [48]
[6] scribo-webmail-de [49]
[7] scribo-webmail-et [50]
[8] scribo-webmail-el [51]
[9] scribo-webmail-fr [52]
[10] scribo-webmail-hr [53]
[11] scribo-webmail-it [54]
[12] scribo-webmail-lv [55]
[13] scribo-webmail-lt [56]
[14] scribo-webmail-hu [57]
[15] scribo-webmail-mt [58]
[16] scribo-webmail-nl [59]
[17] scribo-webmail-pl [60]
[18] scribo-webmail-pt [61]
[19] scribo-webmail-ro [62]
[20] scribo-webmail-sk [63]
[21] scribo-webmail-sl [64]
[22] scribo-webmail-fi [65]
[23] scribo-webmail-sv [66]
Press release
26-11-2025
Plenary session
IMCO
Children should be at least 16 to access social media, say MEPs [24]
Grave concern over physical and mental health risks to minors online, with 25% of them displaying “problematic” smartphone use
Stricter enforcement of EU digital rules, with fines and potential bans for non-compliant platforms
Bans on engagement-based recommender algorithms and loot boxes in games
Push to act on generative AI tools such as deepfakes and nudity apps
MEPs are calling for ambitious EU action to protect minors online, including an EU-wide minimum age of 16 and bans on the most harmful addictive practices.
On Wednesday, MEPs adopted a non-legislative report by 483 votes in favour, 92 against and with 86 abstentions, expressing deep concern over the physical and mental health risks minors face online and calling for stronger protection against the manipulative strategies that can increase addiction and that are detrimental to children’s ability to concentrate and engage healthily with online content.
Minimum age for social media platforms*
To help parents manage their children’s digital presence and ensure age-appropriate online engagement, Parliament proposes a harmonised EU digital minimum age of 16 for access to social media, video-sharing platforms and AI companions, while allowing 13- to 16-year-olds access with parental consent.
Expressing support for the Commission’s work to develop an EU age verification app and the European digital identity (eID) wallet, MEPs insist that age assurance systems must be accurate and preserve minors’ privacy. Such systems do not relieve platforms of their responsibility to ensure their products are safe and age-appropriate by design, they add.
To incentivise better compliance with the EU’s Digital Services Act (DSA) and other relevant laws, MEPs suggest senior managers could be made personally liable in cases of serious and persistent non-compliance, with particular respect to protection of minors and age verification.
Stronger action by the Commission*
Parliament is also calling for:
– a ban on the most harmful addictive practices *and default disabling of other addictive features for minors (including infinite scrolling, auto play, pull-to-refresh, reward loops, harmful gamification);
– a ban on sites not complying with EU rules;*
– action to tackle* persuasive technologies*, such as targeted ads, influencer marketing, addictive design, and dark patterns under the forthcoming Digital Fairness Act;
– a ban on engagement-based recommendation systems* for minors;
– application of DSA rules to online video platforms* and outlawing of loot boxes and other randomised gaming features (in-app currencies, fortune wheels, pay-to-progress);
– protection of minors from commercial exploitation*, including by prohibiting platforms from offering financial incentives for kidfluencing *(children acting as influencers);
– urgent action to address the ethical and legal challenges posed by generative AI tools including deepfakes, companionship chatbots, AI agents and AI-powered nudity apps *(that create non-consensual manipulated images).
Quote*
Rapporteur Christel Schaldemose ( [25] S&D, Denmark) said during the debate: “I am proud of this parliament, that we can stand together in protecting minors online. Together with strong, consistent enforcement of the Digital Services Act, these measures will dramatically raise the level of protection for children. We are finally drawing a line. We are saying clearly to platforms: your services are not designed for children. And the experiment ends here.”
Background *
The report refers to research according to which 97 [26] % of young people go online every day and 78% of 13 to 17-year-olds check their devices at least hourly. At the same time, one in four minors display ‘problematic’ or ‘dysfunctional’ smartphone use [27], i.e. behavioural patterns mirroring addiction.
According to the 2025 Eurobarometer [28], over 90% of Europeans believe action to protect children online is a matter of urgency, not least in relation to social media’s negative impact on mental health (93%), cyberbullying (92%) and the need for effective ways to restrict access to age-inappropriate content (92%).
Member states are starting to take action and responding with measures [29] such as age limits and verification systems.
Further information
[30] scribo-webmail-arrow [67]
Committee on the Internal Market and Consumer Protection
[31] scribo-webmail-arrow [68]
Committee on the Internal Market and Consumer Protection
[32] scribo-webmail-arrow [69]
[33] scribo-webmail-arrow [70]
[34] scribo-webmail-arrow [71]
[35] scribo-webmail-arrow [72]
Procedure file
[36] scribo-webmail-arrow [73]
European Commission guidelines on protecting minors online
[37] scribo-webmail-arrow [74]
EP Research Service: Study “Harmful Internet Use Part I: Internet addiction and problematic use”
[38] scribo-webmail-arrow [75]
EP Research Service: Briefing “Protecting children online: Selected EU, national and regional laws and initiatives”
[39]
MEPs want to ban engagement-based recommender algorithms and gambling-like game features to protect minors © Adobe Stock / olehslepchenko
Yasmina YAKIMOVA
Press Officer (BG)
scribo-webmail-phone [76]
scribo-webmail-portable [77]
scribo-webmail-mail [78]