Members of the European Parliament (MEPs) are advocating for strong EU measures to safeguard minors online. They propose setting a minimum age of 16 for online users and banning the most harmful addictive practices.
On Wednesday, 26 November, MEPs adopted a non-legislative report by 483 votes to 92, with 86 abstentions. The report expresses grave concern about the physical and mental health risks that minors encounter online. It calls for enhanced protection against manipulative strategies that can lead to addiction and negatively impact children’s ability to concentrate and engage with online content healthily.
The report reveals that 97% of young people go online daily, with 78% of 13- to 17-year-olds checking their devices at least once an hour. Additionally, one in four minors shows signs of problematic smartphone usage akin to addiction. The 2025 Eurobarometer finds that over 90% of Europeans see the need for urgent action to protect children online, particularly against social media’s negative effects on mental health (93%), cyberbullying (92%), and access to inappropriate content (92%). In response, member states are starting to implement age limits and verification systems.
“I am proud of this parliament, that we can stand together in protecting minors online. Together with strong, consistent enforcement of the Digital Services Act, these measures will dramatically raise the level of protection for children. We are finally drawing a line. We are saying clearly to platforms: your services are not designed for children. And the experiment ends here,” said rapporteur Christel Schaldemose (S&D, Denmark) during the debate.
To help parents manage their children’s digital presence and ensure safe online engagement, Parliament proposes a minimum EU age of 16 for accessing social media, video-sharing platforms, and AI companions, allowing 13- to 16-year-olds access with parental consent. MEPs support the development of an EU age-verification app and the European digital identity (eID) wallet, emphasising the need for accurate age-assurance systems that protect minors’ privacy.
To enhance compliance with the EU’s Digital Services Act (DSA), MEPs suggest holding senior managers personally liable for serious non-compliance, especially in protecting minors.
They also call for:
– A ban on harmful addictive practices and the automatic disabling of addictive features for minors (like infinite scrolling and auto-play).
– A ban on non-compliant websites.
– Action against persuasive technologies, including targeted ads and dark patterns, under the upcoming Digital Fairness Act.
– A ban on recommendation systems based on engagement for minors.
– The application of DSA rules to online video platforms and prohibiting loot boxes and similar features.
– Protection of minors from commercial exploitation, including a ban on financial incentives for kidfluencing.
– Urgent measures to address ethical and legal challenges posed by generative AI tools, such as deepfakes and non-consensual image manipulation.
