Table of Contents
European Parliament Proposes Ban on Social Media Use for Children Under 16
The European Parliament has approved a resolution urging a new social media ban for children under 16 unless parents provide explicit consent. While the resolution is not legally binding, it significantly increases pressure on EU lawmakers to consider stricter regulations on social media platforms. A clear majority of Members of the European Parliament (MEPs) supported the motion, reflecting rising concern about digital risks facing young users.
The proposal focuses on the impact of addictive platform design. Features like auto-play videos, infinite scrolling and constant notifications encourage prolonged use, raising questions about how these patterns affect children’s wellbeing. The European Parliament argues that these design choices exploit young users and justify a stronger stance on social media use among minors.
This development also follows global debate. Australia is preparing to implement the world’s first official measure that bans social media for under-16s, and European Commission President Ursula von der Leyen has publicly criticized algorithms that target children’s vulnerabilities.
Read more on our article, Australia’s Social Media Ban for Children Faces a High Court Challenge, published on November 26 2025, SquaredTech.

Why the Age 16 Social Media Limit Matters
Under the resolution, the minimum age for holding a social media account across the EU would be 16. Children aged 13 to 15 could join only with verified parental consent, while access for children under 13 should be heavily restricted.
The move aligns with recommendations from a French government report commissioned by President Emmanuel Macron. That report encouraged limiting smartphone access before age 13 and delaying social media use until adulthood due to rising mental health concerns. Research across Europe links early digital exposure to anxiety, short attention spans and increased emotional stress among young people.
Christel Schaldemose, the Danish MEP who drafted the resolution, said protecting children is a shared responsibility. She called for default platform settings that restrict addictive design elements when used by minors, such as disabling auto-play and limiting endless scrolling.
Challenges, Opposition, and the Road Ahead
Not all lawmakers support the direction of the resolution. Some Eurosceptic MEPs, such as Kosma Złotowski, argue that these decisions belong closer to families and national governments, warning that imposing uniform EU rules may clash with cultural differences or undermine parental authority. This opposition reframes the issue not as one of child safety alone, but as a matter of EU overreach and member-state sovereignty.
This criticism reflects broader concerns among groups who feel the European Parliament should avoid setting overly prescriptive standards for children’s digital use. They argue that parental judgment, local values and domestic laws are better suited to decide how and when minors access social media, rather than a continent-wide framework.
The debate also comes as the European Commission reevaluates its digital regulation agenda, delaying parts of the Artificial Intelligence Act to reduce compliance burdens on companies. Despite this pause, Christel Schaldemose maintains that protecting children’s wellbeing online cannot wait. She highlights manipulative platform designs, often referred to as “dark patterns,” as an urgent problem that existing laws like the Digital Services Act do not fully address.
The resolution strengthens calls for updated safeguards targeting addiction-driven features and commercial exploitation of minors, signaling momentum for more comprehensive EU-wide child-protection rules.
What Squaredtech Sees in the Future of Child Online Protection
Children today grow up immersed in connected environments, and safeguarding them requires policies that balance access with protection. SquaredTech supports efforts by the European Parliament to modernize online safety rules and limit harmful features unless parents actively approve a child’s social media use.
We believe platforms should adopt safer defaults, restrict engagement-driven design and provide transparent tools for families. As discussions continue in Brussels, we will track these developments and advocate for evidence-based solutions that protect minors without excluding them from positive digital experiences.
The resolution does not yet create law, but it signals growing European momentum toward a unified social media ban framework for minors and could shape future child-protection policies across the EU.
Stay Updated: TechNews

