EU: France and Greece on the offensive to keep children away from social networks

EU: France and Greece on the offensive to keep children away from social networks

June 6, 2025

Cyberbullying, disinformation, hate speech... Content dangerous to children is proliferating online, to the point of pushing several European countries, including France, to limit minors' access to social networks.

The European Union already has one of the strictest legislative arsenals in the world to regulate digital giants.

But calls to go further are growing among the Twenty-Seven, as studies demonstrate the negative effects of social networks on mental and physical health.

Supported by France and Spain, Greece has proposed regulating the use of online platforms by children, in the face of concerns about their addictive nature.

These countries will present these measures on Friday at a ministerial meeting in Luxembourg.

"We have an opportunity that we cannot miss, and that is what I also came to tell the European Commission today," declared the Minister for Digital Affairs, Clara Chappaz.

The proposal includes setting an EU-wide digital age of majority, below which children would not be able to access social media without parental consent.

"Age verification is possible. It's already underway in France for pornographic sites, and we want the same for social media," the minister affirmed.

France has been at the forefront of platform regulation, with a law passed in 2023 requiring platforms to obtain parental consent from users under the age of 15. The measure has not yet received the necessary approval from the EU.

France also introduced a requirement this year for pornographic sites to verify users' ages to prevent children from accessing them. This measure led three of them – Youporn, Pornhub, and Redtube – to go offline this week in protest.

Under pressure from the French government, TikTok also banned the hashtag #SkinnyTok on Sunday, which promotes extreme thinness.

– Age verification –

France, Greece, and Spain are denouncing algorithms that expose children to addictive content that can worsen anxiety, depression, and low self-esteem.

These countries are also concerned about exposure to screens from a very young age, which is suspected of hindering the development of minors' relational skills as well as other essential learning.

Other EU member states have expressed support for the initiative, including Denmark, which will hold the six-month rotating presidency of the EU Council starting in July and has pledged to make the issue a priority.

The authors of the proposal call for "an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors."

They would like devices like smartphones to have an age verification system.

The European Commission, the EU's digital watchdog, plans to launch an age verification app next month, with assurances that it will not involve the disclosure of personal data.

In May, the EU published interim guidelines for platforms to better protect minors. They are due to be finalized this month following a public consultation.

These non-binding guidelines currently include setting children's accounts to private mode by default, as well as simplifying blocking and muting options.

Brussels is currently investigating the social media platforms Facebook and Instagram, owned by the American group Meta, as well as TikTok, under its new Digital Services Regulation (DSA). These platforms are suspected of failing to adequately protect children from harmful content.

Last week, the Commission also opened an investigation into four pornography sites (Pornhub, Stripchat, XNXX, and XVideos) suspected of failing to prevent children from accessing adult content.

en_USEnglish