France Moves toward Stricter Social Media Regulations

On January 26, 2026, the National Assembly of France introduced a bill banning social media for children and young people under the age of 15. Following Australia and Greece, France could become the third country to impose an age restriction on social media. However, the law must first pass the Senate and be approved by the European Court of Justice (ECJ). This is because legal requirements for online platforms in the European Union are subject to the Digital Services Act (DSA), which limits the impact of decisions made by individual EU nations.

Dr. Stephan Dreyer finds the French approach difficult. In statements for the Science Media Center (SMC), he and other experts explain the limited scope of action available to EU countries regarding bans on social networks and the potential impact of the law in France.

The expert opinon on “Vorstoß in Frankreich für strengere Regelung sozialer Medien” [France Moves toward Stricter Social Media Regulations] is only available in German. Detailed background information can be found on the website of the Science Media Center.

Statement by Stephan Dreyer

Legislative Proposal in France

Yesterday, the French National Assembly passed a law that must now go through the second chamber. However, this does not require platform providers to implement a ban. Due to the primacy of the Digital Services Act (DSA), it would not apply. Instead, the French legal regulation takes a more indirect approach to circumvent the DSA: it simply declares all current and future contracts between underage users and platform providers null and void. Technically, this is a civil law clause, not a media regulation ban.

“For platform providers, the French law would create a strong incentive to verify the age of users, as otherwise they would be providing services and processing data without a contractual basis. Platform operators do not want this for a variety of reasons, including the liability risk associated with processing data from underage individuals. It would therefore make sense to carry out age checks to ensure the validity of the contracts. I cannot say whether this ‘circumvention’ would ultimately conflict with the full harmonization of the DSA, because the legal consequence is not dissimilar to a de facto obligation. That would have to be clarified by the European Court of Justice (ECJ) if someone were to complain — and that does not seem likely at present.”

Assessment of the French Approach

“I consider the French approach to be problematic because the new requirement has created an indirect, yet de facto, age verification obligation for platforms. However, a direct obligation at the national level would be blocked by European law under the DSA. Against this backdrop, the approach appears to circumvent European full harmonization, i.e., the legal framework for online platforms conclusively regulated by European law. Full harmonization refers to a situation in which an EU regulation conclusively regulates its area of application at the EU level. National provisions within the scope of such a regulation, with the same regulatory objective as the regulation, are then not applicable.”

“I consider legally prescribed age limits for important information and communication platforms, which play a central and important role in the everyday media lives of adolescents, to be the wrong approach. In terms of the proportionality of the measure, these age limits are legally problematic. As a result, comprehensive age verification procedures will also be introduced, which will affect all adults on these platforms. At the same time, young people will find ways to access the platforms by circumventing the requirements or will find their way to less well-protected offerings.“

”In addition, platform providers will be able to dispense with youth protection measures on a large scale in the future, as young people will no longer be there by law. This would undo the protective functions and awareness that politicians and society have already achieved with the major providers over the last ten years. Certainly, platform providers can do more today to protect young people. Since 2024, we have had a legal framework in place with the DSA that obliges providers to design their platforms in an age-appropriate manner and allows for a tougher stance towards the platforms. With general age limits, we are practically losing the path of vigorous enforcement of existing law.”

The Scope of Application of French Law

The French draft law regulates the validity of contracts based on age. However, for cross-border contracts, the law of the service provider’s country generally applies. For large social media platforms, this would be Ireland. Therefore, the French approach would only apply to service providers based in France. The same applies to media regulatory provisions. Even in cases involving bans directed at platforms, a French standard would only apply to providers based in France. Providers from other EU countries can invoke the country of origin principle, which states that only the national law of the country of establishment applies to a service provider. The same would apply to a German regulation, should it be adopted in the future. “

National legal initiatives lead to increased complexity for providers offering their services across Europe or globally. To comply with a national standard, providers usually have to use IP geolocation to determine which users are subject to a specific national provision. Providers must then implement special hurdles, measures, or functions for those users. This process is costly and error-prone and has the exact opposite effect of the European legal framework: it fragments legal requirements in Europe, resulting in a lack of a uniform digital single market in the EU.

National Regulations versus EU Regulations

“The DSA does not regulate which content is inadmissible, but rather determines the procedures and obligations that apply when illegal content exists on platforms. In Germany, for instance, content is considered illegal under both criminal law (StGB) and youth media protection law (§§ 4, 5 JMStV).”

However, age limits for social media platforms would not involve legal requirements regarding the illegality of certain content. Rather, they would involve restrictions on access to an entire category of services — social media platforms, social networks, and similar services, for which there is currently no definition — regardless of whether illegal content is available on these platforms. This category includes services that fall within the scope of the DSA — “intermediary services” in the form of “online platforms.” Any national legal requirement for the protection of minors that obligates these services to act or refrain from acting would amend or tighten the applicable EU legal framework. Thus, it would be undermined by the primacy of EU law and would not be applicable.

Ultimately, applicable national age restriction requirements can only exist if platform providers are not obligated by national law. For example, Greece blocks access for younger people outside the platform’s sphere of influence. If platforms were obliged to implement such restrictions, however, this legislative decision would have to be made at the EU level.

Requirements That Must Be Taken Into Account When Imposing a Ban

“In addition to the compatibility of a national regulation with European law, constitutional and human rights requirements must be taken into account above all. These include the Basic Law, the Charter of Fundamental Rights, and the European Convention on Human Rights. In the case of general prohibitions, the key issue is the preservation of proportionality, i.e.: A prohibition must pursue a legitimate purpose, be suitable and necessary to achieve that purpose, and be proportionate. With the exception of the legitimate purpose, I see essential points in all other aspects that would need to be examined in detail before a prohibition is enacted. However, these in-depth proportionality tests and fundamental rights impact assessments have been lacking in political debates to date.”

Legislative proposals at the EU level

“The EU Parliament and the member states have each called for an age limit in statements. The Commission is still keeping a low profile, but Commission President von der Leyen has already expressed her support for an age limit. At the EU level, the question will likely be about the framework within which such a regulation will be introduced. With the plan for a Digital Fairness Act, a possible legal act is already in the works; however, a concrete EU standard is not expected until fall 2026.”

Reporting on Legislation Changes

“Some reporting on the demand in France referred to a “change” to the DSA that would allow national age limits. However, this is not entirely correct, as there has been no change to the DSA. I suspect this refers to Article 28’s guidelines, where a sentence from the EU Commission is interpreted by member states as ‘permission’ for national regulations (point 37): ‘Where Union law or national law in accordance with Union law prescribes a minimum age for access to certain products or services offered and/or presented in any way on an online platform, including specifically defined categories of online social media services.'”

“In my view, this was a smokescreen by the Commission because it says ‘in accordance with Union law.’ Therefore, the primacy of the DSA must also be considered. However, the member states want to interpret it as meaning that they can impose national age limits on entire types of services, based on a guideline that is not legally binding. Unfortunately, the EU Commission is currently very cautious when it comes to the question of primacy of application. This is not good for the unity of the legal system or the harmonization of the digital single market. We already addressed children’s rights in the digital world at the SMC briefing last summer.”

Greece

“Greece has had a legal framework in place since October that does not impose any obligations on platform providers that could cause problems with the (DSA). Smartphones and tablets must be equipped with an app from the Greek government that activates an access filter for social media for children under 15, provided that their age has been verified and parental consent has been given for children aged 13 and over. From a legal perspective, this is a smarter approach.”

Denmark

“I haven’t heard anything from Denmark recently. At the beginning of the year, it seemed like there were legal indications that a national age limit for certain types of media would not be problematic. However, in my opinion, this idea was not properly thought through.”

Great Britain

“The UK is also planning to introduce an age limit for platform providers. This is not a problem in terms of the DSA, as the UK is no longer part of the EU. Therefore, the UK is not prevented by EU law from imposing an age limit on certain services. However, human rights, children’s rights, and the principle of proportionality still apply. The Starmer government has been cautious about implementing the parliamentary mandate thus far and has requested time to consider the matter.”

Last update: 31.01.2026

Newsletter

Information about current projects, events and publications of the institute.

Subscribe now