Why Banning Social Media Is Not the Solution

As of December 2025, minors under the age of 16 will no longer be allowed to create social media profiles in Australia. More and more European countries are planning to introduce age limits for social media. Following Australia, Greece, and France, Denmark, Ireland, Austria, and Norway are currently looking for ways to implement an age limit. However, Dr. Stephan Dreyer, an expert on legal issues at the intersection of youth protection and data protection, argues in his articles that such a ban would undermine youth protection and be disproportionate.

The following interviews and articles by Stephan Dreyer are only available in German.

Latest Contributions

In an interview with Sebastian Meineck for netzpolitik.orgn an interview with Sebastian Meineck for netzpolitik.org, Stephan Dreyer explains that there are essentially three obstacles to implementing a social media ban for minors at the national level: the primacy of the Digital Services Act (DSA), the country of origin principle, and the principle of proportionality. If EU countries enact national regulations, they are not applicable. “ That’s why we actually have to understand these national initiatives primarily as an attempt to put pressure on the EU. Here, a political decision to enact a Europe-wide ban,” says Stephan Dreyer in an interview with Karin Helmsteadt for DW News.

He does not consider a ban to be the right approach. In his commentary for Mediendiskurs Online, Dreyer writes that youth media protection is increasingly seen as a means of addressing perceived unease. “Age checks are the regulatory Swiss Army knife of 2026. They fit everywhere, solve all our problems, and are invisible at best according to the narrative. In practice, however, three conflicting goals emerge: effectiveness, privacy, and a low threshold. You can’t maximize all three at the same time,” he writes.

What is the Current State of Research on Social Media Use?

Dreyer stated, “It is incredibly difficult to prove that media use is a causal factor in mental illness (…) However, there is growing evidence that certain types of use are linked to certain illnesses.”

In an interview with Karin Helmsteadt for DW News, he refers to the commission established by Federal Minister Karin Prien. Experts, including Dr. Claudia Lampert from the HBI, are attempting to develop recommendations based on empirical evidence. The recommendations are expected in summer 2026.

Why a Blanket Ban Is Not the Solution

“When adolescents are excluded from central communication environments, circumvention practices, newly used shadow offerings, and increased inequalities arise once again,” writes Dreyer in his commentary for mediendiskurs online.

“The moment we say that only people over the age of 16 are allowed on these platforms, providers can sit back and say, ‘Then we’ll basically turn off all the child and youth protection measures we’ve already implemented. Because children aren’t allowed on our platform. And this has two fatal consequences. First, children will always find ways to use these platforms anyway. This means they will be less protected than before. Second, they know they are doing something forbidden. In the event of problems or attacks, for example, they may not dare to ask adults for help because they have done something forbidden. So it’s fatal in two ways,” he explains in an interview with Karin Helmstaedt for DW News.

A Milder Measure: The Digital Services Act (DSA)

Although minors are not excluded from platforms under the DSA, platform providers are required by Article 28 of the DSA to design age-appropriate offerings. According to Dreyer, there is still room for improvement in the implementation of Article 28. However, here is a good example: According to last week’s EU Commission communication, TikTok is in violation of the DSA. Dreyer says this is “proof that with the DSA, we have an existing legal framework that can be enforced and that, above all, results in the age-appropriate design of these offerings.” A total ban would allow platforms to withdraw because they could claim that there are no longer any young people on the platform. This would enable them to disable all youth and child protection functions.”

How Could Age Verification Work Technically?

According to Stephan Dreyer, age verification procedures would be a possible solution. Users would have to prove to platform providers electronically that they have reached a certain minimum age. Dreyer: “There are a whole range of age verification procedures. (…) Either by holding a photo of an ID card up to the webcam or by estimating age using biometric facial features. There is also voice recognition and hand movement recognition, which allow an algorithm to determine someone’s age. (…) There are wallet-based methods, for example, where I can store my verified age in a wallet provided by my bank. The platform can then read this wallet without transmitting my name or face. Only the age signal is transmitted, i.e., whether I am over 18 or 16. Australia conducted a large-scale age assessment trial and wrote an expert report on it. The result? There is not one good solution, but rather, several should be kept in reserve.”

Photo: AI-generated symbolic image of a road with many obstacles

Last update: 20.02.2026

Newsletter

Information about current projects, events and publications of the institute.

Subscribe now