In TechCrunch’s recent article, Aisha Malik reports how a wave of countries including from Australia to Spain, France, Greece, and more are moving towards restricting and/or banning social media use for children under the age of 16 years old. These actions reflect a growing global unease with how current platforms affect youth mental health, safety, and development.
What is most striking about Aisha Maliks’ report is how many of these policies are being debated or adopted in Europe: France has advanced a bill banning social media for under-15s, Spain has announced plans to bar under-16s, and similar conversations are underway in Germany, Portugal, Greece, Denmark and elsewhere. This European momentum is a clear signal that governments are willing to intervene where platforms have failed to self-regulate.
These age restrictions deserve credit for acknowledging that something is fundamentally wrong. They show that policymakers are no longer content to leave this to industry voluntary standards or vague terms of service. Governments are asserting that when social networks are designed to maximize engagement and profit, regulators must step in to protect the most vulnerable.
But there is a hard truth beneath all of this: Setting age limits does not change the systems that create harm in the first place.
The Limit of Age Thresholds
It is not completely naive to think that restricting access for younger teens will solve the problem of social media harm. After all, the risk of exposure to cyberbullying, addictive recommendation feeds, and harmful content is real. Research points to significant mental health concerns tied to heavy social media use. But what happens when a teenager turns 15 or 16? Or 18? Or 25?
The underlying platforms don’t suddenly become kinder to older users, and the algorithms that drive engagement remain the same regardless of age. Those recommendation systems, infinite scroll mechanics, personalized feeds and behavioral cues that make these services “Addictive” are not designed around well-being. These systems are optimized for attention. Millions of adults struggle with compulsive social media behaviour. In the United States parents and individuals are part of lawsuits alleging that addictive design has caused real harm to their children’s mental health and wellbeing. “Big Tech” is already facing legal scrutiny over the very patterns of engagement those age bans are trying to mitigate.
If the problem were simply age, then reaching the age of majority would inherently offer protection. But it doesn’t! Adults are equally susceptible to manipulative systems that exploit cognitive biases and emotional vulnerabilities.
This is why age restrictions, while a positive regulatory gesture, risk becoming a symbolic fix, a way to show action without confronting the fundamental structure of social media platforms.
The System, Not Age, Is the Issue
The real issue is not whether someone is 14 or 16. It is that the core mechanics of legacy social media platforms were never built with humans flourishing in mind. They were constructed for growth at all costs, using engagement as currency and our attention as the commodity. Whether a user is a teenager, a parent, or a grandparent, this system incentivizes the same things: maximized time on site, emotional arousal, virality and increasing data extraction.
This is why regulatory efforts that focus solely on age, even if they adopt minimum ages of 16, or require parental consent for younger teens, are only one piece of a much larger puzzle. European lawmakers themselves have finally recognised this. Some Members of the European Parliament have urged not only minimum age standards but also reforms that ban the most harmful addictive features, restrict algorithmic recommendation systems and ensure platforms are safe by design.
But there is another side to this conversation that gets less attention: What a healthier alternative ecosystem might look like. One that doesn’t require users to be protected from their own platforms’ logic.
A Different Structure for Social Interaction
If age bans are one form of protective regulation, the accompanying questions are: what are we leading people toward? What kind of digital environment do we want once the age-based gate is lifted?
Europe’s strong regulatory stance opens a window of opportunity. It is not enough to demand that existing platforms adjust their policies or add verification checks. The deeper challenge is to reimagine social technology so that it aligns with human needs rather than advertising economics.
Emerging social platforms demonstrate that different design choices are possible when user agency, ownership and well-being are central. Some of these newer networks, such as VOJVOJ frame social interaction around community relationships, chronological engagement, and user control. These platforms illustrate that it is possible to offer meaningful connection without exploiting psychological triggers for engagement.
By design, Vojvoj and others like it treat content creators and users as owners of their own voices and data, not as products whose behaviour is monetised by an algorithm. In that sense, they are not just compliant with regulations, but genuinely aligned with the intent behind them.
Time for Europe to Lead By Example
The fact that six of the seven countries actively pursuing age-based social media restrictions are European highlights more than just concern; it reflects a broader desire for digital sovereignty and cultural agency. Europe has not only strong digital safety laws like the Digital Services Act, but also a values-driven approach to data protection and online rights.
Given this leadership, we should be asking not only whether age limits are necessary, but whether European governments should champion platforms that inherently reflect European values. Platforms that give users ownership of their content and their data, that prioritise human agency over algorithmic manipulation, and that treat digital interaction as a public good rather than a commodity.
If the conversation shifts from banning harmful behaviour for minors to fostering healthier systems for everyone, then policy and technology can move in tandem toward a future that is not merely safer, but truly supportive of human dignity and agency.
Age restrictions are a necessary first step. But unless we also rethink the structures that underlie social interaction online, we risk creating a world where adults are free to use the same systems that continue to harm them just without an age label attached.


