Beyond the Ban. Part 2: A Child-Centred Approach to Online Protection Regulation

Amanda Third   

Two decades of research and practice experience demonstrate that, to enable children to effectively navigate the risks of digital harm while maximising the very real benefits of digital participation, a whole-of-community, cross-sector approach is required. The role of regulation in these efforts is not simply to prohibit or restrict, but to create the conditions for a thriving ecosystem — one that not only protects children from harm but actively enables them to seize the opportunities that digital technologies offer for their growth, learning, connection, and civic participation.

Effective regulatory design demands, first and foremost, a genuinely child-centred approach. This means investing seriously in understanding the rich, diverse, and contextually specific ways in which children engage with the digital world, rather than designing regulation around adult anxieties. It means attending to the full range of children’s rights, being wary of frameworks that privilege, for example, mental health above children’s equally fundamental rights to participation, expression, information, and association. And it means being guided by the evidence, including the finding that, for most children, the relationship between social media use and wellbeing is neutral or positive, with harmful associations concentrated in a smaller subset of children who are already experiencing vulnerability. This is not a reason for complacency. It is a reason for precision. Decision makers must be wary of applying the same intervention to a teenager using Instagram to stay in touch with friends after changing schools as it does to one who is being targeted by online harassment. Similarly, they should be cautious about treating all platforms as equivalent or treating all children as identical. Solutions must be sensitive to age, developmental stage, context, and need. They must acknowledge the significant body of research demonstrating that digital connection, under the right conditions, supports rather than undermines young people’s resilience.

Perhaps most importantly, the regulatory question that has received the least attention in the social media ban debate is the one that may matter most: how can regulation be designed to genuinely incentivise technology platforms to centre children’s rights, needs, and aspirations in the design and implementation of their products and services — and to hold them rigorously accountable when they fail to do so? Bans, counterintuitively, potentially let platforms off the hook. By evicting children from social media, age restriction legislation shifts platforms into compliance mode — satisfying a legal threshold while relieving them of the deeper obligation to design digital environments that are genuinely accountable to children’s rights. Keeping children in digital spaces, within a robust regulatory framework that specifies what platforms owe them, is essential to driving the kind of platform accountability that children’s safety and wellbeing actually requires. The question is not whether children should be online, but what obligations platforms must meet to deserve their presence.

On the Alternatives

It will be some time before there is robust evidence on the efficacy of the Australian legislation. In the meantime, I urge decision makers across Europe and elsewhere to scrutinise the Australian experience carefully, to digest the evidence as it emerges, and to give serious consideration to the many alternatives to age-based restrictions that the research base supports.

These alternatives are not hypothetical or untested. They include the development of robust regulatory ecosystems supported by genuine cross-sector coordination, in which governments, platforms, civil society, educators, families, and children work together within a shared framework of accountability. They include promoting meaningful awareness of the existing minimum age of 13, which has been widely ignored and unenforced to date, and investing in digital literacy and citizenship education. Alternatives include the promotion of safety, child rights, and evolving capacities by design, such as the UK’s Age Appropriate Design Code, which requires platforms to build child-centred protections into the architecture of their products rather than treating children’s safety as an afterthought.

Crucially, as an alternative to age-based restrictions, regulators might explore how to regulate harmful features rather than platforms or users, targeting the specific algorithmic and design practices that generate harm. Such efforts could be supported by the development of industry codes to regulate harmful content and strengthened redress and enforcement mechanisms.

Moreover, it will be critical to develop transnational governance mechanisms capable of matching the global reach of the platforms being regulated. Here, drawing on their history of cross-national regulatory efforts, European regulators are ideally positioned to play a leadership role in meeting the challenges social media pose to children and their families. By working together, and perhaps with other regional bodies like ASEAN and the African Union, European nations can hold technology platforms to account at the scale at which they operate. I urge European regulators across national jurisdictions to work closely with experts, drawing on the international evidence base to inform the design of technical standards that can then be legislated by individual nations. Doing so will create the consistency of expectation and of enforcement – indeed, a culture of accountability – that can inspire platforms to genuinely centre children’s best interests in the design of technology products and services.

Going forward, regulatory efforts must be guided by the co-design of regulation with children and families themselves. Research consistently demonstrates that children are not naive about the risks of social media. They are often acutely aware of them. But children also articulate, consistently and compellingly, the value of digital connection in their lives: the friendships maintained, the communities found, the creative expression enabled, the information accessed that was unavailable offline. Policies developed without meaningfully consulting children do not merely fail to honour their rights to participation — they actively undermine the civic trust and digital agency that young people will need to flourish in an increasingly technological world. Children’s perspectives are not an optional supplement to the policy process; they are indispensable intelligence about the problems decision makers are actually trying to solve.

Conclusion: The Ambition We Owe Our Children

Ultimately, the social media ban debate has suffered from a failure of ambition. We have been so focused on what we want to protect children from that we have failed to ask the equally important, and arguably more demanding, question: what kind of digital experiences do we want children to have? What would it mean to build online environments that genuinely nurture children’s growth, creativity, learning, civic participation, and wellbeing? What would it mean to hold platforms not merely to a standard of harm reduction but to a genuinely positive standard of children’s rights by design?

It is time to move from a necessary but insufficient regulatory focus on safety to one that focuses on optimal experiences for children in digital environments. And we must build the legislative and technical mechanisms capable of delivering them. This is harder work. It is slower work. It is potentially less politically compelling than legislating an age limit. But it is the work that actually needs doing.

To get there, we need to be clear about what this is really about. It is not about the power dynamics between governments and technology companies. It is not about votes or political careers. It is not even, primarily, about the anxieties of parents, though these deserve to be addressed. It is about our children, and about what we are collectively obligated to do to ensure they can thrive in the digital world they already inhabit and will continue to inhabit long after today’s policymakers have left the stage.

Children deserve a digital world built for them, with them, and around their rights. That is the standard to which European policymakers and all of us must now rise.


Professor Amanda Third is Professorial Research Fellow and Co-Director of the Young and Resilient Research Centre in the Institute for Culture and Society at Western Sydney University and Faculty Associate in the Berkman Klein Center for Internet and Society at Harvard University. She is the lead author of Young People in Digital Society: Control/Shift (Palgrave, 2019) and was a lead researcher on the consultations informing UNCRC General Comment No. 25 on children’s rights in the digital environment.


This blog post appeared on Social Media Ban for Kids, an interactive website managed by The Lisbon Council, a Brussels-based think tank, to gather available evidence and data points on the social media ban for children. Its website is https://socialmediaban.lisboncouncil.net/.