"Across the globe, governments are debating how young is "too young" to use social media, with some introducing age-related restrictions across platforms.
"These restrictions reflect genuine concern: children are facing bullying, exploitation, and exposure to harmful content online with negative impacts on their mental health and well-being. The status quo is failing children and overwhelming families.
"While UNICEF welcomes the growing commitment to children's online safety, social media bans come with their own risks, and they may even backfire.
"Social media is not a luxury - for many children, especially those who are isolated or marginalised, it is a lifeline providing access to learning, connection, play, and self-expression. What's more, many children and young people will still access social media, whether through workarounds, shared devices, or turning to less regulated platforms, ultimately making it harder to protect them.
"Age restrictions must be part of a broader approach that protects children from harm, respects their rights to privacy and participation, and avoids pushing them into unregulated, less safe spaces. Regulation should not be a substitute for platforms investing in child safety. Laws introducing age restrictions are not an alternative to companies improving platform design and content moderation.
"UNICEF calls on governments, regulators, and companies to work with children and families to build digital environments that are safe, inclusive, and respect children's rights. This includes:
- Governments must ensure that age-related laws and regulations do not replace companies' obligations to invest in safer platform design, as well as effective content moderation, and should mandate companies to take responsibility by proactively identifying and addressing adverse impacts on children's rights.
- Social media and tech companies must redesign products with child safety and well-being at the centre, invest in safer platform design and effective content moderation, and develop rights-respecting age-assurance tools and differentiated experiences that offer younger users safer, developmentally appropriate environments. These protections must apply in all contexts, including fragile or conflict-affected countries where institutional capacity to regulate and enforce protections may be low.
- Regulators must have systemic measures to effectively prevent and mitigate online harm experienced by children.
- Civil society and partners must amplify the voices and lived experiences of children, young people, parents, and caregivers in debates on social media age limits. Decisions around how to best protect children in a digital age must be informed by quality evidence, including evidence coming directly from children.
- Parents and caregivers should be supported with improved digital literacy - they have a crucial role but currently are being asked to do the impossible to protect their children online: monitor platforms they didn't design, police algorithms they can't see, and manage dozens of apps around the clock.
"UNICEF is committed to continuing our work for and with children, young people and families to ensure legislation, regulations and technology design reflects children's views, needs and rights. We stand ready to work with governments, business and communities to ensure every child can safely learn, connect, and thrive in the digital age."