The Irish government has signalled that it is exploring options to introduce age restrictions on social media use for under-16s. The proposal sits within the government's new National Digital and AI Strategy 2030 , which frames online safety and age verification as part of Ireland's broader ambition to act as a European digital regulatory hub.
Author
- Sinan Aşçı
Postdoctoral Researcher at the Anti-Bullying Centre, Dublin City University
The proposals include a "digital wallet" age-verification system. Detailed technical specifications have not yet been published. However, digital identity wallet models typically work by allowing a user to verify their age once through a trusted authority. After that, they can share only a simple confirmation - such as whether they are over 16 - rather than handing over full identity documents. The government has not set out the final architecture, but the stated aim is to reduce repeated data sharing with individual platforms.
Ireland is not alone in looking at age restrictions. Australia introduced a statutory ban, and other European countries are considering stricter access rules. But Ireland's position is distinctive. It hosts the European headquarters of many major technology companies. It also plays a central role in EU enforcement of the Digital Services Act , which requires very large platforms to assess and mitigate systemic risks to minors.
The debate is not simply whether social media is good or bad for children. Blanket restrictions for under-16s raise an important question: are bans the most effective way to reduce harm? Or do they offer reassurance while leaving deeper problems - such as platform design - unchanged?
The Irish context
Ireland's situation is significant because structural regulatory tools already exist at European level. Under the EU Digital Services Act, very large platforms must conduct systemic risk assessments, including risks to minors, and implement mitigation measures. Ireland plays a key role in this through Coimisiún na Meán , the country's statutory media and online safety regulator.
Established under the Online Safety and Media Regulation Act 2022 , the regulator has powers to oversee video-sharing platforms, develop binding online safety codes and investigative non-compliance by the technology companies based in Ireland. This includes in relation to the EU Digital Services Act in Ireland. This raises the question of whether new access restrictions are set to be introduced before these structural obligations are fully deployed.
Ireland's proposed digital wallet pilot also intersects with EU plans for a European Digital Identity framework. The EU's forthcoming European Digital Identity Wallet is intended to support digital proof of certain facts about a person, such as their age. No specific design for any Irish pilot has been produced. However, alignment with EU interoperability standards would be required if it is to integrate into the wider European system.
Evidence driving the debate
Ireland's proposed ban is framed primarily in child-protection terms. These include concerns about youth mental health pressures, exposure to harmful or age-inappropriate material, and risks such as online grooming and exploitation. These concerns are not unfounded.
A 2020 review of research studies found associations between heavy social media use and anxiety or depressive symptoms. However, large-scale analyses suggest that average effects on wellbeing are small and highly variable. They can differ significantly depending on context and individual vulnerability. Risks exist, but they are not uniform.
Exposure to harmful content, including self-harm material, misogynistic narratives, or extremist content, is often shaped by how platforms recommend and amplify posts. Research from my colleagues in the DCU Anti-Bullying Centre shows how recommender systems can contribute to the circulation of toxic content.
Social media platforms are not neutral spaces . Their business models rely on maximising engagement and attention. Recommender systems prioritise emotionally charged material, and feedback mechanisms reward visibility and interaction.
These systems operate regardless of age. If a 17-year-old and a 15-year-old encounter harmful amplified content, the risk doesn't go away for one user just because they're over 16.
Age restrictions may form part of a broader safeguarding approach. However, on their own, they do not address recommender systems, addictive design features or the amplification of harmful material.
Risk and opportunity
At the same time, research consistently shows that risk and opportunity are intertwined . Children who are more active online may encounter greater exposure to harm. On the other hand, they may also gain more social connection and access to information. That complexity matters when designing policies intended to reduce harm without undermining participation.
Research on children's own experiences suggests that many see social media as a normal part of their lives and use in-app safety tools to manage risks. Many also say they prefer safer platform design and clearer accountability rather than outright bans.
Children's rights bodies in Ireland have similarly emphasised the need to balance protection with participation. They also point out that children's views should be considered in the development of any pilot measures.
Ireland's proposal reflects a broader shift away from relying solely on platform self-regulation. However, the key question is whether systems that amplify harmful content and reward attention can be effectively governed.
Ireland's Digital and AI Strategy 2030 positions the country as both a host to global platforms and a digital regulatory leader. That dual role gives particular weight to how these measures are designed and enforced. Ultimately, the effectiveness of Ireland's approach will depend not only on age thresholds, but on how robustly structural risk obligations are implemented.
![]()
Sinan Aşçı is employed as a postdoctoral researcher by DCU Anti-Bullying Centre on the Observatory project funded by the Department of Justice.