Irish regulator threatens X with severe penalties for failing to protect minors from pornography

Irish regulator threatens X with severe penalties for failing to protect minors from pornography

Ireland’s media regulator has issued a serious ultimatum to social media platform X, threatening substantial financial penalties if the company fails to implement adequate measures to protect minors from accessing pornographic content. The regulatory body has set a clear deadline, demanding that X demonstrate compliance with the country’s Online Safety Code or face potentially severe consequences.

Ireland’s new online safety regulations target X platform

The Irish Media Commission has taken a firm stance against X, the social platform owned by billionaire Elon Musk, regarding child protection measures. In a statement released on July 24, 2025, the regulatory body announced that X must provide a detailed explanation of its compliance with the Online Safety Code by Friday or risk significant sanctions.

“We have requested X to explain the measures taken to comply with the Online Safety Code by Friday, and we will take action if evidence of non-compliance exists,” the Media Commission declared in its official communication. This warning represents a critical moment for the platform’s operations in Ireland.

The Online Safety Code, implemented in October 2024, gave social media companies nine months to develop and deploy effective age verification systems. These regulations specifically target platforms with European headquarters in Ireland, including:

  • X (formerly Twitter)
  • TikTok
  • YouTube (Google)
  • Instagram (Meta)

The Irish regulator’s examination of X’s platform revealed concerning gaps in compliance. “Based on our initial review of the X platform, we see no evidence of measures taken to meet this age verification requirement,” stated the Commission, highlighting additional concerns about “the availability of parental control tools.”

Age verification requirements and potential penalties

Under Ireland’s enhanced online safety framework, simple self-declaration methods are no longer sufficient for age verification. The days when users could merely tick a box confirming they are over 18 or self-report their age to access restricted content have ended. Social media platforms must now implement robust mechanisms to verify user age accurately.

The regulations specifically aim to “prevent children from encountering pornography or gratuitous violence” online. This represents a significant shift in how platforms must approach content moderation and access control for younger users.

The financial implications for non-compliance are substantial. Companies violating these regulations face penalties that could reach:

Penalty Type Maximum Amount
Percentage of Annual Revenue 10%
Fixed Financial Penalty €23 million (£20 million)

The Commission will apply whichever amount is higher, creating a particularly serious concern for larger tech companies with substantial revenues. This approach ensures that penalties remain proportional to a company’s financial capacity while maintaining a significant deterrent effect.

When contacted by news agencies, X did not immediately respond to requests for comment regarding the regulatory warning. This silence raises questions about the platform’s readiness to address the compliance issues identified by the Irish regulator.

Child protection measures across digital platforms

The regulatory focus on X highlights broader concerns about child safety across social media ecosystems. Ireland’s position as the European headquarters for multiple major platforms gives its regulatory decisions particular significance in the broader context of online safety governance.

Effective age verification systems represent a technical and ethical challenge for platforms. Current approaches include:

  1. Document verification using government-issued IDs
  2. Biometric verification methods
  3. Credit card verification
  4. Mobile carrier-based verification
  5. Third-party age verification services

Each method presents tradeoffs between user privacy, convenience, and verification effectiveness. Platforms must balance these considerations while meeting regulatory requirements and maintaining user growth objectives.

Parental control tools represent another critical component of child safety measures. These features allow parents to monitor and restrict content their children can access, but their implementation and effectiveness vary significantly across platforms.

Broader implications for digital regulation

Ireland’s enforcement action against X reflects an intensifying global trend toward stricter digital regulation, particularly regarding child protection. The country’s role as a European tech hub positions it at the forefront of efforts to establish meaningful guardrails for online platforms.

This regulatory approach mirrors similar initiatives in other jurisdictions, including the UK’s Online Safety Bill and Australia’s eSafety Commissioner framework. These parallel developments suggest a growing international consensus about the need for more rigorous protections for younger users online.

For X, compliance challenges extend beyond age verification. The platform has faced scrutiny regarding content moderation practices since Elon Musk’s acquisition and subsequent policy changes. The current regulatory action in Ireland adds another dimension to these ongoing concerns.

As the Friday deadline approaches, stakeholders across the digital ecosystem will watch closely to see how X responds to the Irish regulator’s demands and what precedents might be established for platform accountability regarding child protection measures.

Aoife Gallagher
Scroll to Top