iToverDose/Technology· 15 MAY 2026 · 13:01

X strengthens UK content moderation with new anti-hate and terror policies

Under pressure from UK regulators, X commits to faster removal of illegal hate and terror content, pledging to act on at least 85% of reports within 48 hours and share quarterly compliance data with Ofcom.

The Verge2 min read0 Comments

The UK’s communications regulator, Ofcom, has confirmed that X will implement stricter measures to combat illegal hate speech and terrorist material on its platform for British users. The agreement, announced today, reflects growing regulatory scrutiny over how social platforms handle harmful content while balancing free expression concerns.

A shift in moderation enforcement

X has outlined a series of commitments aimed at reducing exposure to illegal content in the UK. According to the agreement, the platform will now withhold access to accounts confirmed to be operated by UK-based terror groups and posting illegal terrorist content. This move represents a departure from past practices, where enforcement often lagged behind user reports.

The company has also pledged to assess at least 85% of user-reported instances of hate and terror content within a 48-hour window—a timeline designed to align with Ofcom’s expectations for rapid intervention. While X has not disclosed how it will achieve this target, the commitment signals an acknowledgment of regulatory pressure to improve moderation efficiency.

Collaboration with experts and transparency measures

Beyond faster response times, X will collaborate with external experts to refine its reporting systems for illegal content. This partnership could include input from academics, civil society groups, and law enforcement advisors to strengthen detection and classification protocols. The goal is to create a more responsive system that adapts to evolving tactics used by bad actors on the platform.

To ensure accountability, X has agreed to submit quarterly performance reports to Ofcom over the next 12 months. These updates will detail the platform’s progress in addressing reported content, including metrics on response times, enforcement actions, and any gaps identified during audits. Such transparency could set a precedent for how other global platforms engage with regulators in high-stakes content moderation cases.

Regulatory pressure mounts across Europe

This agreement comes as Ofcom intensifies its oversight of online safety, particularly for platforms with significant user bases in the UK. The regulator’s broader Online Safety Act grants it sweeping powers to demand compliance, issue fines, and even block services that fail to meet standards. X’s concessions suggest a willingness to cooperate, at least in the short term, to avoid harsher penalties or reputational damage.

Industry observers note that similar pressures are building across Europe, with regulators in Germany and France also scrutinizing social media platforms for their handling of harmful content. For X, these commitments may serve as a test case for balancing regulatory demands with operational feasibility, especially as it continues to navigate a complex geopolitical landscape.

Looking ahead, the success of these measures will depend on consistent execution and measurable outcomes. If X meets its targets, it could reshape expectations for how major platforms address illegal content in regulated markets. However, persistent gaps in enforcement—whether due to resource constraints or algorithmic limitations—could reignite criticism from advocacy groups and policymakers alike.

AI summary

X, İngiltere’nin dijital güvenlik düzenleyicisi Ofcom’un talepleri doğrultusunda, terör ve nefret içeriklerine karşı yeni koruma tedbirleri uygulamayı kabul etti. Peki, bu adımlar neleri içeriyor ve kullanıcılar için ne anlama geliyor?

Comments

00
LEAVE A COMMENT
ID #ULC75U

0 / 1200 CHARACTERS

Human check

6 + 4 = ?

Will appear after editor review

Moderation · Spam protection active

No approved comments yet. Be first.