Social media giant X has promised quicker action on removing hate and terror-related content in the UK, responding to pressure from Ofcom, the UK’s communications regulator. The commitments follow recent crimes targeting Jewish communities, which intensified scrutiny over how platforms handle extremist material. X’s pledge comes as regulators and lawmakers demand faster responses to illegal content online.

The commitments were outlined in a letter to Ofcom, seen by BBC News, where X acknowledged the need for urgent improvements. The company did not specify timelines but stated it would prioritize content flagged as hate speech or terrorism. Ofcom said the commitments were a response to concerns over rising threats to Jewish and other minority communities in the UK.

Regulatory scrutiny grows after UK hate crimes rise

Recent attacks on Jewish communities in the UK have heightened concerns about online radicalization and extremist content. Ofcom’s warning to X reflects broader regulatory efforts to hold social media platforms accountable for failing to curb illegal material. The UK government has also pushed for stricter enforcement under the Online Safety Act, which requires platforms to remove illegal content quickly or face fines.

X’s response comes amid broader scrutiny of its moderation practices. Critics argue the platform has been slow to act on reports of hate speech and terrorist propaganda, particularly since its rebranding from Twitter. The company’s new leadership has framed these commitments as a step toward rebuilding trust with regulators and users alike.

How X plans to enforce stricter content rules

In its letter to Ofcom, X outlined plans to enhance its automated detection systems to identify and remove hate speech and terror-related content more efficiently. The company also pledged to increase human moderation teams in the UK to review flagged content faster. However, details on staffing levels or technology upgrades remain undisclosed.

The commitments follow a series of high-profile incidents where extremist content on X was linked to real-world violence. Ofcom’s intervention signals that regulators are prepared to take enforcement action if X fails to meet its promises. The agency has not ruled out imposing penalties under the Online Safety Act if the platform falls short.

What happens next for X and UK online safety

Ofcom will monitor X’s progress over the coming months, with a focus on whether the platform meets its commitments. If violations are found, the regulator could issue warnings or fines. Meanwhile, Jewish community leaders and advocacy groups have welcomed the commitments but say they will continue pushing for accountability.

X’s pledge is part of a wider trend among social media platforms facing regulatory heat in the UK. Companies like Meta and TikTok have also faced scrutiny over their handling of harmful content. The outcome of X’s efforts could set a precedent for how regulators enforce online safety laws in the future.

What You Need to Know

  • Source: BBC News
  • Published: May 15, 2026 at 12:57 UTC
  • Category: Business
  • Topics: #bbc · #business · #economy · #ofcom · #jewish · #x-hate-content-removal-uk

Read the Full Story

This is a curated summary. For the complete article, original data, quotes and full analysis:

Read the full story on BBC News →

All reporting rights belong to the respective author(s) at BBC News. GlobalBR News summarizes publicly available content to help readers discover the most relevant global news.


Curated by GlobalBR News · May 15, 2026



🇧🇷 Resumo em Português

O X, plataforma de mídia social antes conhecida como Twitter, promete agir com mais agilidade para remover conteúdos de ódio e terror no Reino Unido, após pressão da Ofcom, órgão regulador britânico. A empresa, que recentemente foi alvo de críticas por aumentar a disseminação de discursos de ódio e desinformação, agora enfrenta um escrutínio ainda maior, especialmente após alertas sobre o crescimento de ameaças à comunidade judaica no país.

No Brasil, onde o debate sobre regulação de redes sociais ganha força, a decisão do X reforça a discussão sobre a responsabilidade das plataformas em moderar conteúdos que incitem violência ou ódio. Com mais de 50 milhões de usuários no país, a plataforma tem sido palco de disseminação de notícias falsas e ataques a minorias, o que torna a postura do X no Reino Unido um ponto de reflexão para o cenário brasileiro. A medida pode ser vista como um teste para a eficácia das políticas de moderação de conteúdo em plataformas globais, que muitas vezes enfrentam críticas por agir tardiamente ou de forma inconsistente.

O próximo passo será observar se a promessa de agilidade na remoção de conteúdos será cumprida na prática, tanto no Reino Unido quanto em outros mercados, incluindo o Brasil, onde a regulação de redes sociais ainda está em debate.


🇪🇸 Resumen en Español

La plataforma X, antes conocida como Twitter, ha anunciado un compromiso acelerado para eliminar contenidos de odio y terrorismo en el Reino Unido, tras las advertencias de Ofcom sobre la creciente amenaza a las comunidades judías. La medida llega en un momento en que la desinformación y los discursos de incitación a la violencia ganan terreno en las redes sociales, obligando a las empresas tecnológicas a replantearse su papel en la moderación de contenidos.

El contexto es clave: Ofcom, el regulador británico de comunicaciones, ha alertado sobre el aumento de amenazas contra la población judía en el país, especialmente tras los recientes conflictos en Oriente Medio. Para los usuarios hispanohablantes, esto no es solo un debate lejano, sino una advertencia sobre cómo la polarización global en las redes puede escalar rápidamente. La decisión de X refleja una presión creciente sobre las plataformas para que actúen con más rigor, algo que afecta directamente a millones de hispanohablantes que usan estas herramientas a diario, ya sea como consumidores de información o como generadores de contenido.