Elon Musk's X has committed to faster removal of hate speech and terrorist content in the United Kingdom following pressure from Ofcom, the UK's media regulator. The pledge comes after a series of crimes targeting Jewish communities across Britain raised alarm over the platform's content moderation practices.
Ofcom had been investigating X's handling of illegal material and discriminatory posts. The regulator emphasized that speed matters in removing such content, particularly given recent violent incidents. X's new commitments include accelerated response times for flagged hate speech and terror-related material.
The social media platform faces mounting scrutiny globally over content moderation. In the UK specifically, the Online Safety Bill gives Ofcom enforcement powers over digital platforms. X has experienced staff cuts under Musk's leadership, raising questions about the company's capacity to moderate at scale. The regulator's intervention signals it will hold platforms accountable if they fail to act.
This moment reflects broader tension between free speech protections and platform responsibility. Major advertisers have already fled X over concerns about hate content proliferation. The UK's regulatory approach differs from the US, where Section 230 protections shield platforms from liability. Under British law, companies must demonstrate they take illegal material seriously.
X's commitments remain vague on timelines and enforcement mechanisms. Ofcom will likely monitor compliance closely. Other platforms including Meta and TikTok face similar expectations in the UK market. The regulator has signaled it will not hesitate to issue fines or force compliance through legal action if platforms fail to meet obligations.
