
Historical reckoning: The push for the US to acknowledge the Nakba
Calls grow for the US to acknowledge the Nakba and its implications for Palestinian history.

Elon Musk's X platform will block UK access to accounts linked to banned terrorist groups under an agreement with Ofcom. The platform will also review suspected illegal content within 48 hours and enhance protections for users.
Mentioned in this story
Elon Musk’s X platform has pledged to block UK access to accounts linked to banned terrorist groups under an agreement with the communications regulator to crack down on terrorist and hate content.
X will also review suspected illegal terrorist and hate content within 48 hours and seek expert advice on how to handle user reports of such content.
The UK’s media regulator, Ofcom, announced the commitments as part of a drive to ensure social media platforms have the right systems in place to deal with terrorist and hate material, amid concerns that dangerous content is still not being dealt with on large sites.
Oliver Griffiths, Ofcom’s online safety group director, said: “Following intensive engagement carried out by Ofcom’s online safety team, X have committed to implementing stronger protections for UK users, which we will now monitor closely.”
Griffiths said the issue of online terrorist and hate content had become even more pressing in the wake of a spate of hate crimes committed against the UK’s Jewish community.
Under the agreement, X will block UK access to accounts that post illegal terrorist content and are linked to terrorist organisations proscribed by the UK government. It will also review, within 48 hours, at least 85% of illegal terrorist and hate content flagged by its illegal-content reporting tool. The UK’s Online Safety Act aims to protect people in the UK from illegal content including terror and hate-related material.
Ofcom said it was continuing its investigation into X showing images manipulated with the Grok AI tool, also owned by Musk, to depict women and girls as partly unclothed.
Danny Stone, the chief executive of the Antisemitism Policy Trust, said the agreement was a “good start” but that X was still “failing in so many regards” to tackle racism on its platform.
Adam Hadley, the executive director of Tech Against Terrorism, which aims to tackle online extremism, said the announcement was a “powerful example of what constructive dialogue between regulators and platforms can deliver”.
X has faced regular criticism over its moderation since it was bought by Musk for $44bn (£33bn) in 2022, when the platform was known as Twitter. Last year Amnesty International accused X of creating a “staggering amplification of hate” during the riots that broke out after the Southport murders in 2024.
X declined to comment.
X has agreed to block UK access to accounts linked to banned terrorist groups and review suspected illegal content within 48 hours.
Ofcom is concerned due to a rise in hate crimes, particularly against the UK's Jewish community, and aims to ensure social media platforms effectively manage such content.
X will seek expert advice on handling user reports and is committed to implementing stronger protections for UK users, which Ofcom will monitor closely.

Calls grow for the US to acknowledge the Nakba and its implications for Palestinian history.

11 far-right activists barred from UK ahead of Tommy Robinson rally

Czech police recover stolen skull of Saint Zdislava, arrest suspect

Trump says he made no commitments on Taiwan to Xi Jinping

Voting Rights Act ruling seen as a boost for southern Republicans, says Rep. Bennie Thompson.

UK PM Keir Starmer under fire as rivals rally for leadership
See every story in News — including breaking news and analysis.