Racialized Grooming Gangs: How Musk and X Amplified Islamophobia and Racism in the UK
Aishik Saha / May 14, 2025The improvement in Reform UK’s performance in the 2025 local elections highlights a growing legitimacy crisis within the Conservative Party. Yet, this shift is less a case of the electorate drifting organically toward the far right and more the product of a media and platform ecosystem that has steadily normalized far-right rhetoric. We can see a direct example of these changes in the transition of X (formerly Twitter) into a cesspit of hate and disinformation under Elon Musk.
Since acquiring X in late 2022, Musk has branded himself a “free speech absolutist,” dismantled trust and safety systems, and reinstated and actively engaged with previously banned far-right X accounts. In practice, however, his hands‑off approach to content moderation and personal amplification of conspiratorial claims has turned the platform into fertile ground for Islamophobia, racism, and disinformation. Nowhere is this clearer than in the so‑called “grooming gangs” panic that resurfaced in January 2025.
Historically, the term “grooming gangs” has referred to networks of offenders who prey on vulnerable children and young people through manipulation and shared tactics. Since the early 2010s, however, the label has been racialized into a collective indictment of Muslims, particularly British Pakistanis, other South Asians, and immigrant communities. Recent discourse on X has revived and amplified these stereotypes, merging hate speech with mainstream political debate. Importantly, official data consistently show that white men account for the vast majority of sexual assaults against both adults and children in the UK.
Our new report at the Center for the Study of Organized Hate (CSOH) analyzed 1,365 X posts about “grooming gangs” between January 2024 and January 2025. In the peak period—from January 1 to January 30, 2025—1,208 posts generated a staggering 1.53 billion engagements (1.51 billion views, 11.5 million likes, 3.17 million reposts, 625 thousand replies, and 347 thousand bookmarks). Musk’s 51 posts to the conversation alone accounted for 1.2 billion of those engagements. His centrality here has relied on both his algorithmic privilege and the convergence of an extensive global Islamophobic X network spanning the UK, Europe, the US, and India.
The report found that the “grooming gangs” discourse coalesces around three harmful themes. First, Muslim men, particularly those of British Pakistani heritage, are racialized as collective perpetrators of sexual violence through entrenched orientalist tropes. Second, a powerful myth of institutional cover-up accuses British authorities of hiding crimes to protect minorities or preserve political advantage. Third, attacks on multiculturalism frame diversity and “political correctness” as the root of social decay, pitting majority and minority communities against one another.
More than half of the January posts (650 out of 1,208) amplified Islamophobic and racist tropes, blaming Muslims, British Pakistanis, South Asians, and immigrant communities for being sexual predators. Nearly 48 percent of posts (578) advanced the conspiracy that authorities were concealing these crimes, blaming the Labour Party in 45.5 percent of those cases, the judiciary in 13.5 percent, and the media in 9.6 percent.
The harmful language slid seamlessly between insinuation and overt racism, portraying Muslims and immigrants as rapists, infiltrators, and existential threats. These posts didn’t just misinform—they mobilized outrage, encouraged calls for vigilante violence, and framed state repression as a moral imperative.
A growing far-right digital alliance
What appears to be coordinated activity by UK-based far-right figures like Tommy Robinson and international far-right actors, ranging from Visegrád24 and Radio Genoa in Europe to OpIndia in India, further amplified Islamophobic and racist “grooming gangs” discourse, which illustrates how social media platforms are actively facilitating transnational alliances among extremist movements.
The growing convergence between far-right entities and big tech platforms has long been speculated upon. Though Silicon Valley was once viewed as a stronghold of liberal values, recent shifts—such as relaxing hate-speech policies and embracing conservative talking points—will come as little surprise to those who have been paying attention. X’s downsized trust and safety team and Musk’s own embrace of “free speech absolutism” have created a permissive environment where disinformation and bigotry flourish.
Perhaps most striking is the way Hindu nationalist outlets have welded British and Indian conspiracies into a single narrative. Indian far-right websites such as OpIndia and its editor, Nupur Sharma have repeatedly platformed figures like Robinson, simultaneously invoking racialized “grooming gangs” panic and India’s “Love Jihad” myth. During the January 2025 “grooming gangs” surge, we identified 116 posts (9.6 percent of the dataset) by prominent Hindu nationalist accounts, which together amassed 17.3 million views, 331,000 likes, 108,000 reposts, 8,100 replies, and 15,800 bookmarks. These posts not only echoed far-right tropes but also engaged reciprocally with European far-right handles, suggesting mutual reinforcement if not overt coordination. This pattern of “coordinated inauthentic behavior”, identical text and video across multiple accounts, was evident as early as the UK riots of July–August 2024, when Hindu nationalist X users stoked tensions with identical clips and misleading narratives. This cross-pollination exemplifies the broader consolidation of far-right discourse. Musk himself has signaled alignment with parties like Germany’s AfD, signaling that these are not isolated episodes but rather an emerging global far-right alliance.
These contemporary panics are, in fact, echoes of a much older lineage of white supremacist moralizing. Victorian-era fears of “white slavery,” Nazi caricatures of Jews as sexual predators, and more recently, India’s “love jihad” conspiracy theories, all mirror the same logic, portraying racialized men as threats to national and moral purity. Musk’s interventions tap into that lineage, creating newer discursive trends for the far-right to emulate further.
Online Safety Act and Ofcom
The 'grooming gangs' discourse must also be understood as part of the global far-right’s broader search for legitimacy in a post-2024 UK political landscape. Following the decline of the Conservative Party, far-right actors have repositioned themselves as the “persecuted voice” of a silenced majority, thus adopting a martyr narrative. Musk has directly enabled this framing by positioning the unrestricted hate allowed on his platform as the only acceptable interpretation of freedom of expression.
X’s much-touted community notes, which have been presented as an alternative to fact-checking, not only failed to combat the proliferation of disinformation regarding grooming gangs, but even in one instance amplified the falsehood that Pakistani men are the dominant demographic in the UK’s grooming gangs. Despite these failures, platforms like Meta have indicated that they are open to adapting the same framework of community notes while abandoning fact-checking mechanisms.
Under the UK’s Online Safety Act 2023, X, as a Category 1 service, is legally required to undertake comprehensive measures to mitigate the dissemination of illegal and harmful content on its platform, which includes conducting thorough risk assessments to identify and address content that could incite hatred against Muslims, British-Pakistanis, South Asians, and immigrants. The content also clearly falls within the OSA’s definition of “Illegal Online Harms,” posing both a threat to public order and a risk of foreign interference under Ofcom’s Illegal Content Judgements Guidance (ICJG).
The OSA’s success depends not only on its statutory power but also on the political will to apply it impartially, including to platforms led by powerful figures. If Elon Musk’s direct involvement in proliferating racialized hate online goes unchecked, it will set a dangerous precedent.
Authors
