The deadly knife attack in Southport, England, had only occurred a few hours earlier when the far-right Briton Andrew Tate shared astonishingly detailed information about the perpetrator on social media: “An unregistered migrant decided today to go to a Taylor Swift dance class and stab six little girls. You heard that right: this was someone who came to the United Kingdom on a boat—nobody knew who he was, nobody knows where he came from.” None of this was true. At that time, there were no details about the perpetrator. The next day, it was revealed that he was born in Cardiff and grew up in the UK. But the disinformation had already spread.
Tate, along with figures like Stephen Yaxley-Lennon, the former leader of the Islamophobic “English Defence League,” have a massive amplification effect. Despite being repeatedly convicted of crimes, Yaxley-Lennon described the riots as the result of “legitimate concerns” and called for “mass deportations,” warning that otherwise, “total anarchy” would break out in England. In 2018, Twitter (as it was known then) banned Yaxley-Lennon, but his account was reinstated after Elon Musk took over the platform. His large following is a testament to his role as a long-standing leader in the far-right scene, frequently appearing at demonstrations. In 2009, he founded the English Defence League (EDL) in his hometown of Luton, outside of London, after Muslim extremists insulted British soldiers returning from Iraq. Luton at that time had a high proportion of Muslim residents and a radical Islamist scene. As a teenager, Stephen Yaxley-Lennon joined local football hooligans. He had to abandon an apprenticeship as an aircraft mechanic after being sentenced to a year in prison for assaulting a police officer. He later adopted the pseudonym Tommy Robinson, which was meant to conceal his true identity and evoke a prominent hooligan associated with the Luton Town FC.
These individuals are referred to as “disinfluencers” by academics, and their online influence has significantly grown once again. They assume the role of the supposed voice of the disenfranchised white working class, standing up to the advancing Islamism, and turn their hate speech into a lucrative income source: over the years, people like Tate or Yaxley-Lennon have raised millions of euros in donations, often from supporters living on the poverty line themselves. The British newspaper “Mirror” reported that Yaxley-Lennon is currently sending his hate messages from a “five-star vacation in Cyprus.”
The cocktail of anti-immigration sentiments and lies about an Islamist motive behind the Southport attack was mixed by many. The newly elected Reform UK MP Nigel Farage, in a Facebook post, questioned whether the police might be withholding “the truth” from the public. Although the police never comment on ongoing investigations, it was interesting that Farage did not use the opportunity to raise a relevant question in the House of Commons the same day, which was not yet in summer recess. Instead, he chose the online route, knowing that he would find the audience he was targeting there.
There are also obscure sources that may be linked to destabilizing interventions by hostile state actors. A news site called “Channel3 Now” was the first to claim that the Southport suspect was an asylum seeker named Ali Al Shakati, who was “on MI6’s watchlist”—a false claim that was picked up by both Farage and anti-Islam and anti-immigrant networks. The “Daily Mail” linked “Channel3 Now” to Russia—the YouTube channel had started as a Russian channel eleven years ago. The Channel3 News website was created in the summer of 2023, and one of the four Facebook pages using the same name and branding was repurposed twice—once in 2023 and again in May 2024 when it became “Channel3 Now.” This type of disinformation dissemination is part of a “gray war” that Russia is waging against the West, according to Sir Richard Dearlove, the former head of the British foreign intelligence service MI6.
A major problem is that X (formerly Twitter) accounts spreading disinformation are selected by the platforms themselves based on user preferences, further amplifying their messages. This could theoretically be changed. The platforms could review the algorithms that promote such sources of disinformation to certain users and stop it. Additionally, large tech companies receive information from Western intelligence agencies identifying accounts controlled by hostile actors like Russia or Iran, and they have the option to suspend them. The implementation of the so-called “Online Safety Act” in the UK could help, holding large tech companies more accountable for their content. Normally, in the UK, personal details of underage suspects are not disclosed. However, due to the ongoing disinformation campaign, a court in Liverpool released the information last week. To counter the online lies, established media like the BBC reported not only that the Southport perpetrator, Axel R., was a British-born and non-Muslim, but also that his parents were from Rwanda. However, the mere ethnic origin of the Southport murderer was then taken by the rioters as confirmation of their suspicion that an “immigrant” had killed little girls.
All publishing rights and copyrights reserved to MENA Research Center.