Social media bosses were yesterday told to ‘act now’ by removing any online violence or hate which could fuel the riots that have swept Britain.
Regulator Ofcom wrote an open letter to tech chiefs warning them that they have a duty to protect users from ‘videos likely to incite violence or hatred’.
It came as Deputy Prime Minister Angela Rayner hit out at the ‘fake news’ and ‘hate’ that has been widely circulated online.
She was speaking during a visit to a hotel in Rotherham formerly used to house asylum seekers that was stormed by far-Right rioters last week.
Asked about comments made online by X boss Elon Musk, who branded the Prime Minister ‘two-tier Keir’ and has suggested he has been treating Muslims more favourably than white people in the riots, Ms Rayner said: ‘Well what I’d say is social media companies have responsibilities as well to deal with fake news.
‘We’ve seen a lot of fake information being shared on online platforms. We’ve seen a lot of hate.
‘And people have a responsibility not to conduct themselves and to amplify that, but actually to deal with the online misinformation, but also not to spread that hate and we don’t want to see that whether it’s online or offline.
‘And people who are online spreading violence and inciting violence will be met with the law as well as those that try and turn up and try and throw missiles at police.’
Mr Musk’s outburst came after one government minister branded him ‘deeply irresponsible’ for claiming ‘civil war is inevitable’ amid the riots.
His ‘two-tier’ jibe at the PM was a reference to a theory pushed by far-Right rioters that they are treated differently to Muslim activists.
The world’s richest man, who has almost 193million followers on X, has now made more than 30 comments about immigration, crime, policing and politics in the UK since the Southport stabbings last Monday.
His platform has also been accused of refusing to delete racist comments while allowing controversial activists such as Tommy Robinson and Andrew Tate back on the site.
The false claim that a Muslim asylum seeker was the suspect in the Southport stabbings was first made on X by a British woman before going viral when it was spread by Russia-linked fake news website Channel3 Now.
Ofcom’s letter said it has spoken to ‘various online services to discuss their actions to prevent their platforms from being used to stir up hatred, provoke violence and commit other offences under UK law’.
Gill Whitehead, the regulator’s Group Director for Online Safety, wrote: ‘In a few months, new safety duties under the Online Safety Act will be in place, but you can act now – there is no need to wait to make your sites and apps safer for users.’
It added: ‘Under Ofcom’s regulations that pre-date the Online Safety Act, UK-based video-sharing platforms must protect their users from videos likely to incite violence or hatred.
‘We therefore expect video-sharing platforms to ensure their systems and processes are effective in anticipating and responding to the potential spread of harmful video material stemming from the recent events.’
The landmark Online Safety Act passed by the previous government will mean tech giants face multi-million-pound fines if they fail to protect users from illegal content such as messages provoking hatred, disorder or violence.
But the duties on the firms will not come into force for several months, as Ofcom is still consulting on its codes of practice and guidance.
Regulations which pre-date the Act mean some online platforms already have a duty to protect users from videos likely to incite violence or hatred and can be fined by Ofcom.
But they don’t apply to most of the major platforms like X and Facebook. They only apply to TikTok, which was recently fined £1.9million by Ofcom, Twitch, Snapchat and Vimeo.
Ms Whitehead’s letter added: ‘We expect continued engagement with companies over this period to understand the specific issues they face and we welcome the proactive approaches that have been deployed by some services in relation to these acts of violence across the UK.’
It came as it emerged that a secretive government agency accused of ‘spying’ on anti-lockdown campaigners during the Covid-19 pandemic has been deployed to monitor social media amid the riots.
The Telegraph reported that the Counter Disinformation Unit (CDU), now rebranded as the National Security Online Information Team (NSOIT), has been asked to scour social media despite MPs only a few months ago calling for an independent review of its activities.
Campaigners have expressed concern over the agency playing a central role in the riot response despite outstanding questions over whether it is fit for purpose.
Meanwhile, newly elected Labour MP accused Elon Musk of fanning the flames of hate over the riots, in the strongest condemnation yet of the maverick Twitter/X owner by a UK politician.
Andrew Lewin, who defeated former Defence Secretary Grant Shapps, wrote on the social media network: ‘Musk is using his platform to sow division and fan the flames of hate. It is indefensible.
‘We are now on a platform run by someone who talks up ‘civil war’, seemingly with relish.’
But he added: ‘It’s equally vital that mainstream voices are not drowned out by him, his bots & algorithms. I’m here for now, just less often.’
His comments came after Courts Minister Heidi Alexander called the world’s richest man ‘deeply irresponsible’ for claiming that civil war was inevitable in the UK.
Mr Musk has commented more than 30 times on crime, immigration and the riots in recent days including branding the Prime Minister ‘two-tier Keir’ over the controversial suggestion that Muslims are given greater protection by police.