<p>On the evening of August 5, Aaron Johnson arrived at the Britannia Hotel in Stockport, northern England. The 32-year-old man from a nearby area, had come to see what he was calling a “migrant hotel.”’ </p><p>“This is civil unrest, you know what I’m saying,” he told viewers of his YouTube livestream, which would ultimately number more than 41,000 before it was removed.</p><p>Over the next 25 minutes, Johnson traveled the circumference of the building, probing doors and windows while his audience on the Alphabet Inc.-owned video sharing platform, egged him on in the live chat feature. He filmed through windows into bedrooms, capturing one couple sleeping and harassing another.</p><p>“I’m uncomfortable in my own f******g country,” he said, when confronted by hotel security guards. “These people are getting put up in a f******g hotel while our own people are on the streets.” </p>.UK rioters thwarted by 'show of unity' by communities, says police chief. <p>On Wednesday, Johnson’s video landed him in court, where he pleaded guilty to distributing a recording with intent to stir up racial hatred. He’s set to be sentenced later this month. </p><p>Johnson is one of nearly a dozen people that have so far been charged with offenses related to online speech as far-right riots swept through various UK cities after three girls were killed on July 29. Yet while the prosecutions themselves have materialized relatively quickly, the cases have raised questions about the speed that social media companies are able to act in cases of illegal incitement. </p><p>Johnson’s video was removed for violating YouTube’s harassment policies after <em>Bloomberg News</em> contacted the site for comment. A spokesperson for YouTube said it has teams “working around the clock to monitor for harmful footage.” A spokesman for Greater Manchester Police declined to comment.</p><p>Lawmakers have long criticized social media companies for their approach to moderating hate speech, but the recent events in the UK have put renewed focus on the platforms. Many of the posts that led to convictions had been taken down, but others with a similar tone and tenor have remained online, highlighting the inconsistent way in which these policies are applied. </p><p>Johnson is one of several people to face authorities’ scrutiny for posts over the last week. The day after he published his livestream, the Crown Prosecution Service announced that 28-year-old Jordan Parlour had been convicted for a series of online posts on Meta Platforms Inc.’s Facebook advocating an attack on a hotel housing migrants in Leeds. The CPS said it was the first conviction for posting online in relation to the riots. </p><p>On Friday, Parlour was sentenced to 20 months imprisonment. </p><p>“Although you said you had no intention of carrying out active violence, there can be no doubt you were inciting others to do so, otherwise why post the comment?” Judge Guy Kearl told Parlour. “You expressed remorse, but by that time it was too late.”</p>.How anti-immigrant riots flared in the UK. <p>“We are in contact with law enforcement and are supporting them in every way we can,” a spokesperson for Meta said.</p><p>The social media firms were already set to face new legislation, but there is now pressure to tighten that even further.</p><p>The UK’s Online Safety Act, a broad new set of laws to protect people online, with penalties for social media companies hosting dangerous content, was officially passed last year. While law enforcement in the UK can already hold people accountable for what they say online, the act pushes liability onto the companies themselves for not moderating illegal content. </p><p>However, the laws have yet to be fully implemented with enforcement not expected until next year.</p><p>And even when those powers are in place, they won’t go far enough when dealing with misinformation, with the new legislation unlikely to be of use to police when dealing with incidents like the most recent riots, lawyers said.</p><p>“Whilst the Home Secretary may have said ‘if it’s a crime offline, it’s a crime online,’ and whilst that may be correct, the Online Safety Act provides no additional support to the pre-existing criminal law covering incidents of incitement of violence,” said Mark Jones, a lawyer at Payne Hicks Beach.</p><p>UK government officials are now considering revisiting key parts of the Online Safety Act, according to people familiar with the situation. The last government watered down the bill, removing language that would have regulated “legal but harmful” content in order to allay concerns of free speech campaigners. London Mayor Sadiq Khan told the Guardian newspaper it was “not fit for purpose” in its current form.</p><p>For cases like Parlour’s and Johnson’s, prosecutors are relying on the Public Disorder Act from 1986, traditionally used for rioters who stir up disorder in person rather than online. The Criminal Prosecution Service has managed to get these offenses through the backlogged courts more swiftly than usual with fast-tracked convictions.</p><p>It took just a day for Tyler Kay, 26, to be arrested and charged with intending to stir up racial hatred online after a series of anti-immigration posts on X, formerly Twitter. He pleaded guilty after a post on Aug. 7 calling for hotels housing asylum seekers to be set alight and another inciting violence against an immigration lawyer. He was jailed for 38 months, prosecutors said.</p>.King Charles calls for unity and understanding after riots in UK. <p>Mainstream social networks such as Facebook have long maintained that they remove or block all forms of illegal content and explicit calls for violence, but fringe groups, such as those that helped propel the recent violence in the UK, have a long history of crafting their messages to try and skirt the platform’s rules. </p><p>Prime Minister Keir Starmer on Friday warned that online content “is not a law-free zone” but conceded that following the recent disorder, the government is “going to have to look more broadly” at social media. “People should be mindful of the first priority, which is to ensure that our communities are safe and secure,” he said.</p>
<p>On the evening of August 5, Aaron Johnson arrived at the Britannia Hotel in Stockport, northern England. The 32-year-old man from a nearby area, had come to see what he was calling a “migrant hotel.”’ </p><p>“This is civil unrest, you know what I’m saying,” he told viewers of his YouTube livestream, which would ultimately number more than 41,000 before it was removed.</p><p>Over the next 25 minutes, Johnson traveled the circumference of the building, probing doors and windows while his audience on the Alphabet Inc.-owned video sharing platform, egged him on in the live chat feature. He filmed through windows into bedrooms, capturing one couple sleeping and harassing another.</p><p>“I’m uncomfortable in my own f******g country,” he said, when confronted by hotel security guards. “These people are getting put up in a f******g hotel while our own people are on the streets.” </p>.UK rioters thwarted by 'show of unity' by communities, says police chief. <p>On Wednesday, Johnson’s video landed him in court, where he pleaded guilty to distributing a recording with intent to stir up racial hatred. He’s set to be sentenced later this month. </p><p>Johnson is one of nearly a dozen people that have so far been charged with offenses related to online speech as far-right riots swept through various UK cities after three girls were killed on July 29. Yet while the prosecutions themselves have materialized relatively quickly, the cases have raised questions about the speed that social media companies are able to act in cases of illegal incitement. </p><p>Johnson’s video was removed for violating YouTube’s harassment policies after <em>Bloomberg News</em> contacted the site for comment. A spokesperson for YouTube said it has teams “working around the clock to monitor for harmful footage.” A spokesman for Greater Manchester Police declined to comment.</p><p>Lawmakers have long criticized social media companies for their approach to moderating hate speech, but the recent events in the UK have put renewed focus on the platforms. Many of the posts that led to convictions had been taken down, but others with a similar tone and tenor have remained online, highlighting the inconsistent way in which these policies are applied. </p><p>Johnson is one of several people to face authorities’ scrutiny for posts over the last week. The day after he published his livestream, the Crown Prosecution Service announced that 28-year-old Jordan Parlour had been convicted for a series of online posts on Meta Platforms Inc.’s Facebook advocating an attack on a hotel housing migrants in Leeds. The CPS said it was the first conviction for posting online in relation to the riots. </p><p>On Friday, Parlour was sentenced to 20 months imprisonment. </p><p>“Although you said you had no intention of carrying out active violence, there can be no doubt you were inciting others to do so, otherwise why post the comment?” Judge Guy Kearl told Parlour. “You expressed remorse, but by that time it was too late.”</p>.How anti-immigrant riots flared in the UK. <p>“We are in contact with law enforcement and are supporting them in every way we can,” a spokesperson for Meta said.</p><p>The social media firms were already set to face new legislation, but there is now pressure to tighten that even further.</p><p>The UK’s Online Safety Act, a broad new set of laws to protect people online, with penalties for social media companies hosting dangerous content, was officially passed last year. While law enforcement in the UK can already hold people accountable for what they say online, the act pushes liability onto the companies themselves for not moderating illegal content. </p><p>However, the laws have yet to be fully implemented with enforcement not expected until next year.</p><p>And even when those powers are in place, they won’t go far enough when dealing with misinformation, with the new legislation unlikely to be of use to police when dealing with incidents like the most recent riots, lawyers said.</p><p>“Whilst the Home Secretary may have said ‘if it’s a crime offline, it’s a crime online,’ and whilst that may be correct, the Online Safety Act provides no additional support to the pre-existing criminal law covering incidents of incitement of violence,” said Mark Jones, a lawyer at Payne Hicks Beach.</p><p>UK government officials are now considering revisiting key parts of the Online Safety Act, according to people familiar with the situation. The last government watered down the bill, removing language that would have regulated “legal but harmful” content in order to allay concerns of free speech campaigners. London Mayor Sadiq Khan told the Guardian newspaper it was “not fit for purpose” in its current form.</p><p>For cases like Parlour’s and Johnson’s, prosecutors are relying on the Public Disorder Act from 1986, traditionally used for rioters who stir up disorder in person rather than online. The Criminal Prosecution Service has managed to get these offenses through the backlogged courts more swiftly than usual with fast-tracked convictions.</p><p>It took just a day for Tyler Kay, 26, to be arrested and charged with intending to stir up racial hatred online after a series of anti-immigration posts on X, formerly Twitter. He pleaded guilty after a post on Aug. 7 calling for hotels housing asylum seekers to be set alight and another inciting violence against an immigration lawyer. He was jailed for 38 months, prosecutors said.</p>.King Charles calls for unity and understanding after riots in UK. <p>Mainstream social networks such as Facebook have long maintained that they remove or block all forms of illegal content and explicit calls for violence, but fringe groups, such as those that helped propel the recent violence in the UK, have a long history of crafting their messages to try and skirt the platform’s rules. </p><p>Prime Minister Keir Starmer on Friday warned that online content “is not a law-free zone” but conceded that following the recent disorder, the government is “going to have to look more broadly” at social media. “People should be mindful of the first priority, which is to ensure that our communities are safe and secure,” he said.</p>