SOCIAL MEDIA & THE U.K. RIOTSWhat Is the Online Safety Act and Why Have Riots in the U.K. Reopened Debates About It?
Social media played a key role in the widespread coordination of riots in locations across the country. Online platforms have also served as a vehicle through which misinformation and hateful rhetoric has spread.
Recent rioting and unrest in the UK has led to calls for the Online Safety Act to be revisited. Mayor Sadiq Khan has called it “not fit for purpose” and Cabinet Office minister Nick Thomas Symonds suggested that the government could change the law, which was passed under the previous government and includes a raft of measures relevant to the recent riots, including powers to fine social media companies.
Prime Minister Keir Starmer has been less forthcoming about the act and has said only that he would “look more broadly at social media after this disorder”. His spokesperson suggested the act was not under active review.
In practical terms, social media played a key role in the widespread coordination of events in locations across the country. Online platforms have also served as a vehicle through which misinformation and hateful rhetoric has spread.
The act, enforced by the independent media regulator Ofcom, deals with the regulation of online speech and aims to protect users from potential harms including abuse and harassment, fraudulent activity and hate offences.
Specifically, it seeks to place more responsibility on social media companies to ensure their platforms are safe, with fines of up to 10% of their annual revenue being issued to providers whose platforms are deemed unsafe.
In more extreme cases, Ofcom has the power to require advertisers and internet providers to cease working with platforms that do not comply with the regulations. The act passed into law in October 2023, and laws in relation to individual offences are already in effect. For example, it is now an offence to share false information with an intention to cause non-trivial harm.
However, the frustration in the wake of the riots has arisen from the fact that the parts of the act are not due to come into effect until late 2024. These include enforcement powers and other measures that Ofcom could apply to social networking platforms and other platform providers, such as online forums and instant messaging platforms. This raises questions as to what might have been different in the past 14 days had they already been in place.
Algorithm Concerns
A key concern has been the way in which algorithms deciding what content is recommended on social networking platforms may have propagated harmful content in relation to the riots – including racist, hateful and violent content.
For example, it was found that people were using TikTok to live-stream content of the riots as they unfolded.