Social media platforms, including Facebook and X (formerly Twitter), must adhere to UK laws, according to Science Secretary Peter Kyle. This statement follows Meta’s announcement to alter fact-checking rules in the United States, sparking concerns about content moderation and online safety.
Meta’s Rule Changes and UK Law
Meta CEO Mark Zuckerberg recently stated that the new policy in the US would result in fewer false positives but also reduce the moderation of harmful content. Kyle, addressing the issue on the BBC, emphasized that while the rule change applies only in the US, platforms operating in the UK must remove illegal content as required by law.
“If you come and operate in this country, you abide by the law, and the law says illegal content must be taken down,” Said By Kyle.
Call for Stronger Online Safety
Ian Russell, father of Molly Russell—a 14-year-old who tragically died after exposure to harmful online content—urged the UK government to strengthen internet safety rules. He criticized tech giants Meta and X for moving towards an “anything-goes model,” which he said could reintroduce harmful content similar to what his daughter encountered.
Despite concerns, a Meta spokesperson reassured that there had been no changes to how they handle content promoting suicide, self-harm, or eating disorders. Automated systems, they stated, will continue scanning for such high-risk content.
Gaps in the UK’s Online Safety Law
Critics have pointed out that the UK’s Online Safety Act has limitations, especially regarding live streaming and content promoting self-harm. Initially, the act included provisions to tackle “legal-but-harmful” content, such as posts encouraging eating disorders. However, this clause was removed for adult users after backlash over potential censorship concerns.
Conservative MP David Davis warned that the original proposal risked “the biggest accidental curtailment of free speech in modern history.” Instead, the law now requires platforms to offer users tools to filter unwanted content, while retaining protections for children.
Online Safety Act and Enforcement
The Online Safety Act, set to be enforced later this year, includes strict rules:
- Platforms must remove illegal content, such as child sexual abuse materials, incitements to violence, and posts promoting suicide.
- Companies must shield children from harmful material, including bullying, pornography, and dangerous stunts.
- Firms are expected to adopt “age assurance technologies” to prevent children’s exposure to harmful content.
- Platforms must combat illegal disinformation and state-sponsored propaganda.
Companies failing to comply will face “very strident” sanctions, Kyle warned.
Adapting to Emerging Technologies
Kyle stressed the importance of keeping legislation up to date with rapidly evolving technologies. He noted that ministers would soon gain powers to enforce age-appropriate content standards and assured that new laws could be introduced if needed.
“Parliament needs to get faster at updating laws to address emerging safety concerns,” he said, expressing openness to new regulations.
Conclusion
As the UK prepares to enforce the Online Safety Act, platforms like Facebook and X are under pressure to comply with stringent laws aimed at protecting users, particularly children. The ongoing debate highlights the delicate balance between promoting online safety and safeguarding free speech. For now, the UK government appears firm in holding tech giants accountable for the content on their platforms.