Social media platforms and other online services operating in the UK are facing new regulations. Ofcom, the UK’s communication services regulator, has released over 40 safety measures that applicable organizations must carry out by mid-March 2025. The new guidance follows last year’s passage of the Online Safety Act, which implements new protections for children and adults online. Ofcom’s role includes providing compliance codes and guidance for relevant companies.
Ofcom introduced new measures tackling areas such as fraud, moderation and child sexual abuse material (CSAM). Online services must take steps like nominating a senior person who is accountable for complying to its duties for illegal content, complaints and reporting. Moderating teams must be “appropriately” trained and have enough resources to quickly remove illegal content. Plus, relevant companies, such as social media platforms, should improve their algorithms to limit the spread of illegal content.
The regulator’s required anti-CSAM safety practices include hiding children’s profiles and locations, not allowing random accounts to message children and using hash-matching and URL detection to quickly find and shut down CSAM.
Ofcom consulted with the tech industry, charities and parents, among other entities. It also heard from children about their horrifying experiences of receiving predatory messages online and opinions on new regulations. “As an evidence-based regulator, every response has been carefully considered, alongside cutting-edge research and analysis, and we have strengthened some areas of the codes since our initial consultation,” Ofcom stated in its release. “The result is a set of measures — many of which are not currently being used by the largest and riskiest platforms — that will significantly improve safety for all users, especially children.”
The Online Safety Act includes “organizations big and small, from large and well-resourced companies to very small ‘micro-businesses.’ They also apply to individuals who run an online service,” Ofcom states. It gets a bit vague, though, with Ofcom adding the business must have a “significant number” of UK users or have the UK as a target market. The Act covers “user-to-user services,” such as social media, online gaming and dating sites. It also impacts “search services” and online businesses that show pornographic content.
Ofcom has the power to fine non-compliant sites £18 million ($22.7 million) or 10 percent of their qualifying global revenue, if a higher number. In “very serious cases” Ofcom can seek a court order to block a site’s UK presence. Ofcom plans to release further guidance across the first half of 2025.
This article originally appeared on Engadget at https://www.engadget.com/big-tech/uk-internet-watchdog-gives-social-media-companies-three-months-to-improve-safety-or-face-huge-fines-130018908.html?src=rss