UK’s internet watchdog finalizes first set of rules for Online Safety law

Date:

Share post:


On Monday, the U.K.’s internet regulator, Ofcom, published the first set of final guidelines for online service providers subject to the Online Safety Act. This starts the clock ticking on the sprawling online harms law’s first compliance deadline, which the regulator expects to kick in in three months’ time.

Ofcom has been under pressure to move faster in implementing the online safety regime following riots in the summer that were widely perceived to have been fuelled by social media activity. Although it is just following the process lawmakers set out, which has required it to consult on, and have parliament approve, final compliance measures.

“This decision on the Illegal Harms Codes and guidance marks a major milestone, with online providers now being legally required to protect their users from illegal harm,” Ofcom wrote in a press release.   

“Providers now have a duty to assess the risk of illegal harms on their services, with a deadline of March 16, 2025. Subject to the Codes completing the Parliamentary process, from March 17, 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and activity.”

“We are ready to take enforcement action if providers do not act promptly to address the risks on their services,” it added.

According to Ofcom, more than 100,000 tech firms could be in scope of the law’s duties to protect users from a range of illegal content types — in relation to the over 130 “priority offences” the Act sets out, which cover areas including terrorism, hate speech, child sexual abuse and exploitation, and fraud and financial offences.

Failure to comply risks fines of up to 10% of global annual turnover (or up to £18 million, whichever is greater).

In-scope firms range from tech giants to “very small” service providers, with various sectors impacted including social media, dating, gaming, search, and pornography.

“The duties in the Act apply to providers of services with links to the UK regardless of where in the world they are based. The number of online services subject to regulation could total more than 100,000 and range from some of the largest tech companies in the world to very small services,” wrote Ofcom.

The codes and guidance follow a consultation, with Ofcom looking at research and taking stakeholder responses to help shape these rules, since the legislation passed parliament last fall and became law back in October 2023.

The regulator has outlined measures for user-to-user and search services to reduce risks associated with illegal content. Guidance on risk assessments, record-keeping, and reviews is summarized in an official document.

Ofcom has also published a summary covering each chapter in today’s policy statement.

The approach the U.K. law takes is the opposite of one-size-fits all — with, generally, more obligations placed on larger services and platforms where multiple risks may arise compared to smaller services with fewer risks.

However, smaller lower risk services do not get a carve out from obligations, either. And — indeed — many requirements apply to all services, such as having a content moderation system that allows for swift take-down of illegal content; having mechanism for users to submit content complaints; having clear and accessible terms of service; removing accounts of proscribed organizations; and many others. Although many of these blanket measures are features that mainstream services, at least, are likely to already offer.

But it’s fair to say that every tech firm that offers user-to-user or search services in the U.K. is going to need to undertake an assessment of how the law applies to their business, at a minimum, if not make operational revisions to address specific areas of regulatory risk.

For larger platforms with engagement-centric business models — where their ability to monetize user-generated content is linked to keeping a tight leash on people’s attention — greater operational changes may be required to avoid falling foul of the law’s duties to protect users from myriad harms.

A key lever to drive change is the law introducing criminal liability for senior executives in certain circumstances, meaning tech CEOs could be held personally accountable for some types of non-compliance.

Speaking to BBC Radio 4’s Today program on Monday morning, Ofcom CEO Melanie Dawes suggested that 2025 will finally see significant changes in how major tech platforms operate.

“What we’re announcing today is a big moment, actually, for online safety, because in three months time, the tech companies are going to need to start taking proper action,” she said. “What are they going to need to change? They’ve got to change the way the algorithms work. They’ve got to test them so that illegal content like terror and hate, intimate image abuse, lots more, actually, so that doesn’t appear on our feeds.”

“And then if things slip through the net, they’re going to have to take it down. And for children, we want their accounts to be set to be private, so they can’t be contacted by strangers,” she added.

That said, Ofcom’s policy statement is just the start of it actioning the legal requirements, with the regulator still working on further measures and duties in relation to other aspects of the law — including what Dawes couched as “wider protections for children” that she said would be introduced in the new year.

So more substantive child safety-related changes to platforms that parents have been clamouring to force may not filter through until later in the year.

“In January, we’re going to come forward with our requirements on age checks so that we know where children are,” said Dawes. “And then in April, we’ll finalize the rules on our wider protections for children — and that’s going to be about pornography, suicide and self harm material, violent content and so, just not being fed to kids in the way that has become so normal but is really harmful today.”

Ofcom’s summary document also notes that further measures may be required to keep pace with tech developments such as the rise of generative AI, indicating that it will continue to review risks and may further evolve requirements on service providers.

The regulator is also planning “crisis response protocols for emergency events” such as last summer’s riots; proposals for blocking the accounts of those who have shared CSAM (child sexual abuse material); and guidance for using AI to tackle illegal harms.



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

Skims co-founder Jens Grede addresses those IPO rumors

Skims co-founder Jens Grede has confirmed that plans for an IPO are on the back burner—for now...

EU signs $11B deal for sovereign satellite constellation to rival Musk’s Starlink

The European Union is forging ahead with plans for a constellation of internet satellites to rival Elon...

Google names new India chief

Google has appointed Preeti Lobana to lead its India business, filling a key position that had been...

This stealthy African stablecoin startup already processed over $1B in cross-border payments

Juicyway, an African fintech that leverages stablecoin technology to power fast and cheap cross-border payments, is launching...

Serbian police used Cellebrite to unlock, then plant spyware, on a journalist’s phone

This year, a Serbian journalist and an activist had their phones hacked by local authorities using a...

Revisiting 19th-century Paris with VR

While I have fond memories of past efforts to combine VR content with real-world locations, I’d assumed...

NeurIPS keynote speaker apologizes for reference to Chinese student

A speaker at the annual NeurIPS AI conference has drawn criticism — not for her opinions about...

The 2025 Lucid Air Pure is a luxe ride at $69,900 with room for tech tune-ups

The all-electric 2025 Lucid Air Pure is a dreamy, sexy car that’s no less luxurious for being...