How social media will change under new Ofcom rules this year

New rules aim to make social media safer for children (PA Wire)
New rules aim to make social media safer for children (PA Wire)

Ofcom has revealed when social media companies operating in the UK will need to begin complying with new rules designed to make platforms safer.

From December, the communications regulator will publish its first ‘illegal harms’ code and guidance, giving social media executives three months to comply with the measures.

The changes come as part of new powers given to Ofcom under the Online Safety Act, which was passed in 2023. The Department for Science, Innovation, and Technology (DSIT) says the bill will make the UK “the safest place in the world to be a child online”. It will require platforms to prevent children from accessing harmful or age-inappropriate content online.

The bill also names Ofcom as the UK’s independent regulator of Online Safety. This will give the telecoms watchdog new powers to enforce the rules laid out in the bill.

Technology secretary Peter Kyle has asked Ofcom for an online safety response to summer riots (PA Wire)
Technology secretary Peter Kyle has asked Ofcom for an online safety response to summer riots (PA Wire)

Ofcom chief executive Dame Melanie Dawes said: “The time for talk is over. From December, tech firms will be legally required to start taking action, meaning 2025 will be a pivotal year in creating a safer life online.

“We’ve already engaged constructively with some platforms and seen positive changes ahead of time, but our expectations are going to be high, and we’ll be coming down hard on those who fall short.”

A number of new criminal offences are laid out as part of the act: cyberflashing, intimate image abuse (revenge porn), and epilepsy trolling. “Threatening communications” and “sending false information intended to cause non-trivial harm” are also included.

The act also now requires social media firms to take action against illegal content and activity. Both “racially or religiously aggravated public order offences” and “inciting violence” are included as types of illegal content.

Ofcom will be able to enforce the rules laid out in the Online Safety Act in a number of ways. Platforms will need to begin providing evidence to the watchdog of how they are meeting the set requirements. Ofcom will then evaluate and monitor these companies, before deciding whether to take action for non-compliance.

Companies can be fined up to £18 million or 10 per cent of their revenue worldwide if they do not comply (whichever amounts to more). Criminal action can even be taken against senior managers who fail to ensure information requests from Ofcom are fulfilled.

Specifically on child-related offences, the watchdog will be able to hold companies and managers criminally liable for non-compliance.

In “extreme” cases, Ofcom will be able to require internet providers and advertisers to stop working with platforms, essentially banning them from operating in the country. This would be subject to agreement from the courts.

The regulator has laid out a timeline of milestones it wants to achieve over the next year, with more measures related to children’s safety to come. All measures should be finalised by July 2025.

Some experts recently criticised social media laws in the UK as being too lax, after it was found online misinformation made a significant contribution to fuelling far-right riots in July and August.