How Ofcom’s new online safety changes will protect your child - new rules explained

Watch more of our videos on ShotsTV.com 
and on Freeview 262 or Freely 565
Visit Shots! now
Children’s social media feeds could soon look a little different 📱
  • Ofcom has announced more than 40 new child safety changes tech companies will have to make
  • It has given websites, apps, and social media platforms until late July to make the required changes
  • If they don’t comply, they may face fines or even more serious penalties
  • The changes may give parents more peace of mind that their children aren’t being exposed to harmful content

Social media platforms will soon have to update their ‘recommended’ content algorithms to better protect young users - in just one of many new legally-required changes tech companies now face.

Last week, Ofcom - the Government’s communications regulator - finalised a host of new child safety measures, which websites and apps will have to introduce this July under the Online Safety Act. The watchdog’s chief executive Dame Melanie Dawes described it as an online “reset” for children.

Hide Ad
Hide Ad

“[The changes] will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content,” she said in a statement. “Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”

But what exactly is going to change, and what will it mean for young people and their families? Here’s what you need to know:

What new rules will websites and apps have to follow?

By July 24, all providers of services likely to be accessed by children in the UK have to carry out a risk assessment of dangers their platform could potentially pose to young users. Ofcom will be able to request this at any time. Then from the following day (July 25), they will need to implement safety measures to mitigate these risks.

The full list includes more than 40 measures tech companies will have to take. You can find this full list online here.

Hide Ad
Hide Ad
Ofcom's collection of new law changes are aimed at keeping children safer onlineOfcom's collection of new law changes are aimed at keeping children safer online
Ofcom's collection of new law changes are aimed at keeping children safer online | (Image: National World/Adobe Stock/Getty)

But some of the most important ones include creating safer feeds - by configuring their algorithms to filter out harmful content from being recommended to children. Others include carrying out effective age checks to identify which users are children; having processes in place to take quick action once they become aware of harmful content; making it straightforward even for young users to report this content; and giving children more control over their online experience.

This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute certain accounts, and to disable comments on their own posts, Ofcom says. If children do search for potentially harmful content themselves - like eating disorder, self-harm or bullying-related content - the site or app will need to signpost them to places they can get support.

Companies will also have to appoint a named person who will be accountable for children’s safety, as well as creating a senior team to carry out annual risk management reviews.

Hide Ad
Hide Ad

What will happen if they don’t follow these rules?

If companies fail to comply with their new duties to protect younger users, they may be penalised. Ofcom has the power to impose substantial fines.

In very serious cases, however, the regulator has another option up its sleeve. Ofcom is able to apply for a court order, which may prevent the offending site or app from being available in the UK altogether.

What will the changes mean for families?

Ofcom said that young internet users told them seeing violent content online was “unavoidable”, and self-harm and eating disorder content was “prolific”. Of the young people who had seen pornography online, the average age they first encountered it was just 13 years old. Three in ten 8 to 12 year olds had seen something online they found worrying, while more than half of 11 to 14-year-old boys had engaged with influencers tied to the “manosphere”.

For parents, the rule change should bring some peace of mind that their children would be protected from harmful content while using social media or browsing the web. This is likely especially comforting to parents of older children, who may use the internet without supervision.

Hide Ad
Hide Ad

Potential further changes that could help parents better regulate their child’s social media use have also been floated. The UK’s technology secretary Peter Kyle recently said that he’d be watching the impact of TikTok’s new ‘wind down’ feature with interest. The feature will discourage under-16s from using the app after 10pm, and was introduced alongside new parental controls - including the ability to set customisable daily screen time limits, and a ‘time away’ feature to give children a break from the app.

Mr Kyle confirmed that he was looking into what the next steps should be after Ofcom’s new rules were rolled out. “I’m not going to act on something that will have a profound impact on every single child in the country without making sure that the evidence supports it, but I am investing in [researching] the evidence.”

If you have an education story to share, we’d love to hear from you. You can now send your stories to us online via YourWorld at www.yourworld.net/submit. It's free to use and, once checked, your story will appear on our website and, space allowing, in our newspapers.

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.

Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice