How can you keep your child safe online?




Tech firms will have to do more to protect young people from harmful content under new safety measures announced by the media regulator.

Ofcom’s own research found that 59% of 13 to 17-year olds surveyed had seen “potentially harmful content” online in the previous month.

As part of implementing the Online Safety Act, the regulator has finalised a series of child safety rules which will come into force for social media, search and gaming apps and websites on 25 July 2025.


Watch Video

Ofgem says the rules will prevent young people from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography.

They are also designed to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Firms which wish to continue operating in the UK must adopt more than 40 practical measures, including:

Failure to comply could result in businesses being fined £18m or 10% of their global revenues, or their executives being jailed.

In very serious cases Ofcom says it can apply for a court order to prevent the site or app from being available in the UK.

A number of campaigners want to see even stricter rules for tech firms, and some want under-16s banned from social media completely.


Watch Video

Ian Russell, chairman of the Molly Rose Foundation – which was set up in memory of his daughter who took her own life aged 14 – said he was “dismayed by the lack of ambition” in the codes.



The Duke and Duchess of Sussex are also calling for stronger protection from the dangers of social media, saying “enough is not being done”.


Watch Video

They unveiled a temporary memorial in New York City dedicated to children who have died due to the harms of the internet. “We want to make sure that things are changed so that… no more kids are lost to social media,” Prince Harry told BBC Breakfast.

The NSPCC children’s charity argues that the law still doesn’t provide enough protection for private messaging apps. It says that the end-to-end encrypted services which they offer “continue to pose an unacceptable, major risk to children”.

On the other side, privacy campaigners complain the new rules threaten users’ freedom.

Some also argue age verification methods are invasive without being effective enough. Digital age checks can lead to “security breaches, privacy intrusion, errors, digital exclusion and censorship,” according to Silkie Carlo, director of Big Brother Watch.


Watch Video

The bill also requires firms to show they are committed to removing illegal content, including:


Watch Video

The Act has also created new offences, such as:

Children aged eight to 17 spend between two and five hours online per day, according to Ofcom research.


Watch Video

It found that nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.



About half of children over 12 think being online is good for their mental health, according to Ofcom.


Watch Video

However, the Children’s Commissioner said that half of the 13-year-olds her team surveyed reported seeing “hardcore, misogynistic” pornographic material on social media sites. Children also said material about suicide self-harm and eating disorders was “prolific” and that violent content was “unavoidable”.


Watch Video

The NSPCC says it’s vital that parents talk to their children about internet safety and take an interest in what they do online.


Watch Video

Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters, a safety organisation set up by some of the big UK-based internet companies.

It has a list of parental controls available and step-by-step guides on how to use them.


Watch Video

These include advice on how to manage teen or child accounts on social media, video platforms such as YouTube, and gaming platforms such as Roblox or Fortnite.

However Ofcom data suggests that about one in five children are able to disable parental controls.


Watch Video




Instagram has already introduced “teen accounts” which turn on many privacy settings by default – although some researchers have claimed they were able to circumvent the promised protections.


Watch Video

Watch Video

Watch Video

Watch Video

Phone and broadband networks may block some explicit websites until a user has demonstrated they are over 18.

Some also have parental controls that can limit the websites children can visit on their phones.

Android and Apple devices also offer options for parents to block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.


Watch Video

Watch Video

Game console controls also let parents ensure age-appropriate gaming and control in-game purchases.


Watch Video





Watch Video

Watch Video

Watch Video

Watch Video

Watch Video

Watch Video

    Leave a Reply

    Your email address will not be published. Required fields are marked *