Online Safety Act will help ensure children like my daughter Sophie, 13, aren't driven to suicide by harmful content – Ruth Moss

Ofcom has put forward 40 practical actions for online services to lower the risk of children seeing harmful content, ahead of the Online Safety Act coming into full force

A pivotal moment has occurred this week in the world of social media to help keep children safe online. It may have passed some people by, and others may feel it doesn’t apply to them. But it matters a lot, particularly for me.

In 2014, my beautiful daughter Sophie died by suicide. She was 13. As many parents who have lost a child will understand, my life changed irrevocably and forever. Suicide is complex but it didn’t take long after her death before I turned to Sophie’s internet use on her phone and tablet (which she needed for school).

Hide Ad
Hide Ad

What I found was toe-curling. Sophie had set up fake social media accounts and seen daily content around depression, pornography, self-harm, and suicide, and had chatted to strangers online, some of whom sent her semi-naked photos of themselves.

Many children have smartphones

I was overwhelmed. Where had I gone wrong? I had limited her access, had parental controls in place, set boundaries and talked about safe internet use. Yes, I had given her a smartphone at the age of 12, but this is not unusual.

Ninety-five per cent of 12-15-year-olds own a smartphone, and 53 per cent of eight-12-year-olds have a social media presence, despite most social media platforms setting a minimum age requirement of 13. Controlling the output of multi-billion-dollar companies wasn’t possible on my own.

Read More
How to keep your child safe online as extreme online child sexual abuse material...

I’ve spent seven years of campaigning with organisations such as the NSPCC and National Suicide Prevention Leadership Group for a safer online environment for children. And on May 8, 2024, Ofcom laid out 40 practical actions for online services to lower the risk of kids running into harmful content, ahead of the Online Safety Act coming into full force.

End-to-end encryption

The proposals include mandating the introduction of rigorous age-verification protocols, ensuring effective moderation, and ensuring algorithms pushing harmful content are removed. The legislation forces companies to enact robust safety precautions to mitigate the potential risks their websites pose to children. When things go wrong, social media companies are required to have clear, easy-to-access complaints processes. These changes are due to be implemented next year.

Of course, no-one ever gets everything they want in life and some gaps in the measures still exist. For example, Sophie had set up an account on WhatsApp, which uses end-to-end encryption. First designed for military use to keep information secret, this makes it impossible for tech firms to identify child abuse on their platforms because the security cannot even be breached by the companies. With a 25 per cent rise in online child sex-abuse image offences recorded by UK police in 2022/23, there should be no place for end-to-end encryption on messaging sites used by children.

Individuals still have nowhere to go if they have exhausted the social media companies’ complaints procedures without success. Ofcom is the regulator, but not an ombudsman.

There are also risks associated with the legislation itself. The tech landscape changes quickly, and legislation changes take time. Ofcom and government need to be agile at responding to emerging threats. The Online Safety Act is the beginning of keeping children safe online. It will need to be adapted in the light of real-world testing. But across the UK, it’s a positive step for children and parents and I am cautiously optimistic about its impact. Now the hard work of enforcement begins.

Related topics:

Comments

 0 comments

Want to join the conversation? Please or to comment on this article.