Smart Living | Digital Parenting

Digital Parenting | 09 Dec 2022

Social media on trial: Who’s protecting our kids?

The Channel 4 drama I Am Ruth, starring Kate Winslet and her daughter Mia Threapleton, shone a light on the damage to teenagers' mental health an obsession with social media can wreak. We explore what more social media companies - and Government - should be doing to protect our children.

What is my child seeing on social media? Do I really know? Questions anxious parents often ask themselves.

“Time spent online can be an important part of a child’s development, but it can also pose real-life dangers, including bullying and exploitation, sexual abuse and grooming and exposure to pornography and self-harm forums,” warns Barnardo’s Chief Executive, Lynn Perry MBE.

“New technologies must be designed and developed with children’s safety and wellbeing in mind, to avoid harmful content appearing on these sites.”

She isn’t the only one saying that enough is enough.

When the inquest into the death of 14-year-old Molly Russell released its findings in October, senior coroner Andrew Walker said that the material she viewed on social media “shouldn’t have been available for a child to see”.

Molly ended her own life in November 2017 after viewing suicide and self-harm content online.

Hate speech online: Is your child being harmed?

We ask the experts on how to talk to teens about online hate speech and how to understand the fine line between freedom of expression and potentially criminal behaviour.

Right now “there are very few mechanisms on social media to prevent even very young people from being drawn into worlds of behaviours and thought processes that are very alien to them,” says Emma Citron, consultant clinical psychologist and spokesperson for the British Psychological Society.

The Government’s Online Safety Bill, currently before the House of Commons, aims to change that. So what concrete measures do the experts believe governments and social media platforms should be taking to protect their youngest users?

Age verification

“There are several important things we believe must be included in the Bill to protect children,” says Ms Perry.

“Firstly, age verification for pornographic content must be robust, and sites must be required to implement independent third-party age assurance.”

Governments play a role here, too, she believes: “Statutory guidelines should be developed and implemented to ensure this.”

Richard Collard, NSPCC Child Safety Online Policy Manager, agrees: “Social media sites need to understand which of their users are children if they are going to be effective in protecting them.

“Age assurance measures which use a range of methods to determine if a user is a child can be an effective and reliable way to understand this, and they should be being implemented across the board.”

Pornography

The next step? “Once platforms have identified child users on their site, it’s then essential that they make sure that they are serving them age-appropriate content,” says Mr Collard.

In October, Ofcom published new research showing that, while some companies -including Tiktok and Snapchat – have already made some positive changes to protect users from harmful content online, many adult video platforms were still accessible to children, who only have to tick a box claiming to be 18.

Dame Melanie Dawes, Ofcom’s Chief Executive, called it “deeply concerning to see yet more examples of platforms putting profits before child safety.”

Do you know what your kids are viewing on TikTok?

Here, Digital Awareness UK, a leading digital well-being organisation, shares its advice on how you can manage your kids' TikTok screen time.

Barnardo’s is also concerned. “The Online Safety Bill is an opportunity for the Government to introduce legislation which would require pornography websites to introduce age verification, and help to prevent millions of young children from stumbling across harmful videos online,” says Ms Perry.

“Age verification for pornographic content must be robust, and sites must be required to implement independent third-party age assurance. Statutory guidelines should be developed and implemented to ensure this.”

Self-harm

More than half of parents are concerned  about their children encountering content online that might encourage them to harm themselves.

“The duty to protect children from legal but harmful content, such as material promoting self-harm and suicide, must remain in the Bill, so that algorithms do not drive vulnerable children and young people to this content,” says Ms Perry.

“Even if a child or teen has searched for ‘feeling better’ or ‘mindfulness’ – quite innocent subjects – algorithms can bring up darker aspects of mental health like self-harm and suicide,” explains psychologist Emma Citron.

“This can legitimise self-harm and actually lead to the youngster’s mental health being made worse as they get drawn into a cycle of negative statements and negative feelings.”

Mia Threapleton in I Am Ruth
Children are often exposed to unsuitable content online fed to them by social media algorithms

Youth violence

“It’s hard to switch off from any conflicts, unpleasantness or nasty comments from peers when going home as youngsters tend to be drawn to respond to comments from friends quickly even late at night and throughout the evening,” says Ms Citron.

In fact, a recent report accuses both the Government and the tech sector of a “collective blindspot” when it comes to the relationship between youth violence and social media which, it says, often amplifies offline conflicts between kids.

Researchers worked with Thames Valley Police’s violence reduction unit as well as the parents of Olly Stephens, a 13-year-old from Reading who was stabbed to death following a dispute on social media.

They concluded that Ofcom should oversee an official five-star rating system, indicating how safe social media platforms are for children. The system would help parents and carers make more informed decisions about whether to allow their children access to sites.

The report also said that police, schools and children’s services should be better equipped with the resources and knowledge necessary to protect children from harm online.

Next steps

“The Online Safety Bill must be delivered without any further delay,” says Mr Collard.

Lynn Perry agrees: “These vital changes must come back to Parliament urgently,” she says, “because the truth is children cannot afford to wait any longer.”

Stay up-to-date with the latest news from Vodafone by following us on Twitter and signing up for News Centre website notifications.