Advertisement

Social media giants to be bound by code of conduct to help protect children's privacy online

The ICO has published a new Code of Practice to help protect children online [Photo: Getty]
The ICO has published a new Code of Practice to help protect children online [Photo: Getty]

Social media firms will have to adhere to a new code, which aims to help protect children’s privacy online.

The UK's data regulator has published a set of standards which it hopes will force tech companies to take protecting children online seriously.

Drawn up by the Information Commissioner's Office (ICO), the Age Appropriate Design Code, covers everything from apps to connected toys, social media platforms and online games, and even educational websites and streaming services.

Not only will companies be expected to prioritise data protection of young people, firms such as Facebook, Google and other tech giants will be prevented from serving children any content that is “detrimental to their physical or mental health or wellbeing.”

It is hoped the code will come into effect by the autumn of 2021 following approval from parliament and any breaches made could incur large fines.

READ MORE: Social media 'sadfishing' trend harming children's mental health, but what is it?

The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.

It lays out 15 different principles that sites, apps and other online services likely to have users under 18 in the UK must follow.

The provisions include setting privacy settings to high, switching off location settings by default, minimising data collection and sharing and switching off, by default, profiling that can allow children to be served up targeted content.

Commenting on the Age Appropriate Design Code Elizabeth Denham, the information commissioner, said future generations will be “astonished to think that we ever didn't protect kids online.”

“Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off and even how they are feeling,” she said.

“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”

The code says that the best interests of the child should be a primary consideration when designing and developing online services.

“One in five internet users in the UK is a child, but they are using an internet that was not designed for them,” Ms Denham continued.

“There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.

“In a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind.”

Andy Burrows, the NSPCC’s head of child safety online policy, said: “This transformative code will force high-risk social networks to finally take online harm seriously and they will suffer tough consequences if they fail to do so.

“For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content. It is now key that these measures are enforced in a proportionate and targeted way.”

READ MORE: Mental health warning over children as young as two accessing social media

The new code has been designed to offer children more protection online [Photo: Getty]
The new code has been designed to offer children more protection online [Photo: Getty]

News of the new code comes as last week it was revealed that social media companies such as Facebook and Instagram should be forced to hand over data about who their users are and why they use the sites in an attempt to reduce suicide among children and young people.

The report, from the Royal College of Psychiatrists is backed by the grieving father of Molly Russell, who died aged 14-year-old in 2017, and was found to have viewed harmful content online before her death from suicide.

Last year Instagram revealed it was banning graphic images of self-harm after Health Secretary Matt Hancock said social media companies “need to do more” to curb their impact on teenagers’ mental health.

A Government spokesman said: “We are developing world-leading plans to make the UK a safer place to be online. This includes a duty of care on online companies, overseen by an independent regulator with tough enforcement powers, to hold them to account.

“The regulator will have the power to require transparency reports from companies outlining what they are doing to protect people online. These reports will be published so parents and children can make informed decisions about their internet use.”

Last year it was revealed that teenagers who spend more than three hours a day on social media may be at higher risk of mental health problems.

Findings from 6,595 youngsters aged 12 to 15 in the US found those who used social media more heavily were more likely to report issues such as depression, anxiety and loneliness, as well as aggression and anti-social behaviour, than teenagers who did not use social media.