Social media giants who fail to tackle the spread of child abuse on their platforms face multimillion-pound fines and being blocked in the UK under new plans.
The Online Safety Bill will introduce stiff financial penalties for companies that fail to crack down on images of abuse, content glorifying terrorism or self-harm, ministers have confirmed.
But plans to introduce a criminal offence for senior executives at companies that flout the rules appear to have been scaled back.
Labour have said the plans leave tech giants “to mark their own homework” and don’t go far enough to protect the vulnerable.
From April 2017 to June 2020 there were 14,145 Sexual Communication with a Child offences recorded by police in England and Wales
One in 25 young people have sent, received or been asked to send sexual content to an adult, according to an NSPCC survey of over 2,000 young people
Total online child sex offences recorded by police increased by 16% last year to reach 10,391.
Under the new rules, which the Government will bring forward in an Online Safety Bill next year, Ofcom – in its new confirmed role as regulator – will have the power to fine companies up to £18 million or 10% of global turnover, whichever is higher, for failing to abide by a duty of care to their users – particularly children and the vulnerable.
It will also have the power to block non-compliant services from being accessed in the UK, while the Government said it would also reserve the right to impose criminal sanctions on senior executives, powers it says it would not hesitate to bring into force through additional legislation if firms fail to take the new rules seriously.
The proposed legislation will apply to any company in the world hosting user-generated content online which is accessible by people in the UK or enables them to interact with others online.
Peter Wanless, NSPCC CEO said: “This is a landmark moment – the NSPCC have long called for an enforceable legal Duty of Care on tech companies and today is a major step towards legislation that can make that a reality. For too long children have been exposed to disgraceful abuse and harm online.
“Child protection and children’s voices must remain front and centre of regulatory requirements. We set out six tests for robust regulation – including action to tackle both online sexual abuse and harmful content and a regulator with the power to investigate and hold tech firms to account with criminal and financial sanctions.
“We will now be closely scrutinising the proposals against those tests. Above all, legislation must ensure Ofcom has the power and resources to enforce the duty of care and be able to identify and then take appropriate action against tech firms that fail.”
A small group of high-profile platforms will face tougher responsibilities under a two-tier system, with Facebook, TikTok, Instagram and Twitter to be placed in Category 1 as the companies with the largest online presences and most features deemed high-risk.
In addition to being required to take steps to address illegal content and activity and extra protections for children who access their services, firms in this group will be asked to assess what content or activity on their platform is legal but could pose a risk of harm to adults, and to clarify what “legal but harmful” content they see as acceptable in their terms and conditions.
Home Secretary Priti Patel said: “We will not allow child sexual abuse, terrorist material and other harmful content to fester on online platforms. Tech companies must put public safety first or face the consequences.”
The scope of the new legislation will not include online articles and comment sections, as part of efforts to protect freedom of speech.
The Government said it was also working with the Law Commission on whether the promotion of self-harm should be made illegal.
Labour’s Shadow Culture Secretary Jo Stevens said: “Yet again the government is leaving social media companies to mark their own homework which they know, and we all know, doesn’t work.
“Failing to bring in criminal sanctions seems to be an accommodation with senior social media executives rather than a redressing of the balance of power in the interest of citizens and their safety.
“Far from being world-leading, this is disappointingly unambitious.”
Anne Longfield, Children’s Commissioner for England, said the Bill needed to be introduced in Parliament “as soon as possible” next year.
She said: “The signs are that this regulation will have teeth, including strong sanctions for companies found to be in breach of their duties, and a requirement on messaging apps to use technology to identify child abuse and exploitation material when directed to by the regulator.
“However, much will rest on the detail behind these announcements, which we will be looking at closely.
“It is now essential that the Online Safety Bill is introduced into Parliament as soon as possible in 2021, so that children can enjoy all the benefits of the online world while being kept safe from harm.”