Politics

Regulator Ofcom to have more powers over UK social media


Three teenage girls are lost in the world of smartphone apps and messaging, in Trafalgar Square. While in a very busy environment, the capital's main square in central London, the teenagers obsessively tweet and message their friends at home, completely unaware of their surroundings, absorbed in the functions of their devices and their young lives. Sitting on the walls of the fountains, they are isolated from each other and the noise around them. In the backgrounds are tourists enjoying the archImage copyright
Getty Images

Image caption

Ofcom currently only regulates the media, not internet safety

New powers will be given to the watchdog Ofcom to force social media firms to act over harmful content.

Until now, firms like Facebook, Tiktok, YouTube, Snapchat and Twitter have largely been self-regulating.

The companies have defended their own rules about taking down unacceptable content, but critics say independent rules are needed to keep people safe.

It is unclear what penalties Ofcom will be able to enforce to target violence, cyber-bullying and child abuse.

There have been widespread calls for social media firms to take more responsibility for their content, especially after the death of Molly Russell who took her own life after viewing graphic content on Instagram.

Later on Wednesday, the government will officially announce the new powers for Ofcom – which currently only regulates the media, not internet safety – as part of its plans for a new legal duty of care.

Ofcom will have the power to make tech firms responsible for protecting people from harmful content such as violence, terrorism, cyber-bullying and child abuse – and platforms will need to ensure that content is removed quickly.

They will also be expected to “minimise the risks” of it appearing at all.

Image caption

Molly Russell’s family found she had been accessing distressing material about depression and suicide on Instagram

“There are many platforms who ideally would not have wanted regulation, but I think that’s changing,” said Digital Secretary Baroness Nicky Morgan.

“I think they understand now that actually regulation is coming.”

New powers

Communication watchdog Ofcom already regulates television and radio broadcasters, including the BBC, and deals with complaints about them.

This is the government’s first response to the Online Harms consultation it carried out in the UK in 2019, which received 2,500 replies.

The new rules will apply to firms hosting user-generated content, including comments, forums and video-sharing – that is likely to include Facebook, Snapchat, Twitter, YouTube and TikTok.

The intention is that government sets the direction of the policy but gives Ofcom the freedom to draw up and adapt the details. By doing this, the watchdog should have the ability to tackle new online threats as they emerge without the need for further legislation.

A full response will be published in the spring.

Children’s charity the NSPCC welcomed the news.

“Too many times social media companies have said: ‘We don’t like the idea of children being abused on our sites, we’ll do something, leave it to us,'” said chief executive Peter Wanless.

“Thirteen self-regulatory attempts to keep children safe online have failed.

“Statutory regulation is essential.”

Image caption

Seyi Akiwowo set up the campaign group Glitch after experiencing online harassment.

Seyi Akiwowo set up the online abuse awareness group Glitch after experiencing sexist and racist harassment online after a video of her giving a talk in her role as a councillor was posted on a neo-Nazi forum.

“When I first suffered abuse the response of the tech companies was below [what I’d hoped],” she said.

“I am excited by the Online Harms Bill – it places the duty of care on these multi-billion pound tech companies.”

Global regulation

In many countries, social media platforms are permitted to regulate themselves, as long as they adhere to local laws on illegal material.

Germany introduced the NetzDG Law in 2018, which states that social media platforms with more than two million registered German users have to review and remove illegal content within 24 hours of being posted or face fines of up to €5m (£4.2m).

Australia passed the Sharing of Abhorrent Violent Material Act in April 2019, introducing criminal penalties for social media companies, possible jail sentences for tech executives for up to three years and financial penalties worth up to 10% of a company’s global turnover.

China blocks many western tech giants including Twitter, Google and Facebook, and the state monitors Chinese social apps for politically sensitive content.



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.