Social media platforms must take steps to protect all users from illegal content or face hefty fines under measures introduced this week.
According to Government guidance, the provisions of the Online Safety Act require “all companies to take robust action against illegal content and activity”, including child sexual abuse, extreme pornography, and the sale of illegal drugs.
Ofcom, appointed by the Government to regulate the Act, has power to fine companies in breach of their duties up to £18 million or 10 per cent of their qualifying worldwide revenue, “whichever is greater”.
Safety first
Announcing the changes on Monday, Technology Secretary Peter Kyle said: “In recent years, tech companies have treated safety as an afterthought. That changes today.
“This is just the beginning. I’ve made it clear that where new threats emerge, we will act decisively. The Online Safety Act is not the end of the conversation, it’s the foundation.”
But the NSPCC has said new laws are already needed to deal with AI, which is being used to generate images of young people for sextortion and bullying.
Under 18s
From January, as specified by guidelines issued by Ofcom regarding implementation of the Online Safety Act, sites that host pornography are now required to “deploy” age-verification systems which could be considered to be “highly effective”.
Speaking to LBC Radio recently, the watchdog’s Chief Executive Dame Melanie Dawes said it was “very clear that our under-18s deserve a very different experience than the one they’re getting now”.
This means, she said, “no pornography, no suicide and self-harm material, and significant down ranking of things like violent content, misogyny, racist content, and so on”.
She added: “Parents need to be part of this. Children can do things to keep themselves safe. But above all, I want the platforms to make the service safer.”
Pornhub ‘exploits’ loophole to delay protecting kids
‘Porn is fuelling sexual violence against women’, MP warns
UK Govt urged to ‘stem alarming tide’ of deepfake pornography