The Government must tackle the “alarming proliferation of deepfake pornography” in Britain, a Guardian journalist has said.
Deepfake pornography uses AI tools to generate pornographic images of real people. Lucia Osborne-Crowley warned that it is easy to do, and search engines and social media platforms are failing to tackle the degrading content.
Sharing deepfake pornography, cyberflashing or downblousing images became a criminal offence earlier this year, but creating such content is still legal.
Social media
Osborne-Crowley urged the Government to tackle the issue in all its forms, making it impossible to evade a ban on creating deepfakes by claiming it was “a joke” or by placing digital stickers over an image.
In addition, she warned that such pornography is being “sold openly on social media”, including advertisements on Instagram, TikTok and X.
The journalist explained: “If the government made deepfakes a criminal offence, the regulator would be forced to act.
“Our new prime minister has already made it clear that his government is all about change. Let’s hope that protecting the victims of sexual abuse and stemming the tide of deepfake pornography is part of this.”
Labour
Earlier this year, a Labour Party think tank called for a ban on “nudification” tools.
In a policy paper containing proposals to tackle ‘misinformation’, Labour Together said Ofcom should fine web companies and search engines that do not take reasonable to steps to prevent “harmful deepfakes”.
Secretary of State for Science, Innovation and Technology Peter Kyle, then in the Shadow Cabinet, said the Labour Party would consider the recommendation amid the “deeply concerning” rise of such technology.
The Institute worked with Parliamentarians to improve protections for children from pornography in the Online Safety Act, ahead of it coming into law in October 2023.
Pornhub shuts down in more US states after refusing to protect kids