‘Porn is fuelling sexual violence against women’, MP warns

The Government must not ignore pornography’s contribution to violence against women, a Labour MP has said.

In a debate on commercial sexual exploitation, Tonia Antoniazzi MP said there is “extensive evidence” that “pornography consumption fuels sexual violence”. She warned that it “serves to dehumanise and sexually objectify women”, while distorting viewers’ understanding of sexual behaviour.

The MP for Gower urged ministers to “rein in the lawless activities of pornography sites”, several of which host illegal material, by ensuring that regulation is consistent both online and offline.

‘Online brothels’

Antoniazzi highlighted that prostitution is another form of “violence against women”, but gaps in the law allow pimping websites to promote online adverts that would be illegal to place in a phone box.

She explained: “They function like mass online brothels, making it as easy to order women to exploit as it is to order a takeaway.”

The MP called for the laws to be updated for the “21st century”, concluding: “This Government must shift power out of the hands of punters, pimps and pornographers, and place it into the hands of women and girls.”

‘Emergency’

Jess Phillips, Parliamentary Under-Secretary of State for the Home Department, agreed that the Government “must not rest” until it has tackled the “national emergency” of violence against women and girls.

Although the Minister said that any policies would be announced in due course, she did pledge to “build on the Online Safety Act 2023 to ensure that online companies fulfil their duty to eradicate this exploitation from their sites”.

Phillips highlighted that she has been involved in “many cases where this has not been handled well at all”, including images of herself which have not been removed from pornographic websites.

The Institute worked with Parliamentarians to improve protections for children from pornography in the Online Safety Act, ahead of it coming into law in October 2023.

Deepfakes

Earlier this month, a Guardian journalist urged the Government to tackle the “alarming proliferation of deepfake pornography” in Britain.

Deepfake pornography uses AI tools to generate pornographic images of real people. Lucia Osborne-Crowley warned that it is easy to do, and search engines and social media platforms are failing to tackle the degrading content.

Sharing deepfake pornography, cyberflashing or downblousing images became a criminal offence earlier this year, but creating such content is still legal.

Also see:

Pornhub shuts down in more US states after refusing to protect kids

Social media network X allows users to share pornography

Labour think tank backs ban on AI ‘nudification’ software

Related Resources