Kids exposed to ‘massive grooming problem’ in VR platforms

Children are ‘undeniably’ at risk of sexual abuse in virtual reality (VR) spaces, researchers have warned.

In the NSPCC report on ‘Child Safeguarding and immersive technologies: An outline of the risks’, authors Catherine Allen and Verity McIntosh outlined the current risks by admitting that although they only expected to identify future risks, the “reality is that the threat is present and is likely to grow”.

The report, which gathered data through interviews, focus groups and a literature review, noted that some “offender behaviours discussed by our interview participants seem more like something from a harrowing dystopian sci-fi world than activities happening in the present”.

‘Stupidly easy’

One VR user was quoted as stating: “There’s a massive grooming problem in multi-user VR platforms, a lot of grooming situations: paedophiles, dating age gaps…people lying about their ages.”

like something from a harrowing dystopian sci-fi world

In an interview with a police investigator, one person convicted of possessing child sex abuse materials (CSAM) claimed: “The link between having a VR headset and going down the rabbit hole to a server is stupidly easy”, before adding, “You don’t have to be technically adept.”

The researchers also found a lack of age-verification in VR features, where children under 16 years old can access areas offering “erotic role play”.

Online Safety Bill

NSPCC’s Chief Executive Officer Peter Wanless stated: “It is clear that the risks children are experiencing when using immersive technology are no longer on the horizon. They are happening now.”

“As the Online Safety Bill completes its passage through Parliament, this paper shows how important it is that new and emerging technology is within scope of the legislation. The new online safety regulatory regime needs to be future-proofed, so children are protected from new and emerging harms.”

But the report highlighted that in the gap before the Online Safety Bill is implemented “law enforcement must be given extra resources to implement the existing laws that already apply to these spaces”.

This echoes recent remarks from The Christian Institute’s Ciarán Kelly, who emphasised that the Government shouldn’t “put off to tomorrow what can be done today”.

Search engines

In Australia, search engines such as Google will be required to remove results that show images of child abuse material.

eSafety Commissioner Julie Inman Grant said: “We are seeing ‘synthetic’ child abuse material come through. Terror organisations are using generative AI to create propaganda. It’s already happening. It’s not a fanciful thing. We felt it needed to be covered.”

The new safety code forms part of Australia’s Online Safety Act, which covers the regulation of restricted content such as child sex abuse material and online pornography.

Also see:

Man at computer

Porn: ‘The largest predatory industry in the world’

Outcry as ‘pornographic’ billboards get green light by ASA

Pornography ‘damaging’ young people’s lives in Australia