Meta allegedly fails to address gender-based violence in its metaverse
"Misogyny in the metaverse: is Mark Zuckerberg’s dream world a no-go area for women?", 10 June 2025
Everybody knows that young women are not safe. They are not safe in the street, where 86% of those aged 18 to 24 have experienced sexual harassment. They are not safe at school, where 79% of young people told Ofsted that sexual assault was common in their friendship groups and almost a third of 16- to 18-year-old girls report experiencing “unwanted sexual touching”. They are not safe in swimming pools or parks, or at the beach. They are not even safe online, with the children’s safety charity the NSPCC reporting that social media sites are “failing to protect girls from harm at every stage”.
This will come as no surprise to any woman who has ever used social media. But it is particularly relevant as Meta, the operator of some of the biggest social platforms on the internet, is busily engaged in constructing a whole new world. The company is pumping billions of dollars a year into building its metaverse...
Mark Zuckerberg has grandly promised: “In the metaverse, you’ll be able to do almost anything you can imagine.” It’s the sort of promise that might sound intensely appealing to some men and terrifying to most women.
Indeed, the deeply immersive nature of the metaverse will make the harassment and abuse so many of us endure daily in text-based form on social media feel 100 times more real and will simultaneously make moderation 100 times more difficult. The result is a perfect storm. And I am speaking from experience, not idly speculating: I spent days in the metaverse researching my book, The New Age of Sexism.
...
Less than two hours after I first entered the metaverse, I saw a woman’s avatar being sexually assaulted. When I approached her to ask her about the experience, she confirmed: “He came up to me and grabbed my ass.”
“Does that happen a lot?” I asked.
“All the time,” she replied, wearily.
I used my haptic controller to “pick up” a bright-yellow marker and moved towards a giant blackboard. “HAVE YOU BEEN ASSAULTED IN THE METAVERSE?” I wrote.
The response was near instantaneous. “Yeah, many times,” someone shouted.
“I think everybody’s been assaulted in the damn metaverse,” one woman replied immediately, in a US accent.
“Unfortunately, it is too common,” a British woman added, nodding.
Both women told me they had been assaulted multiple times.
During my time in the metaverse, sexual harassment and unwanted sexual comments were almost constant. ...My virtual breasts were commented on repeatedly. I did not witness any action taken in response – whether by a moderator or by another player.
A damning TechCrunch report from 2022 found that human moderators were available only in the main plaza of Meta’s metaverse game Horizon Worlds – and that they seemed more engaged in giving information on how to take a selfie than moderating user behaviour.
More worryingly still, I visited worlds where I saw what appeared to be young children frequently experiencing attention from adult men they did not know. In one virtual karaoke-style club, the bodies of the singers on stage were those of young women in their early 20s. But based on their voices, I would estimate that many of the girls behind the avatars were perhaps nine or 10 years old. Conversely, the voices of the men commenting on them from the audience, shouting out to them and following them offstage were often unmistakably those of adults.
It is particularly incumbent on Meta to solve this problem. Of course, there are other companies, from Roblox to Microsoft, building user-generated virtual-reality gaming platforms and virtual co-working spaces. But, according to NSPCC research, while 150 apps, games and websites were used to groom children online between 2017 and 2023, where the means of communication was known, 47% of online grooming offences took place on products owned by Meta.
These are not isolated incidents or cherry-picked horror stories. Research by the Center for Countering Digital Hate (CCDH) found that users were exposed to abusive behaviour every seven minutes in the metaverse. During 11 and a half hours recording user behaviour, the report identified 100 potential violations of Meta’s policies. This included graphic sexual content, bullying, abuse, grooming and threats of violence.
In a separate report, the CCDH found repeated instances of children being subjected to sexually explicit abuse and harassment, including an adult asking a young user: “Do you have a cock in your mouth?” and another adult shouting: “I don’t want to cum on you,” to a group of underage girls who explicitly told him they were minors.
Since its inception, Meta’s virtual world has been plagued with reports of abuse. Users have reported being virtually groped, assaulted and raped. Researchers have also described being virtually stalked in the metaverse by other players, who tail them insistently, refuse to leave them alone and even follow them into different rooms or worlds.
In December 2021, a beta tester of the metaverse wrote in the official Facebook group of the Horizon platform: “Not only was I groped last night, but there were other people there who supported this behaviour.”
What was even more revealing than the virtual assault itself was Meta’s response. Vivek Sharma, then vice-president of Horizon at Meta, responded to the incident by telling the Verge it was “absolutely unfortunate”. After Meta reviewed the incident, he claimed, it determined that the beta tester didn’t use the safety features built into Horizon Worlds, including the ability to block someone from interacting with you. “That’s good feedback still for us because I want to make [the blocking feature] trivially easy and findable,” he continued.
This response was revealing. First, the euphemistic description of the event as “unfortunate”, which made it sound on a par with poor sound quality. Second, the immediate shifting of the blame and responsibility on to the person who experienced the abuse – “she should have been using certain tools to prevent it” – rather than an acknowledgment that it should have been prevented from happening in the first place. And, finally, most importantly, the description of a woman being abused online as “good feedback”.
...
When it was revealed in 2024 that British police were investigating the virtual gang-rape of a girl below the age of 16 in the metaverse, a senior officer familiar with the case told the media: “This child experienced psychological trauma similar to that of someone who has been physically raped”.
Second, technology to make the metaverse feel physically real is developing at pace. ...
But most importantly, regardless of how similar to or different from physical offline harms these forms of abuse are, what matters is that they are abusive, distressing, intimidating, degrading and offensive and that they negatively affect victims. And, as we have already seen with social media, the proliferation of such abuse will prevent women and girls from being able to fully use and benefit from new forms of technology.
If Zuckerberg’s vision comes to fruition and the boardrooms, classrooms, operating theatres, lecture halls and meeting spaces of tomorrow exist in virtual reality, then closing those spaces off from women, girls and other marginalised groups, because of the tolerance of various forms of prejudice and abuse in the metaverse, will be devastating. If we allow this now, when the metaverse is (relatively speaking) in its infancy, we are baking inequality into the building blocks of this new world.
At the time of the aforementioned virtual-reality rape of an underage girl, Meta said in a statement: “The kind of behaviour described has no place on our platform, which is why for all users we have an automatic protection called personal boundary, which keeps people you don’t know a few feet away from you.”
In another incident, when a researcher experienced a virtual assault, Meta’s comment to the press was: “We want everyone using our services to have a good experience and easily find the tools that can help prevent situations like these and so we can investigate and take action.”
The focus always seems to be on users finding and switching on tools to prevent harassment or reporting abuse when it does happen. It is not on preventing abuse and taking serious action against abusers.
But in the CCDH research that identified 100 potential violations of Meta’s VR policies, just 51 of the incidents could be reported to Meta using a web form created by the platform for this purpose, because the platform refuses to examine policy violations if it cannot match them to a predefined category or username in its database.
Worse, not one of those 51 reports of policy violation (including sexual harassment and grooming of minors) was acknowledged by Meta and as a result no action was taken. ...
...
The Guardian invited Meta to reply to this article, but the company did not respond.