“Metaverse: another cesspool of toxic content,” a new report published by nonprofit advocacy group SumOfUs on May 24, details the researcher’s violent encounter in Meta’s Horizon World.
According to SumOfUs’s account, users invited the researcher to a private party on Horizon World earlier this month. Users in the same room then asked her to disable a setting that prevented others from getting within four feet of her,
The report linked to a video that the group said shows what happened to the researcher’s avatar from her perspective. In the video, a male avatar is seen getting very close to her, while another male avatar stands nearby, watching. A bottle of what appears to be alcohol is then passed between the two avatars, per the 28-second video. Two male voices are heard making lewd comments in the video.
In a part of the video SumOfUs opted not to share but describe, the researcher “was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see – all while another user in the room watched and passed around a vodka bottle,” per the report.
Even though it happened in virtual reality, the incident left the researcher “disoriented,” she said in the report. The researcher noted her controller vibrated when the male avatars touched her, resulting in a physical sensation that was a result of what she was experiencing online.
“One part of my brain was like wtf is happening, the other part was like this isn’t a real body, and another part was like, this is important research,” she said in the report.
SumOfUs researchers also reported experiencing homophobic and racial slurs in Horizon World and said they witnessed gun violence on the platform.
Meta launched Horizon Worlds last December to users 18 and up in the US and Canada. By February, there were at least 300,000 users on the platform, according to The Verge.
Four other users also recently claimed that their avatars were sexually assaulted or harassed in Horizon World and other Meta’s VR platforms, according to the SumOfUs report.
In November, a beta tester reported that her avatar had been groped on Horizon Worlds.
At the time, Meta spokesperson Kristina Milian told the MIT Technology Review that users should have “a positive experience with safety tools that are easy to find and it’s never a user’s fault if they don’t use all the features we offer. We will continue to improve our UI and to better understand how people use our tools so that users are able to report things easily and reliably. Our goal is to make Horizon Worlds safe, and we are committed to doing that work.”
But the next month, metaverse researcher Nina Jane Patem disclosed in a post on Medium that within 60 seconds of her joining Horizon Worlds, her avatar was gang-raped by three to four male-looking avatars.
That same month, The New York Times reported that a female player’s avatar was groped on a Meta-owned shooter game. Separately, a player on the sports game Echo VR said a male player told her he had recorded her voice so that he could “jerk off” to her cursing.
SumOfUs and Meta did not immediately respond to Insider’s requests for comments. In response to the SumOfUs report, a Meta spokesperson told the Daily Mail it doesn’t recommend “turning off the safety feature with people you do not know.”
At least two major metaverse investors expressed concern over emerging details of harassment and abuse on its metaverse platforms
Meta has staked its future on building its immersive metaverse virtual reality. It plowed $10 billion into designing the metaverse. CEO Mark Zuckerberg is playing the long game with his investment, recently saying that the project could continue to lose money for the next three to five years, Insider reported.
However, at least two major Meta investors were alarmed by emerging details of harassment and abuse on its metaverse platforms.
In December, investors Arjuna Capital and Storebrand Asset Management, together with SumOfUs and several other advocacy organizations, co-filed a motion demanding that Meta publish a report that would examine the potential harms users could face on its metaverse platforms, they said in a press release.
“Investors need to understand the scope of these potential harms, and weigh in on whether or not this is a good idea before we throw good money after bad,” Arjuna Capital’s managing partner Natasha Lamb said in the release.
At Meta’s May 25 shareholder’s meeting, a proposal was introduced to complete a third-party assessment of “potential psychological and civil and human rights harms to users that may be caused by the use and abuse of the platform” and “whether harms can be mitigated or avoided, or are unavoidable risks inherent in the technology.”
However, the proposal was voted down.
Earlier this month, Nick Clegg, president for global affairs at Meta Platforms, said in a blog post that “the rules and safety features of the metaverse regardless of the floor will not be identical to the ones currently in place for social media. Nor should they be.”
But, he continued, “In the physical world, as well as the internet, people shout and swear and do all kinds of unpleasant things that aren’t prohibited by law, and they harass and attack people in ways that are. The metaverse will be no different. People who want to misuse technologies will always find ways to do it.”