Catherine Cross, who studies online harassment at the University of Washington, says that when virtual reality is immersive and real, the toxic behavior that occurs in that environment is also real. “Ultimately, the nature of virtual reality spaces is designed to trick the user into thinking that they are physically in a particular space, that their every bodily action takes place in a 3D environment,” she says. . “This is part of the reason why emotional reactions can be stronger in this space and why VR triggers the same internal nervous system and psychological reactions.”
This was true of the woman who was groped at Horizon Worlds. According to The Verge, her post read: “Sexual harassment is not a joke on the ordinary internet, but the presence in VR adds another layer that makes the event more intense. Not only was I groped last night, but there were other people out there who supported this behavior, which made me feel isolated in the Plaza [the virtual environment’s central gathering space]”
Sexual violence and harassment in virtual worlds is not new, nor is it realistic to expect a world in which these problems will disappear completely. As long as there are people who will hide behind their computer screens to avoid moral responsibility, they will continue to happen.
The real problem may be related to the perception that when you play a game or participate in a virtual world, there is what Stanton describes as a “developer-player contract.” “As a player, I agree to be able to do what I want in the world of developers according to their rules,” he said. “But once this contract is broken and I no longer feel comfortable, it is the company’s duty to get the player back where he wants to be and back to a comfortable position.”
The question is: Whose responsibility is it to ensure that consumers feel comfortable? Meta, for example, says it gives users access to tools to protect themselves by effectively shifting the burden on them.
“We want everyone at Horizon Worlds to have a positive experience with safety tools that are easy to find – and it’s never the user’s fault if they don’t use all the features we offer,” said Meta spokeswoman Christina Milian. “We will continue to improve our user interface and better understand how people use our tools so that users can report things easily and reliably. Our goal is to make Horizon Worlds safe, and we are committed to doing that. “
Milian said users need to go through an deployment process before joining Horizon Worlds, which teaches them how to launch Safe Zone. She also said that regular reminders are loaded into screens and posters at Horizon Worlds.
But the fact that the victim groping Meta either did not think of using a safe area or did not have access to it exactly the problem, says Cross. “The structural issue is the big problem for me,” she said. “Generally speaking, when companies engage in online abuse, their decision is to assign it to the consumer and say, ‘Here, we give you the strength to take care of yourself.’
And this is unfair and does not work. Security must be easy and accessible, and there are many ideas on how to make this possible. For Stanton, all it will take is some kind of universal signal in virtual reality – perhaps Quivr’s V gesture – that can tell moderators that something is wrong. Fox wonders if automatic personal distance, unless two people agree to be closer to each other, would help. And Cross said it would be useful to explicitly set out in training sessions norms that reflect those that prevail in ordinary life: “In the real world, you wouldn’t touch someone at random, and you should transfer them to the virtual world.”
Until we figure out whose job it is to protect consumers, one of the main steps toward a safer virtual world is disciplining aggressors, who often remain free and eligible to participate online even after their behavior becomes known. “We need deterrents,” Fox said. This means making sure that bad actors are detected and stopped or banned. (Milian said Meta[doesn’t] share details on individual cases “when asked what happened to the alleged searcher.)
Stanton regrets that he has not insisted on accepting the gesture of power in the whole industry and has not forgotten to talk more about the touching incident with Belamir. “It was a missed opportunity,” he said. “We could have avoided this Meta incident.”
If one thing is clear, it is this: there is no body that is clearly responsible for the rights and safety of those who participate everywhere online, let alone in virtual worlds. Until something changes, the metaverse will remain a dangerous, problematic space.