Katherine Cross, a student of online harassment at the University of Washington, says that when virtual reality is immersive and real, the toxic behavior that occurs in that environment is also real. “After all, the nature of virtual reality spaces is such that they are designed to trick the user into thinking that they are physically in a specific space, and that every physical action they take takes place in a three-dimensional environment,” she says. … “This is one of the reasons why emotional responses can be stronger in this space, and why virtual reality triggers the same internal nervous system and psychological responses.”
This was the case with the woman who was groped at Horizon Worlds. According to The Verge, her message read: “Sexual harassment is not a joke on the regular Internet, but being in virtual reality adds another layer that makes the event more intense. Not only was I groped last night, there were other people who supported this behavior that made me feel isolated in the square. [the virtual environment’s central gathering space]… “
Sexual violence and harassment in virtual worlds is not new, and it is unrealistic to expect a world in which these problems will completely disappear. As long as there are people who will hide behind their computer screens to avoid moral responsibility, they will happen.
The real problem may have to do with the perception that when you play a game or participate in a virtual world, there is what Stanton describes as a “developer-player contract.” “As a player, I agree that I can do whatever I want in the developer world according to their rules,” he says. “But as soon as this contract ends and I no longer feel comfortable, the company has a responsibility to bring the player back to where he wants to be and feel comfortable again.”
The question arises: who is responsible for ensuring that users feel comfortable? Meta, for example, says it gives users access to tools to stay safe by effectively shifting the burden on them.
“We want everyone at Horizon Worlds to have a positive experience with security tools that are easy to find, and the user is never to blame for not using all the features we offer,” said Meta spokeswoman Cristina Milian. “We will continue to improve our user experience and better understand how people use our tools so that users can communicate things easily and reliably. Our goal is to make Horizon Worlds safe and we are doing our best. ”
Milian said that before joining Horizon Worlds, users must go through an onboarding process that teaches them how to launch Safe Zone. She also said that regular reminders are being uploaded to screens and posters at Horizon Worlds.
But the fact that the victim of the meta-groping either did not think to use the safe zone or could not access it is exactly problem, says Cross. “The structural issue is a big problem for me,” she says. “Generally speaking, when companies turn to online abuse, their solution is to outsource it to the user and say, ‘This is where we give you the opportunity to take care of yourself.’
And this is unfair and does not work. Security needs to be simple and affordable, and there are many ideas on how to make this possible. For Stanton, all it took was some kind of universal signal in virtual reality – perhaps a Kiwr V gesture – that could convey to the moderators that something was wrong. Fox wonders if automatic personal distance will help if the two people have not agreed to get closer. And Cross believes that it would be useful for training to clearly state norms that reflect those that prevail in everyday life: “In the real world, you will not accidentally grope for someone, and you have to transfer that to the virtual world.”
Until we figure out whose job it is to protect users, one of the main steps towards a safer virtual world is to punish aggressors, who often go unpunished and have the right to participate in the network even after their behavior is known. “We need constraints,” says Fox. This means that you need to make sure that attackers are found and suspended or blocked. (Milian said that Meta “[doesn’t] share details of individual cases ”when asked about what happened to the alleged grouper.)
Stanton regrets that he did not insist on the large-scale implementation of this gesture of power and was unable to tell more about the incident with Belamir. “It was a missed opportunity,” he says. “We could have avoided this incident on Meta.”
If anything is clear, it is this: there is no body directly responsible for the rights and security of those who participate anywhere in the network, let alone virtual worlds. Until something changes, the metaverse will remain a dangerous, problematic space.