The metaverse already has a scoring problem


Katherine Cross, who researches online bullying at the University of Washington, says that when virtual reality is immersive and real, the toxic behavior that occurs in that environment is also real. “At the end of the day, the nature of virtual reality spaces is such that it is designed to fool the user into believing that they are physically in a certain space, that all their bodily actions occur in a 3D environment,” he says. . “It’s part of the reason why emotional reactions can be stronger in that space and why virtual reality triggers the same internal nervous system and psychological responses.”

That was true of the woman who was groped at Horizon Worlds. According to The Verge, their post read: “Sexual harassment is not a joke on the internet, but being in virtual reality adds another layer that makes the event more intense. Not only did I get groped last night, but there were other people who supported this behavior that made me feel isolated in the Plaza. [the virtual environment’s central gathering space]. “

Assault and sexual harassment in virtual worlds is not new, nor is it realistic to expect a world in which these problems disappear completely. As long as there are people hiding behind their computer screens to evade moral responsibility, they will continue to occur.

The real problem, perhaps, has to do with the perception that when you play a game or participate in a virtual world, there is what Stanton describes as a “developer-player contract.” “As a player, I agree to be able to do whatever I want in the developer world according to its rules,” he says. “But as soon as that contract is broken and I no longer feel comfortable, the obligation of the company is to return the player to where he wants to be and to feel comfortable again.”

The question is: Whose responsibility is it to make sure that users are comfortable? Meta, for example, says it gives users access to tools to stay safe, effectively transferring responsibility to them.

“We want everyone at Horizon Worlds to have a positive experience with security tools that are easy to find, and it’s never the user’s fault if they don’t use all of the features we offer,” said Meta spokeswoman Kristina Milian. “We will continue to improve our user interface and to better understand how people use our tools so that users can report things easily and reliably. Our goal is to make Horizon Worlds safe and we are committed to doing that job. “

Milian said users must go through an onboarding process before joining Horizon Worlds that teaches them how to start Safe Zone. He also said that regular reminders are uploaded to screens and posters within Horizon Worlds.

screenshot of Meta's Safe Zone interface

FACEBOOK

screenshot of Safe Zone interface
Screenshots of the Safe Zone interface courtesy of Meta

FACEBOOK

But the fact that Meta’s victim groped either did not think of using Safe Zone or was unable to access it is precisely the problem, says Cross. “The structural issue is the big problem for me,” he says. “Generally speaking, when companies address abuse online, their solution is to outsource it to the user and say, ‘Here, we empower you to take care of yourself.’

And that’s unfair and it doesn’t work. Security should be easy and accessible, and there are many ideas to make this possible. For Stanton, all it would take is some kind of universal signal in virtual reality, perhaps Quivr’s V-gesture, that could convey to moderators that something was wrong. Fox wonders if an automatic personal distance unless two people mutually agree to be closer together would help. And Cross thinks it would be helpful for training sessions to explicitly set rules that mirror those that prevail in everyday life: “In the real world, you wouldn’t touch someone randomly and you should bring them into the virtual world.”

Until we find out whose job it is to protect users, an important step towards a safer virtual world is to discipline attackers, who are often left blameless and remain eligible to participate online even after they become known. your behavior. “We need deterrents,” Fox says. That means making sure bad actors are found and suspended or banned. (Milian said Meta “[doesn’t] share details about individual cases ”when asked what happened to the alleged tamper).

Stanton regrets that he did not push harder for the whole industry to adopt the gesture of power and did not speak more about the Belamire groping incident. “It was a missed opportunity,” he says. “We could have prevented that incident in Meta.”

If one thing is clear, it is this: there is no body that is clearly responsible for the rights and safety of those who participate anywhere online, let alone in virtual worlds. Until something changes, the metaverse will remain a dangerous and troublesome space.


www.technologyreview.com

Leave a Reply

Your email address will not be published. Required fields are marked *