Online games and social networks are already hotbeds of bad behavior, but virtual reality offers an even higher degree of potential harm — and experts are urging VR innovators to build in more safeguards from the ground up. That includes building in controls to deter bad behavior and giving users more controls over their own environment and interactions with others.
Virtual harassment is no joke
Video game players are used to getting physically attacked by monsters or other players as part of the game. But when those attacks become personal or sexual, or take place outside a gaming environment in a social platform, or follow the victim to multiple channels or offline then the situation can become uncomfortable, scary, or even actually dangerous in real life.
“Virtual reality makes emotional and physical action more intense, things are heightened in virtual reality, ” Pooky Amsterdam, CEO of PookyMedia, told Hypergrid Business.
For example, when the virtual avatar mirrors the user’s own hand motions, facial expressions, or full body movements, the feeling of being immersed in the virtual environment is intensified, she said. And so is the experience of harassment.
Game designer Patrick Harris showed a video demonstrating in-game virtual harassment during the GDC 2016 conference in March that included obscene gestures and invasion of a female player’s personal space that left the conference audience in “stunned, dismayed silence,” Polygon reported.
“It is intense, it is visceral [and] it triggers your fight or flight response,” said Harris, who works for game studio Minority Media. “They can lean in and touch your chest and groin and it’s really scary.”
Other developers have reported that it is common to see male avatars swarming around female avatars, males groping 3D models of female users, and shaking controllers around their crotch areas while uttering words that suggest sexual engagement with them.
A male avatar rubbing his chest while making suggestive comments, though intended as playful banter with other male players, created a tense and uncomfortable experience for video game company executive Renee Gittins.
“While I found the experience to be extremely unpleasant, it did not reduce my interest in virtual reality gaming,” Gittins told Hypergrid Business. “Of course, I can’t say that I am the best example for dealing with the effects of such an experience, because I am one of the women who have continued gaming while battling sexism and harassment in games for the past decade and a half. There are plenty of women who would have reduced interest in virtual reality after such an experience, just as I have had many female friends drop out of gaming over the years. I just happen to be a particular stubborn breed.”
Widespread harassment, especially on virtual social networks, as a result of harassment could directly impact impact a company’s bottom line by costing them female users.
“The freedom to create, to make friends and to express oneself is what draws so much talent and energy to virtual reality,” said PookyMedia’s Amsterdam. “A climate where people are also free from harassment encourages more community and productivity.”
Many security options available for developers
Proactive companies have a variety of options for curbing virtual harassment before it starts, responding to it after it happens, and helping users protect themselves.
Platforms can allow users to easily report instances of harassment, filter offensive or abusive language, automatically put offenders on notice and then ban repeat offenders. Reporting functionality could include recordings of what occurred before and after the report was made.
That will not completely eliminate harassment, said Amsterdam, but it will definitely make players stop and think.
The worst offenders, those who are seriously dedicated to harassing others, may find ways around the automated controls. But many will change their behavior in order to continue using the platforms.
Giving users defensive capabilities will also help.
Features that people can use to project themselves in a virtual environment include include the option to mute or ignore other users, said Amsterdam. Event organizers could have the power to allow only invited guests, or to eject those who behave badly.
User can also be given the power to control who can see their avatars, to make other users invisible, to not show gestures or body motions, and to create bubbles of personal space that other users cannot enter, suggested Gittins.
“Obviously not all of these options work for all games, especially competitive multiplayer games, but they do help,” she said. “I have talked with many developers to help ensure that the virtual reality space and interactions can be controlled by the user so that harassment can be shut down immediately. Hopefully, these will reduce the occurrence and effects of these negative experiences.”
One platform with plenty of experience in dealing with virtual harassment is Second Life. Amsterdam hosts an in-world show called The 1st Question, a science-based quiz show, which has been a target of attacks from in-world harassers known as “griefers” in Second Life.
For example, in one case, the set had to be closed so that only invited guests were allowed to enter the region.
“To keep people away from the set itself we had an invisible barrier constructed so those who weren’t on the show were held back,” said Amsterdam.
In addition, the “particle” functionality was turned off for avatars in the area, and a second set was ready as a backup at an undisclosed location in case the griefers continued their attack from a neighboring region.
“There are certainly ways to contain these kinds of willful distractions,” she said.