Clicky
Opinion

Metaverse harassment doesn’t bode well for Zuck’s dreams


Published : 24 Feb 2022 07:57 PM | Updated : 24 Feb 2022 07:58 PM

When Mark Zuckerberg described the metaverse last year, he conjured an image of harmonious social connections in an immersive virtual world. But his company’s first iterations of the space have not been very harmonious.

Several women have reported incidents of harassment, including a beta tester who was virtually groped by a stranger and another who was virtually gang-raped within 60 seconds of entering Facebook’s Horizon Venues social platform. I had several uncomfortable moments with male strangers on social apps run by both Meta Platforms Inc. (formerly known as Facebook Inc.) and Microsoft Corp. when I visited them in December.

These are early days for the metaverse, but that’s precisely the problem. If safety isn’t baked early on into the design of a new social platform, it is much harder to secure down the line.

Gaming firms like Riot Games Inc., the maker of League of Legends, for instance, have faced an uphill battle trying to rescue a virtual community from toxic behavior. Facebook also knows this problem well: It struggled to put the proverbial toothpaste back in the tube with Covid vaccine misinformation, as highlighted by a whistleblower last year. If the company had been faster to design a prompt asking users to stop and read certain links before sharing them, that might have limited anti-vax posts from going viral and costing lives.

It turns out Facebook has grappled internally with building safety features into its new metaverse services. In 2016, it released Oculus Rooms, an app where anyone with an Oculus headset could hang out in a virtual apartment with friends and family. Despite the cartoonish-looking avatars, meeting a friend in Rooms was a captivating experience. But the service was destined to remain niche, with so few people in possession of a headset. The way to encourage more growth was to bring together headset owners who didn’t know each other, according to Jim Purbrick, who was an engineering manager on Facebook’s VR products between 2017 and 2020.

In 2017, Facebook built Oculus Venues (now Horizon Venues), a virtual space where it would show films or professional sports games in the hope that visitors would mingle and make connections. It was a critical shift, strengthening the company’s ability to grow its new VR platform — but also opening it up to new risks.

Microsoft is serious about stopping toxicity

 from becoming the norm. Facebook 

should consider a similarly bold overhaul

of its socializing platforms for the metaverse

Engineers and product managers began holding meetings to discuss how they might design safety features into Venues, Purbrick recalls, which they hadn’t done when designing Rooms. Managers did pay significant attention to safety, he tells me. For instance, people had to watch a safety video before entering Venues. But there were some noticeable problems. Only one out of 10 engineers and one in four product managers working on virtual reality at Facebook were women, according to Purbrick. Meta said it didn’t break out demographics by team.

Purbrick also warned engineers early on that avatars should fade out and disappear when they got too close to another user, to help address the unsettling sensation of having someone zoom up within millimeters of your own avatar. They liked the idea, he says, but it was never implemented. “Everyone had too much to do,” Purbrick recalls.

A spokeswoman for Facebook didn’t say why the firm had not implemented a fade-to-disappear feature, and instead highlighted its new “personal boundary” tool, which prevents certain avatars from coming within a radius of two virtual feet of your own.

The boundary tool can backfire, Purbrick says, pointing to how similar features have been misused in gaming. “You can end up with gangs of people creating rings around others, making it difficult for them to move out,” he says. “If there’s a big crowd and you have a bunch of personal boundaries, it makes navigation harder.” Meta said avatars would still be able to move forward with the boundary tool.

“Oculus definitely cared about people having good experiences in VR and understood that a bad first experience could put people off VR forever, but I think they underestimated the size of the problem,” Purbrick adds.

He believes Meta should make safety features easier for users to find — like a fire extinguisher — and work on introducing volunteers who can officially help monitor behavior, potentially even becoming a new branch of governance for the platform. Gaming, despite its history of toxic behavior, has some helpful templates: The role-playing game EVE Online, for instance, has a council of 10 elected volunteers who meet with the game’s developers to regularly discuss problems on the platform.

Until now, Meta has centralized the task of moderating content on Facebook, but it will struggle to take the same approach with millions of inhabitants of a new virtual world.

The company has “the most centralized decision-making structure” ever encountered in a large company, according to one early backer, a description that’s underscored by Mark Zuckerberg’s control of 57% of the company’s voting shares. But virtual worlds are human communities at their core, which means people will want more of a say in how they are run. Relinquishing some of that central control could ultimately help Meta mitigate harassment.

Educating visitors about what constitutes potentially criminal behavior would, too. Holly Powell Jones, a criminologist and lecturer at City University London, has found that an alarming number of children and teenagers shrug off harassment or the sharing of indecent images because they have no idea that they are criminal offenses. People have “almost certainly” been harassed at a criminal level in virtual reality already, she says.

“Harassment in digital spaces is nothing new, and it’s something we and others in the industry have been working to address for years,” Meta’s spokeswoman said. “We’re constantly shipping new features based on people’s feedback.”

With police already stretched from social-media cases and the offline world, technology companies should try more radical solutions to address harassment in the metaverse now before it becomes entrenched. The dearth of women in the development process for virtual reality certainly isn’t helping and could be fixed.

Facebook’s Horizon Venues and Microsoft’s AltspaceVR have both provided vague warnings about behavior — reminding visitors that they must treat other avatars as human beings, for instance — before entering. But Microsoft on Wednesday announced a more drastic move to combat harassment: It was shutting down several of its platforms for socializing with strangers, including Campfire, muting all attendees when they joined an event and activating “safety bubbles” as a default.

That shows Microsoft is serious about stopping toxicity from becoming the norm. Facebook should consider a similarly bold overhaul of its socializing platforms for the metaverse. Otherwise, the dream that Zuckerberg conjured will never go beyond being just that.


Parmy Olson is a Bloomberg Opinion Columnist. Source: Bloomberg