The Community Zuck Longs to Build Remains a Distant Dream

Mark Zuckerberg wants Facebook to build a global community. Easier said than done.
FBManualTA477167834Converted.jpg
Getty Images

On February 16, Mark Zuckerberg published "Building Global Community," a 6,000-word open letter directly addressed to Facebook's users. "To our community," Zuckerberg begins. "On our journey to connect the world, we often discuss products we're building and updates on our business. Today I want to focus on the most important question of all: are we building the world we all want?"

Zuckerberg goes on to propose a solution that---in true Silicon Valley fashion---involves the very company he created. With a bolded flourish, he writes, "In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us."

The goal is noble. Membership in real-life groups has declined in the US "by as much as one-quarter" since the 1970s, Zuckerberg writes in his letter. And while the disenchanted might say the lure of an ersatz world like Facebook actually contributes to the diminishment of IRL community, it's probably a fairer argument that community hasn't eroded so much as transferred to a different space. Specifically, to Facebook, which has emerged as our world's dominant social platform.

But Facebook’s desire to become the bedrock of social infrastructure is somewhat troubling. The United Nations defines civil society as the “third sector,” alongside government and business. In both of those spheres, transparency is a prerequisite. The government is accountable to its citizens. Businesses are accountable to their shareholders and to the government. Yet Facebook straddles a sort of gray space. During the 2016 presidential election, Facebook helped register 2 million of its users to vote and is often the platform where potential candidates interface with those users-slash-constituents. Facebook also sells information about the habits of its users to advertisers.

And that's why things got so complicated when *The Guardian *recently published details from a leaked copy of the manual that Facebook gives its thousands of "content moderators," the people who effectively monitor, police, and determine what we see in our Facebook feeds. What the document revealed is a deeply arbitrary set of guidelines that confuse the moderators who are helping to shape the civil society that millions of people rely on to, as Zuckerberg has put it, find meaning in their lives.

There’s a lot in the specific rules that is problematic, but the biggest problem is that these guidelines were secret at all. In fact, it appears to go against one of the very suggestions Zuckerberg outlined in his manifesto: "The Community Standards should reflect the cultural norms of our community," he wrote. "The approach is to combine creating a large-scale democratic process to determine standards with AI to help enforce them."

A spokesperson for Facebook tells WIRED that the company regularly consults with regulators, NGOs, academics and advocates in areas like self-harm, terrorism, and free speech. But as was evident by the outcry to the Guardian story, not only did Facebook users seem left out of the democratic process, they didn't appear to even know any sort of process already existed.

This Land Is Facebook's

To its credit, Facebook has facilitated near-frictionless potential for positive community building: You can meet friends and connect with groups; you can publish ideas and broadcast videos; you can learn things, educate people, lobby lawmakers around the world.

But, in accordance with the manual, you can also threaten and bully. You can upload a picture of a dog being tortured. You can say, "To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat." You can tell your friends to beat up "fat kids." You can show yourself killing someone. You can say you're going to kill yourself, but only if you plan to do it at least five days in the future.

You can't say you like seeing animals getting tortured. Or upload naked pictures of a woman without her permission. You can’t announce you are going to kill yourself right now. You can’t say someone should kill the president, or write, "We should put all foreigners in gas chambers." If you do, someone else on Facebook can notify the authorities---the Facebook authorities, aka moderators---and they, in turn, can shut down your account or even call actual law enforcement.

If it seems hard to find the organizing principle, that's because there appears not to be one.

"It seems so reactionary right now, rather than saying, 'Let’s take a step back and ask what our moral aims are in this privately owned, very public space that we have,'" says ethicist Christopher Robichaud, a senior lecturer at Harvard's Kennedy School of Public Policy, who described the guidelines as "haphazard and schizophrenic."

To a degree, the rules do feel reactionary, a slapdash necessity borne of urgency. Self-harm and suicide videos flood onto the site, often at a rate of 5,000 every two weeks, according to The Guardian. Last month a Cleveland man murdered a stranger and posted a video of it to Facebook. Two weeks later, a man killed his baby and then himself on Facebook Live. Since then, safety appears to have become Facebook's main focus. Zuckerberg reiterated this at F8, the company's developer conference, earlier this month: “We have a full road map of products to help build groups and community, help build a more informed society, and help keep our community safe,” he said.

But the guidelines for the company's content moderators are the first glimpse the public has had at what providing that safety actually looks like on a technical level. The Guardian notes that many of the moderators profess confusion about how to follow the manual. Take the suicide rules, for instance, which state that moderators should keep videos or posts up as long as there’s a chance someone could save the person. “Once there’s no longer an opportunity to help the person,” they should take the livestream or post down. Why allow a suicidal threat to stay on the site for five days? It's probably an arbitrary time frame, says Daniel J. Reidenberg, executive director the suicide prevention outreach group SAVE, which has helped Facebook for more than a decade with its suicide prevention tools and policies. But the idea is based on traditional suicide prevention techniques. "The concept behind it is that if you can see out into the future, that gives us hope and a chance to intervene," he says. Still, that’s an incredibly difficult and subjective call to expect a content moderator, typically a low-paid employee with no professional experience with suicide prevention, to make. (And AI isn't far enough along to spot these horrific videos automatically.)

"They need a set of relatively straightforward organizing moral guidelines and principles that could then be applied to individual cases," Robichaud says. Without them, the laws of the land become contradictory and hard to follow.

In his manifesto, Zuckerberg lays out a plan for developing these moral guiding principles: he suggests asking the people. "The idea is to give everyone in the community options for how they would like to set the content policy for themselves," he wrote. "Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings." Once again, Facebook's secretive practices and tightly-held decision-making seem at odds with Zuckerberg's democratic vision.

This Land Is Our Land

It's not just community guidelines that feel opaque; Facebook appears to fight transparency at every turn. The company has yet to release the data it has on how Russian hackers abused its algorithms to spread propaganda during the 2016 presidential election. How is that consistent with fostering an “informed” society?

"For a private company like Facebook, which owns such an unfathomably large public space, to go about thinking about the ethics of it without bringing in the public at various stages would be really problematic," Robichaud says. He argues that Facebook should let the Facebook citizenry weigh in on the creation of the community by helping to write the guiding moral principles that would inform any moderator manual.

This makes sense, but when asked for comment about why they didn't make the manual public, a Facebook spokesperson said the company feared people would find workarounds to the rules. One workaround to that problem would be to establish a core philosophy rather than reacting in a piecemeal way. And Facebook should be transparent about what those core philosophies are. It's good that Zuckerberg is engaging directly with his company's responsibilities in "building global community" and "strengthen[ing] our social fabric." It's also slightly troublesome that he is just now addressing this topic. Facebook has two billion users across the world. His company isn't building global community so much as it has built global community.

And if Facebook is intent on building a community that people increasingly live in, shouldn't its citizens get a look at the blueprints?