Facebook’s Deadly Impact



On the morning of August 25, a self-proclaimed militia group on Facebook called Kenosha Guard put out a public call for people to “take up arms” and defend the city of Kenosha, Wisconsin, from “evil thugs” — that is, people protesting the police shooting of Jacob Blake, an unarmed Black man, two days earlier. Responses rolled in throughout the day, including ones like, “I fully plan to kill looters and rioters tonight,” according to an investigation published Thursday by BuzzFeed News.

Horrified Facebook users flagged the event to the company at least 455 timesBuzzFeed reported — and Facebook moderators replied that it didn’t violate the platform’s rules. They left it up, and that evening, two protesters were shot and killed after clashes with militia-style groups of young men with rifles. (It isn’t yet clear whether the gunman was inspired by the Facebook group specifically or heard about the event through some other channel.)

Facebook CEO Mark Zuckerberg called the moderation decision an “operational mistake” made by contractors. The Kenosha Guard page and event did violate Facebook’s rules, he said, and should have been taken down.

The Pattern

There’s no longer any excuse for outsourcing content moderation.

  • Zuckerberg is right that there’s a serious problem with Facebook’s content moderation processes. But it’s not a mistake. It’s a policy decision that Facebook made long ago, at the highest levels, to outsource the vast, traumatic, nigh-impossible task of moderating a platform used by some 2.7 billion people around the world.By delegating to third-party contractors the critical work of content moderation, Facebook has abdicated its responsibility to both its users and society at large.
  • Content moderation is hard. It’s hard philosophically to make fair and consistent judgments about what people should and should not be allowed to say to each other. It’s hard to gain the level of contextual understanding needed to properly interpret every post. And, as numerous reports have detailed, it’s grueling psychologically to spend all day sifting through the worst of humanity, making rapid-fire decisions that could have life-or-death consequences. The Verge’s Casey Newton in particular has exposed in a series of stories just how traumatic the job can be — especially for contractors viewing heinous footage and images day in and day out for subpar wages with no say in the often arbitrary-seeming policies they’re compelled to enforce.
  • Content moderation has high stakes. Kenosha is only the latest reminder that it can have history-altering consequences. The grimmest comes from Myanmar, where Facebook helped fuel a genocide under the watch of moderators who mostly didn’t even speak Burmese. Facebook’s rulings on coronavirus-related posts can also have serious consequences, affecting whether potentially dangerous misinformation spreads to millions of users.
  • Content moderation is sensitive. Moderators have access to information and tools that could compromise people’s privacy if abused. When Covid-19 forced offices around the world to close, Facebook’s own employees simply shifted to working from home. But Facebook didn’t trust its contract moderation workforce to do that, so it simply stopped using most human moderators for a period of time. It shifted most of the load to its A.I. tools, even as it acknowledged that would lead to more mistakes.
  • Content moderation is critical to Facebook’s business. Without it, the social network would quickly go the way of MySpace, the spam-ridden incumbent that it crushed a decade ago, thanks in large part to its cleaner interface and safer-feeling environment. All the pornography, the bestiality, the child abuse, the hate speech, the terrorist beheadings that make the work of moderation so miserable would be inflicted instead on Facebook’s users, who surely wouldn’t stand for it very long.
  • Given that content moderation is extremely hard, high-stakes, sensitive, and mission-critical, you might think Facebook would attack the problem with some of its most skilled employees, giving them all the resources they need. That’s how Facebook treats problems such as A.I., which it entrusts to teams of highly paid engineers overseen by world-renowned experts in the field. Instead, Facebook treats content moderation as an afterthought, outsourcing the vast majority of it to third-party contracting firms in undisclosed locations. It has roughly 15,000 moderators in locations ranging from Manila to Hyderabad to Austin. Other dominant internet platforms do much the same, including Google, which outsources most of its 10,0000 moderation positions.
  • In June, a major report from NYU’s Stern Center for Business and Human Rightscalled for Facebook to stop outsourcing content moderation. The report, authored by Paul Barrett, executive director of the center, recommended that Facebook double its moderation workforce to 30,000, give workers more time to process the images crossing their screens, and provide better mental health support, among other proposals. The report made headlines, but Facebook has never offered a substantive response. Contacted for this story, a Facebook spokesperson offered a canned quote: “The people who review content make our platforms safer, and we’re grateful for the important work that they do. Our safety and security efforts are never finished, so we’re always working to do better and to provide more support for content reviewers around the world.”
  • So why, even after blaming contracted moderators for the Kenosha Guard mistake, has Facebook not made any move to bring moderation in-house? At times, Zuckerberg has downplayed the importance of the task, positioning the social network as a bastion of free expression. In the leaked transcript of a 2019 all-hands call with Facebook employees, Zuckerberg told them the company outsources moderation so that “we can scale up and down and work quickly and be more flexible on that.” While he said human moderation would always be needed, he implied that in the long run, the company hoped to automate much of the process. (Zuckerberg also called reports of moderators’ inhumane working conditions “a little overdramatic.” This May, Facebook agreed to pay $52 million to moderators who sued after developing post-traumatic stress disorder from the job.)
  • I spoke with Barrett, the author of the NYU report, about why Facebook seems determined to keep outsourcing one of its most essential jobs. Cost is surely part of it, he said. But that can’t be the whole story. The report’s recommendations would cost the company tens if not hundreds of millions of dollars, but it made some $5 billion in profit in the second quarter of 2020 alone. Barrett told me he believes there’s something else at work: a sense among Facebook’s leadership that content moderation is dirty work, unfit for the brilliant engineers and innovators the company prides itself on hiring. “I think there’s this psychological impulse to push away this unsavory but absolutely necessary function of running a global media platform,” he said.
  • Bringing moderation in-house wouldn’t immediately solve the company’s problems, of course — far from it. A big part of the issue at Facebook and YouTube is the platforms’ sheer scale: If they weren’t so dominant, their moderation decisions wouldn’t feel quite so momentous because people could always just use a different social network. That’s an issue that could be addressed through antitrust or Section 230 reform, perhaps, but not by rearranging the moderation chairs.
  • Sarah T. Roberts, the UCLA social media scholar who wrote Behind the Screen: Content Moderation in the Shadows of Social Media, told me the very concept of “commercialized, industrial-scale global content moderation” serves the interests of companies like Facebook and Google because it takes their dominance for granted. “Of course this is a problem of scale; one that is not clear to me can actually be surmounted or addressed by continuing to grow the content moderation workforce in new and different ways,” she said. “But it’s also a question of how the ‘problem’ of content has been framed. It’s a framing that has largely been at the service and to the benefit of the firms themselves.”
  • Making moderators full-time employees also wouldn’t automatically make the job less devastating. In one of Newton’s Verge investigations, he spoke with a content moderator employed full-time by Google, earning near six figures with benefits. She detailed how the job has shaken her to the core. “No matter how well you are paid or how good the benefits are, being a content moderator can change you forever,” Newton concluded.
  • And yet, at a time when the problems wrought by social media can feel overwhelming and intractable, bringing content moderation in-house is a concrete step that the big platforms could take now that would almost certainly make at least some difference. No doubt full-time employees would make content moderation errors, just as contractors do. But their employers would have far more incentive to train them, to invest in their development and well-being, and to help them cultivate domain knowledge that would allow them to do their jobs better. It might lead Zuckerberg to realize that “the ability to scale up and down and work quickly” is less important than treating workers humanely and giving them the tools to make thoughtful decisions about users’ posts. It seems notable that while Zuckerberg blamed contracted moderators for the initial “mistake” on Kenosha Guard, he credited a team of more specialized moderators who specialize in violent groups with reversing the decision the following day. Presumably, those specialists are Facebook employees, although the company declined to confirm that for me.
  • At the very least, insourcing moderation would bring the ugliness of the platform’s worst elements closer to home, forcing Facebook and Google to grapple with the job’s trauma at the level of the employee-employer relationship. It might even spur employees to organize for changes to both their working conditions and Facebook’s systems — which, come to think of it, could be another reason the company prefers to keep the job at arm’s length. So far, there has been no stronger lever for ethical reform at the Big Tech companies than internal pressure from their own employees.
  • Making content moderation a core part of social media platforms’ businesses, to reflect its central role in their products and societal impact, won’t solve everything. But without that as a first, basic step toward taking responsibility for the project of moderating the world’s speech, how can we expect them to solve anything?
Acknowledgement and thanks to:: Wired |OneZero
Sept. 13, 2020