Incident 215: Facebook Content Moderators Demand Better Working Conditions Due to Allegedly Inadequate AI Content Moderation

Description: Content moderators and employees at Facebook demand better working conditions, as automated content moderation system allegedly failed to achieve sufficient performance and exposed human reviewers to psychologically hazardous content such as graphic violence and child abuse.
Alleged: Facebook developed and deployed an AI system, which harmed Facebook content moderators.

Suggested citation format

Dickinson, Ingrid. (2020-04-01) Incident Number 215. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
215
Report Count
1
Incident Date
2020-04-01
Editors
Khoa Lam

Reports Timeline

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

November 2020

Mark Zuckerberg, Sheryl Sandberg, Anne Heraty (CEO, CPL/Covalen), Julie Sweet (CEO, Accenture)

Via email and posting on Facebook’s Workplace channels

Open letter from content moderators re: pandemic

Dear Mr. Zuckerberg, Ms. Sandberg, Ms. Heraty, Ms. Sweet

We, the undersigned Facebook content moderators and Facebook employees, write to express our dismay at your decision to risk our lives—and the lives of our colleagues and loved ones—to maintain Facebook’s profits during the pandemic.

After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office. Moderators who secure a doctors’ note about a personal COVID risk have been excused from attending in person. Moderators with vulnerable relatives, who might die were they to contract COVID from us, have not.

The pandemic has been good for Facebook. More than 3 billion people have now joined Facebook services, creating more demand for our work than ever. Mr. Zuckerberg nearly doubled his fortune during the crisis. He is now worth well over $100 billion. It has been good for Facebook’s contractors, too: CPL, one of the main European contractors, is due to be sold for €318m.

Despite vast sums flowing to each of you as corporate executives, you have refused moderators hazard pay. A content moderator at Accenture’s office in Austin, Texas generally earns $18/hour.

Before the pandemic, content moderation was easily Facebook’s most brutal job. We waded through violence and child abuse for hours on end. Moderators working on child abuse content had targets increased during the pandemic, with no additional support.

Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone. In several offices, multiple COVID cases have occurred on the floor. Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice.

Stop Needlessly Risking Moderators’ Lives

It is important to explain that the reason you have chosen to risk our lives is that this year Facebook tried using ‘AI’ to moderate content—and failed.

At the start of the pandemic, both full-time Facebook staff and content moderators worked from home. To cover the pressing need to moderate the masses of violence, hate, terrorism, child abuse, and other horrors that we fight for you every day, you sought to substitute our work with the work of a machine.

Without informing the public, Facebook undertook a massive live experiment in heavily automated content moderation. Management told moderators that we should no longer see certain varieties of toxic content coming up in the review tool from which we work— such as graphic violence or child abuse, for example.

The AI wasn’t up to the job. Important speech got swept into the maw of the Facebook filter—and risky content, like self-harm, stayed up.

The lesson is clear. Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there.

This raises a stark question. If our work is so core to Facebook’s business that you will ask us to risk our lives in the name of Facebook’s community—and profit—are we not, in fact, the heart of your company?

Without our work, Facebook is unusable. Its empire collapses. Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can.

Facebook needs us. It is time that you acknowledged this and valued our work. To sacrifice our health and safety for profit is immoral.

These are our demands.

  1. Keep moderators and their families safe. At the moment, only individual content moderators with a doctors’ note indicating that they are high risk are excused from working in the office. Even this is not offered in some workplaces. Those who live with an at-risk person – who have, for example, a child with epilepsy – have been forced to come in. All content moderators who are high risk or who live with someone who is high risk for Covid should be permitted to work from home indefinitely.

  2. Maximize at-home working. Work that can be done from home should continue to be done from home. You have previously said content moderation cannot be performed remotely for security reasons. If that is so, it is time to fundamentally change the way that the work is organized. There is a pervasive and needlessly secretive culture at Facebook. Some content, such as content that is criminal, may need to be moderated in Facebook offices. The rest should be done at home.

  3. Offer hazard pay. If you want moderators to risk their lives to maintain ‘community’ and profit, you should pay. Moderators who are working in the office on high-risk material (eg, child abuse) should be paid hazard pay of 1.5x their usual wage.

  4. End outsourcing. There is, if anything, more clamor than ever for aggressive content moderation at Facebook. This requires our work. Facebook should bring the content moderation workforce in house, giving us the same rights and benefits as full Facebook staff.

  5. Offer real healthcare and psychiatric care. Facebook employees enjoy various benefits, including private health insurance and visits to psychiatrists. Content moderators, who bear the brunt of the mental health trauma associated with Facebook’s toxic content, are offered 45 minutes a week with a ‘wellness coach’. These ‘coaches’ are generally not psychologists or psychiatrists and are contractually forbidden from diagnosis or treatment. And they generally cannot build a relationship of trust with moderators, since workers know that Facebook management (and Accenture/CPL management) ask ‘coaches’ to reveal confidential details of counselling sessions. Moderators deserve at least as much mental and physical health support as full Facebook staff.

The current crisis highlights that at the core of Facebook’s business lies a deep hypocrisy. By outsourcing our jobs, Facebook implies that the 35,000 of us who work in moderation are somehow peripheral to social media. Yet we are so integral to Facebook’s viability that we must risk our lives to come into work.

It is time to reorganize Facebook’s moderation work on the basis of equality and justice. We are the core of Facebook’s business. We deserve the rights and benefits of full Facebook staff. We look forward to your public response.

Very sincerely yours,

Andrea, Angela De Hoyos Hart, Ani Niow, Audrey Martin, Aune Mitchell, Azer Gueco, Baris Aytan, Brady Bennett, Cam Herringshaw, Carlin Scrudato, Carlos Ancira, Charles Maxwell, Chris Chan, Christopher Glenn, Claire Sexton, Crystal Chan, Danica Michaels, Daniel Baxley, Daniel Finlayson, Daniel Rezende Fuser, Danille Sindac, Diego Ramirez, Dominick Martinez, Douglas Hart, Erin Donohue, Fletcher West, Hua Hoai Nam, James J. Morrow, Jeremy Calvert, Jess L, Jessica den Boer, John Reese, John Royales McTurk, Jonathan Daniel, Jonathan de la Rosa, Joseph Pouttu, Joseph Sarhan, Joshua Sklar, Katie Adamsky, Kelly Lambert, Kevin Fei, Kevin Liao, Kiara Gaytan, Lucy Yang, Marcus Rodriguez, Maria Sam, Mark Reitblatt, Mayra Ota Coffey, Michael Thot, Mike Vitousek, Naomi Shiffman, Nathan Tokala, Niccolo Coluccio, Nicholas O'Brien, Nick Azcarate, Nick Martens, Noah Korotzer, Nuno Picareta, Palina Andrayuk, Phil Wills, Phillip Shih, Phong Vu, Purnam Jantrania, Raimonds Gabalis, Ramazan Sahin, Rena, Robert Boyce, Ryan Hoyt, Sam Ringel, Sara Valderrama, Sarah Dunn, Shom Mazumder, Steffan Voges, Stephanie Marina, Stuart Millican, Tariq Yusuf, Thi Cat Tuong Trinh, Tina Wall, Tom G, Tristam MacDonald, Vahid Liaghat, Vitor Cordeiro Pileggi, Zoya Waliany, (and a further 248 content moderators who’ve signed anonymously)

Open letter from content moderators re: pandemic