Three Brave Content Moderators Take On Facebook On Behalf Of Thousands In Historic Class Action
Social media platforms have completely transformed our lives. We rely on them for information, advice, and opinions, and to connect with family, friends, and colleagues in many positive ways. What we do not see are postings of humanity’s worst impulses: murders, sexual crimes, bestiality, child abuse, and other vile behavior. For this, we can thank “content moderators”: a non-existent profession a short time ago, now one we rely on for safe navigating—for ourselves and our children.
To maintain sanitized platforms, maximize profits, and cultivate their public images, social media companies rely on content moderators to review objectionable postings and remove any that violate their terms of use. These moderators are our “front line soldiers” in a modern war against online depravity that we all have a stake in winning. They have one of the most difficult jobs in the world. Constant and unrelenting exposure to screening toxic postings causes many to develop and suffer from psychological trauma and/or PTSD. Tech companies have implemented counseling, training, and safety standards to protect them. But they have ignored these standards, instead requiring moderators to work under conditions known to cause and intensify psychological trauma. Also, many moderators work for third-party agencies across several states and are prohibited by non-disclosure agreements from talking about their work concerns publicly. They frequently receive low wages under short-term contracts and minimal health benefits. When I first learned of this nightmare world, I was equally shocked and angered. I vowed I would do everything I could to help them.
In 2018 my firm and co-counsel filed Facebook in San Mateo County Superior Court. Our plaintiff class of content moderators working on behalf of Facebook through third-party agencies alleged they were denied protection against severe psychological and other injuries resulting from viewing objectionable postings. The lawsuit sought mental health screening, treatment, and compensation, and a requirement that Facebook improve working conditions to live up to its own safety standards.
When we initiated Facebook, we were a lone voice in the wilderness raising this type of legal claim. It was a complicated, “first of its kind” lawsuit, so there was no easy path to follow from previous litigation. To avoid having our afflicted clients’ claims shunted into the workers’ compensation system, we instead aimed to achieve a medical monitoring solution. This remedy is available under California law, but successful cases are as rare as Bigfoot sightings
Surprisingly, Facebook CEO Mark Zuckerberg had acknowledged the content moderation problem in a Facebook post. By doing so, he put himself in the middle of the primary issue of the case. I was taken aback by the tone coming from the lawyers on the other side. They took an aggressive approach when we sought access to information and to Zuckerberg and COO Sheryl Sandberg. They insisted we had no legal claim. They were in a hard spot because of Zuckerberg’s post; he did not want to testify under oath.
As the case progressed, we filed summary judgment motions and for judgment on the pleadings. Facebook asked for settlement talks. A JAMS neutral mediated a deal that took a year to hammer out. While confronting numerous obstacles, I was always comforted by my firm’s support and my unwavering recognition that my legal struggles were miniscule compared to the harm my clients were suffering every day, and which unquestionably had to be relieved.
Settlement Reached
In 2020, our class reached a preliminary settlement with Facebook. In 2021, the Court granted final approval of a $52 million settlement and workplace improvements for over 14,000 class members who work for Facebook vendors in California, Arizona, Texas, and Florida. The workplace improvements apply to any U.S.-based content moderation operations for Facebook. It was exhilarating to achieve this outcome and to share the welcome news with the class, especially since success was never guaranteed but was so desperately needed.
Scola v. Facebook, Inc.’s Legacy
This case’s settlement and resulting media attention have opened the door to similar litigation. In 2020, my firm brought a proposed class action against YouTube, Inc., alleging it failed to protect a former content moderator and her co-workers from mental harm caused by reviewing disturbing footage. In 2022, the Court granted preliminary approval to an approximately $4.3 million settlement and about $3.9 million in injunctive relief. In 2022, my firm filed a similar suit again TikTok, which is ongoing.
The role and plight of content moderators is endemic across the social media landscape. All of us need to monitor and support them. The fact that moderators are treated as disposable scares me as it should everyone. It has been my honor and privilege to represent them at every step in this case. There is no limit to what you can do if you are prepared to lose sometimes—sometimes in the face of formidable odds you want and need to make a statement about something that is just plain wrong. This case’s successful resolution has improved our clients’ lives. That is the ultimate gift—indeed, the ultimate impact—that any lawyer can receive: one that I have and continue to aspire to, and which I would encourage everyone in the legal profession to strive for.
Selena Scola, Gabriel Ramos, and Erin Elder were inducted into the Impact Fund Class Action Hall of Fame on February 24th, 2023 in recognition of their courage, sacrifice, commitment, and determination that led to a significant advance in social justice.