X

Ex-content moderator sues YouTube, claims job led to PTSD symptoms and depression

The worker watched videos that included beheadings, shootings and child abuse, according to the lawsuit.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
4 min read
youtube-1

YouTube is being accused of not doing enough to safeguard the mental health of its content moderators.

Angela Lang/CNET

A former content moderator is suing Google-owned YouTube after she allegedly developed depression and symptoms associated with post-traumatic stress disorder from repeatedly watching videos of beheadings, child abuse and other disturbing content.

"She has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind," says the lawsuit, which was filed in a California superior court on Monday. The former moderator also can't be in crowded places because she's afraid of mass shootings, suffers from panic attacks and has lost friends because of her anxiety. She also has trouble being around kids and is now frightened to have children, according to the lawsuit. 

The proposed class-action lawsuit accuses YouTube of violating California law by failing to provide a safe workplace for content moderators and not doing enough to safeguard their mental health. Moderators spend more than four hours a day reviewing graphic video content because YouTube is "chronically understaffed," the suit says. These long hours run afoul of YouTube's best practices, according to the lawsuit. Workers are required to review "between 100 and 300 pieces of content per day with an error rate of two to five percent," creating stress and increasing the risk that content moderators develop psychological trauma from the job, according to the lawsuit.

The former moderator, who isn't named, is seeking medical treatment, compensation for the trauma she suffered and the creation of a YouTube-funded medical monitoring program that would screen, diagnose and treat content moderators.

She worked at YouTube through the staffing agency Collabera in an office in Austin, Texas, from January 2018 to August 2019. Collabera and YouTube didn't immediately respond to requests for comment. 

During her time on the job, the worker saw thousands of disturbing videos that showed graphic images such as people eating from a smashed open skull, school shootings with dead children, a fox being skinned alive and a person's head getting run over by a tank, the lawsuit said. She suffered psychological trauma from the job and paid out of pocket to get treatment, according to the lawsuit. 

YouTube, like other tech companies such as Facebook and Twitter, rely on both technology and humans to review posts and videos that could violate their rules against violence, hate speech and other offensive content. More contract workers are speaking out about the toll this job takes on their mental health because they're constantly exposed to graphic content. 

At the same time, tech companies are under more pressure to combat hate speech and misinformation head of the US presidential election in November. 

Joseph Saveri Law Firm, which also filed a 2018 lawsuit on behalf of moderators who reviewed Facebook content, is representing the former YouTube content moderator. In May, Facebook agreed to pay $52 million to content moderators as part of a settlement.

An unsafe work environment 

The lawsuit against YouTube alleges the company failed to adequately inform potential content moderators about the negative impact the job could have on their mental health and what it involved. Prospective moderators are told they might be required to review graphic videos but don't get more details about the job or its potential impact on their mental health.

During training, workers aren't told how to assess their reactions to graphic videos, and YouTube doesn't ease moderators into the job "through controlled exposure with a seasoned team member followed by counseling sessions," according to the lawsuit. 

Content moderators are told they could step out of the room when YouTube is showing them graphic videos during training, but these workers are afraid they will lose their jobs if they do so. That's because they have to pass a test in which they have to determine whether certain content violates YouTube's rules.

YouTube also didn't do enough to provide support for these employees after they started their job, according to the lawsuit. The company allows workers to speak with wellness coaches, but the coaches don't have medical expertise and aren't available to moderators who work at night.

The ex-moderator who is suing YouTube sought the advice of a wellness coach in 2018 after she felt traumatized by a video she reviewed. The coach recommended the worker take illegal drugs and didn't provide any resilience training or ways to cope with her symptoms, according to the lawsuit. Another coach told a content moderator to just "trust in God." The Human Resources department also didn't provide content moderators with any help and YouTube requires workers to sign non-disclosure agreements, making it harder for them to talk about their problems.

Tech companies can also blur graphic images, mute audio or decrease their size to limit the negative impacts viewing offensive content can have on moderators but YouTube failed to provide these technological safeguards, according to the lawsuit.

The lawsuit alleges that YouTube is strictly liable for the harms caused to content moderators because the work is "abnormally dangerous." The lawsuit also accuses YouTube of negligent behavior and of providing "unsafe equipment," making the company responsible for the damages even though content moderators are contract workers. 

If content moderators choose to leave their jobs, they'll lose their pay and health benefits.

"Content Moderators were left with a Hobbesian's choice -- quit and lose access to an income and medical insurance or continue to suffer in silence to keep their job," the lawsuit states.