The Big Take

YouTube Went to War Against Terrorists, Just Not White Nationalists

A new book details the video site’s ongoing struggles with the blurry lines of extremist speech.

Photo: Alamy

Unlike many of her colleagues at YouTube, Tala Bardan doesn’t remember the company retreat in June 2017 as a nice long weekend. YouTube employees stayed at the Westin hotel in downtown Los Angeles, enjoyed a private Snoop Dogg performance, and took day trips to a nearby Harry Potter theme park. They drank free drinks. As the partying began that Friday morning, though, Bardan was one of about a dozen unlucky workers that Chief Executive Officer Susan Wojcicki pulled into the hotel’s basement for a sobering meeting about the video site’s problem with terrorists.

Discussions about terrorists were nothing new to Bardan, who worked in a relatively junior position overseas watching violent videos in Arabic for the YouTube division that screened footage categorized as “VE,” company shorthand for violent extremism. (Tala Bardan is a pseudonym used to protect her identity, given her sensitive work.) In the meeting, a top engineer explained that YouTube had decided it would try to eliminate from its site the entire ideology that had given rise to groups such as Islamic State, the Sunni Muslim militant organization. The company would recode YouTube’s promotional algorithm to bury “inflammatory religious and supremacist content.” Policy staff would devise a list of 14 incendiary figures, all Muslim men, who would be banned no matter what they posted.