Report: YouTube Execs Ignored Employee Warnings About Toxic Content
Photo Credit: La Habra, United States - August 2, 2016: Macro closeup image of Youtube app icon among other icons on an iphone smartphone device.

Report: YouTube Execs Ignored Employee Warnings About Toxic Content

Over the past few months, YouTube has come under increased scrutiny regarding its tendency to allow toxic content to flourish on its platform. Now, a report by Bloomberg has revealed YouTube executives ignored the warnings of their own employees. 

More than 20 former and current YouTube staffers said that employees proposed ways to stop the spread of toxic videos — including those with extremist content and conspiracy theories — but were ignored.

Privacy engineer Yonatan Zunger proposed a third tier which would allow videos that were “close to the line” of being removed to remain up. The difference is they would not be recommended. That proposal was rejected by YouTube in 2016.

“I can say with a lot of confidence that they were deeply wrong,” Zunger told Bloomberg.

Rather than solve content issues, employees told Bloomberg that executives were focused on engagement. The company had an internal goal of 1 billion hours of views a day.

A former staffer told Bloomberg that YouTube CEO Susan Wojcicki “never put her fingers on the scale” and her view was to just “run the company, not deal with this.”

This is a big issue because the content YouTube ignores has far-reaching consequences. In 2018, two reports found that YouTube was playing a key role in developing the next generation of the far right, as reported by Vox.

Employees at YouTube tried to counter that problem. In 2018, one employee made a hypothetical alt-right category to show it was as popular as music, sports, and gaming. It was an attempt to show how critical alt-right bloggers were to YouTube’s business model.

The company has started to make some changes, such as no longer recommending conspiracy videos. However this report goes to show that when companies make money off of harmful content, they’re less likely to do anything about it.