Conspiracy videos make up their own niche section of YouTube and have long been noted as potentially promoting “harmful” misinformation. In a move to clean up the site, YouTube says it will no longer recommend conspiracy videos to users.
In a Jan. 25 blog post, YouTube wrote, “We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
Guillaume Chaslot, a former engineer for Google, YouTube’s parent company, called the move a “historic victory”. In a thread of tweets, Chaslot said, “It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.”
The affected videos will be those YouTube identifies as coming close to violating its community guidelines, without actually crossing that line. According to the blog post, videos the site recommends will no longer lead to just similar videos and, instead, “pull in recommendations from a wider set of topics.”
The change won’t have any impact on the availability of those videos, though. The company noted that if users are subscribed to a channel producing conspiracy content, or if they outright search for it, they will still see related recommendations.
To start, the change will only affect recommendations of a small set of videos in the United States making up less than 1% of YouTube’s content.
“Over time, as our systems become more accurate, we’ll roll this change out to more countries,” YouTube wrote.