Skip to main content

YouTube says it will stop recommending conspiracy videos that harmfully ‘misinform’ users

YouTube has today announced that it plans to stop recommending conspiracy videos to its users without removing that content from the platform. This change is meant to stop users from seeing content that goes right against the line of YouTube’s Community Guidelines.

In a post today on its official blog, YouTube explains this upcoming change to its recommendations system. Apparently, YouTube will work on “reducing” recommendations which include “borderline content” that doesn’t quite break Community Guidelines, but arguably gets very close to doing so. That includes content that “could misinform users in harmful ways.” YouTube cites examples such as fake cures to serious illness, flat Earth claims, and blatantly false claims about historic events such as 9/11.

Apparently, this change will only affect a small portion of content uploaded to YouTube. A mere one percent of content on the platform falls in this category and YouTube specifically notes that this content won’t be removed. Rather, it will simply not show up on users’ recommendations on the Home feed. They’ll still appear in video/channel recommendations for related videos, as well as showing in search results.

YouTube says that this change to its recommendations will rely both on machine learning and actual human beings. Human evaluators will work to help train the systems that generate recommendations. As that system improves, the change will expand to more users in more regions. At the start, it will only affect some videos for users in the US.

This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These evaluators are trained using public guidelines and provide critical input on the quality of a video.

This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as our systems become more accurate, we’ll roll this change out to more countries. It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube.

The company further says that it believes “this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.” This also comes just a couple weeks after YouTube banned life-threatening challenges and dangerous pranks.

More on YouTube:


Check out 9to5Google on YouTube for more news:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel