Skip to main content

Google’s fight to keep terror content off of YouTube continues with encouraging results

Chances are you use YouTube. Whether you upload your own videos or just watch content, the impact that YouTube has had on the world of media consumption over the past few years is undeniable. As YouTube grows and more people use the service, the higher the chances are for content being posted that has no place on the platform.

Google recently updated us on its fight for blocking and removing terror content on YouTube, and the progress that’s being made looks quite promising.

Google recently started using machine learning technology with YouTube to work alongside its human staff for keeping YouTube as safe of a place as possible for everyone, and that move appears to already have been quite effective.

According to Google, the use of its machine learning systems has allowed it to remove more than 75% of violet extremism videos during this past month before even one human was able to manually flag them.

Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we’ve removed for violent extremism over the past month were taken down before receiving a single human flag.

Additionally, Google says that these systems have been more accurate than its human staff when flagging videos for removal, “in many cases” at least.

Lastly, Google reports that it’s been able to double not only the number of videos that have been removed for violent extremism, but also the rate at which the videos are being taken down.

With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we’ve removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.

To go along with the progress being made, Google is is also hiring more experts from leading organizations (No Hate Speech Movement, Anti-Defamation League, Institute for Strategic Dialogue), showing curated playlists to people searching for, “sensitive keywords” that help to debunk messages about violet extremism, and introducing more rigorous standards for videos that aren’t illegal but have still be flagged by users who find that the content violates YouTube’s hate speech policies.

“Altogether, we have taken significant steps over the last month in our fight against online terrorism,” Google writes on the YouTube blog. “But this is not the end”

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Joe Maring Joe Maring

Joe has been a writer and occasional video producer for 9to5Google since July 2017. Follow him on Twitter @JoeMaring1 and send all emails to joe@9to5mac.com.


Joe Maring's favorite gear