Combating videos that violate Community Guidelines is a constant struggle for Google, with videos occasionally slipping through the cracks of its detection systems. The company last December promised to take additional steps and today is sharing an update as it hopes to be more transparent about the process.
The first YouTube Community Guidelines Enforcement Report — a part of Google’s Transparency Report site — covers October to December 2017 and will be released quarterly. Providing details on enforcement, it shows progress removing violative content, while the Google company planning to “refine” reporting systems and provide additional data on comments, speed, and policy removal reasons by the end of 2018.
Meanwhile, Google will be much better at letting users keep track of videos they manually flag as being problematic with a Reporting History dashboard. It notes the review status, your reason, and whether it’s still online.
YouTube also reviewed its progress with machine learning to examine videos. Of 8 million videos in a three month period bringing last October, 6.7 million were first flagged for review by machines. The majority involved spam and adult content, with 76 percent were removed before they received a single view.
The video site also details that leveraging ML requires more people to review content, with YouTube having staffed full-time specialists with expertise in violent extremism, counterterrorism, and human rights, as well as expanded regional expert teams.