Skip to main content

YouTube terminates 270 accounts, removes 150k videos & more in response to ‘sexualized child imagery’

YouTube has responded to reports of ‘sexualized videos of children‘ which attracted numerous comments from suspected pedophiles.

The company says that it has terminated more than 270 accounts, removed more than 150,000 inappropriate videos and turned off comments on over 625,000 videos which had attracted interest from child predators …

The company made the statement to VICE News, along with ad removal from videos falsely portrayed as family-friendly.

Over the past week we removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content. Content that endangers children is abhorrent and unacceptable to us.

Volunteer moderators had earlier claimed that tools used to screen comments have proved ineffective, and that inappropriate comments on videos featuring children are still being made by between 50,000 and 100,000 accounts.

YouTube was also providing disturbing auto-complete suggestions after an apparent targeted attack.

Major brands have been pulling ads from YouTube, or the Google network as a whole, as they did over a similar issue with hate videos.

Photo: Pierce Freeman


Check out 9to5Google on YouTube for more Google & Android news:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel