Skip to main content

Failure in YouTube’s filters sees UK government and mainstream brand ads embedded in hate videos

A BBC ad appearing alongside a neo-Nazi video

UK government ads, as well as those for major brands like L’Oréal, have been embedded in hate videos on YouTube, reports the Times.

The ads have appeared within and alongside videos of former Ku Klux Klan official and holocaust denier David Duke, as well as Steven Anderson, a preacher banned from Britain after praising the terrorist attack on a gay nightclub in Orlando …

These were not the only examples found, reports the paper.

More government-funded adverts, for organisations as varied as the Royal Navy, the BBC and Visit Scotland, pop up on YouTube videos of […] Michael Savage, a DJ banned in the UK for “fostering hatred”, Wagdi Ghoneim, an extremist Islamic preacher who has reportedly praised Osama bin Laden, and the National Rebirth of Poland, a group of Polish nationalists based in London and Manchester.

YouTube uses a personalized ad-selection tool known as programmatic advertising. These use cookies stored on user devices to anonymously select ads which match the known interests of viewers, based on their browsing history. This is why you’ll often see ads relating to products you have researched or bought online.

A number of advertisers – including the British government – have now pulled their ads from YouTube.

When informed of their presence on extremist sites, brands rushed to remove their adverts from YouTube. Channel 4, whose adverts were plastered across several extremist videos, said that the platform was “no longer safe”. Transport for London, one of the main government advertisers, said it was suspending its YouTube ads. The Guardian and Sainsbury’s said that Google’s actions were unacceptable. L’Oréal said it was horrified that its campaign was placed next to Anderson’s videos.

Johnny Hornby, founder of the advertising group The&Partnership, said that Google risked a boycott from the world’s big advertisers unless they ‘sort this out.’

Google issued a somewhat vague statement acknowledging the problem and promising to do better.

We have strict guidelines that define where Google ads should appear, and in the vast majority of cases, our policies work as intended, protecting users and advertisers from harmful or inappropriate content. We accept that we don’t always get it right, and that sometimes, ads appear where they should not. We’re committed to doing better, and will make changes to our policies and brand controls for advertisers.

Google is already under pressure in the UK for failing to be proactive in seeking out and removing hate content.

Peter Barron, Google’s vice-president for communications, was put under pressure by MPs on the home affairs select committee after he told them that the company did not look for hate content on YouTube, instead relying on users to notify it. Google, Facebook and Twitter were engaged in “commercial prostitution”, David Winnick, MP, said, adding that he would be “ashamed” to work there […]

Last night Chuka Umunna, another member of the committee, said that it was “staggering” to hear that both Google and extremists were making money out of adverts appearing alongside extreme and grotesque content.

Barron said Google received 200,000 reports a day of inappropriate content, and that 98% were reviewed within 24 hours.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel