Skip to main content

[Update: Fake cures too] YouTube will remove conspiracy videos falsely linking 5G to the coronavirus

YouTube has confirmed that it will stop promoting and remove videos that are falsely spreading claims linking 5G technology to the coronavirus.

New wireless technology has always allowed crackpots and conspiracy theorists to emerge and spout unsubstantiated claims, but now, YouTube will start actively suppressing the dangerous spread of misinformation on their video-hosting platform. Many videos will lose monetization options, with ads being cut entirely alongside removal from video search results.


[Update 04/27]: Speaking directly to CNN, YouTube CEO Susan Wojcicki confirmed that videos that make unsubstantiated or false claims against WHO guidelines and those of health professionals — herbal remedies, disinfectant concoctions — are already being removed from the video-sharing site to help curb the spread of misinformation during the COVID-19 pandemic.

Wojcicki confirmed that a 75% increase in news from legitimate or confirmed sources has been uploaded to YouTube since the start of 2020. “People saying, ‘Take vitamin C, take turmeric, we’ll cure you,’ those are the examples of things that would be a violation of our policy,” she told the news network. She went on to confirm that YouTube is “removing information that is “false” or “medically unsubstantiated.”

The CEO suggested that, as you’d expect, there have been large shifts in content being shared, uploaded, and consumed. Wojcicki noted the large spikes in home workout, DIY, and education content occuring over the past few weeks. However, she also noted that policy changes have been pushed faster than in recent years to ensure that on authoritative sources and substantiated information is promoted on the platform.


According to a report by the Guardian, YouTube will outright remove videos that promote 5G coronavirus conspiracy theories. However, it may allow other “borderline content” that doesn’t actively mention the coronavirus. In a statement to the Guardian, YouTube said:

We also have clear policies that prohibit videos promoting medically unsubstantiated methods to prevent the coronavirus in place of seeking medical treatment, and we quickly remove videos violating these policies when flagged to us.

We have also begun reducing recommendations of borderline content such as conspiracy theories related to 5G and coronavirus, that could misinform users in harmful ways.

This decision has no doubt been taken due to recent arson attacks on UK wireless networking infrastructure, in which members of the British public attacked what they believed were 5G sites. Networking hardware from O2 and Vodafone was destroyed by those likely accessing conspiracy-driven content on platforms like YouTube and Facebook that have tried to claim a link between the spread of the coronavirus and the rollout of 5G networks.

Sifting through fact and fiction has become a far more difficult process, but this move will help ensure that only the correct information is made available to the general public.

More on YouTube:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Damien Wilde Damien Wilde

Damien is a UK-based video producer for 9to5Google. Find him on Twitter: @iamdamienwilde. Email: damien@9to5mac.com