YouTube has been at the center of a few different controversies and apparently, executives were warned that something like that might happen. According to a report from Bloomberg, Susan Wojcicki and other executives at YouTube ignored warnings that led to toxic video recommendations and much more.
Apparently in recent years, employees in YouTube and Google raised concerns to upper management surrounding false or toxic content on the platform and how it could spread. Various employees at one time or another wanted to flag and cease recommending videos that fell just short of hate speech, while others wanted to track these videos to determine their popularity, while another did just that.
They were all given the same response to “not rock the boat.”
Reportedly, this was because YouTube’s sole focus was on increasing “engagement” – a measure of views, time spent watching, and interactions. Over 20 employees spoke to Bloomberg stating that YouTube’s corporate leadership is either unwilling or unable to act on warnings from employees for a fear of “throttling” engagement. One person specifically said that Wojcicki would “never put her fingers on the scale.”
A YouTube spokesperson apparently “contested the notion that Wojcicki is inattentive to these issues and that the company prioritizes engagement above all else.” Further, she pointed to how the past two years have been spent trying to find solutions to these issues. She further says in a statement:
Our primary focus has been tackling some of the platform’s toughest content challenges. We’ve taken a number of significant steps, including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies — we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority.
Later in the Bloomberg story, it’s explained that one employee recommended a third “tier” for videos which would pull videos with “bad virality” from recommendations. Yonatan Zunger, a former privacy engineer at Google, brought that suggestion to YouTube staff at the time but was turned down. Fast forward to this year, and YouTube has implemented that to some extent with conspiracy videos.
Interestingly, one idea that was tossed around by YouTube executives was a complete rewrite on how videos were monetized. Instead of paying creators based on the ads in their videos, YouTube would pool together all of its incoming cash and pay out to creators based on engagement, even if creators didn’t run ads on their videos. A year’s worth of work was put into making that work, but it was later killed by Google’s Sundar Pichai. He felt that this method could make the “filter bubble” problem worse. Bloomberg also points out that this method of paying creators would have rewarded videos with “bad virality,” seeing personalities such as the now-booted Alex Jones of InfoWars becoming one of the highest-paid names on the platform.
It’s not totally clear what changed YouTube’s strategy, but the past few months have certainly seen a change in how the platform views the information it spreads. For example, there are now info-panels on sensitive topics such as vaccines to help avoid misinformation, as well as new ways of handling content that “pushes the line.”
More on YouTube:
- YouTube will disable comments on videos featuring minors, adding new ‘comments classifier’
- YouTube’s new info panels fact-check sensitive topics in video search
- YouTube subscriber king PewDiePie admits defeat to T-Series right before taking back the lead
FTC: We use income earning auto affiliate links. More.