YouTube is likely the biggest source of radicalization in the Western world, as its algorithm has a way of recommending conspiratorial content whether you want it or not. Here is MSNBC’s Chris Hayes walking you through a standard YouTube rabbit hole.
This is how YouTube makes money. Speaking as someone who tried to start their own social network centered around an algorithm, it's clear as day what is (was?) happening on YouTube. Their most ardent users are conspiracy theorists, the algorithm prioritizes videos that get a lot of engagement and keep people on the site as long as possible, so the algorithm assumes that when someone watches a popular cooking tutorial, the next thing he or she wants to watch is a 20-minute diatribe with 14 million views on how lizard people really run the world. This weekend, YouTube announced that they will finally do something to stem the constant waterfall of toxic sludge they're pumping into the political discourse every day:
YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos.
Former Google engineer Guillame Chaslot told a very personal story as to how Google's algorithm (used to?) work, and how it is specifically designed to radicalize people.
This is a big, big deal. Around 1.3 billion people use YouTube. For years, their algorithm eschewed any standards of basic humanity for a cash grab underwritten by conspiracy theories that have done immeasurable damage to democracy in the West. YouTube's algorithm was so hapless at identifying manipulative BS that Kremlin propaganda made it through its filters, and YouTube didn't fix it until NBC alerted them to their failure.
The absolutely insane QAnon conspiracy theory, which falsely asserts that Robert Mueller and Donald Trump are actually teaming up together to take down a worldwide pedophilia ring run by the Democrats and Hollywood, would have nowhere near as much reach today without YouTube. When QAnon named Tom Hanks as complicit in their fantasy, YouTube verified it as true.
Same with Steven Spielberg.
What YouTube is doing here is admitting that they are an editorial company. Silicon Valley has fought this distinction tooth and nail in favor of a “we are not responsible for what people put on our platform” approach. The problem is that as your platform gets larger and larger, it becomes harder and harder to assert zero editorial guidance. The logical outcome of this kind of mindset is YouTube—pre-algorithm change—where you can watch a tutorial on how to make ravioli and then have YouTube suggest a Jordan Peterson video with ten million views where he argues that misogyny is necessary because of some minute trait in lobsters (you may think that I was being unnecessarily hyperbolic to lampoon the famed Canadian “philosopher,” but nope, he really argued this should be the case).
The central problem in this whole mess is that capitalism has no moral center. Strictly adhering to the whims of the market has led YouTube to be dominated by conspiracy theorists—to such an extreme degree where the recommendations coming out of the algorithm made it seem like conspiracy theories are YouTube’s product. Cracking down on this nonsense is an unequivocally good thing, and YouTube deserves some amount of credit for finally doing the bare minimum, but the fundamental truth of their platform is unchanged. By placing their fate in the whims of the market, without any serious thought whatsoever as to what kind of company they want to look like, YouTube has become the place on the web to host conspiracy theories, and it will take a lot more than just an algorithm tweak to change that.
Jacob Weindling is a staff writer for Paste politics. Follow him on Twitter at @Jakeweindling.