How YouTube helps to promote fake news

Having worked on YouTube’s recommendation algorithm, I started investigating, and came to the conclusion that the powerful algorithm I helped build plays an active role in the propagation of false information.

 

We’ve all heard about conspiracy theories, alternative facts and fake news circulating on the internet. How do they become so popular? What’s the impact of the state-of-the-art algorithms on their success?

 

In order to see what YouTube is currently promoting the most, I wrote an open-source recommendation explorer which extracts the most frequently recommended videos about a query (try it for yourself at the end of this article). I compared them to the 20 first results coming from the same Google and YouTube Search queries.

 

The results on these 5 queries speak for themselves:

 

1. Basic fact: “Is the earth flat or round?”

 

YouTube

 

2. Religion: “Who is the Pope?”

 

 

3. Science: “Is global warming real?”

 

 

4. Conspiracies: “Is Pizzagate real?”

 

Pizzagate is a conspiracy theory according to which the Clintons ran a pedophile ring out of a pizzeria in Washington DC. Videos promoting this theory were recommended millions of times by YouTube in the months preceding the 2016 US presidential election.

 

 

5. Celebrities: “Who is Michelle Obama?

 

 

Why do recommendations differ from search ?

 

YouTube search and YouTube recommendation algorithm yield surprisingly different results in these examples, despite both algorithms using the same data. This shows that small differences in the algorithms can yield large differences in the results. Search is probably optimized more towards relevance, whereas recommendations might take watch time more into account.

 

YouTube doesn’t recommend what people “like”

 

Surprisingly, “likes” and “dislikes” on a video have little impact on recommendations. For instance, many videos claiming Michelle Obama was “born a man” have more dislikes than likes, but are still highly recommended by YouTube. YouTube seems to put more weight in maximizing watch time than likes. Hence, if “the earth is flat” keeps users online longer than “the earth is round”, this theory will be favored by the recommendation algorithm.

 

One author commented: “There are 2 millions flat earth videos on YouTube, it cannot be B.S. !”

 

The snowball effect that boosts conspiracies

 

Once a conspiracy video is favored by the A.I., it gives an incentive to content creators to upload additional videos corroborating the conspiracy. In turn, those additional videos increase the retention statistics of the conspiracy. Next, the conspiracy gets recommended further.

 

Eventually, the large amount of videos favoring a conspiracy makes it appear more credible. For instance, in one of the “flat earth” videos, the author commented: “There are 2 millions flat earth videos on YouTube, it cannot be B.S. !”

 

What we can do

 

The point here is not to pass judgement on YouTube. They’re not doing this on purpose, it’s an unintended consequence of the algorithm. But every single day, people watch more than one billion hours of YouTube content.

 

And because YouTube has a large impact on what people watch, it could also have a lot of power in curbing the spread of alternative news, and the first step to finding a solution is to measure it.

 

Experiment with the recommendation explorer if you want to find out what YouTube recommends the most about subjects you care about.

 

Republished with permission from Medium.

Share on FacebookTweet about this on TwitterShare on Google+

1 Comment on “How YouTube helps to promote fake news

  1. Thɑnks for the marvelous posting! I definitеly enjoyed reading it, yoս’re a great aսthor.

    I wiⅼl ɑlways bookmark your Ьlog and will often come back later in life.
    I ᴡant to encourage you to definitely continue your grеat posts, have a nice morning!

Leave a Reply

Your email address will not be published. Required fields are marked *