YouTube attempts to increase advertisement views by promoting extremist, radical videos

YouTube’s current algorithm recommends videos that are likely to have viewers watch for the entire duration in an attempt to increase advertisement revenue. These videos often feature conspiracy theories and radical political opinions, which may wrongfully influence viewer’s beliefs (Fank Stone, Berg Eckle/ wikimedia commons).

YouTube is both the second largest search engine and the second largest social media platform in the world, according to Forbes, due in large part to a mid-2000’s boom of memes, viral videos and sensational content. 

However, a great deal of YouTube’s recent expansion has been a result of the company orchestrating the effect that these kinds of videos have on the viewer, enticing them to watch more content in order to increase revenue from advertisements. 

The YouTube algorithm, which recommends videos to viewers based on the content they’ve already watched, is designed to increase watch times, often by recommending more sensational content like conspiracies or messages from the alt-right.

This tendency of videos to lean toward the fanatical is called the “rabbit hole effect,” which Youtube denies exists. But there is a great deal of evidence for the trend from viewers who claim that the platform played a key role in manifesting their conspiratorial ideologies. In an article for New York Magazine, Brian Feldman discusses a survey of Flat-Earthers which “found YouTube to be a central tool in their conversion to accepting the theory.”

Curious viewers seeking knowledge about a specific topic end up experiencing accidental indoctrination, but how? If the viewer was in complete control of the content viewed, then this phenomenon would be impossible, but they are not. Autoplay—a key feature of the platform’s recommendation system—decides for the viewer to play the next recommended video if nothing else is clicked after about ten seconds. 

This, according to Feldman, is how the viewer will often find their way from a popular video entitled “What is the Flat Earth theory?” to one called “Flat Earth theory PROVEN RIGHT — [Why ILLUMINATI and *NASA* Lied to Us] (and why they must die),” without really intending to. The YouTube formula will continue to feed the viewer content if they are not quick enough to say no, putting the viewer at risk of encountering radicalized content that they may perceive as “normal,” simply because it’s been recommended to them by the platform.

Autoplay came as part of a decision by YouTube to focus on watch time rather than the number of views when rewarding creators for putting out consistent, high-quality content. Ben McOwen Wilson, an executive at YouTube, said, “the videos which are most viewed are those that people watch in their entirety,” and that these videos are more likely to be recommended to other viewers, according to BBC News. 

Under this system, videos about conspiracies and those from the alt-right thrive; they have a tendency to be intense and fanatical throughout, making them more likely to be viewed and therefore more likely to be recommended. Thus, leading to the accidental indoctrination of countless YouTube viewers into the alt-right. 

In Brazil, for example, the recent election of Jair Bolsanaro has been largely credited to his popularity on YouTube, where he speaks out against homosexuality and gender equality and endorses political violence. 

Brazil is one of the largest markets in the world for YouTube videos due to widespread illiteracy and low-quality television networks. New York Times reporters Amanda Taub and Mac Fisher took to Brazil to investigate the widespread belief in conspiracy, alt-right ideology and other forms of misinformation. Viewers who cannot afford to pay for the internet in Brazil use WhatsApp, a messenger app, to “spread misinformation [which] often come[s] from videos that first went viral on YouTube, where they had been boosted by the extremism-favoring algorithms,” according to The New York Times.

Suppressing these voices would be an obvious instance of unjust censorship on the part of YouTube, but the corporation has no obligation to market misinformative or inflammatory content. Their focus on profit has made YouTube an inadvertent stronghold for misinformation, a status which they’ve come to accept in denying that these issues exist in the first place.

In