New study further debunks “far-right” rabbit hole YouTube narrative

YouTube uses algorithms to suggest videos depending on what you watch. There have been many stories about how YouTube’s recommendation algorithms have “radicalized” people by populating their viewings with a specific subject, particularly when the subjects are “conspiracy theories.”

Yet, these accusations have been accused of being a “conspiracy theory” themselves as several studies have debunked such claims.

A new study published Monday further suggests that video recommendations on YouTube do not radicalize people.

The study focused on whether the alleged radicalization is anecdotal or represented an undeniable trend. The results of the study do not rule out the existence of radicalization through social media. However, it does strongly suggest that this radicalization is not at all common.

Keep reading

Unknown's avatar

Author: HP McLovincraft

Seeker of rabbit holes. Pessimist. Libertine. Contrarian. Your huckleberry. Possibly true tales of sanity-blasting horror also known as abject reality. Prepare yourself. Veteran of a thousand psychic wars. I have seen the fnords. Deplatformed on Tumblr and Twitter.

Leave a comment