New study further debunks “far-right” rabbit hole YouTube narrative

YouTube uses algorithms to suggest videos depending on what you watch. There have been many stories about how YouTube’s recommendation algorithms have “radicalized” people by populating their viewings with a specific subject, particularly when the subjects are “conspiracy theories.”

Yet, these accusations have been accused of being a “conspiracy theory” themselves as several studies have debunked such claims.

A new study published Monday further suggests that video recommendations on YouTube do not radicalize people.

The study focused on whether the alleged radicalization is anecdotal or represented an undeniable trend. The results of the study do not rule out the existence of radicalization through social media. However, it does strongly suggest that this radicalization is not at all common.

Keep reading

Author: HP McLovincraft

Seeker of rabbit holes. Pessimist. Libertine. Contrarian. Your huckleberry. Possibly true tales of sanity-blasting horror also known as abject reality. Prepare yourself. Veteran of a thousand psychic wars. I have seen the fnords. Deplatformed on Tumblr and Twitter.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s