hide
Free keywords:
Computer Science, Computers and Society, cs.CY
Abstract:
YouTube is by far the largest host of user-generated video content worldwide.
Alas, the platform also hosts inappropriate, toxic, and/or hateful content. One
community that has come into the spotlight for sharing and publishing hateful
content are the so-called Involuntary Celibates (Incels), a loosely defined
movement ostensibly focusing on men's issues, who have often been linked to
misogynistic views.
In this paper, we set out to analyze the Incel community on YouTube by
focusing on the evolution of this community over the last decade and
understanding whether YouTube's recommendation algorithm steers users towards
Incel-related videos. We collect videos shared on Incel-related communities
within Reddit, and perform a data-driven characterization of the content posted
on YouTube. Among other things, we find that the Incel community on YouTube is
getting traction and that during the last decade the number of Incel-related
videos and comments rose substantially. Also, we quantify the probability that
a user will encounter an Incel-related video by virtue of YouTube's
recommendation algorithm. Within five hops when starting from a
non-Incel-related video, this probability is 1 in 5, which is alarmingly high
as such content is likely to share toxic and misogynistic views.