Help Privacy Policy Disclaimer
  Advanced SearchBrowse





"It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations


Zannettou,  Savvas
Internet Architecture, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

(Preprint), 9MB

Supplementary Material (public)
There is no public supplementary material available

Papadamou, K., Zannettou, S., Blackburn, J., De Cristofaro, E., Stringhini, G., & Sirivianos, M. (2020). "It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations. Retrieved from https://arxiv.org/abs/2010.11638.

Cite as: https://hdl.handle.net/21.11116/0000-0007-89E5-C
YouTube has revolutionized the way people discover and consume videos,
becoming one of the primary news sources for Internet users. Since content on
YouTube is generated by its users, the platform is particularly vulnerable to
misinformative and conspiratorial videos. Even worse, the role played by
YouTube's recommendation algorithm in unwittingly promoting questionable
content is not well understood, and could potentially make the problem even
worse. This can have dire real-world consequences, especially when
pseudoscientific content is promoted to users at critical times, e.g., during
the COVID-19 pandemic.
In this paper, we set out to characterize and detect pseudoscientific
misinformation on YouTube. We collect 6.6K videos related to COVID-19, the flat
earth theory, the anti-vaccination, and anti-mask movements; using
crowdsourcing, we annotate them as pseudoscience, legitimate science, or
irrelevant. We then train a deep learning classifier to detect pseudoscientific
videos with an accuracy of 76.1%. Next, we quantify user exposure to this
content on various parts of the platform (i.e., a user's homepage, recommended
videos while watching a specific video, or search results) and how this
exposure changes based on the user's watch history. We find that YouTube's
recommendation algorithm is more aggressive in suggesting pseudoscientific
content when users are searching for specific topics, while these
recommendations are less common on a user's homepage or when actively watching
pseudoscientific videos. Finally, we shed light on how a user's watch history
substantially affects the type of recommended videos.