English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  "It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations

Papadamou, K., Zannettou, S., Blackburn, J., De Cristofaro, E., Stringhini, G., & Sirivianos, M. (2020). "It is just a flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations. Retrieved from https://arxiv.org/abs/2010.11638.

Item is

Basic

show hide
Genre: Paper
Latex : "It is just a flu": {A}ssessing the Effect of Watch History on {YouTube}'s Pseudoscientific Video Recommendations

Files

show Files
hide Files
:
arXiv:2010.11638.pdf (Preprint), 9MB
Name:
arXiv:2010.11638.pdf
Description:
File downloaded from arXiv at 2020-12-08 16:24
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Papadamou, Kostantinos1, Author
Zannettou, Savvas2, Author           
Blackburn, Jeremy1, Author
De Cristofaro, Emiliano1, Author
Stringhini, Gianluca1, Author
Sirivianos, Michael1, Author
Affiliations:
1External Organizations, ou_persistent22              
2Internet Architecture, MPI for Informatics, Max Planck Society, ou_2489697              

Content

show
hide
Free keywords: Computer Science, Computers and Society, cs.CY,cs.SI
 Abstract: YouTube has revolutionized the way people discover and consume videos,
becoming one of the primary news sources for Internet users. Since content on
YouTube is generated by its users, the platform is particularly vulnerable to
misinformative and conspiratorial videos. Even worse, the role played by
YouTube's recommendation algorithm in unwittingly promoting questionable
content is not well understood, and could potentially make the problem even
worse. This can have dire real-world consequences, especially when
pseudoscientific content is promoted to users at critical times, e.g., during
the COVID-19 pandemic.
In this paper, we set out to characterize and detect pseudoscientific
misinformation on YouTube. We collect 6.6K videos related to COVID-19, the flat
earth theory, the anti-vaccination, and anti-mask movements; using
crowdsourcing, we annotate them as pseudoscience, legitimate science, or
irrelevant. We then train a deep learning classifier to detect pseudoscientific
videos with an accuracy of 76.1%. Next, we quantify user exposure to this
content on various parts of the platform (i.e., a user's homepage, recommended
videos while watching a specific video, or search results) and how this
exposure changes based on the user's watch history. We find that YouTube's
recommendation algorithm is more aggressive in suggesting pseudoscientific
content when users are searching for specific topics, while these
recommendations are less common on a user's homepage or when actively watching
pseudoscientific videos. Finally, we shed light on how a user's watch history
substantially affects the type of recommended videos.

Details

show
hide
Language(s): eng - English
 Dates: 2020-10-222020-11-032020
 Publication Status: Published online
 Pages: 13 p.
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: arXiv: 2010.11638
BibTex Citekey: Papadamou_arXiv2010.11638
URI: https://arxiv.org/abs/2010.11638
 Degree: -

Event

show

Legal Case

show

Project information

show

Source

show