English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Subjective evidence evaluation survey for many-analysts studies

MPS-Authors
/persons/resource/persons283628

Trübutschek,  Darinka       
Research Group Neural Circuits, Consciousness, and Cognition, Max Planck Institute for Empirical Aesthetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

ncc-24-tru-02-subjective.pdf
(Publisher version), 2MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Sarafoglou, A., Hoogeveen, S., van den Bergh, D., Aczel, B., Albers, C. J., Althoff, T., et al. (2024). Subjective evidence evaluation survey for many-analysts studies. Royal Society Open Science. doi:10.1098/rsos.240125.


Cite as: https://hdl.handle.net/21.11116/0000-000F-B0F6-4
Abstract
Many-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.