0 Datasets
0 Files
Get instant academic access to this publication’s datasets.
Yes. After verification, you can browse and download datasets at no cost. Some premium assets may require author approval.
Files are stored on encrypted storage. Access is restricted to verified users and all downloads are logged.
Yes, message the author after sign-up to request supplementary files or replication code.
Join 50,000+ researchers worldwide. Get instant access to peer-reviewed datasets, advanced analytics, and global collaboration tools.
✓ Immediate verification • ✓ Free institutional access • ✓ Global collaborationJoin our academic network to download verified datasets and collaborate with researchers worldwide.
Get Free AccessMany-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.
Alexandra Sarafoglou, Suzanne Hoogeveen, Don van den Bergh, Balázs Aczél, Casper J. Albers, Tim Althoff, Rotem Botvinik‐Nezer, Niko A. Busch, Andrea M. Cataldo, Berna Devezer, Noah N. N. van Dongen, Anna Dreber, Eiko I. Fried, Rink Hoekstra, Sabine Hoffman, Felix Holzmeister, Jürgen Huber, Nick Huntington‐Klein, John P A Ioannidis, Magnus Johannesson, Michael Kirchler, Eric Loken, Jan-Francois Mangin, Dóra Matzke, Albert J. Menkveld, Gustav Nilsonne, Don van Ravenzwaaij, Martin Schweinsberg, Hannah Schulz-Kuempel, David R. Shanks, Daniel J. Simons, Barbara A. Spellman, Andrea H. Stoevenbelt, Barnabás Szászi, Darinka Trübutschek, Francis Tuerlinckx, Eric Luis Uhlmann, Wolf Vanpaemel, Jelte M. Wicherts, Eric‐Jan Wagenmakers (2024). Subjective evidence evaluation survey for many-analysts studies. , 11(7), DOI: https://doi.org/10.1098/rsos.240125.
Datasets shared by verified academics with rich metadata and previews.
Authors choose access levels; downloads are logged for transparency.
Students and faculty get instant access after verification.
Type
Article
Year
2024
Authors
40
Datasets
0
Total Files
0
Language
en
DOI
https://doi.org/10.1098/rsos.240125
Access datasets from 50,000+ researchers worldwide with institutional verification.
Get Free Access