0 Datasets
0 Files
Get instant academic access to this publication’s datasets.
Yes. After verification, you can browse and download datasets at no cost. Some premium assets may require author approval.
Files are stored on encrypted storage. Access is restricted to verified users and all downloads are logged.
Yes, message the author after sign-up to request supplementary files or replication code.
Join 50,000+ researchers worldwide. Get instant access to peer-reviewed datasets, advanced analytics, and global collaboration tools.
✓ Immediate verification • ✓ Free institutional access • ✓ Global collaborationJoin our academic network to download verified datasets and collaborate with researchers worldwide.
Get Free AccessObjective. The ability of peer review to improve the scientific endeavour, e.g., conduct, reporting, and validity of study findings, is increasingly being questioned (Tennant & Ross-Hellauer, 2020) and calls have been made to showcase changes that occurred to each study due to peer review (Limbu, 2020). Until such transparency is achieved, our objective is to identify and collect studies that analysed differences between preprints or submitted manuscripts and peer reviewed journal articles. Design. We identified studies based on our knowledge of the field and by checking all research at peer review conferences (presented as podium presentations or posters). We also checked references of identified studies. For all included studies we then extracted the year of publication, sampling method, conflicts of interest, funding, data and protocol sharing, number of analysed version-pairs, sample size calculation, scholarly discipline, method used to compare versions, variables (i.e., manuscript sections) analysed for changes, and metric with which the changes were quantified or qualitatively classified. Future steps will include a search of bibliographic databases (and preprint servers) and launching of an online form that will allow anyone to submit missed studies for inclusion in the review. Current findings are only descriptive, but meta-analyses are planned. Results. Of 25 studies published from 1990 till the end of 2021, 16 analysed changes between submitted and published papers and 9 between preprints and published papers. Changes were most often analysed by filling out questionnaires or scoring each of the two manuscript versions separately (n=11) or by comparing them visually (n=6). Median number of analysed version-pairs was 59 (IQR, 41- 122). Most studies analysed changes in health (n=18) or social sciences (n=4) manuscripts. Overall, they find a very high similarity between version-pairs, with largest changes occurring in introduction and discussion sections. Conclusions. Current results indicate that submitted or pre-printed manuscript versions and their peer-reviewed journal version are very similar, with main (analysis) methods and main findings rarely changing. Quantification of these results is pending. Large differences between studies, type of changes, and methods with which they were measured indicate greater need for collaboration in the peer-review field and for core outcome measures for manuscript version changes.
Mario Malički, Ana Jerončić, L.M. Bouter, Gerben ter Riet, John P A Ioannidis, IJsbrand Jan Aalbersberg, Steven N. Goodman (2022). Elucidating the effects of peer review. , DOI: https://doi.org/10.15291/pubmet.3941.
Datasets shared by verified academics with rich metadata and previews.
Authors choose access levels; downloads are logged for transparency.
Students and faculty get instant access after verification.
Type
Article
Year
2022
Authors
7
Datasets
0
Total Files
0
Language
en
DOI
https://doi.org/10.15291/pubmet.3941
Access datasets from 50,000+ researchers worldwide with institutional verification.
Get Free Access