0 Datasets
0 Files
Get instant academic access to this publication’s datasets.
Yes. After verification, you can browse and download datasets at no cost. Some premium assets may require author approval.
Files are stored on encrypted storage. Access is restricted to verified users and all downloads are logged.
Yes, message the author after sign-up to request supplementary files or replication code.
Join 50,000+ researchers worldwide. Get instant access to peer-reviewed datasets, advanced analytics, and global collaboration tools.
✓ Immediate verification • ✓ Free institutional access • ✓ Global collaborationJoin our academic network to download verified datasets and collaborate with researchers worldwide.
Get Free AccessWhen new research is submitted to a journal, other experts in the field (peer reviewers) check the research to make sure it's reliable and clear. Among others, one important part of this process is ensuring that researchers follow reporting guidelines about what information should be included in their papers so that the readers can understand how the research was conducted. We wanted to find out if reminding peer reviewers to focus on the key parts of these guidelines (ie, 10 most important items) would help to improve the reporting quality of published research papers. For this purpose, we conducted two studies in which we randomized manuscripts to either an intervention group or a control group. In the intervention group, the peer reviewers from half of the included manuscript received such a reminder (ie, asking them to check whether the 10 most important reporting items are well described in the manuscript), whereas peer reviewers in the control group did not receive a reminder. Within our previously published main results of these studies we saw that the reporting quality of the published articles did not improve with this intervention. To find out why this approach did not work, we looked closer at the individual reports from peer reviewers and checked how often reviewers asked for these important details and whether authors made the necessary changes. We found that reminders did lead to more requests about reporting items from peer reviewers. However, as a high proportion of peer-reviewed articles is rejected during the peer review process and because not all requests for improvements are addressed by authors, this effect was not visible anymore (ie, "diluted") when assessing published research articles.
Hillary Wnfried Ramirez, Malena Chiaborelli, Christof M Schönenberger, Katie Mellor, Alexandra Griessbach, Paula Dhiman, Pooja Gandhi, Szimonetta Lohner, Arnav Agarwal, Ayodele Odutayo, Michael Maia Schlüssel, Philippe Ravaud, David Moher, Matthias Briel, Isabelle Boutron, Sally Hopewell, Sara Schroter, Benjamin Speich (2025). Do peer reviewers comment on reporting items as instructed by the journal? A secondary analysis of two randomized trials. , 183, DOI: https://doi.org/10.1016/j.jclinepi.2025.111818.
Datasets shared by verified academics with rich metadata and previews.
Authors choose access levels; downloads are logged for transparency.
Students and faculty get instant access after verification.
Type
Article
Year
2025
Authors
18
Datasets
0
Total Files
0
Language
en
DOI
https://doi.org/10.1016/j.jclinepi.2025.111818
Access datasets from 50,000+ researchers worldwide with institutional verification.
Get Free Access