Guidelines for Reviewers#
Reviewers will be recruited to the Artifact Evaluation Committee (AEC) from the Climate Informatics community (join here) and Turing Environment and Sustainability Grand Challenge community (join here). Please join either (or both!) communities to receive updates on reviewer training opportunities.
Requirements for reviewers#
We require a minimum level of practical experience in climate data science to participate as a reviewer. This ensures that the review process can focus on technical detail rather than computational literacy. Reviewers will be required to evidence their experience through their own software publications.
Expected workload#
Each reviewer will be assigned 2 submissions
There will be a 2-hour on-boarding session to get everyone up to speed
Each review can be expected to take up to 2 working days to complete, spread over the ~5 weeks of the review process, including time spent communicating with the submitting author using HotCRP. We will consider an artifact that takes longer than 2.5 working days to review to be “unreproducible” for the purposes of this review process.
Final reviews should be submitted by Thursday 24 October.
Benefits to reviewers#
Reviewers will benefit from hands-on training in high fidelity computational reproducibility, which we anticipate will support their own development of reproducible research artifacts. Reviewers will be able to reference their contribution as evidence of leadership culture change towards reproducibility, and invited to co-author a retrospective report to be published in Environmental Data Science after the review process is complete. They will be supported by the Climate Informatics Reproducibility team and AEC throughout, thereby strengthening their connections with this highly skilled team.
Evaluation process#
Reviewers will assess the artifacts against a checklists provided for each “badge” level to be awarded.
For each point, reviewers will be required to briefly note the evidence for this point, anything they have done to validate that point (e.g., what did you need to reproduce a claim), as well as the outcome (negative, neutral, or positive) and a brief reason for their judgment.
Support#
There was a 1-hour informal session on 19th September, 2024 for reviewers who expressed interest in evaluating the submitted artifacts. The recording is available here.
We also organised a panel about reproducibility at CI2024 which provided some motivation and inspiration. The recording is available here.
The artifact chairs are available to ask any questions about the process or to ask advice about preparing the artifact or if any parts of the process are unclear.