Rationale#

Climate Informatics, like many other communities and fields, has software at its heart. Underlying most publications is a novel piece of software playing some critical role, e.g., embodying a model, processing or analysing data, or producing a visualisation. In order for such software artifacts to have the most impact, they should be available, functional, and reusable, such that other researchers can benefit from the work, verify the claims of the paper, and then build upon the software to do more great work. These ideals are summarised by the FAIR principles of data, which can be applied to software: research software should be Findable, Accessible, Interoperable, and Reusable. In order to help promote FAIR software, Climate Informatics is embarking, for the first time, on an Artifact Evaluation phase for full paper submissions, after those submissions have been accepted for publication as the conference proceedings in Environmental Data Science following the traditional peer-review process. Artifact Evaluation provides an opportunity to embed the values of reproducibility into the publication process in a lightweight opt-in fashion, thus encouraging authors to make software available and the results of the paper reproducible.

Selection Criteria#

Artifacts have two broad purposes: facilitating reproduction and reuse of the work by future scientists. Reuse goes beyond reproduction by allowing future scientists to, for example, extend a tool with new features or to inspect the exact definitions used in a formal proof.

To facilitate reproduction and reuse, an artifact should be:

  • consistent with the claims of the paper and the results it presents;

  • as complete as possible, supporting all claims of the paper;

  • well-documented;

  • future-proof;

  • easy to extend and modify.

Artifacts that satisfy these criteria will be awarded at least one of the ACM-based “Available”, “Functional” and “Reusable” badges. For more details on the badges and the evaluation criteria, see the Evaluation Guidelines.

Publishing workflow#

We want to avoid disincentivising authors who opt into the reproducibility challenge by having this lead to a slower timeline for publication. For those papers that opt-in, we will therefore publish them initially without the badges (so as not to delay the timeline to publication) and at a later date, after artifact evaluation, publish an addendum and retrospectively update the article with the badge(s).

In the CUP platform there is no functionality for versioning articles. However, there is a precedent for retrospectively updating a published article with a badge by using an addendum. See for example https://doi.org/10.1017/dap.2022.5. In this case, it was a mistake: the original article (https://doi.org/10.1017/dap.2021.38) was not awarded Open Data and Open Materials badges when it should have been. So we published the addendum and retrospectively updated the original article and Data Availability Statement with the badges.

Retrospective report#

The reproducibility chairs will produce a general report after the artifact evaluation process documenting the approach and reporting on experiences. Reviewers and authors will be invited to co-author the report, providing their experiences. Any experiences will be anonymised however with respect to the artifacts.