Skip to main content

Table 1 Infectious Disease Modeling Reproducibility Checklist elements reported in COVID-19 modeling papers

From: Inter-rater reliability of the infectious disease modeling reproducibility checklist (IDMRC) as applied to COVID-19 computational modeling research

Question

Mean Percent Agreement

(95% CI)

Fleiss Kappa

(95% CI)

Computational Environment

  

1.1) Is the operating system documented?

0.94 (0.87, 1.00)

0.90 (0.81, 0.97)

1.2) Is the operating system version documented?

0.94 (0.87, 1.00)

0.90 (0.79, 0.98)

Analytical Software

  

2.1) Is the name of the analytical software documented (e.g., the programming language name)?

0.88 (0.79, 0.97)

0.82 (0.71, 0.92)

2.2) Is the analytical software accessible for free?

0.82 (0.71, 0.93)

0.68 (0.54, 0.81)

2.3) Is the version of the analytical software documented?

0.79 (0.67, 0.90)

0.75 (0.64, 0.85)

2.4) Do the authors include a specific identifier (DOI, URL, citation) that points to the analytical software that was used?

0.76 (0.64, 0.89)

0.72 (0.61, 0.81)

2.5) Is the analytical software installation guide accessible online?

0.89 (0.80, 0.98)

0.75 (0.60, 0.89)

Model Description

3.1) Is the complete, structured model description provided in the publication, supplement, or referenced publication?

0.53 (0.38, 0.67)

0.58 (0.51, 0.66)

3.2) Is the model specified in the publication or supplement (contrary to being referenced in other papers)?

0.91 (0.83, 0.99)

0.84 (0.70, 0.94)

Model Implementation (“Code”)

4.1) Is the model implementation (e.g., code, workflow) openly accessible online?

0.86 (0.75, 0.96)

0.86 (0.77, 0.94)

4.2) Does the model implementation (e.g., code, workflow) have a version or modification date?

0.84 (0.73, 0.94)

0.69 (0.55, 0.84)

4.3) Does the model implementation (e.g., code, workflow) have an identifier?

0.79 (0.67, 0.90)

0.74 (0.62, 0.85)

4.4) Is the computer language of the model implementation (e.g., code, workflow) documented?

0.62 (0.48, 0.76)

0.39 (0.22, 0.54)

4.5) Are all model implementation (e.g., code, workflow) dependencies clearly specified in either the publication or supplemental files?

0.69 (0.56, 0.83)

0.63 (0.50, 0.75)

4.6) Are the model implementations (e.g., code, workflow) annotated with comments?

0.94 (0.87, 1.00)

0.80 (0.66, 0.92)

Data

5.1) Does the model in the publication use input data?

0.75 (0.62, 0.87)

0.59 (0.47, 0.70)

5.2) Has the source and content of the input data been described in the publication or supplement?

0.59 (0.45, 0.74)

0.36 (0.20, 0.52)

5.3) Does the paper cite a specific, unique, and persistent identifier to refer to each input dataset?

0.54 (0.39, 0.68)

0.36 (0.20, 0.52)

5.4) Is the input data openly accessible?

0.66 (0.52, 0.80)

0.34 (0.16, 0.52)

5.5) Is the data in a format that can be easily re-formatted (or “parsable”) to meet the input specifications of the model implementation?

0.55 (0.41, 0.69)

0.23 (0.10, 0.40)

Experimental Protocol

6.1) Are all the mentioned parameter values for the model implementation (e.g., code, workflow) documented in a single location (e.g., table or list in the publication or supplement)?

0.65 (0.51, 0.79)

0.69 (0.60, 0.77)

6.2) Is there an explanation of how the described/mentioned categories (computational environment, analytical software, model implementation, and data) were used together to create the results (e.g., figures and/ or tables)?

0.50 (0.36, 0.64)

0.58 (0.52, 0.64)

  1. Abbreviations: CI, confidence intervals.