JASA Reproducibility Guide


Reproducibility guide for authors and reviewers publishing in the Journal of the American Statistical Association (JASA).

View the Project on GitHub jasa-acs/repro-guide

Reproducibility Review Form: Evaluation Criteria

Criterion 1- Data availability

Are the data available in a public repository (or if not yet available, is it clear how the data will be made available upon publication)? If it is not, does the author’s rationale for not making the data available seem reasonable?

Criterion 2 - Data integrity

Do the data provided with the submission represent as closely as possible the data originally available to the authors, whether from a public source (e.g., US Census data) or data collected by the authors or their colleagues?

Criterion 3 - Data documentation and usability

Are the data in a form, including with clear metadata and in a non-proprietary format, that can be used and understood by others? Does the documentation adequately describe the variables used in the analyses?

Criterion 4 - Code availability

Is the code available in a public repository (or if not yet available, is it clear how it will be made available upon publication)?

Criterion 5 - Code clarity

Is the code in a form that can be used and understood by others, including being readable at a line-by-line level in terms of syntax and comments?

Criterion 6 - Documentation of workflow

Is there a clear, documented workflow (including data preparation/cleaning steps and analyses) to reproduce the results? Are all key results (figures and tables) supported by the documented workflow?

Are the inputs to and outputs from the different components of the workflow adequately described? Are input values, function arguments, and parameter settings appropriately documented?

Are system requirements for the workflow appropriately documented?

Criterion 7 - Reproducibility

[NOTE: Reproducibility reviewers may choose to, but are NOT required to, run the submitted code to verify that it reproduces the key results. Depending on whether you have chosen to undertake this step, please answer the relevant question below.]

As best you can judge without having run the code, do you have any concerns that the code would not reproduce the key results?

Based on having run the code, did the workflow allow you to reproduce the key results?

Other comments

Do you have any specific comments on issues not covered by the criteria above? Are any parts of the ACC Form incomplete, inadequate, or unclear?


Please summarize your thoughts about the quality of the data, code, and workflow and the potential for reproducibility of the work.