Artifact Evaluation

Authors of accepted papers with computational results will be invited to submit an Artifact Evaluation Package (AEP) to be reviewed and evaluated for reproducibility and usability by the Artifact Evaluation (AE) Committee.

The main objectives of the AE initiative are to:

  • carry out an independent reproduction of the results stated in the papers and thereby strengthen the quality of the published papers
  • find possible errors in the software implementation
  • provide feedback on the artifacts’ documentation and reusability
  • raise the profile of the FORMATS conference series by recognising the efforts made by authors towards open and reproducible research.

Papers that pass AE will receive a badge that will appear on the first page of the published version; these papers will also be listed on the conference webpage and the final proceedings as having passed the AE. The AE will also feature a Best Artifact award. Papers that do not pass the AE will simply be treated as papers that did not submit an AEP.

Submission Guidelines

AEPs submission will be through EasyChair.

AEPs must include:

  1. a pdf of the submitted paper
  2. a document (webpage or pdf) indicating:
    • which specific results of the paper (tables, figures, etc.) are part of the AEP
    • necessary instructions about how to install and run the software so as to reproduce the specified results. Ideally, authors should provide push-button scripts that will automatically generate figures or tables from the paper.
  3. all necessary software and data made available for the entire duration of the AE process in one of several ways (in decreasing order of preference):
    • a virtual machine, e.g., using VirtualBox or VMware
    • a Docker image along with host OS info in the accompanying document
    • source code that does not requires local installation, e.g.,
      • a Code Ocean capsule.
      • a shared MATLAB Drive folder that could be run using MATLAB Online for submissions in the MATLAB ecosystem
    • source code that requires a local installation, e.g., a downloadable zip archive.
    • If none of the above alternatives is viable, please contact the AE Chairs: Akshay Rajhans and Paolo Zuliani (formats2022ae@easychair.org).

We strongly encourage authors to arrange for an independent test of their AEPs before the final submission. This could be for example performed by collaborators or colleagues who are not familiar with the artifact in question.

We also suggest authors add a suitable license should their artifact be publicly available. 

Evaluation Process

The AE process will be single-blind. 

  • AEPs will be reviewed by at least three anonymous members of the AE Committee. Reviewers will evaluate AEPs based on the Evaluation Criteria listed below.
  • To maintain the single-blind review process, authors must not try to discover the identity of the reviewers and must turn off any kind of analytics in their AEPs.
  • AEPs will be treated as confidential material. The AE Committee members will not
    • use the AEPs for any purpose except reviewing for FORMATS
    • share the AEPs under evaluation
    • store the AEPs after the AE process has concluded
  • If the evaluation of an AEP requires a financial cost (e.g., running it on a cloud service), it might not be evaluated.

Evaluation Criteria

Reproducibility and usability are the two main criteria used by the reviewers. Please see this ACM page for a description of AE terminology.

Reproducibility essentially means that the results of the paper have been independently obtained by a team other than the authors using artifacts provided by the authors. Reviewers will assign a score and provide feedback based on:

  • how many and to what degree have the results in the paper been reproduced following the instructions supplied by the authors
  • how easy it is to reproduce the results of the paper, which includes downloading, installing, and running the AEP
  • the quality and the extent of the supplied documentation for using the AEP.

Usability covers all aspects of good practices regarding documentation, the software development process and/or certification, and the usage and extension of the software. Reviewers will give a special consideration in their written feedback for packages that include the following nice to haves:

  • how well-architected and easy-to-follow the source code is [if provided]
  • any additional efforts taken by the authors for ensuring correctness of their implementation of algorithms (e.g., unit tests, statistical analyses, formal verification)

Example AEP 

  • Citation: Akshay Rajhans, Srinath Avadhanula, Alongkrit Chutinan, Pieter J. Mosterman, and Fu Zhang, “Graphical Modeling of Hybrid Dynamics with Simulink and Stateflow,” In Proceedings of the 21st ACM International Conference on Hybrid Systems: Computation and Control, 2018. Best Repeatability Evaluation Award Finalist.
  • Official version:https://doi.org/10.1145/3178126.3178152
  • Author postprint: https://arajhans.github.io/files/papers/RajhansAC+_HSCC18.pdf
  • AEP files
    • Shared MATLAB Drive folder: Link
    • Accompanying instructions in PDF: Link
      • Note that these instructions were originally intended to work with MATLAB Release R2017b. The code in the MATLAB Drive folder has been slightly modified to directly work with the latest MATLAB release (R2021b at the time of posting this) in MATLAB Online.