Artifact Evaluation Committee Co-Chairs

Rezaul Chowdhury

Stony Brook University, U.S.

Bin Ren

College of William & Mary, U.S

Artifact Evaluation Committee

Daniel DeLayo

Stony Brook University, U.S.

Xiaojun Dong

University of California Riverside, U.S.

Sophia Heck

University of Vienna, Austria

Florian Kurpicz

Karlsruhe Institute of Technology, Germany

Hans-Peter Lehmann

Karlsruhe Institute of Technology, Germany

Yuchen Ma

College of William & Mary, U.S.

Magdalen Manohar

Carnegie Mellon University, U.S.

Lara Ost

University of Vienna, Austria

Zhen Peng

Pacific Northwest National Laboratory, U.S.

Steffan Sølvsten

Aarhus University, Denmark

Zheqi Shen

University of California Riverside, U.S.

Kyle Singer

MIT, U.S.

Rahul Varki

University of Florida, U.S.

Gaurav Verma

Stony Brook University, U.S.

Letong Wang

University of California Riverside, U.S.

Yihua Wei

University of Iowa, U.S.

Yuanhao Wei

MIT, U.S.

Duo Zhang

University of California Merced, U.S.

ALENEX encourages authors of accepted papers (and accepted papers only) to submit their artifacts (code, data, and any scripts for running experiments, collecting data, and analyzing data) for Artifact Evaluation (AE). This process, conducted by a separate committee after the papers are accepted by ALENEX, verifies the extent to which the artifacts support the work described in those papers. Participation in AE is not mandatory. All papers already accepted by the original ALENEX PC will appear in the ALENEX proceedings irrespective of their participation/success in AE. However, AE results will be considered when inviting top papers to journal special issues. A paper that passes AE will receive SIAM badges for availability and reproducibility which will be printed prominently on the title page of the paper, giving readers greater confidence in the presented results. AE participation will only be publicized for papers that receive badges.

Important Dates
  • Artifacts submission deadline

  • Author rebuttal period, October 10 through:

  • AE results notification by

Submission Site

The submission site is located at https://alenex25ae.hotcrp.com/.

Badges Offered

Authors of accepted papers may apply for any one or both of the following two badges offered by SIAM:

  1. Availability badge
  2. Reproducibility badge

For evaluation by the AE committee, the authors must submit their computer code and data that implement the computational methods proposed in the paper through the ALENEX AE submission site.

  1. The SIAM/ALENEX Availability Badge

    A logo with text on it

Description automatically generated

    The availability badge indicates that the artifacts associated with the paper are publicly available for long-term access and can be found by readers.

    The AE committee will validate the request for the badge based on code/data availability and sufficient (but not necessarily complete) coverage of computational methods. ​The committee will also check if a detailed README file is provided, describing the material and how to use it. The committee, however, will not verify if the artifacts reproduce the results described in the paper.

    If the committee decides to award the badge, the evaluated artifacts (with modifications, if any, as deemed necessary by the committee) must be made publicly available on a permanent and immutable archival repository approved by ALENEX, e.g., Zenodo, figshare, and Dryad. ​A link to the repository must be included in the camera-ready version of the paper, and the paper title along with author names must be included in the main README file of the repository.

  2. The SIAM/ALENEX Reproducibility Badge

    A logo with text on it

Description automatically generated

    The reproducibility badge indicates that the AE committee was able to reproduce the main results of the paper within a reasonable tolerance limit using the artifacts submitted by the authors.

    The authors must include detailed descriptions of the submitted code and data files as well as detailed instructions on how to use the submitted artifacts to reproduce the results presented in the paper. Following the instructions the committee must be able to reproduce a major share of the major results of the paper. While exact reproduction is not required, the differences must not change the paper’s main claims. It is highly recommended that authors include a single high-level “runme.sh” script that automatically compiles the artifact, runs it (printing some interesting events to the console), collects data (e.g., performance data), and produces files such as graphs or charts similar to the ones used in the paper.

    If awarded, this badge does not require the artifacts to be made public unless the paper receives the availability badge as well. In that case, the version uploaded to the public repository must include detailed instructions for reproduction along with any modifications, as deemed necessary by the committee.

    Submission and Packaging Guidelines

    We recommend (but do not require) packaging the artifacts as a Docker image or in a virtual machine, particularly when the reproducibility badge is sought. This will allow the reviewers to run the code in an execution environment intended by the authors with minimal systems-specific configurations on the part of the reviewers. 

    In the absence of a Docker/VM image, the reviewers will run the code on machines available to them for their own research using software/libraries that are already installed on them or that can be installed for free. We will try our best to match submissions with reviewers who have access to the required software/hardware resources. We strongly recommend testing the artifact on a fresh machine before submission to find and fix any missing dependencies.

    In case the experiments must be run on specific/rare hardware (e.g., a machine with a very large RAM, a  CPU with a specific new feature) or require proprietary software/data that cannot be distributed to the AEC, please contact the AEC chairs ahead of time to make necessary arrangements for evaluation. Potential solutions include (with approval from the AEC chairs) giving the AEC access to authors’ machines on which the original experiments were conducted or to suitable hardware through a cloud provider.

    Confidentiality

    AE submissions are confidential. AE committee members/reviewers will not retain/share any part of any artifact submitted for evaluation. 

    Communication with Reviewers

    Throughout the review period, reviewers will be able to anonymously communicate with the authors via HotCRP to ask for clarifications, and system-specific patches, and to resolve logistical problems. The goal of this continuous interaction is to prevent the rejection of artifacts due to minor issues. Reviews will be visible to the authors as soon as submitted, and they will have the opportunity to respond to reviewer questions/concerns to resolve any misunderstanding before the AE decision deadline.

    For questions, please contact ALENEX’25 AE co-chairs, Rezaul Chowdhury ([email protected]) or Bin Ren ([email protected]).

    Acknowledgment: Instructions on this webpage are partly based on well-established AE protocols used by other journals and conferences (e.g., SIAM SISC and ACM PPoPP).