ISSTA 2025
Wed 25 - Sat 28 June 2025 Trondheim, Norway

The Artifact Evaluation process is a service provided by the community to help authors of accepted papers provide more substantial supplements to their papers so that future researchers can more effectively build on and compare with previous work. ISSTA invites submissions for artifact evaluation. Research artifacts denote digital objects that were either created by the authors of a research article to be used as part of their study or generated by their experiments.

The artifact evaluation (AE) process aims to foster reproducibility and reusability. Reproducibility refers to researchers or practitioners being able to validate the paper’s results using the provided artifact. Reusability means that researchers can extend or use the artifact in a different context or for a different use case. Overall, the artifact evaluation process allows our field to progress by incentivizing and supporting authors to make their artifacts openly available and improve their quality. Furthermore, a formal artifact evaluation documents the outstanding nature of the published research through recognizable and recognized badges stamped directly on the published papers. Therefore, it is common to offer the authors of accepted papers at high-quality conferences, such as ISSTA, an artifact evaluation service before publication.

More details can be found here: ACM guidelines on Artifact Review and Badging Version.

Call for Reviewers

We are pleased to invite you to contribute your expertise as a reviewer on the Artifact Evaluation Committee for ISSTA. This role is an excellent opportunity for senior PhD students, postdocs, and professionals who have previously engaged in the AE process as authors or reviewers. While such experience is highly valued, it is not a strict requirement.

What we are looking for

  • Expertise in Software Testing and Analysis: Your specialized knowledge in these fields is crucial for providing high-quality reviews.
  • Commitment to constructive evaluations: We value your dedication to providing detailed and constructive feedback.
  • Availability: Ability to review 2-3 artifacts from April 12th to May 12th, 2025.

Reviewer responsibilities

  • Evaluate artifacts thoroughly: Assess the submissions’ quality, reproducibility, and relevance.
  • Provide clear, Constructive Feedback: Your insights will significantly assist authors in improving their work.
  • Participate in committee discussions: Engage in deliberations to determine the allocation of artifact badges.

Benefits of being a reviewer

  • Stay informed on emerging research: Keep abreast of the latest developments in software testing and analysis.
  • Network with peers: Connect with other senior researchers and professionals in the field.
  • Earn recognition: Your contributions will be acknowledged on the ISSTA website and in conference proceedings.

Application Process

  • Self-Nomination: Please express your interest by filling out this Google form
  • Application Deadline: March 23, 2025
  • Decision Notification: Decisions on reviewer selections will be communicated at the beginning of April 2025.

Your expertise and contributions as a reviewer are invaluable to the success of ISSTA. We eagerly anticipate your participation and look forward to receiving your application.

Call for Artifacts

The following instructions provide an overview of preparing an artifact for submission. Please also read the instructions and explanations in the subsequent sections on this page before submission.

  1. Prepare your artifact: Along with the artifact itself, prepare a README file (with a .txt, .md, or .html extension) containing two sections:
    • Getting started to demonstrate how to set up the artifact and validate its general functionality (e.g., based on a small example) in less than 30 min.
    • Detailed description to describe how to validate the paper’s claims and results.
  2. Include a preprint of your paper: A preprint of the paper associated with the artifact must be included in the submission package. This is essential to assess whether the artifact adequately supports the claims made in the paper. The preprint should be the accepted version of the paper to ease the artifact evaluation process, allowing reviewers to correlate the claims with the provided artifacts directly.
  3. Upload the artifact: Use Zenodo or a similar service to acquire a DOI for your artifact. This step ensures that your artifact is accessible and can be appropriately cited.
  4. Submit through HotCRP: Provide the DOI and additional information about your artifact, including the paper abstract and the included preprint, using HotCRP.

The Artifact Evaluation Process

The following provides a detailed explanation of the scope of artifacts, the goal of the evaluation process, and the submission instructions. Artifacts can be a variety of different types (but are not limited to):

  • Tools, which are standalone systems.
  • Data repositories storing, for example, logging data, system traces, or survey raw data.
  • Frameworks or libraries, which are reusable components.
  • Machine-readable proofs (see the Guide on Proof Artifacts by Marianna Rapoport)

If you are in doubt whether your artifact can be submitted to the AE process, please contact the AE chairs.

Evaluation Objectives and Badging

The evaluation of the artifacts subsequently targets three different objectives defined in the ACM guidelines on Artifact Review and Badging:

  • Artifact Available v.1.1 Availability: The artifact should be available and accessible for everyone interested in inspecting or using it. As detailed below, an artifact has to be uploaded to Zenodo to obtain this badge.
  • Artifact Functional v.1.1 Functionality: The main claims of the paper should be backed up by the artifact.
  • Artifact Reusable v.1.1 Reusability: Other researchers or practitioners should be able to inspect, understand, and extend the artifact.

Each of the different objectives is handled as part of the evaluation process with each successful outcome awarded with an ACM badge.

Availability

Your artifact should be available via Zenodo, a publicly-funded platform to support open science. The artifact needs to be self-contained. During upload, you must select a license and provide additional information, such as the artifact description. Zenodo will generate a DOI necessary for the artifact evaluation submission (HotCRP). Note that the artifact is immediately public and can no longer be modified or deleted. However, uploading an updated version of the artifact that receives a new DOI (e.g., to address reviewer comments during the kick-the-tires response phase) is possible.

The default storage for Zenodo is currently limited to 50GB per artifact but can be extended on request Zenodo FAQ - Section Policies. Still, please keep the size reasonably small to support reviewers in the process.

Functionality

To judge the functionality and reusability of an artifact, two to three reviewers will evaluate every submission. The reviewers will evaluate the artifact in detail and validate that it backs up the paper’s important claims.

The README file is crucial for guiding reviewers through the evaluation process and should include:

  • Getting started: This section must outline the necessary steps to set up the artifact and verify its general functionality. This could involve:
    • Listing the artifact’s requirements, with considerations for different operating systems or environments.
    • Detailed instructions for initializing the artifact, whether it involves compiling source code, running a virtual machine or container, or other setup processes.
    • Specific commands or actions reviewers should perform to verify basic functionality, including expected outcomes and approximate time required for these steps. The goal is for reviewers to complete this part within 30 minutes.
  • Detailed description: This section should demonstrate how the artifact supports each claim and result presented in the paper. It may include:
    • Step-by-step instructions to replicate the experiments or analyses.
    • Explanation of how the artifact’s outputs validate the paper’s claims.
    • Any necessary background information or context to understand the artifact’s operation and significance.
Reusability

For the “Artifacts Evaluated - Reusable” badge, all requirements for the “Artifacts Evaluated - Functional” need to be met as a prerequisite. When submitting your artifact to HotCRP, you must argue if and why your artifact should receive an “Artifacts Evaluated – Reusable” badge. A typical reusable artifact is expected to correspond to one or multiple of the following characteristics:

  • The artifact is highly automated and easy to use.
  • It is comprehensively documented, and the documentation describes plausible scenarios for extending it.
  • The artifact contains all means necessary for others to extend it. For example, a tool artifact includes its source code, all not commonly available requirements, and a working description of compiling it. Container or virtual machines with all requirements are preferred.
  • The README should contain or point to other documentation that is part of the artifact and describes use case scenarios or details beyond the scope of the paper. Such documentation is not limited to text. For example, a video tutorial could demonstrate how the artifact could be used and evaluated more generally.

The wide variety of artifacts makes it difficult to come up with an exact list of expectations. The points above should be used as a guideline for authors and reviewers of what to provide and what to expect. Feel free to contact the Artifact Evaluation Chairs if you have any doubts.

Distinguished Artifact Awards

Artifacts that go above and beyond the expectations of the Artifact Evaluation Committee will receive a Distinguished Artifact Award.

FAQ

  • Is the reviewing process double-blind? No, the reviewing process is single-blind. The reviewers will know the authors’ identities, while the reviewers’ identities are kept hidden from the authors. Authors can thus submit artifacts that reveal the authors’ identities.
  • How can we submit an artifact that contains private components (e.g., a commercial benchmark suite)? An option would be to upload only the public part of the artifact to Zenodo and share a link to the private component that is visible only to the reviewers by specifying the link in the Bidding Instructions and Special Hardware Requirements HotCRP field. If this is not possible, another option would be to provide reviewers access to a machine that allows them to interact with the artifact’s private component. Both options must adhere to the single-blind reviewing process (i.e., they must not reveal the reviewers’ identities). Whether an “Availability” flag will be awarded for partially available artifacts will be determined based on the AEC’s evaluation.

Questions? Use the ISSTA Artifact Evaluation contact form.