ICSA 2024
Tue 4 - Sat 8 June 2024 Hyderabad, Telangana, India

Call for Artifacts

ICSA 2024 will be the fourth ICSA featuring an Artifact Evaluation Track (AET) An AET aims to review and promote the research artifacts of accepted papers. For the presentations of tools in general, check the Demonstrations Track

Artifacts can be software systems, scripts, or datasets related to accepted papers at ICSA 2024. High-quality artifacts of published research papers are a foundation for the research results to be reproduced by other researchers and are thus a desirable part of the publication itself. Tools, in the context of this call, are software systems supporting software architects or software architecture researchers, independently of a previously accepted research paper.

The artifact and tool evaluation system is based on the upcoming NISO Recommended Practice on Reproducibility Badging and Definitions, which is supported by our publisher, IEEE, and the evaluation criteria are also inspired by ACM’s artifact review and the badging system as well as the criteria used by ICSE 2023.

General Information

Authors of the papers accepted to the following ICSA 2023 tracks are expected to submit an artifact for evaluation for the Research Object Reviewed (ROR) badges and the Open Research Object (ORO) badge: Technical Track, Journal-First Track, Software Architecture in Practice Track, and New and Emerging Ideas Track. All authors of papers related to the topics mentioned in the call for papers of the ICSA technical track are invited to submit studies for the Results Reproduced (ROR-R) and Results Replicated (RER) badges.

Please note that we require one author of each artifact submission to peer-review 2-3 other artifacts!

Candidate Artifacts

Artifacts of interest include (but are not limited to) the following:

  • Software, which are implementations of systems or algorithms potentially useful in other studies
  • Data repositories, which are data (e.g., logging data, system traces, survey raw data) that can be used for multiple software engineering approaches
  • Frameworks, which are tools and services illustrating new approaches to software engineering that could be used by other researchers in different contexts

This list is not exhaustive, so the authors are asked to email the chairs before submitting if their proposed artifact is not on this list. Further information on data sharing principles and approaches are further introduced along with an introduction of the general notion of open science in the book chapter Open Science in Software Engineering by Méndez, Graziotin, Wagner, and Seibold: https://arxiv.org/abs/1904.06499.

The best artifact selected by the reviewers will be awarded the best artifact award.

For accepted ICSA 2024 papers, we will integrate the badge on the paper in the official IEEE proceedings.

Evaluation Criteria and Badges

Evaluation criteria for badges and additional information have been taken from ICSE 2023 which are based on the ACM policy on Artifact Review and Badging Version 1.1, and from the upcoming NISO Recommended Practice on Reproducibility Badging and Definitions, which is supported by our publisher, IEEE. The ICSA artifact evaluation track uses a single-blind review process. The reviewers for the artifacts will be a combination of artifact authors and ICSA program committee members. Artifacts will be evaluated using the criteria in the last two rows of the table below. The badges to be awarded as well as the evaluation criteria are the following ones.

Research Object Reviewed (ROR) Open Research Object (ORO) Results Validated
ROR Functional ROR Reusable Results Reproduced (ROR-R) Results Replicated (RER)
Artifacts documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation (corresponds to ACM “Functional” badge) Functional + very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to. (corresponds to ACM “Reusable” badge) Author-created artifacts relevant to this paper have been placed in a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided (corresponds to ACM “Available” badge) Functional + main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author. (corresponds to ACM “Reproduced” badge) ROR + ORO + the main results of the paper have been independently obtained in a subsequent study by a person or team other than the authors, without the use of author-supplied artifacts. (corresponds to ACM “Replicated” badge)
Authors submit their artifact; Reviewers can execute the artifact and get initial results that suggest that the artifact can be used to reproduce the results of the paper (not necessarily all, but as claimed in the paper and artifact) Authors submit their artifact; Reviewers find that is functional and additionally that the documentation and structure of the artifact allow to easily reuse it Authors submit their artifact; Reviewers confirm that all artifacts relevant to this paper are available (1) Independent researchers submit evidence that results of previous ICSA papers were independently reproduced OR (2) Authors submit their artifact and reviewers were able to reproduce all results in the paper (only if doable with low effort) Independent researchers submit evidence that results of previous ICSA papers were independently replicated.


Papers with such badges contain functional and reusable products that other researchers can use to bootstrap their research. Experience shows that such papers earn increased citations and greater prestige in the research community.

General information and further suggested reading about artifact evaluation can be viewed in a public draft standard by Baldassarre, Ernst, Hermann, and Menzies.

Important Dates

  • Artifact Evaluation Registration Deadline: February 20th, 2024
  • Artifact Evaluation Submissions Deadline: February 27th, 2024
  • Artifact Evaluation Notification: March 20th, 2024

Notes: All deadlines are 23:59h AoE (anywhere on Earth)

Submission and Review

The submission and review criteria are mainly taken from ICSE 2023, with some adaptions. Different badges have different submission procedures as described below. TODO: new order: (1) Submission Process for All Artifacts, (2) Submitting Results Reproduced (ROR-R) and Results Replicated (RER) Badges (3) Submitting Tools

Submission Process for All Artifacts

Authors must perform the following steps to submit an artifact (including tools):

  • Preparing the artifact
  • Making the artifact publicly available for reviewers (ideally by using repositories granting public access)
  • Documenting the artifact
  • Submitting the artifact
  • Clarification period

1. Preparing the Artifact

There are two options depending on the nature of the artifacts: Installation Package or Simple Package. In both cases, the configuration and installation of the artifact should take less than 30 minutes. Otherwise, the artifact is unlikely to be endorsed simply because the committee will not have sufficient time to evaluate it.

Installation Package. If the artifact consists of a tool or software system, then the authors need to prepare an installation package so that the tool can be installed and run in the evaluator’s environment. Provide enough associated instruction, code, and data such that some CS person with a reasonable knowledge of scripting, build tools, etc. could install, build, and run the code. If the artifact contains or requires the use of a special tool or any other non-trivial piece of software the authors must provide a virtual machine in the Open Virtual Appliance (OVA) format (e.g., using Oracle Virtual Box or VMware), or a Docker container image with a working environment containing the artifact and all the necessary tools. Alternatively, authors can provide a link to a web service that allows reviewers to execute the artifact (e.g., using Jupyter Notebooks or codeocean.com). Otherwise, the proposal may be desk-rejected.

Simple Package. If the artifact only contains documents that can be used with a simple text editor, a PDF viewer, or some other common tool (e.g., a spreadsheet program in its basic configuration) the authors can just save all documents in a single package file (zip or tar.gz).

2. Documenting the Artifact

The authors need to write and submit documentation explaining how to obtain the artifact package, how to unpack the artifact, how to get started, and how to use the artifacts in more detail. The artifact submission must only describe the technicalities of the artifacts and uses of the artifact that are not already described in the paper.

On the submission site, the following documents should be provided (in Markdown, Plain-text or PDF format).

  • A README main file describing what the artifact does and where it can be obtained (with hidden links and access password if necessary). Also, there should be a clear step-by-step description about how to reproduce the results presented in the paper (i.e., tables, figures, and other reported results), including links to all necessary scripts and input data. If the results presented in the paper take more than 1 hour on a standard laptop to be reproduced, authors are expected to additionally provide simplified examples with runtimes of less than one hour. If the results in the paper have been generated with an artifact using data that underlies non-disclosure agreements, authors should provide artificial replacement data that allows reviewers to assess the artifact independently of that data.
  • An INSTALL file with step-by-step installation instructions (if any). These instructions should include notes illustrating a very basic usage example or a method to test the installation. This could be, for instance, on what output to expect that confirms that the code is installed and working; and the code is doing something interesting and useful.
  • A copy of the accepted paper in pdf format, if any.

Note that for the badge Research Object Reviewed (ROR) – reusable, extensive documentation of the artifact is needed that allows other researchers to repurpose the tool. For software artifacts, this requires to include an overview of the artifact source code and its architecture.

3. Making the Artifact Available

The authors need to make the packaged artifact (installation package or simple package and its documentation) available so that the Evaluation Committee can access it. We suggest a link to a public repository (e.g., GitHub) or to a single archive file in a widely available archive format. If the authors are aiming for the badges Open Research Object (ORO) and beyond, the artifact needs to be publicly accessible in an archival repository that guarantees long-time storage. We suggest using Zenodo, figshare, or similar free services that provide Digital Object Identifiers (DOIs). In other cases, the artifacts do not necessarily have to be publicly accessible for the review process. In this case, the authors are asked to provide a private link or a password-protected link. In any case, we encourage the authors to use permanent repositories dedicated to data sharing where no registration is necessary for those accessing the artifacts (e.g., please avoid using services such as Google Drive).

Note that to score Open Research Object (ORO) or higher, the license needs to reflect some form of open source license.

4. Submitting the Artifact

By the artifact registration deadline (see above), register your research artifact at the ICSA AE HotCRP site - https://nfdixcs.sdq.kastel.kit.edu/. Please use the submission form fields as follows:
  • The title field should contain the name of the original paper to which the artifact belongs, preceded or followed by the name of the artifact if any.
  • The abstract should describe the purpose of the artifact so that reviewers can bid for it
  • Additional information on technical requirements, including, at the end, the required OS (if any), used virtualization technique, and hardware requirements (RAM and disk space) should be given in the respective fields of the submission form.
  • The badges authors apply for can be selected on the submission site.

The remaining submission fields are explained at the submission site.

By the artifact submission deadline, complete your artifact submission by providing all requested information. Before the actual evaluation, reviewers will check the integrity of the artifact and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.).

5. Clarification period

During the review, the reviewers may contact the authors to request clarifications on the basic installation and start-up procedures, to resolve simple installation problems or other questions. If a new question is asked by a reviewer via HotCRP, authors will be notified by e-mail. Please make sure that at least one author is available during this time to respond to reviewer questions. More details on this procedure can be found below in Section Communication during the Clarification Period. Given the short review time available, the authors are expected to respond within 48 hours. Authors may update their research artifacts after submission only for changes requested by reviewers in the clarification period.

Submitting Results Reproduced (ROR-R) and Results Replicated (RER) Badges

For Results Reproduced (ROR-R) and Results Replicated (RER) badges, authors will need to offer appropriate documentation in their abstract that their artifacts have reached that stage. The abstract should include the paper title, the purpose of the research artifact, the badge(s) you are claiming, and the technology skills assumed by the reviewer evaluating the artifact. Please also mention if running your artifact requires specific Operating Systems or other environments.

TITLE: A (Partial)? (Replication or Reproduction) of XYZ. Please add the term partial to your title if only some of the original work could be replicated/reproduced.

  • WHO: name the original authors (and paper) and the authors that performed the replication/reproduction. Include contact information and mark one author as the corresponding author
  • WHERE: include a web link to a publicly available URL directory containing (a) the original paper (that is being reproduced) and (b) any subsequent paper(s)/documents/reports that do the reproduction
  • WHAT: describe the “thing” being replicated/reproduced
  • WHY: clearly state why that “thing” is interesting/important
  • HOW: say how it was done first
  • STATE: describe the replication/reproduction. If the replication/reproduction was only partial, then explain what parts could be achieved or had to be missed
  • DISCUSSION: What aspects of this “thing” made it easier/harder to replicate/reproduce? What are the lessons learned from this work that would enable more replication/reproduction in the future for other kinds of tasks or other kinds of research?

Two reviewers will review each abstract, possibly reaching out to the authors of the abstract or original paper. Abstracts will be ranked as follows

  • If the reviewers do not find sufficient substantive evidence for replication/reproduction, the abstract will be rejected
  • Any abstract that is judged to be unnecessarily critical of prior work will be rejected (*)
  • The remaining abstracts will be sorted according to (a) how interesting they are to the community and (b) their correctness
  • The top-ranked abstracts will be invited to give lightning talks

(*) Our goal is to foster a positive environment that supports and rewards researchers for conducting replications and reproductions. To that end, we require that all abstracts and presentations pay due respect to the work they are reproducing/replicating. Criticism of prior work is acceptable only as part of a balanced and substantive discussion of prior accomplishments.

Reviewer Guidelines

1. Artifact bidding

Please bid on artifacts during the bidding phase using HotCRP. Please consider the platform and OS in the information provided by the authors when making your bids.

2. Artifact review

Please assess the artifacts based on the criteria for the badges the authors have applied for (see table above). If in doubt about how to assess a property, please use comments in HotCRP (after submitting a draft review or even an almost empty review) to contact your fellow reviewers as well as the track chairs to discuss. Please try to install the artifacts soon after the assignment but before March 08, 2024, so that the authors have time to respond to potential problems.

3. Clarification period and channel

During the review phase, reviewers can communicate with the authors via HotCRP. If you as a reviewer need clarifications on the basic installation and start-up procedures, to resolve simple installation problems or other questions, please use the comments in HotCRP (for a paper, select “Main” (top left) -> “Add comment” (at the bottom of the page)) and the comment visible to authors using the drop-down menu. You can already add comments to clarify issues before submitting your initial review.
If you notice any potential ethical or legal issues such as unfit licensing (too restrictive? too permissive?), institutional review of human subjects research (if applicable, an the IRB certificate should form part of the submission), data provenance issues (is there sufficient information on how and where data was collected dataset biased?), and privacy protection (can individual information be obtained?), please ask the authors to clarify and additionally contact the track chairs.

Reviewers are requested to consider at least one update to the artifact or the usage instructions by the authors but are invited to consider as many updates and replies by the authors as needed to clarify an issue.

4. Final artifact review

Please enter your final review into HotCRP and comment on whether the artifact shall receive the badges the authors have applied for. If the authors have applied for multiple badges, please comment on these individually. A possible outcome of your reviews may be that an artifact only receives a subset of the badges the authors applied for.

Artifact Evaluation Track Co-Chairs

Joshua Garcia, University of California, Irvine, USA
Anne Koziolek, Karlsruhe Institute of Technology, Germany