Call for Artifacts Submissions
The Artifact Evaluation Track reviews, promotes, and catalogs the research artifacts of accepted ICSE papers. Artifacts are evaluated according to the ACM Artifact Review and Badging Version 1.1 standard (see ACM Badge Definitions)
- Authors of papers accepted to the Research, SEIP, SEET, NIER, SEIS, and Demonstrations tracks can submit an artifact for the Artifacts Available and Artifacts Evaluated badges.
- Authors of any prior SE work (published at the ICSE tracks above) are also invited to submit their work for the Results Validated badges (Results Reproduced / Results Replicated). A peer-reviewed publication that reports the replication or reproduction must also be submitted as evidence, and if awarded, the badge will contain a link to this paper.
Our primary goal is to help authors make their artifacts available and reusable or functional. To this end, we strongly encourage the use of clean, containerized environments (e.g., Docker or similar) and rich metadata for reproducibility.
To ensure that all submitted artifacts can be brought up to the standard for reusability, which requires high-quality documentation and structure, we will enable PC/Author discussions for the review period.
Artifacts Evaluated
This badge is applied to papers whose associated artifacts have successfully completed an independent audit. Artifacts do not need to be made publicly available to be considered for this badge. However, they do need to be made available to reviewers. Two levels are distinguished, only one of which should be applied in any instance:
Functional
The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.
-
Documented: At minimum, an inventory of artifacts is included, and a sufficient description is provided to enable the artifacts to be exercised.
-
Consistent: The artifacts are relevant to the associated paper, and contribute in some inherent way to the generation of its main results.
-
Complete: To the extent possible, all components relevant to the paper in question are included. (Proprietary artifacts do not need to be included. If they are required to exercise the package, then this should be documented, along with instructions on how to obtain it. Proxies for proprietary data should be included to demonstrate the analysis.)
-
Exercisable: Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.
Reusable
The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated - Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing are facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
Artifacts Available
This badge is applied to papers in which associated artifacts have been made permanently available for retrieval.
Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI or link to this repository, along with a unique identifier for the object, is provided.
- We do not mandate the use of specific repositories. Publisher repositories (such as the ACM Digital Library), institutional repositories, or open commercial repositories (e.g., figshare or Dryad) are acceptable. In all cases, repositories used to archive data should have a declared plan to enable permanent accessibility. Personal web pages are not acceptable for this purpose.
- Artifacts do not need to have been formally evaluated for an article to receive this badge. In addition, they do not need to be complete in the sense described above. They simply need to be relevant to the study and add value beyond the text in the article. Such artifacts could be something as simple as the data from which the figures are drawn, or as complex as a complete software system under study.
Results Validated
This badge is applied to papers in which the main results have been successfully replicated by a person or team other than the author. Two levels are distinguished:
Results Reproduced
The main results of the paper have been reproduced in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author.
Results Replicated
The main results of the paper have been independently obtained in a subsequent study by a person or team other than the authors, without the use of author-supplied artifacts.
In each case, exact replication or reproduction of results is neither required nor expected. Instead, the results must be in agreement to within a tolerance deemed acceptable for experiments of the given type. In particular, differences in the results should not change the main claims made in the paper.
Important Dates
22 Jan 2027 Artifact registration deadline
29 Jan 2027 Artifact submission deadline
30 Jan 2027 - 19 Feb 2027 Review period (PC/authors discussion)
26 Feb 2027 Notifications
Important Notes for Authors
Between Jan 30 - Feb 15, the review cycle will be iterative, and authors should be responsive to information requests from reviewers within 2 working days.
Submission link
https://icse2027-artifact.hotcrp.com/
Best Artifact Awards
There will be two ICSE 2027 Best Artifact Awards to recognize the effort of authors creating and sharing outstanding research artifacts.
Submission Guidelines:
Who may submit
-
Artifacts Available / Artifacts Evaluated (Functional or Reusable): Authors of papers accepted to the Research, SEIP, NIER, SEIS, Doctoral Symposium, and Demonstrations tracks in ICSE 2027.
-
Results Validated (Reproduced or Replicated): Authors of any prior ICSE publication in the above tracks. Submissions must include a peer-reviewed publication that reports the reproduction or replication; if awarded, the badge will link to that paper. (ICSE 2027 accepted papers are not eligible for Results Validated.)
Submission for Artifacts Available / Artifacts Evaluated (Functional or Reusable) Badges
What to Submit
Abstract (max 2 pages, PDF)
By the abstract submission deadline (see Important Dates), upload a 2-page abstract via the Submission site: https://icse2027-artifact.hotcrp.com/, which includes:
- Paper title and track, claimed badge(s), and assumed reviewer skills.
- Clear access/download instructions (URLs/DOIs, credentials if needed for private review - note that if you share your artifacts via private links, you cannot get the Available badge).
- Any special runtime needs (e.g., OS, GPUs, large memory/disk, HPC).
Artifact package (what reviewers use)
Authors must perform the following steps to submit an artifact:
- Prepare the artifact
- Make the artifact available
- Document the artifact
- Submit the artifact
1. Prepare the artifact
Both executable and non-executable artifacts may be submitted. The preparation process depends on the type of artifact and its components. Authors should ensure that a typical CS professional can build, install, and run (if applicable) the artifact within a reasonable time frame, following the principles below.
Executable Artifacts
Executable artifacts consist of a tool, prototype, framework, or software system. Authors should prepare an installation package that can be installed and executed in the evaluator’s environment. As a practical guideline, if installation and configuration require more than 30 minutes, the artifact may not be accepted, since the PC may not have sufficient time to evaluate it.
Authors are expected to test the full installation process on a clean machine before submission to ensure that setup can be completed smoothly.
-
Containers / VMs strongly encouraged.
Provide a Docker image or a VirtualBox/VMware VM. -
If the artifact relies on special tools or non-trivial software, a VM or container with everything pre-installed is required.
Artifacts should be self-contained, reproducible, and operable without network connectivity whenever possible. Any reliance on external APIs or online data sources should be clearly documented.
Non-Executable Artifacts
Non-executable artifacts include datasets, documents, scripts, survey/interview packages (survey instruments, interview protocols, qualitative coding schemes, study scripts, analysis notebooks, etc.), or supplementary materials that can be used directly with common tools such as text editors, spreadsheet programs, or PDF viewers. These artifacts should be submitted as a single, optionally compressed package (e.g., .tar, .zip, or .tar.gz). The package should open without requiring proprietary software or additional installations.
Non-executable artifacts must include sufficient documentation describing their structure, format, and intended use. If the data includes large files, authors should provide a sample subset to enable easy inspection. Metadata describing the data source, licensing, and any ethical or legal considerations should also be included.
Hybrid Artifacts
Hybrid artifacts combine executable and non-executable components (for example, a software tool accompanied by datasets or configuration files). Authors should:
- Follow the executable artifact guidelines for packaging and reproducibility.
- Include clear instructions showing how the non-executable components (e.g., data or scripts) integrate with the executable part.
- Provide small test datasets or example runs that demonstrate the full workflow end-to-end.
In all cases, authors should ensure that the artifact:
- Can be easily unpacked and understood without special permissions or system changes.
- Uses standard formats and open tools wherever possible.
- Includes a README detailing the environment setup, data structure (if applicable), and usage examples.
2. Make the artifact available
The authors need to make the packaged artifact available so that the PC can access it.
In order to be eligible for the Artifacts Available badge, it must be placed in a repository with a permanence plan and a stable identifier (e.g., DOI). We do not mandate specific repositories; however, suitable options include publisher/institutional repositories and trusted open services (e.g., Zenodo, Figshare, Dryad). For source code, Software Heritage can archive and identify precise versions (see their submission guide).
Please note that platforms that do not guarantee long-term archival, which presently includes GitHub, generally do not qualify. However, open source software on GitHub that has a history of regular updates over at least 5 years, that have formal releases that generate a DOI on a tool like Zenodo, will be considered.
3. Document the artifact
The authors need to prepare and submit clear documentation explaining how to obtain, unpack, install (if applicable), and use their artifact in detail. The documentation should focus on technical details and usage instructions that are not already described in the paper, so that reviewers can understand, install, and evaluate the artifact efficiently.
Each artifact repository must include the following materials:
-
Accepted paper (PDF) - A copy of the accepted paper that includes a link to the archival repository where the artifact is hosted.
-
LICENSE file - A clear description of the distribution rights.
- For artifacts seeking the Available badge, the license must permit public access and reuse.
-
In line with the ICSE Open Science Policy, authors are encouraged to use an open-source license (e.g., MIT, Apache 2.0, GPL) for software or an open-data license (e.g., CC-BY, CC0, Open Data Commons) for data artifacts.
-
README file - Provided in Markdown, plain text, or PDF format, the README should be comprehensive, well-organized, and cover all the information needed for setup and use. It may include the following sections as appropriate:
- Purpose - A short description of what the artifact does, its main components, and its relationship to the paper.
-
Badges Claimed - List the badge(s) being applied for and justify how the artifact meets the corresponding criteria.
-
Provenance - Specify where the artifact can be obtained (DOI, repository URL, or persistent identifier) and, if available, link to a public preprint of the paper.
-
Data (for artifacts which focus on data or include a nontrivial dataset) - Describe the dataset’s source, structure, provenance, size, and any ethical or legal constraints. Include statements about consent, anonymization, IRB approval, or licensing if relevant.
-
Setup (for executable artifacts) - Provide clear, step-by-step installation and preparation instructions, including:
- Hardware requirements (e.g., CPU, RAM, disk, GPU).
- Software requirements (e.g., Docker, Apptainer, or VM configuration, or OS and library dependencies if not containerized).
- Clear instructions for building and running the tool. Any non-standard configurations should be justified. Providing a tested container image or VM setup is strongly encouraged.
-
Usage (for executable artifacts) - Explain how to execute and test the artifact. Include:
- A basic test or quick-start example showing expected behavior after installation.
- Detailed commands or scripts to replicate the main results of the paper, along with any expected outputs or result files.
To support discoverability and reuse, authors are also encouraged to provide machine-readable metadata that describes their artifact:
-
A CITATION.cff file so others can cite your software/data correctly, containing citation information such as title, authors, version, and DOI, allowing automated citation via platforms like GitHub and Zenodo. See this link for details.
-
Optionally, metadata in CodeMeta or DataCite format. These schemas define structured fields such as creator affiliations, programming languages, dependencies, funding sources, and related works. Including such metadata ensures repositories and indexers can accurately catalog the artifact, making it easier for others to find, cite, and build upon.
4. Submit the artifact
By the abstract submission deadline (see important dates), register your research artifact at the HotCRP site by submitting the 2-page abstract describing your artifact.
The PC may contact the authors via the submission system during the entire review period to request clarifications on the basic installation and start-up procedures or to resolve simple installation problems. Reviewers will be encouraged to attempt to execute submitted software artifacts early on, to minimize the time spent iterating on making the artifact functional, and in turn provide enough time to ensure that all artifacts can be made reusable. Given the short review time available, the authors are expected to respond within a 72-hour period. Authors may update their research artifact after submission only for changes requested by reviewers during this time. Information on this phase is provided in the Submission and Reviewing Guidelines.
Submission for Results Validated (Results Reproduced and Results Replicated) Badges
ICSE 2027 accepted papers are not eligible for these badges.
The Results Validated badges recognize papers whose main results have been successfully obtained by a person or team other than the original authors. These badges are awarded to the original paper whose results were validated, not to the replication paper itself.
Eligibility
- Any previously published ICSE paper (from any track eligible for the Artifact Evaluation process) may be considered for a Results Validated badge.
- The validation must be demonstrated through a peer-reviewed publication (e.g., a paper accepted to ICSE 2027 or another reputable venue) that reports the reproduction or replication of the original results. Differences are acceptable within reasonable tolerances that do not alter the original paper’s claims.
- The badge is awarded to the original paper, and it will include a link to the validating publication.
Submission Process
Authors seeking a Results Validated badge for a prior ICSE publication should submit:
- A 2-page abstract (PDF) through the submission site: https://icse2027-artifact.hotcrp.com/. The abstract should include:
- Full reference to the original ICSE paper and the peer-reviewed replication/reproduction paper serving as evidence.
- A concise summary of what was reproduced or replicated, the extent of agreement with the original findings, and the key methodological differences (if any).
- An explanation of which badge level (Results Reproduced or Results Replicated) applies and why.
- Supporting documentation or repository links may be included if relevant, though the primary evidence should be the published replication study.
Notes
- The authors of the replication study (for example, those publishing at ICSE 2027) may coordinate with the authors of the original work to ensure the badge is correctly applied.
- The ICSE Artifact Evaluation Committee will verify the replication paper’s peer-reviewed status and assess the strength of the validation claim.
- Once approved, the Results Validated badge will be attached to the original ICSE paper, with a direct link to the replication publication as evidence.
Please do not hesitate to contact the chairs for any questions.