ASE 2021
Mon 15 - Fri 19 November 2021 Melbourne, Australia

Call for Papers

Authors of papers accepted in any of the tracks of ASE 2021 are invited to submit artifacts associated with those papers to the ASE Artifact Track.

Artifacts will be evaluated as candidates for reusable, available, replicated or reproduced artifacts. Accepted artifact(s) will receive one (and only one) of the badges below on the front page of the authors’ paper and in the proceedings.

In addition, authors of any prior SE work (published at ASE or elsewhere) are invited to submit an artifact to the ASE Artifact Track for evaluation as a candidate replicated or reproduced artifact. If the artifact is accepted:

  • Authors will be invited to give lightning talks on this work at ASE’21
  • We will do our best to work with the IEEE Xplore and ACM Portal administrator to add badges to the electronic versions of the authors’ paper(s).


Functional Reusable Available Replicated Reproduced
No Badge
Artifacts documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation Functional + very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to. Functional + placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided. Available + main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author. Available + the main results of the paper have been independently obtained in a subsequent study by a person or team other than the authors, without the use of author-supplied artifacts.

Papers with such badges contain reusable products that other researchers can use to bootstrap their own research. Experience shows that such papers earn increased citations and greater prestige in the research community. Artifacts of interest include (but are not limited to) the following.

  • Software, which are implementations of systems or algorithms potentially useful in other studies.
  • Data repositories, which are data (e.g., logging data, system traces, survey raw data) that can be used for multiple software engineering approaches.
  • Frameworks, which are tools and services illustrating new approaches to software engineering that could be used by other researchers in different contexts.

This list is not exhaustive, so the authors are asked to email the chairs before submitting if their proposed artifact is not on this list.


Submission and Review

The ASE artifact evaluation track uses a single-blind review process. All submittors must make their repositories available using the following steps:

  • Create a Github repo.
  • Register the repo at Zenodo.org. For details on that process, see https://guides.github.com/activities/citable-code/
  • Make a release at Github, at which time Zenodo will automatically grab a copy of that repo and issue a Digital Object Identifier (DOI)

Authors are to submit the DOI to the HotCRP website here: https://ase20201-artifact-evaluation.hotcrp.com

Prior to reviewing, there may be some interactions to handle setup and install. Before the actual evaluation reviewers will check the integrity of the artifact and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). The Evaluation Committee may contact the authors to request clarifications on the basic installation and start-up procedures or to resolve simple installation problems. Artifact evaluation can be rejected for artifacts whose configuration and installation takes an undue amount of time.

Authors may update their research artifacts after submission only for changes requested by reviewers in the rebuttal phase.

To update artifacts

  • Go to Github
  • Make your changes
  • Make a new release
  • Make a comment in HotCrp that “in response to comment XYZ we have made a new release that addresses that issues as follows: ABC”

IMPORTANT NOTE: different badges have different instructions on what needs to be submitted to Github. See below.

Reusable and Available Badges

Authors need to write and submit documentation files explaining how to obtain the artifact package, how to unpack the artifact, how to get started, and how to use the artifacts in sufficient detail. The artifact submission must describe only the technicalities of the artifacts and uses of the artifact that are not already described in the paper. The submission should contain the following documents (in markdown plain text format).

  • A README.md main file describing what the artifact does and where it can be obtained (with hidden links and access password if necessary).
  • A LICENSE.md file describing the distribution rights. Note that to score “available” or higher, then that license needs to be some form of open source license.
  • An INSTALL.md file with installation instructions. These instructions should include notes illustrating a very basic usage example or a method to test the installation. This could be, for instance, information on what output to expect that confirms that the code is installed and working; and that the code is doing something interesting and useful. IMPORTANT, there should be a clear description of how to reproduce the results presented in the paper.
  • A copy of the accepted paper in pdf format.

For Replicated and Reproduced Badges

Authors will need to offer appropriate documentation that their artifacts have reached that stage. Specifically, a one page (max) abstract in PDF format:

  • TITLE: A (Partial)? (Replication|Reproduction) of XYZ. Please add the term partial to your title if only some of the original work could be replicated/reproduced.
  • WHO: name the original authors (and paper) and the authors that performed the replication/reproduction. Include contact information (emails). Mark one author as the corresponding author.
  • IMPORTANT: include also a web link to a publically available URL directory containing (a) the original paper (that is being reproduced) and (b) any subsequent paper(s)/documents/reports that do the reproduction.
  • IMPORTANT: include also a web link to a publically available URL directory containing (a) the original paper (that is being reproduced) and (b) any subsequent paper(s)/documents/reports that do the reproduction.
  • WHAT: describe the “thing” being replicated/reproduced;
  • WHY: clearly state why that “thing” is interesting/important;
  • PLATFORM: being the operating system where this artifact was mostly developed on;
  • HOW: say how it was done first;
  • WHERE: describe the replication/reproduction. If the replication/reproduction was only partial, then explain what parts could be achieved or had to be missed.
  • DISCUSSION: What aspects of this “thing” made it easier/harder to replicate/reproduce. What are the lessons learned from this work that would enable more replication/reproduction in the future for other kinds of tasks or other kinds of research.

Two PC members will review each abstract, possibly reaching out to the authors of the abstract or original paper. Abstracts will be ranked as follows.

  • If PC members do not find sufficient substantive evidence for replication/reproduction, the abstract will be rejected.
  • Any abstract that is judged to be unnecessarily critical of prior work will be rejected (*).
  • The remaining abstracts will be sorted according to (a) interestingness and (b) correctness.
  • The top ranked abstracts will be invited to give lightning talks.

(*) Our goal is to foster a positive environment that supports and rewards researchers for conducting replications and reproductions. To that end, we require that all abstracts and presentations pay due respect to the work they are reproducing/replicating. Criticism of prior work is acceptable only as part of a balanced and substantive discussion of prior accomplishments.

ACCEPTED PAPERS

After acceptance, the list of paper authors can not be changed under any circumstances and the list of authors on camera-ready papers must be identical to those on submitted papers. After acceptance paper titles can not be changed except by permission of the Track Chairs, and only then when referees recommended a change for clarity or accuracy with paper content.