ASE 2024
Sun 27 October - Fri 1 November 2024 Sacramento, California, United States

Call for Papers

The ASE 2024 Demonstrations Track invites researchers and practitioners to present and discuss the most recent advances, experiences, and challenges in the field of software engineering supported by live presentations of new research tools, data, and other artifacts. We encourage innovative research demonstrations, which show early implementations of novel software engineering concepts, as well as mature prototypes. The research demonstrations are intended to highlight underlying scientific contributions.

Whereas a regular research paper points out the scientific contribution of a new software engineering approach, a demonstration paper provides the opportunity to show how a scientific contribution has been transferred into a working tool or data set. Authors of regular research papers are thus encouraged to submit an accompanying demonstration paper. Submissions of independent tools that are not associated with any research papers are welcome.

Papers submitted to the tool demonstration track should describe (a) novel early tool prototypes or (b) novel aspects of mature tools. The submissions must clearly communicate the following information to the audience:

  • the envisioned users;
  • the software engineering challenge the tool addresses;
  • the methodology it implies for its users;
  • the results of validation studies already conducted (for mature tools) or the design of planned studies (for early prototypes).

Submission

Papers must be submitted electronically through the HotCRP submission site by Fri 26 June 2024, and must:

  • All submissions must be in PDF format and conform, at time of submission, to the to the ACM Proceedings Template: https://www.acm.org/publications/proceedings-template. LaTeX users must use the \documentclass[sigconf,review,anonymous]{acmart} option.
  • All submissions must be in English.
  • A demonstration submission must not exceed four pages (including all text, references, and figures);
  • Authors are encouraged to submit a screencast of the tool, with the video link attached to the end of the abstract;
  • Authors are encouraged to make their code and datasets open source, with the link for the code and datasets attached to the end of the abstract;
  • A submission must not have been previously published in a demonstration form and must not simultaneously be submitted to another symposium other than ASE;
  • Submissions for the tool track DO NOT follow a double-blind review process. If a tool track submission accompanies a submission to the research track (which is double-blind), please make sure to click “Yes” in the “Connection with research track” section on HotCRP during submission.

By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all ACM Publications Policies, including ACM’s new Publications Policy on Research Involving Human Participants and Subjects. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.

Please ensure that you and your co-authors obtain an ORCID ID, so you can complete the publishing process for your accepted paper. ACM has been involved in ORCID from the start and we have recently made a commitment to collect ORCID IDs from all of our published authors. The collection process has started and will roll out as a requirement throughout 2022. We are committed to improve author discoverability, ensure proper attribution and contribute to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.

Tools and Data Availability

To promote replicability and to disseminate the advances achieved with the research tools and data sets, we require that data sets are publicly available for download and use. We strongly encourage the same for tools, ideally through their distribution with an open-source software license. Whenever the tool is not made publicly available, the paper must include a clear explanation for why this was not possible.

Authors are also encouraged to distribute their demonstration in a form that can be easily used, such as a virtual machine image, a software container (e.g., Docker), or a system configuration (e.g., Puppet, Ansible, Salt, CFEngine).

Screencast

Authors are required to prepare an up to 5 minutes video demonstrating the tool. For consistency, we require ALL videos to be uploaded to YouTube and made available by the time of submission. The URL of the YouTube video should be added at the end of the abstract.

The video should:

  • provide an overview of the tool’s capabilities and show the major tool features in detail;
  • provide clarifying voice-over and/or annotation highlights;
  • be engaging and exciting for the audience!

Please note that authors of successful submissions will have the opportunity to revise the paper, the video (and its hosting location), the code, and the datasets by the camera-ready deadline.

Submissions that do not comply with the instructions will be rejected without review.

Evaluation

Each submission will be reviewed by at least three members of the tool demonstrations program committee. The evaluation criteria include:

  • Presentation, i.e., the extent to which the presentation meets the high standards of ASE.
  • Relevance, i.e., the pertinence of the proposed tool for the ASE audience;
  • Positioning, i.e., the degree to which the submission considers differences to related tools (pros and cons);
  • Demo quality, i.e., the quality and usefulness of the accompanied artifacts: video, tool, code, and evaluation datasets.

For further information, please feel free to contact the track chairs.

Accepted Papers

After acceptance, the list of paper authors cannot be changed under any circumstances; the list of authors on camera-ready papers must be identical to those on submitted papers. Paper titles cannot be changed except by permission of the Track Chairs and only when referees recommend a change for clarity or accuracy with respect to the paper content.