APLAS 2022
Mon 5 - Sat 10 December 2022 Auckland, New Zealand
co-located with SPLASH 2022

Background

A research paper is a peer reviewed description of a body of work. Our research output is much more than the page-limited paper it is described by. For example, as part of our research we will write tech reports containing full descriptions of the work, software that realises the work, proofs that verify the work’s correctness, models that encapsulate the ideas, test suites and benchmarks to document empirical evidence, and so on. The quality of these research artifacts is just as important as that of the paper itself, perhaps even more so. Yet many of our conferences offer no formal means to submit and evaluate anything but the paper. This should change!

Artifact Evaluations [1] have steadfast become a common sight in our community. This year the 20th Asian Symposium on Programming Languages and Systems (APLAS’22) is excited to launch its own Artifact Evaluation process, that will allow authors of accepted papers to optionally submit supporting artifacts. The goal of artifact evaluation is two-fold: to probe further into the claims and results presented in a paper, and to reward authors who take the trouble to create useful artifacts to accompany the work in their paper. Although artifact evaluation is optional, we highly encourage authors of accepted papers to participate in this process.

The evaluation and dissemination of artifacts improves reproducibility and enables authors to build on top of each other’s work. Beyond helping the community, the evaluation and dissemination of artifacts confers several direct and indirect benefits to the authors themselves.

The ideal outcome for the artifact evaluation process is to accept every artifact that is submitted, provided it meets the evaluation criteria mentioned in the Call for Artifacts. We will strive to remain as close as possible to that ideal goal. However, even though some artifacts may not pass muster and may be rejected, we will evaluate in earnest and make our best attempt to follow authors’ evaluation instructions.

[1] https://www.artifact-eval.org/

The Process

To maintain the separation of paper and artifact review, authors will only be asked to upload their artifacts after their papers have been accepted. Authors planning to submit to the artifact evaluation should prepare their artifacts well in advance of this date to ensure adequate time for packaging and documentation.

Throughout the artifact review period, submitted reviews will be (approximately) continuously visible to authors. Reviewers will be able to continuously interact (anonymously) with authors for clarifications, system-specific patches, and other logistics help to make the artifact evaluable. The goal of continuous interaction is to prevent rejecting artifacts for minor issues, not research related at all, such as a “wrong library version”-type problem. The conference proceedings will include a discussion of the continuous artifact evaluation process.

Types of Artifacts

The artifact evaluation will accept any artifact that authors wish to submit, broadly defined. A submitted artifact might be:

  • software
  • mechanized proofs
  • test suites
  • data sets
  • hardware (if absolutely necessary)
  • a video of a difficult- or impossible-to-share system in use
  • any other artifact described in a paper

When in doubt authors are encouraged to contact the AEC Co-chairs for guidance.

Artifact Evaluation Committee

By design, members of the Artifact Evaluation Committee (AEC) represent a broad church of experience ranging from senior graduate students to research associates, to lecturers and professors. All are welcome! Formation of the AEC is through an open call that supports those from underrepresented and distant groups to be become involved with the APLAS community.

A broad church is necessary as, among researchers, experienced graduate students are often in the best position to handle the diversity of systems expectations that the AEC will encounter. In addition, graduate students represent the future of the community, so involving them in the AEC process early will help push this process forward. The AEC chairs devote considerable attention to both mentoring and monitoring, helping to educate the students on their responsibilities and privileges.

This text was adapted from existing text from the ESOP’22 & PLDI’22 AECs.

Call for Artifacts

APLAS 2022 will have post-paper-acceptance voluntary artefact evaluation (new in 2022!). Authors of accepted will be welcome to submit artefacts for evaluation after paper notification. The outcome will not alter the paper acceptance decision. Details of the Artefact Evaluation will be made available later.

The 20th Asian Symposium on Programming Languages and Systems (APLAS’22) is going to be holding its first Artifact Evaluation Committee (AEC). The artifact evaluation process aims to promote, share, and catalogue the research artifacts of papers accepted to the APLAS research track. We are looking for motivated researchers at all academic stages (PhD Students, Researchers, Lecturers, & Professors) to join us on the inaugural APLAS’22 AEC.

Nomination Forms.

The self nomination form:

To nominate a colleague, please use this form

As a committee member your primary responsibility will be to review artifacts submitted by authors of accepted papers and ensure that the artifact is a faithful representation of the accepted paper’s results. This will involve interacting with some tooling provided by the authors, check if the results of the main paper are consistent with the claims in the paper and are also reproducible for researchers to come. APLAS will use a three-phase artifact review process: Kick-The-Tyres; Review the Artifact; and Iron-out-the-Wrinkles. Instructions for chosen committee members will be made available once the committee has been formed.

We will close nominations on:

  • Friday 8th July 2022 (AOE)

and notify the selected committee members on:

  • Friday 15th July 2022 (AOE)

Important Dates (AOE)

  • Author Artifact Submission: Thursday 18th August 2022.
  • Reviewer Preferences Due: Tuesday 23rd August, 2022
  • Review Process:
    • Phase 1 ‘Kick-The-Tyres’ Review Due: Wednesday, 31st August, 2022
    • Phase 2 ‘Full Review’ Due: Monday, 12th September, 2022
    • Phase 3 ‘Iron-out-the-Wrinkles’ Due: Monday 19th September, 2022
  • Author Notification: Thursday 22nd September 2022.

We expect the majority of the reviewing process to be performed between 22nd August 2022 and 19th September 2022. We expect most of the reviewing process to be performed between 22nd August 2022 and 16th September 2022. We expect each artifact to take around eight hours to review and we will look to assign each reviewer three to four reviews. For each artifact we will assign a Lead Reviewer to lead the reviewing process.

Reviewing Process

We expect each artifact to take, on average, eight hours to review, and we will look to assign each reviewer three to four reviews. For each artifact we will assign a Lead Reviewer to lead the reviewing process.

The review process is highly-interactive, you will be communicating anonymously with the authors, and you will know the identity of your fellow reviewers.

All communications will happen using the APLAS’22 AEC HotCRP instance.

Phase 1 ‘Kick-The-Tyres’

The aim of the first phase is to ensure that the artifacts are ready for reviewing. The first phase of the review process will require reviewers to check that they can:

  1. Obtain the artifact using the provided instructions.
  2. Go through a ‘Getting Started Guide’ to ensure the artifact is fit for the main review.

Each reviewer will be asked to submit a short review based on these checks. These initial reviews will be immediately available to authors, who will be able to communicate with the reviewers to address any issues found.

Phase 2 ‘Full Review’

The aim of the second phase is to conduct a thorough assessment of the artifact against the paper, and to submit full, complete reviews that extend and expand upon the initial Phase 1 reviews as necessary. As before, these reviews will be immediately available to authors, and they can communicate with you through HotCRP.

During this phase you will decide whether or not the submitted artifact satisfies the main criteria for Badges.

Phase 3 ‘Iron-out-the-Wrinkles’

We expect the majority of evaluations to be complete after the initial two phases. The third phase, however, is for artifacts whose review process still has issues after Phase 2. This additional phase will give authors and reviewers extra time to discuss and address any pertinent issues that stops the artifact from being reviewed.

Evaluation Guidelines

SIGPLAN has produced some guidance for reviewing empirical evaluations.

https://www.sigplan.org/Resources/EmpiricalEvaluation/

The ECOOP 2018 Committee produced some guidance for reviewing proof artifacts:

https://proofartifacts.github.io/guidelines/ecoop_guidelines.html

some more general guidance for proof artifacts is:

https://proofartifacts.github.io/guidelines/

This call was adapted from the PLDI’22/ESOP’22 AEC reviewer information guides.