Write a Blog >>
Requirements Engineering 2021
Mon 20 - Fri 24 September 2021
Authors of articles accepted to the RE conference are encouraged to upload their research artifacts to a publicly accessible archival repository, and submit their work for review. Artifact types include everything from interview scripts and CSV data to statistical scripts and full machine-learning pipelines. Any artifact used to create results presented in research articles.

The RE AE Track aims to promote and celebrate open science. The final goal of the track is to reward authors’ work which satisfies the RE AE Track criteria. Given the emerging nature of open-science and AE tracks in software engineering research, we encourage discussion and patience as a community when reviewing the submissions. More accepted artifacts are better, as long as the review process transforms those submissions into an acceptable state.

Call for Artifacts

Authors of accepted papers in the research, RE@Next! and industry tracks of RE’21 (Available and Reusable Badge) and all authors of previously accepted papers of all past RE submissions (Validated) are invited to submit an artifact to the Artifact Track. Research papers with artifacts receive a “Badge” on the front page of their paper in the proceedings.


The Badges

The badges fall into one of three categories: Available, Reusable, and Validated. The three categories are independent; an article can receive any one, two, or three of the categories of badges. For the Validated category, articles can only be assigned one of Reproduced or Replicated. (Badges are loosely adopted from ACM badges).


Artifacts Available Artifacts Reusable Results Validated
Open to RE’21 Submissions Open to RE’21 Submissions Open to All Past RE Submissions
The artifacts associated with this research are permanently available for retrieval.
Author-created artifacts relevant to this article have been placed on a publically accessible archival repository (such as Zenodo or FigShare). A DOI for the object is provided via these archival repositories and is referenced in both the articles and the artifacts.
The artifacts associated with the research are documented, exercisable, complete, and include appropriate evidence of verification. The artifacts are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to. The main results of the article have been obtained in a subsequent study that was peer-reviewed and published by a person or team other than the authors, ...

... using, in part, artifacts provided by the author. (Reproduced)

... without the use of author-supplied artifacts. (Replicated)



Eligibility and Evaluation Criteria

The purpose of this section is to communicate submission expectations to authors, and reviewing guidelines for reviewers. Failure to meet these guidelines does not automatically mean rejection, and adhering fully to these guidelines does not automatically mean acceptance. Ambiguity is certain to exist, so academic knowledge and skills must be used to fully consider the eligibility of submissions, and scientific integrity is key to a successful and amicable process.



Artifacts Available - Available Badge

Artifacts are hosted online.
The URL to access the artifacts is immutable (cannot be altered by the author).

Please avoid services like Dropbox, Google Drive, One Drive, and institutional websites, as they can easily change URLs and the data behind them.

The organisation hosting the URL plans to maintain it for the foreseeable future.

For this, you must check the mission statement of the hosting organization. Currently, there are only a few known organisations with this mission statement: ArXiv for articles, Zenodo and FigShare for data.
Artifacts have a Digital Object Identifier (DOI) redirecting to the immutable URL.
Anyone can access the artifacts, without the need for registration.



Artifacts Reusable - Reusable Badge

Artifacts can be obtained via the “Artifact Location” README section.
Artifacts are documented in the “Descriptions of Artifacts” README section.
Artifacts can be installed and run via the “Installation Instructions” README section.

There is an expectation that the submitted artifacts can be run on any machine. In cases where this is not possible, it is the responsibility of the authors to provide virtual environments from which to run the artifacts. For example, Python Virtual envs, Docker envs, VirtualBox VMs, etc. Maximum reasonable installation time: 60 minutes.
If your installation time is longer than 60 minutes, you must make this clear in your Installation section and offer an explanation. Some scripts take a long time to produce results. In these cases, the authors must provide a minimum working example and the expected output. This can be done via a smaller dataset, intermediate script data saved by the authors, a truncated script, etc

Artifacts generate the results presented in the article following the “Steps to Reproduce” README section.

All known deviations from results presented in the article must be explicitly outlined (E.g., when a table or figure is not produced, or the produced results are different from the results presented in the paper). Maximum reasonable reproduction time: 60 minutes
Artifacts have a proper open-source license attached as a LICENSE.md file.

If there exists a license file under a different name, a LICENSE.md file must point to the actual license. The artifacts are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. Norms and standards of the research community for artifacts of this type are strictly adhered to.

The artifacts are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. Norms and standards of the research community for artifacts of this type are strictly adhered to.



Artifacts Validated - Reproduced & Replicated Badges

The article applying for the badge was accepted to a past iteration of the RE conference.
A subsequent article was published that …
… reproduced/replicated the same results as the previous article.
… has no overlap of authorship.
… used, in part, artifacts provided by the original authors (Reproduced only)
… did not make use of artifacts provided by the original authors (Replicated only)
The author-supplied abstract summarises the validation thoroughly.
The results were, in fact, validated as correct (as originally stated).

If the claims were falsified, this is good science, but then the original article does not get a Validated badge. (It is sufficient if the results are within a margin / tolerance and slightly deviate from those results of the original study as long as the main claims in the original article are not changed.)



Review Process

The review process has two primary objectives: encourage improvement of artifacts through proper documentation, and verification that the artifacts meet the aforementioned badge criteria. For this reason, the RE AE Track review process is more of a discussion, and less of a traditional conference review.

The review process will take place via GitHub. Each submission will have a folder under “submissions” with the required submission documents. Additionally, each submission will have an associated GitHub issue where reviewers will interact with the authors.

The entire review process is conducted over a two week period. During this time, the reviewers will check the submitted artifacts against the badge guidelines. Reviewers are encouraged to start the review process early, as it can take time for reviewers and authors to sort out unforeseen issues in the artifacts. If reviewers encounter issues, or simply need clarifications, they will communicate via the GitHub issue. Authors must reply as soon as possible to ensure a timely review process. GitHub offers the ability to subscribe to updates on issues, so you can be notified of new messages immediately via email.

Once the reviewer has checked all badge guidelines, and feels there is no more reasonable improvement that can be made by the authors, they will then make their review final on the GitHub issue. We recommend a clear statement such as “Recommended Badges: Available & Reusable” near the end of the full review. Additional information in the review may include a summary of the artifact, compliments regarding the artifact itself, and is



Download submission instructions here

What to Create (Reusable Badge Only)

The Reusable badge requires additional documents to be created and stored with your artifact. These will improve your artifact greatly and guide the reviewer through your work. Note: Do Not Submit these files. They should remain with your artifact at all times.

README.md Standard file describing the artifact in great detail. Minimum sections required:
Summary of Artifacts (why does this artifact exist and what does it do?)
Author Information (List all authors and how to cite work that uses this artifact)
Description of Artifacts (describe each the files, including what was not included)
System Requirements (required system, programs, etc. to run the artifact)
Installation Instructions (how to go from nothing to a running artifact)
Steps to Reproduce (which commands produce which tables, figures, etc.)
LICENSE.md Attach the license applied to this artifact. Note that open science requires open-source licenses.
Artifacts may be rejected if the license is too restrictive.

What to Submit:

SUBMISSION.md This file describes your submission. Required sections:
Requested Badges (requested badges must include qualification justification)
Artifacts Location (link to an upload of your artifact online: Zenodo, FigShare, Github, institutional repo, etc.) If you like Github for open source collaboration, consider integrating with Zenodo to create a permanent archive that is citable.
Pre-Print Location (reviewers need to check details in the pre-print)
1-Page Report (Validated Badges Only) Describe the validation: summarise in an abstract, and then detail the reuse by describing the who, what, why and how of the validation work. Include a short discussion of the findings of the validation work and how it pertains to the original article.

How to Submit:

The RE AE process will be conducted via the RE’21 AE Track Github repository Submissions will be submitted by authors via Pull Requests, and review discussions will take place via a single Github Issue for each submission.

  1. Fork the RE’21 AE Track Github repository.
  2. Create a folder titled “firstAuthorLastName_RE_AE” (e.g. “Montgomery_RE_AE”) under the “submissions” folder. See example in RE AE Track repository.
  3. Add the required submission documents (listed above) to that folder.
  4. Open a Pull Request to merge your fork back into the RE AE Track GitHub repository.