Openness in science is key to fostering progress via transparency and availability of all outputs produced at each investigative steps. Transparency and availability of research outputs allow better reproducibility, replicability of quantitative studies and recoverability of qualitative studies. Open science builds the core for excellence in evidence-based research.
As an internationally renowned forum for researchers, practitioners, and educators to present and discuss the most recent innovations, trends, experiences, and challenges in the field of software engineering, ICSE 2023 has decided to actively support setting standards for how we conduct this kind of research.
To this end, we have explicitly committed ourselves to foster openness to our research outcomes. In particular, we support the adoption of open data and open source principles. We encourage all contributing authors to disclose the (anonymized and curated) data to increase reproducibility, replicability, and/or recoverability of the studies.
Research output should be publicly and freely accessible by anyone, permanently.
Artefacts related to a study (which include, but are not limited to, raw and transformed data, extended proofs, appendices, analysis scripts, software, virtual machines and containers, and qualitative codebooks) and the paper itself should, in principle, be made available on the Internet:
- without any barrier (e.g., paywalls, registration forms, request mechanisms),
- under a proper open license that specifies purposes for re-use and repurposing,
- properly archived and preserved,
provided that there are no ethical, legal, technical, economic, or sensible barriers preventing the disclosure.
Fostering artifacts as open data and open source should be done as:
- Archived on preserved digital repositories such as zenodo.org, figshare.com, www.softwareheritage.org, osf.io, or institutional repositories.
- GitHub, GitLab, and similar services for version control systems do not offer properly archived and preserved data.
- Personal or institutional websites, consumer cloud storage such as Dropbox, or services such as Academia.edu and Researchgate.net do not provide properly archived and preserved data.
- Released under a proper open data license such as the CC0 dedication or the CC-BY 4.0 license when publishing the data.
- Software can be released under an open source license.
- Different open licenses, if mandated by institutions or regulations, are also permitted.
We encourage authors to make artifacts available upon submission (either privately or publicly) and upon acceptance (publicly).
We ask authors to provide a supporting statement on the data availability (or lack thereof) in their submitted papers in a section named Data Availability after the Conclusion section.
Authors who cannot disclose data for the reasons stated in the principles of the policies should provide a short statement in their submitted papers in a section named Data Availability after the Conclusion section.
Please note that the success of the open science initiative depends on the willingness (and possibilities) of authors to disclose their data and that all submissions will undergo the same review process independent of whether they disclose their analysis code or data.
A step-by-step approach to disclosing artifacts for (double-blind) peer review and make it open data upon acceptance is available at: https://ineed.coffee/5205/how-to-disclose-data-for-double-blind-review-and-make-it-archived-open-data-upon-acceptance/.
A step-by-step approach to automatically archive a GitHub repository to Zenodo.org is available at https://guides.github.com/activities/citable-code/.
A step-by-step approach to automatically archive a GitHub repository to figshare.com is available at https://knowledge.figshare.com/articles/item/how-to-connect-figshare-with-your-github-account.
A proposal for artifact evaluation by SIGSOFT is available at https://github.com/acmsigsoft/artifact-evaluation.
A proposal for open science in software engineering, including explanations for structuring an open artifact, is available at https://arxiv.org/abs/1904.06499.
We encourage ICSE 2023 authors to self-archive their pre- and post-prints in open and preserved repositories. Self-archiving is legal and allowed by most publishers (granted in the copyright transfer agreement), and it will enable anybody in the world to reach papers barrier-free.
Upon acceptance to ICSE 2023, we encourage authors to revise their article according to the peers’ comments, generate a PDF version of it (post-print), and submit it to arXiv.org or their institutional repository.
Note: Authors are not allowed to self-archive the PDF of the published article as typeset by the publisher (a.k.a. “publisher proof,” “published paper,” “the digital library version”).
A comprehensive FAQ for open access and self-archiving is available at https://avandeursen.com/2016/11/06/green-open-access-faq/.
ICSE 2023 has adopted an open science stance and introduced guidelines for authors (available at https://conf.researchr.org/track/icse-2023/icse-2023-open-science-policies). The policies invite authors to provide all research artifacts for peer review, self-archive their pre- and post-prints, and archive artifacts as open data upon acceptance. We kindly ask you to pay attention to the following, while reviewing:
- All open science steps are optional for authors and reviewers. You are invited, but not required, to inspect the provided artifacts as part of your review efforts.
- All reasons for partial disclosure of data (or lack thereof) should be trusted.
- Submissions have to undergo the same review process independent of whether they disclose their analysis code or data. You are invited to complain in your review of any absence of data, but please do not let it influence your review of submissions. You are free to welcome further disclosure of data and help authors in doing so, with your review.
- Open science is challenging for qualitative studies. Please be welcoming of qualitative studies which open their artifacts even in a limited way. Furthermore, when evaluating artifacts from qualitative studies that was made available for peer review, please keep in mind that authors of qualitative studies might have underlying ontological and epistemological stances that differ from those of authors of quantitative studies. Concepts such as replicability and reproducibility might apply partially or not apply at all with qualitative studies.
- Providing research artifacts might introduce issues with double-blind reviews. We ask you not to actively hunt the identity of authors, especially in case they self-archived a preprint of their submission.