ISSTA 2025
Wed 25 - Sat 28 June 2025 Trondheim, Norway

Welcome to the website of the ISSTA 2025 conference.

The ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA) is the leading research symposium on software testing and analysis, bringing together academics, industrial researchers, and practitioners to exchange new ideas, problems, and experience on how to analyze and test software systems.

2025 will mark the 34th edition of ISSTA!

Research Papers: Call for Papers

View track page for all details

We invite high-quality submissions, from both industry and academia, describing original and unpublished results of theoretical, empirical, conceptual, and experimental research on software testing and analysis.

ISSTA invites three kinds of submissions. The majority of submissions are expected to be “Research Papers”, but submissions that best fit the description of “Experience Papers” or “Replicability Studies” should be submitted as such. A good Experience Paper will include lessons learned or other wisdom synthesised for the community from the reported experience. Replicability Studies shall clearly describe their purpose and value beyond the original result.

NEW THIS YEAR: The conference proceedings will be published in the Proceedings of the ACM on Software Engineering (PACMSE), Issue: ISSTA 2025.

Research Papers

Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, methods for emerging systems, in-depth case studies, infrastructures of testing and analysis, or tools are welcome.

Experience Papers

Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience.

Replicability Studies

ISSTA would like to encourage researchers to replicate results from previous papers. A replicability study must go beyond simply re-implementing an algorithm and/or re-running the artefacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, replicability studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A replicability study should clearly report on results that the authors were able to replicate as well as on aspects of the work that were not replicable. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artefact only, but instead to perform a comparative experiment of multiple related approaches.

Replicability studies should follow the ACM guidelines on replicability (different team, different experimental setup): the measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artefacts which they develop completely independently. Moreover, it is generally also insufficient to focus on reproducibility (i.e., different team, same experimental setup) alone. Replicability Studies will be evaluated according to the following standards:

  • Depth and breadth of experiments
  • Clarity of writing
  • Appropriateness of conclusions
  • Amount of useful, actionable insights
  • Availability of artefacts

We expect replicability studies to clearly point out the artefacts the study is built on, and to submit those artefacts to the artefact evaluation. Artefacts evaluated positively will be eligible to obtain the prestigious Results Reproduced badge.

Major Revisions

Papers submitted to the initial deadline may be accepted, rejected or may receive a chance to submit a major revision of the initial submission to the major revision deadline.

Submission Guidelines

Submissions must be original and should not have been published previously or be under consideration for publication while being evaluated for ISSTA. Authors are required to adhere to the ACM Policy and Procedures on Plagiarism and the ACM Policy on Prior Publication and Simultaneous Submissions.

At the time of submission, each paper should have no more than 18 pages for all text and figures, plus 4 pages for references, using the following templates: Latex or Word (Mac) or Word (Windows). Authors using LaTeX should use the sample-acmsmall-conf.tex file (found in the samples folder of the acmart package) with the acmsmall option. We also strongly encourage the use of the review, screen, and anonymous options as well. In sum, you want to use:

\documentclass[acmsmall,screen,review,anonymous]{acmart}

Papers may be submitted using numeric citations, but final versions of accepted papers must use the author-year format for citations. It is a single-column page layout. Submissions that do not comply with the above instructions will be desk rejected without review.

The page limit is strict, i.e., papers that take more than 18 pages for anything apart from references (including any section, figure, text, or appendix), will be desk-rejected. Experience papers and replicability studies should clearly specify their category in the paper title upon submission: “[TITLE](Experience Paper)”, “[TITLE](Replicability Study)”. Papers must be submitted electronically through the ISSTA 2025 submission site.

Each submission will be reviewed by at least three members of the program committee. Submissions will be evaluated on the basis of originality, importance of contribution, soundness, evaluation, quality of presentation, appropriate comparison to related work, and verifiability/transparency of the work. Some papers may have more than three reviews, as the PC chairs may solicit additional reviews based on factors such as reviewer expertise and strong disagreement between reviewers. The program committee as a whole will make final decisions about which submissions to accept for presentation at the conference.

Double-anonymous Reviewing

ISSTA 2025 will conduct double-anonymous reviewing. Submissions should not reveal the identity of the authors in any way. Authors should leave out author names and affiliations from the body of their submission and may want to confirm that their generated PDF does not contain any meta-data with their names. They should also ensure that any citations to related work by themselves are written in third person, that is, “the prior work of XYZ” as opposed to “our prior work”.

Authors with further questions on double-blind reviewing are encouraged to contact the Program Chair by email.

Open Science Policy and “Data Availability” Section

ISSTA has adopted an open science policy. Openness in science is key to fostering scientific progress via transparency, reproducibility, and replicability. The steering principle is that all research results should be accessible to the public, if possible, and that empirical studies should be reproducible. In particular, we actively support the adoption of open data and open source principles and encourage all contributing authors to disclose (anonymized and curated) data to increase reproducibility and replicability.

Upon submission, authors are asked to make their code, data, etc. available to the program committee (via upload of anonymized supplemental material or a link to an anonymized private or public repository) or to comment on why this is not possible or desirable. At least one of the reviewers will check the provided data. While sharing the data is not mandatory for submission or acceptance, it will inform the program committee’s decision. Furthermore, we ask authors to provide a supporting statement on the data availability (or lack thereof) in their submitted papers in a section named “Data Availability” after the Conclusion section.

Important Dates

All dates are 23:59:59 AoE (UTC-12h):

  • Full paper submission: 31 October, 2024
  • Initial notification: 19 December, 2024
  • Revised manuscript submissions (major revisions only): 27 February, 2025
  • Final notification for major revisions: 31 March, 2025
  • Camera ready: 24 April, 2025

Publication Date

The official publication date is the date the proceedings are made available in the ACM Digital Library. This date may be up to two weeks prior to the first day of the ISSTA conference. The official publication date affects the deadline for any patent filings related to published work.