Call for Papers
The Testing Tools and Data Showcase Track seeks to bridge the gap between research and practice by focusing on software testing tools and their demonstrations that advance both the state of the art and the state of the practice. The track invites submissions that showcase tools at several stages of maturity, including promising research prototypes, widely used research tools, and commercial tools (only if they contribute to scientific knowledge), and from both academia and industry. In addition, the track aims to actively promote and recognize the creation of reusable datasets that are designed and built not only for a specific research project but for the testing community as a whole. These datasets should enable other practitioners and researchers to jumpstart their research efforts, and also allow the reproducibility of earlier work.
The ICST 2025 Testing Tools and Data Showcase Track will accept two types of submissions: 1) tool papers and (2) data showcase papers.
1 - Tool Papers
Tool papers should:
- Fall under the topics mentioned in the ICST 2025 research track.
- Present and discuss a tool that has NOT been published before as a tool paper.
- Motivate the need for the tool and clearly describe the complexity of the addressed problem.
- Discuss the tool’s goals, envisioned users and implied use case scenarios, requirements, implemented testing process or testing technique, solved technical challenges, and maturity level.
- Explain the tool’s overall architecture and its inner workings.
- Describe the tool’s novelty and how it relates to previous industrial or research efforts.
- Report on how the tool has been validated (e.g., from previously published research work or new experiments); experience reports from using the tool in industrial settings will be highly valued. Although tool papers are NOT expected to necessarily contain large-scale empirical studies, reporting any empirical result or user feedback is highly encouraged. In the case of early prototypes, illustrating the design of the planned validation studies is acceptable.
- Discuss any limitation of the tool as well as potential ways that the tool could be extended (by the authors or the community) in the future.
- Include a statement on the tool availability (see also below).
- Include (at the end of the abstract) the URL of a 3-to-5-minute screencast, either with annotations or voice-over, that provides a concise version of the tool demo scenario. The video should be posted on YouTube (as unlisted video) or hosted on the tool’s website. The main purpose of such a screencast is to show the functionality of the tool; it will not be reviewed as a form of long-term artifact
The tool itself should be made available (with an appropriate license notice) at the time of submission of the paper for review. At a minimum, the tool should be accessible (either free to download or online accessible). If possible, the source code of the tool should also be available. Exceptions can be granted only if a valid reason is provided explaining why the tool cannot be released (e.g., organizational rules, Intellectual Property restrictions). The tool should include clear installation instructions and an example dataset that allow the reviewers to run the tool
Upon acceptance, authors of papers that mentioned “publicly available” (or equivalent) under “Tool availability” should archive the tool on a persistent repository that can provide a digital object identifier (DOI) such as zenodo.org, figshare.com, IEEE DataPort, or institutional repositories. In addition, the DOI-based citation of the tool should be included in the camera-ready version of the paper.
2 - Data Showcase Papers
Data Showcase papers are descriptions of datasets relevant to the ICST 2025 topics, which can be used by other practitioners or researchers.
Data showcase papers should include:
- A description of the data source.
- A description of the methodology used to gather the data (including provenance and the tool used to create/generate/gather the data, if any).
- A description of the storage mechanism, including a schema if applicable.
- If the data has been used by the authors or others, a description of how this was done, including references to previously published papers.
- A description of the originality of the dataset (that is, even if the dataset has been used in a published paper, its complete description must be unpublished) and similar existing datasets (if any).
- Ideas for future research questions that could be answered using the dataset.
- Ideas for further improvements that could be made to the dataset.
- Any limitations and/or challenges in creating or using the dataset.
The dataset should be made available (with an appropriate license notice) at the time of submission of the paper for review. The dataset should include detailed instructions about how to use the dataset (e.g., how to import the data or how to access the data once it has been imported).
At a minimum, upon acceptance of the paper, the authors should archive the data on a persistent repository that can provide a digital object identifier (DOI) such as zenodo.org, figshare.com, IEEE DataPort, or institutional repositories. In addition, the DOI-based citation of the dataset should be included in the camera-ready version of the paper.
If custom tools have been used to create the dataset, we expect the paper to be accompanied by the source code of the tools, along with clear documentation on how to run the tools to recreate the dataset. The tools should be open source, accompanied by an appropriate license; the source code should be citable, i.e., refer to a specific release and have a DOI. If you cannot provide the source code or the source code clause is not applicable (e.g., because the dataset consists of qualitative data), please provide a short explanation of why this is not possible.
Evaluation:
Tool papers will be evaluated based on:
- The relevance and significance of the addressed problem.
- The innovation element of the approach.
- The availability, maturity, and adoption of the tool.
- The presence of lessons learned from developing or using the tool.
- The quality of the presentation.
Data Showcase papers will be evaluated based on:
- value, usefulness, and reusability of the datasets.
- quality of the presentation.
- availability of the datasets.
Submission
Submissions will be handled via EasyChair (ICST2025 / Testing Tools and Data Showcase Track) at https://easychair.org/conferences/?conf=icst2025.
The Testing Tools and Data Showcase track of ICST 2025 uses single-anonymous reviewing, meaning authors and tools do not have to be anonymized. All submissions must:
- Be in PDF and conform to the the IEEE Conference Proceedings Formatting Guidelines. Templates for LaTeX and Word are available at: http://www.ieee.org/conferences_events/conferences/publishing/templates.html.
- Not exceed 5 pages, including all text, figures, tables, appendices, and references.
- Not have been published elsewhere or be under review elsewhere while under review for ICST 2025.
- Comply with IEEE plagiarism policy as well as IEEE Policy on Authorship.
To make research datasets and accessible and citable, we further encourage authors to attend to the FAIR rules, i.e., data should be: Findable, Accessible, Interoperable, and Reusable.
Submissions that are not in compliance with the required submission format or that are out of the scope of the track will be rejected without being reviewed.
All authors, reviewers, and organizers are expected to uphold the IEEE Code of Conduct.
Publication and Presentation
Accepted papers will be published as part of the conference proceedings. Camera-ready and presentation details will be provided after notification of acceptance. At least one author of each accepted paper must register for the conference and present the paper at the conference.
In addition to delivering a presentation that will be included in the conference program, authors of accepted tool papers will have the opportunity to conduct a hands-on session where attendees of ICST 2025 can actively use and experiment with the demonstrated tools.
Important Dates
- Full paper: December 4, 2024
- Author notification: January 19, 2025
- Camera Ready: February 13, 2025