The Testing Tools and Demonstration Track seeks to bridge the gap between research and practice by focusing on software testing tools and their demonstrations that advance both the state of the art and the state of the practice. The track invites submissions that showcase tools at several stages of maturity, including promising research prototypes, widely used research tools, and commercial tools (only if they contribute to scientific knowledge), and from both academia and industry.
Tools and demonstrations paper submissions should clearly describe the complexity of the addressed problem, solved technical challenges, and the validation that the tool actually works (e.g., from previously published research work or new experiments, with preferable experiences from using the tool in the industry).
Wed 29 MayDisplayed time zone: Eastern Time (US & Canada) change
15:30 - 17:00 | FuzzingJournal-First Papers / Research Papers / Testing Tools and Demonstration at Room 1 Chair(s): Sahar Tahvili Ericsson AB | ||
16:50 10mDemonstration | MOTIF: A tool for Mutation Testing with Fuzzing Testing Tools and Demonstration Jaekwon Lee University of Ottawa & University of Luxembourg, Enrico Viganò University of Luxembourg, Fabrizio Pastore University of Luxembourg, Lionel Briand University of Ottawa, Canada; Lero centre, University of Limerick, Ireland |
15:30 - 17:00 | Testing Autonomous Driving SystemsResearch Papers / Testing Tools and Demonstration at Room 2 & 3 Chair(s): Nargiz Humbatova USI Lugano | ||
16:50 10mDemonstration | U-Fuzz: A Tool for Stateful Fuzzing of IoT Protocols on COTS Devices Testing Tools and Demonstration Shang Zewen , Matheus Eduardo Garbelini , Sudipta Chattopadhyay Singapore University of Technology and Design |
Fri 31 MayDisplayed time zone: Eastern Time (US & Canada) change
11:00 - 12:20 | Testing and ApplicationsResearch Papers / Testing Tools and Demonstration / Industry at Room 1 Chair(s): Jeremy Bradbury Ontario Tech University | ||
12:00 10mDemonstration | MLHCBugs: A Framework to Reproduce Real Faults in Healthcare Machine Learning Applications Testing Tools and Demonstration Guna Sekaran Jaganathan , Nazmul Kazi , Indika Kahanda University of North Florida, Upulee Kanewala University of North Florida | ||
12:10 10mDemonstration | The GitHub Recent Bugs Dataset for Evaluating LLM-based Debugging Applications Testing Tools and Demonstration Jae Yong Lee , Sungmin Kang , Juyeon Yoon Korea Advanced Institute of Science and Technology, Shin Yoo Korea Advanced Institute of Science and Technology |
Accepted Papers
Title | |
---|---|
MLHCBugs: A Framework to Reproduce Real Faults in Healthcare Machine Learning Applications Testing Tools and Demonstration | |
MOTIF: A tool for Mutation Testing with Fuzzing Testing Tools and Demonstration | |
The GitHub Recent Bugs Dataset for Evaluating LLM-based Debugging Applications Testing Tools and Demonstration | |
U-Fuzz: A Tool for Stateful Fuzzing of IoT Protocols on COTS Devices Testing Tools and Demonstration |
ICST 2024 Testing Tools and Demonstration Call for Papers
The Testing Tools and Demonstration Track seeks to bridge the gap between research and practice by focusing on software testing tools and their demonstrations that advance both the state of the art and the state of the practice. The track invites submissions that showcase tools at several stages of maturity, including promising research prototypes, widely used research tools, and commercial tools (only if they contribute to scientific knowledge), and from both academia and industry.
Tools and demonstrations paper submissions should clearly describe the complexity of the addressed problem, solved technical challenges, and the validation that the tool works (e.g., from previously published research work or new experiments, with preferable experiences from using the tool in the industry).
The tool paper should clearly communicate the following information to the audience:
- Envisioned users of the tool and the implied use case scenario for its users;
- Software engineering challenges the tool addresses;
- Testing process or testing technique implemented by the tool;
- Results of validation studies already conducted (for mature tools) or the design of the planned studies (for early prototypes);
- Potential ways that the tool could be extended (by the authors or the community) in the future.
The tool itself should be made available for evaluation. At a minimum, the tool should be accessible (either free to download, online accessible, or for purchase). If possible, the source code of the tool should also be available. Exceptions can be granted only if a valid reason is provided explaining why the tool cannot be released (e.g., organizational rules or it was developed for internal use in a company and cannot be made public).
Evaluation:
Each submission will be evaluated based on:
- The relevance and significance of the addressed problem.
- The innovation element of the approach.
- The availability, maturity, and adoption of the tool.
- The presence of lessons learned from developing or using the tool.
- The quality of the presentation.
Submission:
Submissions will be handled via EasyChair (ICST2024 / Testing Tools and Demonstration Track).
The Testing Tools and Demonstration Track of ICST 2024 uses single-anonymous reviewing, meaning authors and tools do not have to be anonymized. All submitted papers must conform to the two-column IEEE conference publication format. Templates for Latex and Word are available at: http://www.ieee.org/conferences_events/conferences/publishing/templates.html;
- It must conform to the IEEE Conference Proceedings Formatting Guidelines (please use the letter format template and conference option).
- Testing tools and demonstration papers must be submitted as PDFs.
- The length of the testing tools and demonstration papers must not exceed 2 pages, including all text, figures, tables, and appendices; one additional page containing only references is permitted. This can be either short papers for the testing tools or extended abstracts for the demonstration. The submission must also comply with the ACM plagiarism policy and procedures. In particular, the same content must not have been published elsewhere and must not be under review elsewhere while under review for ICST. The submission must also comply with the IEEE Policy on Authorship.
Accepted papers will be published in ICST conference proceedings.
Special issue as part of the SCP journal:
The ICST 2024 Testing Tools and Demonstration Track will organize a special issue in the Software Track of the Science of Computer Programming (SCP) and invite authors of the best tools to publish their research software. In particular, we actively support researchers who put a lot of effort into tool development to support research. The authors can have their refereed software published, just like a research paper.
Invitation of selected submission: 1 July 2024
Important Dates
- Submission: Monday, 29 Jan
- Author Notification: Monday, 26 Feb
- Camera Ready: Friday, 1 March
Poster Presentation:
Authors of tools and demonstration papers will also be given the option to make a poster presentation.