ICST 2023
Sun 16 - Thu 20 April 2023 Dublin, Ireland
Mon 17 Apr 2023 11:00 - 11:20 at Grand canal - Session 1: Automated Testing Chair(s): Gilles Perrouin

Researchers and practitioners have designed and implemented various automated test case generators to support effective software testing. Such generators exist for various languages (e.g., Java, C#, or Python) and various platforms (e.g., desktop, web, or mobile applications). The generators exhibit varying effectiveness and efficiency, depending on the testing goals they aim to satisfy (e.g., unit-testing of libraries versus system-testing of entire applications) and the underlying techniques they implement. In this context, practitioners need to be able to compare different generators to identify the most suited one for their requirements, while researchers seek to identify future research directions. This can be achieved by systematically executing large-scale evaluations of different generators. However, executing such empirical evaluations is not trivial and requires substantial effort to select appropriate benchmarks, setup the evaluation infrastructure, and collect and analyse the results. In this Software Note, we present our JUnit Generation Benchmarking Infrastructure (JUGE) supporting generators (search-based, random-based, symbolic execution, etc.) seeking to automate the production of unit tests for various purposes (validation, regression testing, fault localization, etc.). The primary goal is to reduce the overall benchmarking effort, ease the comparison of several generators, and enhance the knowledge transfer between academia and industry by standardizing the evaluation and comparison process. Since 2013, several editions of a unit testing tool competition, co-located with the Search-Based Software Testing Workshop, have taken place where JUGE was used and evolved. As a result, an increasing amount of tools (over 10) from academia and industry have been evaluated on JUGE, matured over the years, and allowed the identification of future research directions. Based on the experience gained from the competitions, we discuss the expected impact of JUGE in improving the knowledge transfer on tools and approaches for test generation between academia and industry. Indeed, the JUGE infrastructure demonstrated an implementation design that is flexible enough to enable the integration of additional unit test generation tools, which is practical for developers and allows researchers to experiment with new and advanced unit testing tools and approaches.

Mon 17 Apr

Displayed time zone: Dublin change

11:00 - 12:30
Session 1: Automated Testing Journal-First Papers / Research Papers / Previous Editions / Testing Tools / Tool Demo at Grand canal
Chair(s): Gilles Perrouin Fonds de la Recherche Scientifique - FNRS & University of Namur
JUGE: An infrastructure for benchmarking Java unit test generators
Journal-First Papers
Xavier Devroey University of Namur, Alessio Gambi IMC University of Applied Sciences Krems, Juan Pablo Galeotti University of Buenos Aires, René Just University of Washington, Fitsum Kifetew Fondazione Bruno Kessler, Annibale Panichella Delft University of Technology, Sebastiano Panichella Zurich University of Applied Sciences
DOI Authorizer link Pre-print
Metamorphic Testing with Causal Graphs
Research Papers
Andrew Graham Clark University of Sheffield, Michael Foster University of Sheffield, Neil Walkinshaw University of Sheffield, Robert Hierons University of Sheffield
QEX: Automated Testing Observability and QA Developer Experience Framework
Testing Tools
Luohua Huang Shopee, Joseph Chu Shopee, Keshia Yap Shopee, Hock Yao Chua Shopee
ASDF - A Differential Testing Framework for Automatic Speech Recognition Systems
Tool Demo
Daniel Hao Xian Yuen School of Information Technology, Monash University Malaysia, Andrew Yong Chen Pang School of Information Technology, Monash University Malaysia, Zhou Yang Singapore Management University, Chun Yong Chong Monash University Malaysia, Mei Kuan Lim Monash University Malaysia, David Lo Singapore Management University
A Framework for Automated API Fuzzing at Enterprise Scale
Previous Editions
Riyadh Mahmood The Aerospace Corporation, Jay Pennington The Aerospace Corporation, Danny Tsang The Aerospace Corporation, Tan Tran The Aerospace Corporation, Andrea Bogle The Aerospace Corporation