Search-based Hyperparameter Tuning for Python Unit Test Generation
This program is tentative and subject to change.
Search-based test-generation algorithms have countless configuration options. Users rarely adjust these options and usually stick to the default values, which may not lead to the best possible results. Tuning an algorithm’s hyperparameters is a method to find better hyperparameter values, but it typically comes with a high demand of resources. Meta-heuristic search algorithms—that effectively solve the test-generation problem—have been proposed as a solution to also efficiently tune parameters. In this work we explore the use of differential evolution as a means for tuning the hyperparameters of the DynaMOSA and MIO many-objective search algorithms as implemented in the Pynguin framework. Our results show that significant improvement of the resulting test suite’s coverage is possible with the tuned DynaMOSA algorithm and that differential evolution is more efficient than basic grid search.
This program is tentative and subject to change.
Sun 16 NovDisplayed time zone: Seoul change
08:30 - 10:00 | |||
08:30 10mTalk | Opening Keynote Shin Hong Chungbuk National University | ||
08:40 20mTalk | Search-based Hyperparameter Tuning for Python Unit Test Generation Research Papers Pre-print | ||
09:00 20mTalk | Constraint-Guided Unit Test Generation for Machine Learning Libraries Research Papers Lukas Krodinger University of Passau, Altin Hajdari University of Passau, Stephan Lukasczyk JetBrains Research, Gordon Fraser University of Passau Pre-print | ||
09:20 20mTalk | LLM-Guided Fuzzing for Pathological Input Generation Research Papers | ||
09:40 20mTalk | The Pursuit of Diversity: Multi-Objective Testing of Deep Reinforcement Learning Agents Research Papers Antony Bartlett TU Delft, The Netherlands, Cynthia C. S. Liem Delft University of Technology, Annibale Panichella Delft University of Technology | ||