Write a Blog >>
ISSTA 2020
Sat 18 - Wed 22 July 2020
Tue 21 Jul 2020 14:50 - 15:10 at Zoom - REGRESSION TESTING Chair(s): Alex Orso

Developers check their changes using regression testing techniques. Unfortunately, regression testing techniques suffer from flaky tests, which can both pass and fail when run multiple times on the same version of code and tests. While many types of flaky tests exist, one prominent type is dependent tests, which are tests that pass when run in one order but fail when run in another order. Although dependent tests may cause flaky test failures, dependent tests can help developers run their tests faster. Since developers may still want dependent tests, we propose to make regression testing techniques dependent-test-aware to reduce flaky test failures.

To understand the necessity of dependent-test-aware regression testing techniques, we conduct the first study on the impact of dependent tests on three regression testing techniques: test prioritization, test selection, and test parallelization. In particular, we implement 4 test prioritization, 6 test selection, and 2 test parallelization algorithms, and we evaluate them on 11 Java modules with dependent tests. When we run the orders produced by the traditional, dependent-test-unaware regression testing algorithms, 90% of the human-written test suites with dependent tests have at least one flaky test failure, while 100% of the automatically-generated test suites have at least one flaky test failure.

We develop a general approach for enhancing regression testing algorithms to make them dependent-test-aware, and we apply our approach to enhance 12 algorithms. Compared to traditional, unenhanced regression testing algorithms, the enhanced algorithms use provided test dependencies to produce different orders or orders with extra tests. Our evaluation shows that, in comparison to the orders produced by the unenhanced algorithms, the orders produced by the enhanced algorithms (1) have overall 65% fewer flaky test failures due to dependent tests, and (2) may add extra tests but run only <1% slower on average. Our results suggest that enhancing regression testing algorithms to be dependent-test-aware can substantially reduce flaky test failures with only a minor slowdown to run the tests.

Tue 21 Jul
Times are displayed in time zone: (GMT-07:00) Tijuana, Baja California change

issta-2020-papers
14:50 - 15:50: Technical Papers - REGRESSION TESTING at Zoom
Chair(s): Alex OrsoGeorgia Institute of Technology

Public Live Stream/Recording. Registered participants should join via the Zoom link distributed in Slack.

issta-2020-papers14:50 - 15:10
Talk
Wing LamUniversity of Illinois at Urbana-Champaign, August ShiThe University of Texas at Austin, Reed Oei, Sai ZhangGoogle Cloud, Michael D. ErnstUniversity of Washington, USA, Tao XiePeking University
DOI Media Attached
issta-2020-papers15:10 - 15:30
Talk
Patrice GodefroidMicrosoft Research, Daniel LehmannUniversity of Stuttgart, Marina PolishchukMicrosoft
DOI Media Attached
issta-2020-papers15:30 - 15:50
Talk
Qianyang Peng, August ShiThe University of Texas at Austin, Lingming ZhangThe University of Texas at Dallas
DOI