ISSTA 2022
Mon 18 - Fri 22 July 2022 Online
Wed 20 Jul 2022 01:20 - 01:40 at ISSTA 2 - Session 1-2: Test Generation and Mutation A Chair(s): Raghavan Komondoor
Thu 21 Jul 2022 07:20 - 07:40 at ISSTA 1 - Session 2-7: Test Generation and Mutation B Chair(s): Christoph Csallner

A common practice in computer science courses is to evaluate student-written test suites against either a set of manually-seeded faults (handwritten by an instructor) or against all other student-written implementations (``all-pairs'' grading). However, manually seeding faults is a time consuming and potentially error-prone process, and the all-pairs approach requires significant manual and computational effort to apply fairly and accurately. Mutation analysis, which automatically seeds potential faults in an implementation, is a possible alternative to these test suite evaluation approaches. Although there is evidence in the literature that mutants are a valid substitute for real faults in large open-source software projects, it is unclear whether mutants are representative of the kinds of faults that students make. If mutants are a valid substitute for faults found in student-written code, and if mutant detection is correlated with manually-seeded fault detection and faulty student implementation detection, then instructors can instead evaluate student test suites using mutants generated by open-source mutation analysis tools.

Using a dataset of 2,711 student assignment submissions, we empirically evaluate whether mutation score is a good proxy for manually-seeded fault detection rate and faulty student implementation detection rate. Our results show a strong correlation between mutation score and manually-seeded fault detection rate and a moderately strong correlation between mutation score and faulty student implementation detection. We identify a handful of faults in student implementations that, to be coupled to a mutant, would require new or stronger mutation operators or applying mutation operators to an implementation with a different structure than the instructor-written implementation. We also find that this correlation is limited by the fact that faults are not distributed evenly throughout student code, a known drawback of all-pairs grading. Our results suggest that mutants produced by open-source mutation analysis tools are of equal or higher quality than manually-seeded faults and a reasonably good stand-in for real faults in student implementations. Our findings have implications for software testing researchers, educators, and tool builders alike.

Wed 20 Jul

Displayed time zone: Seoul change

01:20 - 02:20
Session 1-2: Test Generation and Mutation ATechnical Papers at ISSTA 2
Chair(s): Raghavan Komondoor IISc Bengaluru
01:20
20m
Talk
On the Use of Mutation Analysis For Evaluating Student Test Suite Quality
Technical Papers
James Perretta Northeastern University, Andrew DeOrio University of Michigan, Arjun Guha Northeastern University, Jonathan Bell Northeastern University
DOI
01:40
20m
Talk
Automated Test Generation for REST APIs: No Time to Rest Yet
Technical Papers
Myeongsoo Kim , Qi Xin Wuhan University, Saurabh Sinha IBM Research, Alessandro Orso Georgia Tech
DOI
02:00
20m
Talk
One Step Further: Evaluating Interpreters Using Metamorphic Testing
Technical Papers
Ming Fan Xi'an Jiaotong University, Jiali Wei Xi'an Jiaotong University, Wuxia Jin Xi'an Jiaotong University, Zhou Xu Wuhan University, Wenying Wei Xi'an Jiaotong University, Ting Liu Xi'an Jiaotong University
DOI

Thu 21 Jul

Displayed time zone: Seoul change

07:00 - 08:00
Session 2-7: Test Generation and Mutation BTechnical Papers at ISSTA 1
Chair(s): Christoph Csallner University of Texas at Arlington
07:00
20m
Talk
Automated Test Generation for REST APIs: No Time to Rest Yet
Technical Papers
Myeongsoo Kim , Qi Xin Wuhan University, Saurabh Sinha IBM Research, Alessandro Orso Georgia Tech
DOI
07:20
20m
Talk
On the Use of Mutation Analysis For Evaluating Student Test Suite Quality
Technical Papers
James Perretta Northeastern University, Andrew DeOrio University of Michigan, Arjun Guha Northeastern University, Jonathan Bell Northeastern University
DOI
07:40
20m
Talk
Test Mimicry to Assess the Exploitability of Library Vulnerabilities
Technical Papers
Hong Jin Kang Singapore Management University, Singapore, Truong Giang Nguyen School of Computing and Information Systems, Singapore Management University, Xuan Bach D. Le The University of Melbourne, Corina S. Pasareanu Carnegie Mellon University Silicon Valley, NASA Ames Research Center, David Lo Singapore Management University
DOI