Parallelization in System-level Testing: Novel Approaches to Manage Test Suite Dependencies
System-level testing is fundamental to ensure the reliability of software systems. However, the execution time for system tests can be quite long, sometimes prohibitively long, especially in a regimen of continuous integration and deployment. One way to speed things up is to run the tests in parallel, provided that the execution schedule respects any dependency between tests. We present two novel approaches to detect dependencies in system-level tests, namely Pfast and Mem-Fast, which are highly parallelizable and optimistically run test schedules to exclude many dependencies when there are no failures. We evaluated our approaches both asymptotically and practically, on six Web applications and their system-level test suites, as well as on MySQL system-level tests. Our results show that, in general, Pfast is significantly faster than the state-of-the-art PraDet dependency detection algorithm, while producing parallelizable schedules that achieve a significant reduction in the overall test suite execution time.
Wed 15 AprDisplayed time zone: Brasilia, Distrito Federal, Brazil change
14:00 - 15:30 | Testing and Analysis 5SE In Practice (SEIP) / Research Track / Journal-first Papers at Oceania II Chair(s): Gabriele Bavota Software Institute @ Università della Svizzera Italiana | ||
14:00 15mTalk | Parallelization in System-level Testing: Novel Approaches to Manage Test Suite Dependencies Journal-first Papers Pasquale Polverino USI Università della Svizzera italiana, Fabio Di Lauro USI Università della Svizzera italiana, Matteo Biagiola University of St. Gallen and Università della Svizzera italiana, Paolo Tonella USI Lugano, Antonio Carzaniga Università della Svizzera italiana DOI Pre-print | ||
14:15 15mTalk | Automated Network-Level Fault Injection Testing of Microservice Architectures Research Track Delano Flipse Delft University of Technology (TU Delft), Hakan Simsek ASML, Jérémie Decouchant Delft University of Technology (TU Delft), Burcu Kulahcioglu Ozkan Delft University of Technology | ||
14:30 15mTalk | Predicting Failures in Smart Human-Centric EcoSystems Research Track Niccolò Puccinelli Università della Svizzera Italiana, Davide Molinelli Constructor Institute of Technology, Noura El Moussa USI Lugano; Schaffhausen Institute of Technology, Matteo Ciniselli Università della Svizzera Italiana, Mauro Pezze Università della Svizzera italiana (USI) and Università degli Studi di Milano Bicocca | ||
14:45 15mTalk | PerfScout: An Adaptive Workload Generator in Software Performance Testing SE In Practice (SEIP) Yongqian Sun Nankai University, Qingliang Zhang Nankai University, Xiao Xiong Nankai University, Mengyao Li Nankai University, Yimin Zuo Nankai University, Shenglin Zhang Nankai University, Xidao Wen BizSeer, Wenwei Gu Nankai University, Huandong Zhuang Huawei Cloud, Bowen Deng Huawei Cloud, Ruiyuan Wan , Dan Pei Tsinghua University Media Attached | ||
15:00 15mTalk | Scaling Mobile Chaos Testing with AI-Driven Test Execution SE In Practice (SEIP) Juan Marcano Uber Technologies, Ashish Samant Uber Technologies, Inc, Kai Song Uber Technologies, Inc, Lingchao Chen Uber Technologies, Kaelan Mikowicz Uber Technologies, Inc., Tim Smyth Uber Technologies, Inc., Mengdie Zhang Uber Technologies, Inc., Ali Zamani Uber Technologies, Inc., Arturo Bravo Rovirosa Uber Technologies, Inc., Sowjanya Puligadda Uber Technologies, Inc., Srikanth Prodduturi Uber Technologies, Inc., Mayank Bansal Uber Technologies, Inc. | ||
15:15 15mTalk | CAST: Automated Resilience Testing for Production Cloud Service Systems SE In Practice (SEIP) Zhuangbin Chen Sun Yat-sen University, Zhiling Deng School of Software Engineering, Sun Yat-sen University, Kaiming Zhang School of Software Engineering, Sun Yat-sen University, Yang Liu Nanyang Technological University, Cheng Cui Huawei Cloud, Jinfeng Zhong Huawei Cloud, Zibin Zheng Sun Yat-sen University | ||