Write a Blog >>
ICSE 2021
Mon 17 May - Sat 5 June 2021

Performance regressions of large-scale software systems often lead to both financial and reputational losses. In order to detect performance regressions, performance tests are typically conducted in an in-house (non-production) environment using test suites with predefined workloads. Then, performance analysis is performed to check whether a software version has a performance regression against an earlier version. However, the real workloads in the field are constantly changing, making it unrealistic to resemble the field workloads in predefined test suites. More importantly, performance testing is usually very expensive as it requires extensive resources and lasts for an extended period. In this work, we leverage black-box machine learning models to automatically detect performance regressions in the field operations of large-scale software systems. Practitioners can leverage our approaches to complement or replace resource-demanding performance tests that may not even be realistic in a fast-paced environment. Our approaches use black-box models to capture the relationship between the performance of a software system (e.g., CPU usage) under varying workloads and the runtime activities that are recorded in the readily-available logs. Then, our approaches compare the black-box models derived from the current software version with an earlier version to detect performance regressions between these two versions. We performed empirical experiments on two open-source systems and applied our approaches on a large-scale industrial system. Our results show that such black-box models can effectively and timely detect real performance regressions and injected ones under varying workloads that are unseen when training these models. Our approaches have been adopted in practice to detect performance regressions of a large-scale industry system on a daily basis.

Fri 28 May

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

16:40 - 18:00
4.4.2. Defect Prediction: Modeling and PerformanceJournal-First Papers / Technical Track at Blended Sessions Room 2 +12h
Chair(s): Ayse Tosun Istanbul Technical University
16:40
20m
Paper
On the Need of Preserving Order of Data When Validating Within-Project Defect ClassifiersJournal-First
Journal-First Papers
Davide Falessi California Polytechnic State University, Jacky Huang California Polytechnic State University, USA, Likhita Narayana California Polytechnic State University, USA, Jennifer Fong Thai California Polytechnic State University, USA, Burak Turhan Monash University
Link to publication DOI Pre-print Media Attached
17:00
20m
Paper
Using black-box performance models to detect performance regressions under varying workloads: an empirical studyJournal-First
Journal-First Papers
Lizhi Liao Concordia University, Jinfu Chen Centre for Software Excellence, Huawei, Canada, Heng Li Polytechnique Montréal, Yi Zeng Concordia University, Weiyi Shang Concordia University, Jianmei Guo Alibaba Group, Catalin Sporea ERA Environmental Management Solutions, Andrei Toma ERA Environmental Management Solutions, Sarah Sajedi ERA Environmental Management Solutions
Link to publication DOI Pre-print Media Attached
17:20
20m
Paper
Predicting Performance Anomalies in Software Systems at Run-timeJournal-First
Journal-First Papers
Guoliang Zhao Computer Science of Queen's University, Safwat Hassan Thompson Rivers University, Ying Zou Queen's University, Kingston, Ontario, Derek Truong IBM Canada, Toby Corbin IBM UK
Pre-print Media Attached
17:40
20m
Paper
How Developers Optimize Virtual Reality Applications: A Study of Optimization Commits in Open Source Unity ProjectsTechnical Track
Technical Track
Fariha Nusrat University of Texas at San Antonio, Foyzul Hassan University of Michigan - Dearborn, Hao Zhong Shanghai Jiao Tong University, Xiaoyin Wang University of Texas at San Antonio
Pre-print Media Attached

Sat 29 May

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

04:40 - 06:00
4.4.2. Defect Prediction: Modeling and PerformanceJournal-First Papers / Technical Track at Blended Sessions Room 2
04:40
20m
Paper
On the Need of Preserving Order of Data When Validating Within-Project Defect ClassifiersJournal-First
Journal-First Papers
Davide Falessi California Polytechnic State University, Jacky Huang California Polytechnic State University, USA, Likhita Narayana California Polytechnic State University, USA, Jennifer Fong Thai California Polytechnic State University, USA, Burak Turhan Monash University
Link to publication DOI Pre-print Media Attached
05:00
20m
Paper
Using black-box performance models to detect performance regressions under varying workloads: an empirical studyJournal-First
Journal-First Papers
Lizhi Liao Concordia University, Jinfu Chen Centre for Software Excellence, Huawei, Canada, Heng Li Polytechnique Montréal, Yi Zeng Concordia University, Weiyi Shang Concordia University, Jianmei Guo Alibaba Group, Catalin Sporea ERA Environmental Management Solutions, Andrei Toma ERA Environmental Management Solutions, Sarah Sajedi ERA Environmental Management Solutions
Link to publication DOI Pre-print Media Attached
05:20
20m
Paper
Predicting Performance Anomalies in Software Systems at Run-timeJournal-First
Journal-First Papers
Guoliang Zhao Computer Science of Queen's University, Safwat Hassan Thompson Rivers University, Ying Zou Queen's University, Kingston, Ontario, Derek Truong IBM Canada, Toby Corbin IBM UK
Pre-print Media Attached
05:40
20m
Paper
How Developers Optimize Virtual Reality Applications: A Study of Optimization Commits in Open Source Unity ProjectsTechnical Track
Technical Track
Fariha Nusrat University of Texas at San Antonio, Foyzul Hassan University of Michigan - Dearborn, Hao Zhong Shanghai Jiao Tong University, Xiaoyin Wang University of Texas at San Antonio
Pre-print Media Attached