FAST: Boosting Uncertainty-based Test Prioritization Methods for Neural Networks via Feature Selection
Due to the vast testing space, the increasing demand for effective and efficient testing of deep neural networks (DNNs) has led to the development of various DNN test case prioritization techniques. However, the fact that DNNs can deliver high-confidence predictions for incorrectly predicted examples, known as the over-confidence problem, causes these methods to fail to reveal high-confidence errors. To address this limitation, in this work, we propose FAST, a method that boosts existing prioritization methods through guided FeAture SelecTion. FAST is based on the insight that certain features may introduce noise that affects the model’s output confidence, thereby contributing to high-confidence errors. It quantifies the importance of each feature for the model’s correct predictions, and then dynamically prunes the information from the ‘noisy’ features during inference to derive a new probability vector for the uncertainty estimation. With the help of FAST, the high-confidence errors and correctly classified examples become more distinguishable, resulting in higher APFD (Average Percentage of Fault Detection) values for test prioritization. We conduct extensive experiments to evaluate FAST across a diverse set of model structures on multiple benchmark datasets to validate the effectiveness, efficiency and scalability of FAST compared to the state-of-the-art techniques.
Tue 29 OctDisplayed time zone: Pacific Time (US & Canada) change
10:30 - 12:00 | Test selection and prioritizationResearch Papers / Journal-first Papers / NIER Track at Camellia Chair(s): Wing Lam George Mason University | ||
10:30 15mTalk | Towards Exploring the Limitations of Test Selection Techniques on Graph Neural Networks: An Empirical Study Journal-first Papers Xueqi Dang University of Luxembourg, SnT, Yinghua LI University of Luxembourg, Wei Ma Nanyang Technological University, Yuejun GUo Luxembourg Institute of Science and Technology, Qiang Hu The University of Tokyo, Mike Papadakis University of Luxembourg, Maxime Cordy University of Luxembourg, Luxembourg, Yves Le Traon University of Luxembourg, Luxembourg Media Attached | ||
10:45 15mTalk | Prioritizing Test Cases for Deep Learning-based Video Classifiers Journal-first Papers Yinghua LI University of Luxembourg, Xueqi Dang University of Luxembourg, SnT, Lei Ma The University of Tokyo & University of Alberta, Jacques Klein University of Luxembourg, Tegawendé F. Bissyandé University of Luxembourg Media Attached | ||
11:00 15mTalk | Neuron Sensitivity Guided Test Case Selection Journal-first Papers Dong Huang The University of Hong Kong, Qingwen Bu Shanghai Jiao Tong University, Yichao FU The University of Hong Kong, Yuhao Qing University of Hong Kong, Xiaofei Xie Singapore Management University, Junjie Chen Tianjin University, Heming Cui University of Hong Kong | ||
11:15 15mTalk | FAST: Boosting Uncertainty-based Test Prioritization Methods for Neural Networks via Feature Selection Research Papers Jialuo Chen Zhejiang University, Jingyi Wang Zhejiang University, Xiyue Zhang University of Oxford, Youcheng Sun University of Manchester, Marta Kwiatkowska University of Oxford, Jiming Chen Zhejiang University; Hangzhou Dianzi University, Peng Cheng Zhejiang University | ||
11:30 15mTalk | Hybrid Regression Test Selection by Integrating File and Method Dependences Research Papers Guofeng Zhang College of Computer, National University of Defense Technology, Luyao Liu College of Computer, National University of Defense Technology, Zhenbang Chen College of Computer, National University of Defense Technology, Ji Wang National University of Defense Technology DOI Pre-print | ||
11:45 10mTalk | Prioritizing Tests for Improved Runtime NIER Track Abdelrahman Baz The University of Texas at Austin, Minchao Huang The University of Texas at Austin, August Shi The University of Texas at Austin |