Metamorphic Object Insertion for Testing Object Detection Systems
Recent advances in deep neural networks (DNNs) have led to object detectors (ODs) that can rapidly process pictures or videos, and recognize the objects that they contain. Despite the promising progress by industrial manufacturers such as Amazon and Google in commercializing deep learning-based ODs as a standard computer vision service, ODs — similar to traditional software — may still produce incorrect results. These errors, in turn, can lead to severe negative outcomes for the users. For instance, an autonomous driving system that fails to detect pedestrians can cause accidents or even fatalities. However, despite their importance, principled, systematic methods for testing ODs do not yet exist.
To fill this critical gap, we introduce the design and realization of MetaOD, a metamorphic testing system specifically designed for ODs to effectively uncover erroneous detection results. To this end, we (1) synthesize natural-looking images by inserting extra object instances into background images, and (2) design metamorphic conditions asserting the equivalence of OD results between the original and synthetic images after excluding the prediction results on the inserted objects. MetaOD is designed as a streamlined workflow that performs object extraction, selection, and insertion. We develop a set of practical techniques to realize an effective workflow, and generate diverse, natural-looking images for testing. Evaluated on four commercial OD services and four pretrained models provided by the TensorFlow API, MetaOD found tens of thousands of detection failures. To further demonstrate the practical usage of MetaOD, we use the synthetic images that cause erroneous detection results to retrain the model. Our results show that the model performance is significantly increased, from an mAP score of 9.3 to an mAP score of 10.5.
Thu 24 Sep Times are displayed in time zone: (UTC) Coordinated Universal Time change
09:10 - 10:10
|Predicting failures in multi-tier distributed systems|
|Cats Are Not Fish: Deep Learning Testing Calls for Out-Of-Distribution Awareness|
David BerendNanyang Technological University, Singapore, Xiaofei XieNanyang Technological University, Lei MaKyushu University, Lingjun ZhouCollege of Intelligence and Computing, Tianjin University, Yang LiuNanyang Technological University, Singapore, Chi XuSingapore Institute of Manufacturing Technology, A*Star, Jianjun ZhaoKyushu University
|Metamorphic Object Insertion for Testing Object Detection Systems|