Verifying Low-dimensional Input Neural Networks via Input QuantizationVirtual
Sun 17 Oct 2021 17:30 - 17:45 at Zurich B - Session 1A Chair(s): Kedar Namjoshi
Deep neural networks are an attractive tool for compressing the control policy lookup tables in systems such as the Airborne Collision Avoidance System (ACAS). It is vital to ensure the safety of such neural controllers via verification techniques. The problem of analyzing ACAS Xu networks has motivated many successful neural network verifiers. These verifiers typically analyze the internal computation of neural networks to decide whether a property regarding the input/output holds. The intrinsic complexity of neural network computation renders such verifiers slow to run and vulnerable to floating-point error.
This paper revisits the original problem of verifying ACAS Xu networks. The networks take low-dimensional sensory inputs with training data extracted from a lookup table. We propose to prepend an input quantization layer to the network. Quantization allows efficient verification via input state enumeration, whose complexity is bounded by the size of the quantization space. Quantization is equivalent to nearest-neighbor interpolation at run time, which has been shown to provide acceptable accuracy for ACAS in simulation. Moreover, our technique can deliver exact verification results immune to floating-point error if we directly enumerate the network outputs on the target inference implementation or on an accurate simulation of the target implementation.
Sun 17 OctDisplayed time zone: Central Time (US & Canada) change
09:00 - 10:20 | |||
09:00 15mTalk | Accelerating Program Analyses in Datalog by Merging Library FactsVirtual SAS Yifan Chen Peking University, Chenyang Yang , Xin Zhang Peking University, Yingfei Xiong Peking University, Hao Tang Peking University, Xiaoyin Wang University of Texas at San Antonio, Lu Zhang Peking University | ||
09:15 15mTalk | Exploiting Verified Neural Networks via Floating Point Numerical ErrorVirtual SAS Pre-print | ||
09:30 15mTalk | Verifying Low-dimensional Input Neural Networks via Input QuantizationVirtual SAS Kai Jia Massachusetts Institute of Technology, Martin C. Rinard Massachusetts Institute of Technology Pre-print | ||
09:45 15mTalk | A Multi-Language Static Analysis of Python Programs with Native C ExtensionsVirtual SAS Raphaël Monat Sorbonne Université — LIP6, Abdelraouf Ouadjaout Sorbonne Université, Antoine Miné Sorbonne Université Pre-print Media Attached | ||
10:00 20mLive Q&A | Session 1A Discussion, Questions and Answers Virtual SAS |
17:00 - 18:20 | |||
17:00 15mTalk | Accelerating Program Analyses in Datalog by Merging Library FactsVirtual SAS Yifan Chen Peking University, Chenyang Yang , Xin Zhang Peking University, Yingfei Xiong Peking University, Hao Tang Peking University, Xiaoyin Wang University of Texas at San Antonio, Lu Zhang Peking University | ||
17:15 15mTalk | Exploiting Verified Neural Networks via Floating Point Numerical ErrorVirtual SAS Pre-print | ||
17:30 15mTalk | Verifying Low-dimensional Input Neural Networks via Input QuantizationVirtual SAS Kai Jia Massachusetts Institute of Technology, Martin C. Rinard Massachusetts Institute of Technology Pre-print | ||
17:45 15mTalk | A Multi-Language Static Analysis of Python Programs with Native C ExtensionsVirtual SAS Raphaël Monat Sorbonne Université — LIP6, Abdelraouf Ouadjaout Sorbonne Université, Antoine Miné Sorbonne Université Pre-print Media Attached | ||
18:00 20mLive Q&A | Session 1A Discussion, Questions and Answers Virtual SAS |