Exploiting Verified Neural Networks via Floating Point Numerical ErrorVirtual
Sun 17 Oct 2021 17:15 - 17:30 at Zurich B - Session 1A Chair(s): Kedar Namjoshi
Motivated by the need to reliably characterize the robustness of deep neural networks, researchers have developed neural network verification algorithms. The verifiers aspire to answer whether certain properties are guaranteed with respect to all inputs in a space for a neural network. However, many verifiers inaccurately model floating point arithmetic but do not fully discuss the consequences.
We show that the negligence of floating point error leads to unsound verification that can be systematically exploited in practice. For a pretrained neural network, we present a method that efficiently searches inputs as witnesses for the incorrectness of robustness claims made by a complete verifier. We also present a method to construct neural network architectures and weights that induce wrong results of an incomplete verifier. Our results highlight that, to achieve practically reliable verification of neural networks, any verification system must accurately (or conservatively) model the effects of any floating point computations in the network inference or verification system.
Sun 17 OctDisplayed time zone: Central Time (US & Canada) change
09:00 - 10:20 | |||
09:00 15mTalk | Accelerating Program Analyses in Datalog by Merging Library FactsVirtual SAS Yifan Chen Peking University, Chenyang Yang , Xin Zhang Peking University, Yingfei Xiong Peking University, Hao Tang Peking University, Xiaoyin Wang University of Texas at San Antonio, Lu Zhang Peking University | ||
09:15 15mTalk | Exploiting Verified Neural Networks via Floating Point Numerical ErrorVirtual SAS Pre-print | ||
09:30 15mTalk | Verifying Low-dimensional Input Neural Networks via Input QuantizationVirtual SAS Kai Jia Massachusetts Institute of Technology, Martin C. Rinard Massachusetts Institute of Technology Pre-print | ||
09:45 15mTalk | A Multi-Language Static Analysis of Python Programs with Native C ExtensionsVirtual SAS Raphaël Monat Sorbonne Université — LIP6, Abdelraouf Ouadjaout Sorbonne Université, Antoine Miné Sorbonne Université Pre-print Media Attached | ||
10:00 20mLive Q&A | Session 1A Discussion, Questions and Answers Virtual SAS |
17:00 - 18:20 | |||
17:00 15mTalk | Accelerating Program Analyses in Datalog by Merging Library FactsVirtual SAS Yifan Chen Peking University, Chenyang Yang , Xin Zhang Peking University, Yingfei Xiong Peking University, Hao Tang Peking University, Xiaoyin Wang University of Texas at San Antonio, Lu Zhang Peking University | ||
17:15 15mTalk | Exploiting Verified Neural Networks via Floating Point Numerical ErrorVirtual SAS Pre-print | ||
17:30 15mTalk | Verifying Low-dimensional Input Neural Networks via Input QuantizationVirtual SAS Kai Jia Massachusetts Institute of Technology, Martin C. Rinard Massachusetts Institute of Technology Pre-print | ||
17:45 15mTalk | A Multi-Language Static Analysis of Python Programs with Native C ExtensionsVirtual SAS Raphaël Monat Sorbonne Université — LIP6, Abdelraouf Ouadjaout Sorbonne Université, Antoine Miné Sorbonne Université Pre-print Media Attached | ||
18:00 20mLive Q&A | Session 1A Discussion, Questions and Answers Virtual SAS |