SAS 2020
Wed 18 - Fri 20 November 2020 Online Conference
co-located with SPLASH 2020
Wed 18 Nov 2020 17:40 - 18:00 at SPLASH-III - 5 Chair(s): Patrick Cousot, Sukyoung Ryu
Thu 19 Nov 2020 05:40 - 06:00 at SPLASH-III - 5 Chair(s): Xavier Rival, Sukyoung Ryu

We are interested in algorithmically proving the robustness of neural networks. Notions of robustness have been discussed in the literature; we are interested in probabilistic notions of robustness that assume it feasible to construct a statistical model of the process generating the inputs of a neural network. We find this a reasonable assumption given the rapid advances in algorithms for learning generative models of data. A neural network f is then defined to be probabilistically robust if, for a randomly generated pair of inputs, f is likely to demonstrate k-Lipschitzness, i.e., the distance between the outputs computed by $f$ is upper-bounded by the k-th multiple of the distance between the pair of inputs. We name this property, probabilistic Lipschitzness.

We model generative models and neural networks, together, as programs in a simple, first-order, imperative, probabilistic programming language, pcat. Inspired by a large body of existing literature, we define a denotational semantics for this language. Then we develop a sound local Lipschitzness analysis for cat, a non-probabilistic sublanguage of pcat. This analysis can compute an upper bound of the ``Lipschitzness'' of a neural network in a bounded region of the input set. We next present a provably correct algorithm, PROLIP, that analyzes the behavior of a neural network in a user-specified box-shaped input region and computes - (i) lower bounds on the probabilistic mass of such a region with respect to the generative model, (ii) upper bounds on the Lipschitz constant of the neural network in this region, with the help of the local Lipschitzness analysis. Finally, we present a sketch of a proof-search algorithm that uses PROLIP as a primitive for finding proofs of probabilistic Lipschitzness. We implement the PROLIP algorithm and empirically evaluate the computational complexity of PROLIP.

Conference Day
Wed 18 Nov

Displayed time zone: Central Time (US & Canada) change

17:00 - 18:20
5DLS / SAS at SPLASH-III +12h
Chair(s): Patrick CousotNew York University, Sukyoung Ryu
17:00
20m
Research paper
Abstract Neural Networks
SAS
Matthew SotoudehUniversity of California, Davis, Aditya V. ThakurUniversity of California, Davis
Pre-print Media Attached
17:20
20m
Talk
Amalgamating Different JIT Compilations in a Meta-tracing JIT Compiler Framework
DLS
Yusuke IzawaTokyo Institute of Technology, Hidehiko MasuharaTokyo Institute of Technology
Link to publication DOI Pre-print Media Attached
17:40
20m
Research paper
Probabilistic Lipschitz Analysis of Neural NetworksArtifact
SAS
Ravi MangalGeorgia Institute of Technology, Kartik SarangmathGeorgia Institute of Technology, Aditya Nori, Alessandro OrsoGeorgia Tech
Pre-print Media Attached
18:00
20m
Talk
Pricing Python Parallelism: A Dynamic Language Cost Model for Heterogeneous Platforms
DLS
Dejice JacobUniversity of Glasgow, UK, Phil TrinderUniversity of Glasgow, Jeremy SingerGlasgow University
Link to publication DOI Pre-print Media Attached

Conference Day
Thu 19 Nov

Displayed time zone: Central Time (US & Canada) change

05:00 - 06:20
5SAS / DLS at SPLASH-III
Chair(s): Xavier RivalINRIA/CNRS/ENS Paris, Sukyoung Ryu
05:00
20m
Research paper
Abstract Neural Networks
SAS
Matthew SotoudehUniversity of California, Davis, Aditya V. ThakurUniversity of California, Davis
Pre-print Media Attached
05:20
20m
Talk
Amalgamating Different JIT Compilations in a Meta-tracing JIT Compiler Framework
DLS
Yusuke IzawaTokyo Institute of Technology, Hidehiko MasuharaTokyo Institute of Technology
Link to publication DOI Pre-print Media Attached
05:40
20m
Research paper
Probabilistic Lipschitz Analysis of Neural NetworksArtifact
SAS
Ravi MangalGeorgia Institute of Technology, Kartik SarangmathGeorgia Institute of Technology, Aditya Nori, Alessandro OrsoGeorgia Tech
Pre-print Media Attached
06:00
20m
Talk
Pricing Python Parallelism: A Dynamic Language Cost Model for Heterogeneous Platforms
DLS
Dejice JacobUniversity of Glasgow, UK, Phil TrinderUniversity of Glasgow, Jeremy SingerGlasgow University
Link to publication DOI Pre-print Media Attached