Compiling Discrete Probabilistic Programs for Vectorized Exact Inference
Probabilistic programming languages (PPLs) are essential for reasoning under uncertainty. Even though many real-world probabilistic programs involve discrete distributions, the state-of-the-art PPLs are suboptimal for a large class of tasks dealing with such distributions. In this paper, we propose BayesTensor, a tensor-based probabilistic programming framework. By generating tensor algebra code from probabilistic programs, BayesTensor takes advantage of the highly-tuned vectorized implementations of tensor processing frameworks. Our experiments show that BayesTensor outperforms the state-of-the-art frameworks in a variety of discrete probabilistic programs, inference over Bayesian Networks, and real-world probabilistic programs employed in data processing systems.
Sat 25 FebDisplayed time zone: Eastern Time (US & Canada) change
10:20 - 11:20 | Vector & ParallelismResearch Papers at St. Laurent 3 Chair(s): Sebastian Hack Saarland University, Saarland Informatics Campus | ||
10:20 20mTalk | Java Vector API: Benchmarking and Performance Analysis Research Papers DOI | ||
10:40 20mTalk | Compiling Discrete Probabilistic Programs for Vectorized Exact Inference Research Papers DOI | ||
11:00 20mTalk | A Multi-threaded Fast Hardware Compiler for HDLs Research Papers Sheng-Hong Wang University of California, Hunter James Coffman University of California, Kenneth Mayer University of California, Sakshi Garg University of California, Jose Renau University of California DOI |