Matthew Johnson

Not registered as user yet

Name:Matthew Johnson

Matt Johnson is a research scientist at Google Brain interested in software systems powering machine learning research. When moonlighting as a machine learning researcher, he works on composing graphical models with neural networks, automatically recognizing and exploiting conjugacy structure, and model-based reinforcement learning from pixels. Matt was a postdoc with Ryan Adams at the Harvard Intelligent Probabilistic Systems Group and Bob Datta in the Datta Lab at the Harvard Medical School. His Ph.D. is from MIT in EECS, where he worked with Alan Willsky on Bayesian time series models and scalable inference. He was an undergrad at UC Berkeley (Go Bears!).

Country:United States
Affiliation:Google Brain


FHPNC 2021 Author of Parallelism-preserving automatic differentiation for second-order array languages within the FHPNC 2021-track
LAFI 2021 Author of Decomposing reverse-mode automatic differentiation within the LAFI 2021-track
ICFP 2021 Author of Getting to the Point: Index Sets and Parallelism-Preserving Autodiff for Pointful Array Programming within the Research Papers-track
SPLASH 2020 Author of JAX: accelerated machine learning research via composable function transformations in Python within the REBASE-track
LAFI 2019 Committee Member in Program Committee within the LAFI (né PPS)-track