Bayesian Neural Networks using HackPPL with Application to User Location State Prediction

Bayesian Deep Learning Workshop at NeurIPS 2018


At Facebook, we are becoming increasingly interested in incorporating uncertainty into models used for decision making. As a result, we are building an in-house universal probabilistic programming language that aims to make modeling more accessible to developers and to unify the tooling experience for existing users of Bayesian modeling. In addition, we form interfaces for common probabilistic models on top of this language and apply these to datasets within the company.

For the less familiar reader, probabilistic programming languages (PPLs) provide a convenient syntax for allowing users to describe generative processes composed of various sources of uncertainty. The user may then pose probabilistic queries about their world that will be resolved using an inference engine. Some of the more mature languages include domain specific languages such as WinBUGS [1], JAGS [2] and Stan [3], which place some restrictions on the models a user may write in order for inference to run more efficiently. On the other hand, the newer universal PPLs such as Church [4], WebPPL [5] and Anglican [6] extend existing general-purpose languages and resolve queries through a generic inference engine. In doing so, users are constrained only by the limitations of the underlying language, although this may not always result in the most efficient model and this tradeoff between model expressivity and inference efficiency is an ongoing area of research.

Traditionally, Bayesian neural networks (BNNs) are neural networks with priors on their weights and biases [7, 8]. Their main advantages include providing uncertainty of predictions rather than point estimates, built-in regularization through priors, and better performance in problem settings such as the robot arm problem [7, 9], but are generally expensive in terms of compute time. While probabilistic interpretations of neural networks have been studied in the past, BNNs have seen a resurgence in popularity in recent years, particularly with alternative probabilistic approaches to dropout [10] and backpropagation [11], and the renewed investigation of variational approximations [12] which have made computation more tractable. There have also been more recent advances in combining probabilistic programming and deep learning, notably by Edward [13] and Pyro [14]. These languages are built on top of existing tensor libraries and have so far focused on variational approaches for scalable inference.

In this study, we present HackPPL as a probabilistic programming language in Facebook’s server-side language, Hack. One of the aims of our language is to support deep probabilistic modeling by providing a flexible interface for composing deep neural networks with encoded uncertainty and a rich inference engine. We demonstrate the Bayesian neural network interface in HackPPL and present results of a multi-class classification problem to predict user location states using several inference techniques. Through HackPPL we aim to provide tools for interacting and debugging Bayesian models and integrate them into the Facebook ecosystem.

Related Publications

All Publications

NeurIPS - December 6, 2021

Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement

Samuel Daulton, Maximilian Balandat, Eytan Bakshy

UAI - July 27, 2021

Measuring Data Leakage in Machine-Learning Models with Fisher Information

Awni Hannun, Chuan Guo, Laurens van der Maaten

arXiv - January 29, 2020

fastMRI: An Open Dataset and Benchmarks for Accelerated MRI

Jure Zbontar, Florian Knoll, Anuroop Sriram, Tullie Murrell, Zhengnan Huang, Matthew J. Muckley, Aaron Defazio, Ruben Stern, Patricia Johnson, Mary Bruno, Marc Parente, Krzysztof J. Geras, Joe Katsnelson, Hersh Chandarana, Zizhao Zhang, Michal Drozdzal, Adriana Romero, Michael Rabbat, Pascal Vincent, Nafissa Yakubova, James Pinkerton, Duo Wang, Erich Owens, Larry Zitnick, Michael P. Recht, Daniel K. Sodickson, Yvonne W. Lui

NeurIPS - December 6, 2021

CRYPTEN: Secure Multi-Party Computation Meets Machine Learning

Brian Knott, Shobha Venkataraman, Awni Hannun, Shubho Sengupta, Mark Ibrahim, Laurens van der Maaten

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy