Publication

Constrained Bayesian Optimization with Noisy Experiments

Bayesian Analysis 2018


Abstract

Randomized experiments are the gold standard for evaluating the effects of changes to real-world systems. Data in these tests may be difficult to collect and outcomes may have high variance, resulting in potentially large measurement error. Bayesian optimization is a promising technique for efficiently optimizing multiple continuous parameters, but existing approaches degrade in performance when the noise level is high, limiting its applicability to many randomized experiments. We derive an expression for expected improvement under greedy batch optimization with noisy observations and noisy constraints, and develop a quasi-Monte Carlo approximation that allows it to be efficiently optimized. Simulations with synthetic functions show that optimization performance on noisy, constrained problems outperforms existing methods. We further demonstrate the effectiveness of the method with two real-world experiments conducted at Facebook: optimizing a ranking system, and optimizing server compiler flags.

Related Publications

All Publications

Efficient tuning of online systems using Bayesian optimization

Ben Letham, Brian Karrer, Guilherme Ottoni, Eytan Bakshy

September 17, 2018

LEEP: A New Measure to Evaluate Transferability of Learned Representations

Cuong V. Nguyen, Tal Hassner, Matthias Seeger, Cedric Archambeau

ICML - July 13, 2020

The Differentiable Cross-Entropy Method

Brandon Amos, Denis Yarats

ICML - July 12, 2020

Stochastic Hamiltonian Gradient Methods for Smooth Games

Nicolas Loizou, Hugo Berard, Alexia Jolicoeur-Martineau, Pascal Vincent, Simon Lacoste-Julien, Ioannis Mitliagkas

ICML - July 12, 2020

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy