All Research Areas
Research Areas
Year Published

533 Results

July 12, 2018

Efficient Bias-Span-Constrained Exploration-Exploitation in Reinforcement Learning

International Conference on Machine Learning (ICML)

In this paper, we relax the optimization problem at the core of REGAL.C, we carefully analyze its properties, and we provide the first computationally efficient algorithm to solve it.

By: Ronan Fruit, Matteo Pirotta, Alessandro Lazaric, Ronald Ortner
July 12, 2018

Improved Regret Bounds for Thompson Sampling in Linear Quadratic Control

International Conference on Machine Learning (ICML)

In this paper, we study an instance of TS in the challenging setting of the infinite-horizon linear quadratic (LQ) control, which models problems with continuous state-action variables, linear dynamics, and quadratic cost.

By: Marc Abeille, Alessandro Lazaric
July 12, 2018

Improved Large-Scale Graph Learning through Ridge Spectral Sparsification

International Conference on Machine Learning (ICML)

In this paper, we combine a spectral sparsification routine with Laplacian learning.

By: Daniele Calandriello, Alessandro Lazaric, Ioannis Koutis, Michal Valko
July 11, 2018

Convergent TREE BACKUP and RETRACE with Function Approximation

International Conference on Machine Learning (ICML)

In this work, we show that the TREE BACKUP and RETRACE algorithms are unstable with linear function approximation, both in theory and in practice with specific examples.

By: Ahmed Touati, Pierre-Luc Bacon, Doina Precup, Pascal Vincent
July 11, 2018

Fitting New Speakers Based on a Short Untranscribed Sample

International Conference on Machine Learning (ICML)

We present a method that is designed to capture a new speaker from a short untranscribed audio sample.

By: Eliya Nachmani, Adam Polyak, Yaniv Taigman, Lior Wolf
July 11, 2018

Learning Diffusion using Hyperparameters

International Conference on Machine Learning (ICML)

In this paper we advocate for a hyperparametric approach to learn diffusion in the independent cascade (IC) model. The sample complexity of this model is a function of the number of edges in the network and consequently learning becomes infeasible when the network is large.

By: Dimitris Kalimeris, Yaron Singer, Karthik Subbian, Udi Weinsberg
July 10, 2018

Generalization without Systematicity: On the Compositional Skills of Sequence-to-Sequence Recurrent Networks

International Conference on Machine Learning (ICML)

In this paper, we introduce the SCAN domain, consisting of a set of simple compositional navigation commands paired with the corresponding action sequences.

By: Brenden Lake, Marco Baroni
July 10, 2018

Hierarchical Text Generation and Planning for Strategic Dialogue

International Conference on Machine Learning (ICML)

We introduce an approach to learning representations of messages in dialogues by maximizing the likelihood of subsequent sentences and actions, which decouples the semantics of the dialogue utterance from its linguistic realization.

By: Denis Yarats, Mike Lewis
July 10, 2018

Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry

International Conference on Machine Learning (ICML)

We are concerned with the discovery of hierarchical relationships from large-scale unstructured similarity scores. For this purpose, we study different models of hyperbolic space and find that learning embeddings in the Lorentz model is substantially more efficient than in the Poincaré-ball model.

By: Maximilian Nickel, Douwe Kiela
July 10, 2018

Optimizing the Latent Space of Generative Networks

International Conference on Machine Learning (ICML)

The goal of this paper is to disentangle the contribution of these two factors to the success of GANs. In particular, we introduce Generative Latent Optimization (GLO), a framework to train deep convolutional generators using simple reconstruction losses.

By: Piotr Bojanowski, Armand Joulin, David Lopez-Paz, Arthur Szlam