Research Area
Year Published

634 Results

July 13, 2018

A Multi-lingual Multi-task Architecture for Low-resource Sequence Labeling

Association for Computational Linguistics (ACL)

We propose a multi-lingual multi-task architecture to develop supervised models with a minimal amount of labeled data for sequence labeling.

By: Ying Lin, Shengqi Yang, Veselin Stoyanov, Heng Ji

July 13, 2018

Analyzing Uncertainty in Neural Machine Translation

International Conference on Machine Learning (ICML)

Our study relates some of these issues to the inherent uncertainty of the task, due to the existence of multiple valid translations for a single source sentence, and to the extrinsic uncertainty caused by noisy training data.

By: Myle Ott, Michael Auli, David Grangier, Marc'Aurelio Ranzato

July 12, 2018

Improved Regret Bounds for Thompson Sampling in Linear Quadratic Control

International Conference on Machine Learning (ICML)

In this paper, we study an instance of TS in the challenging setting of the infinite-horizon linear quadratic (LQ) control, which models problems with continuous state-action variables, linear dynamics, and quadratic cost.

By: Marc Abeille, Alessandro Lazaric

July 12, 2018

Improved Large-Scale Graph Learning through Ridge Spectral Sparsification

International Conference on Machine Learning (ICML)

In this paper, we combine a spectral sparsification routine with Laplacian learning.

By: Daniele Calandriello, Alessandro Lazaric, Ioannis Koutis, Michal Valko

July 12, 2018

Efficient Bias-Span-Constrained Exploration-Exploitation in Reinforcement Learning

International Conference on Machine Learning (ICML)

In this paper, we relax the optimization problem at the core of REGAL.C, we carefully analyze its properties, and we provide the first computationally efficient algorithm to solve it.

By: Ronan Fruit, Matteo Pirotta, Alessandro Lazaric, Ronald Ortner

July 11, 2018

Convergent TREE BACKUP and RETRACE with Function Approximation

International Conference on Machine Learning (ICML)

In this work, we show that the TREE BACKUP and RETRACE algorithms are unstable with linear function approximation, both in theory and in practice with specific examples.

By: Ahmed Touati, Pierre-Luc Bacon, Doina Precup, Pascal Vincent

July 11, 2018

Fitting New Speakers Based on a Short Untranscribed Sample

International Conference on Machine Learning (ICML)

We present a method that is designed to capture a new speaker from a short untranscribed audio sample.

By: Eliya Nachmani, Adam Polyak, Yaniv Taigman, Lior Wolf

July 11, 2018

Learning Diffusion using Hyperparameters

International Conference on Machine Learning (ICML)

In this paper we advocate for a hyperparametric approach to learn diffusion in the independent cascade (IC) model. The sample complexity of this model is a function of the number of edges in the network and consequently learning becomes infeasible when the network is large.

By: Dimitris Kalimeris, Yaron Singer, Karthik Subbian, Udi Weinsberg

July 10, 2018

Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry

International Conference on Machine Learning (ICML)

We are concerned with the discovery of hierarchical relationships from large-scale unstructured similarity scores. For this purpose, we study different models of hyperbolic space and find that learning embeddings in the Lorentz model is substantially more efficient than in the Poincaré-ball model.

By: Maximilian Nickel, Douwe Kiela

July 10, 2018

Optimizing the Latent Space of Generative Networks

International Conference on Machine Learning (ICML)

The goal of this paper is to disentangle the contribution of these two factors to the success of GANs. In particular, we introduce Generative Latent Optimization (GLO), a framework to train deep convolutional generators using simple reconstruction losses.

By: Piotr Bojanowski, Armand Joulin, David Lopez-Paz, Arthur Szlam