Collaborating on the most interesting Machine Learning Research Questions at ICML 2016
Facebook researchers are active participants at the International Conference on Machine Learning (ICML) 2016 being held in New York City this week. Widely known as the leading Machine Learning conference, our researchers will be interacting with their academic peers and presenting their latest work. In addition to presenting three research papers, the team is leading four workshops and delivering two tutorials: Memory Networks for Language Understanding by Jason Weston, AI research scientists at Facebook, and Stochastic Gradient Methods for Large-Scale Machine learning by Leon Bottou, AI research scientist director at Facebook.
The diverse use of machine learning, the explosive growth in data, and the complexity of large-scale learning systems have fueled an interesting intersection of ML and large scale system design. Aparna Lakshmi Ratan, technical project manager, engineering directors Joaquin Quiñonero Candela and Hussein Mehanna (all from Facebook), along with Joseph Gonzalez from UC Berkeley, will be leading the Machine Learning Systems workshop to address this complex intersection of technologies. Their ultimate goal is to facilitate the flow of new ideas as well as share best practices by bringing together experts working across the fields to explore and address the challenges, identify tools, best practices and design principles to help the community address real world, large scale machine learning problems.
Two papers will also be presented within the Machine Learning Systems workshop: Productionizing Machine Learning Pipelines at Scale, by Pierre Andrews, engineer, Aditya Kalro, engineering manager, and Alexander Sidorov, software engineer all on the Facebook Applied Machine Learning team, and Torchnet: An Open-Source Platform for (Deep) Learning Research, by Ronan Collobert, Armand Joulin, and Laurens van der Maaten, all AI research scientists at Facebook.
The Neural Networks Back to the Future workshop takes a step back and to examine the foundations of neural networks to take an even bigger step forward. Leon Bottou, along with David Grangier and Tomas Mikolov both AI research scientists at Facebook, along with John Platt from Google have organized a workshop designed to take a critical look at previous work on neural networks. The workshop includes a not-to-miss panel discussion between Yann LeCun, director of AI research at Facebook, Patrice Simard of Microsoft and John Platt. The goal of the workshop is to find forgotten “gems” from earlier literature, revisit assumptions to better understand the differences of today’s work.
Deep learning is a fast-growing field of ML concerned with the study and design of computer algorithms for learning good representations of data at multiple levels of abstraction. While there has been rapid progress in recent years, many challenges remain. The Deep Learning workshop brings together researchers across several organizations to collaborate on two major challenges: Unsupervised learning in the regime of small data, and simulation-based learning and its transferability to real world. The Deep Learning workshop will deliver focused discussions in these important areas. Hosts include Antoine Bordes, AI research scientist at Facebook, Kyunghyun Cho and Emily Denton from New York University, along with Nando de Freitas from Google DeepMind and the University of Oxford, and Rob Fergus, AI research scientist manager at Facebook and New York University.
Lars Backstrom, engineering director at Facebook is participating in a third workshop organized by Facebook research scientist Khalid El-Arini, Computational Frameworks for Personalization. This workshop aims to explore how to extract actionable knowledge to make informed decisions based on today’s torrent of information from online news to shopping and scholarly research. The workshop brings together researchers from industry and academia to share recent advances and discuss future research directions, in particular emerging applications such those in education and medicine where personalization has shown great promise.
Facebook papers being presented at ICML 2016 include:
Learning Simple Algorithms from Examples authored by Wojciech Zaremba, Tomas Mikolov, Armand Joulin, and Rob Fergus all from Facebook AI Research. Their paper presents an approach for learning simple algorithms directly from examples based on a new framework of interfaces accessed by a controller.
Learning Physical Intuition of Block Towers by Example by software engineer Adam Lerer and Rob Fergus, Facebook AI research scientist manager, explores the ability for deep feed-forward models to learn intuitive physics. By using a 3D game engine to train large convolutional network models, they can accurately predict the stability outcome of randomized wooden block towers, as well as estimate the trajectory of the blocks. Their models deliver a performance comparable to human subjects who learn intuition over time even as infants.
Unsupervised Deep Embedding for Clustering Analysis by Junyuan Xie University of Washington, Ross Girshick AI research scientist at Facebook, and Ali Farhadi University of Washington explores learning representations for clustering, which is central to many data-driven application domains. They present a new Deep Embedded Clustering (DEC) method that simultaneously learns feature representations and cluster assignments using deep neural networks that delivers significant improvements over current state-of-the-art methods on image and text corpora.
Recurrent Orthogonal Networks and Long-Memory Tasks by Mikael Henaff of NYU and Facebook, Arthur Szlam AI research scientist at Facebook and Yann LeCun, director of AI research at Facebook analyzes two standard synthetic long-term memory problems and provides explicit RNN solutions for them.
“Engaging and participating with the broader machine learning community at forums like ICML are fundamental to moving the field forward,” says Joaquin Quiñonero Candela, engineering director at Facebook. “At Facebook, we believe the most interesting research questions are derived from real world problems. By collaborating with our peers and sharing our findings, we aim to push new boundaries every day, not only within Facebook, but across the research community.