Publication

The Decoupled Extended Kalman Filter for Dynamic Exponential-Family Factorization Models

Journal of Machine Learning Research (JMLR)


Abstract

Motivated by the needs of online large-scale recommender systems, we specialize the decoupled extended Kalman filter (DEKF) to factorization models, including factorization machines, matrix and tensor factorization, and illustrate the effectiveness of the approach through numerical experiments on synthetic and on real-world data. Online learning of model parameters through the DEKF makes factorization models more broadly useful by (i) allowing for more flexible observations through the entire exponential family, (ii) modeling parameter drift, and (iii) producing parameter uncertainty estimates that can enable explore/exploit and other applications. We use a different parameter dynamics than the standard DEKF, allowing parameter drift while encouraging reasonable values. We also present an alternate derivation of the extended Kalman filter and DEKF that highlights the role of the Fisher information matrix in the EKF.

Related Publications

All Publications

NAACL - June 6, 2021

Deep Learning on Graphs for Natural Language Processing

Lingfei Wu, Yu Chen, Heng Ji, Yunyao Li

ICASSP - June 6, 2021

On the Predictability of HRTFs from Ear Shapes Using Deep Networks

Yaxuan Zhou, Hao Jiang, Vamsi Krishna Ithapu

L4DC - June 7, 2021

On the model-based stochastic value gradient for continuous reinforcement learning

Brandon Amos, Samuel Stanton, Denis Yarats, Andrew Gordon Wilson

MLSys - May 19, 2021

TT-Rec: Tensor Train Compression For Deep Learning Recommendation Model Embeddings

Chunxing Yin, Bilge Acun, Xing Liu, Carole-Jean Wu

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy