May 6, 2019
Selfless Sequential Learning
International Conference on Learning Representations (ICLR)
Sequential learning, also called lifelong learning, studies the problem of learning tasks in a sequence with access restricted to only the data of the current task. In this paper we look at a scenario with fixed model capacity, and postulate that the learning process should not be selfish, i.e. it should account for future tasks to be added and thus leave enough capacity for them.
By: Rahaf Aljundi, Marcus Rohrbach, Tinne Tuytelaars
Facebook AI Research