14/06/2020

Sequential Mastery of Multiple Visual Tasks: Networks Naturally Learn to Learn and Forget to Forget

Guy Davidson, Michael C. Mozer

Keywords: continual learning, mutlitask learning, curriculum, catastrophic interference, facilitation, transfer learning.

Abstract: We explore the behavior of a standard convolutional neural net in a continual-learning setting that introduces visual classification tasks sequentially and requires the net to master new tasks while preserving mastery of previously learned tasks. This setting corresponds to that which human learners face as they acquire domain expertise serially, for example, as an individual studies a textbook. Through simulations involving sequences of ten related visual tasks, we find reason for optimism that nets will scale well as they advance from having a single skill to becoming multi-skill domain experts. We observe two key phenomena. First, \emph{forward facilitation}---the accelerated learning of task n+1 having learned n previous tasks---grows with n. Second, \emph{backward interference}---the forgetting of the n previous tasks when learning task n+1 ---diminishes with n. Amplifying forward facilitation is the goal of research on metalearning, and attenuating backward interference is the goal of research on catastrophic forgetting. We find that both of these goals are attained simply through broader exposure to a domain.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at CVPR 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers