18/07/2021

SketchEmbedNet: Learning Novel Concepts by Imitating Drawings

Alexander Wang, Mengye Ren, Richard Zemel

Keywords: Neuroscience and Cognitive Science, Human or Animal Learning, Neuroscience and Cognitive Science, Memory; Optimization, Combinatorial Optimization; Optimization, Submodular Optimizati, Deep Learning, Embedding and Representation learning

Abstract: Sketch drawings capture the salient information of visual concepts. Previous work has shown that neural networks are capable of producing sketches of natural objects drawn from a small number of classes. While earlier approaches focus on generation quality or retrieval, we explore properties of image representations learned by training a model to produce sketches of images. We show that this generative, class-agnostic model produces informative embeddings of images from novel examples, classes, and even novel datasets in a few-shot setting. Additionally, we find that these learned representations exhibit interesting structure and compositionality.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers