15/06/2020

Daydream: Accurately Estimating the Efficacy of Optimizations for DNN Training

Hongyu Zhu, Amar Phanishayee, Gennady Pekhimenko

Keywords:

Abstract: Modern deep neural network (DNN) training uses a complex software/hardware stack used by machine learning (ML) practitioners are often heterogeneous. The efficacy of software-level optimizations can vary significantly when applied to different configurations. It is onerous and error-prone for ML practitioners and system developers to implement each optimization separately, and determine which ones will improve performance in their own configurations. Unfortunately, existing profiling tools do not aim to answer predictive questions such as "How will optimization X affect the performance of my model?". This paper addresses this critical limitation, and proposes a new profiling tool, Daydream, to help programmers efficiently explore the efficacy of DNN optimizations. Daydream models DNN execution with a fine-grained dependency graph based on low-level traces collected by CUPTI, and predicts runtime by simulating execution based on the dependency graph. Daydream maps the low-level traces using DNN domain-specific knowledge, and introduces a set of graph-transformation primitives that can easily model a wide variety of optimizations. We show that Daydream is able to model most mainstream DNN optimization techniques, and accurately predict the efficacy of optimizations that will result in significant performance improvements.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at USENIX ATC 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers