11/10/2020

Dance Beat Tracking from Visual Information Alone

Fabrizio Pedersoli, Masataka Goto

Keywords: Musical features and properties, Rhythm, beat, tempo, MIR fundamentals and methodology, Multimodality

Abstract: We propose and explore the novel task of dance beat tracking, which can be regarded as a fundamental topic in the Dance Information Retrieval (DIR) research field. Dance beat tracking aims at detecting musical beats from a dance video by using its visual information without using its audio information (i.e., dance music). The visual analysis of dances is important to achieve general machine understanding of dances, not limited to dance music. As a sub-area of Music Information Retrieval (MIR) research, DIR also shares similar goals with MIR and needs to extract various high-level semantics from dance videos. While audio-based beat tracking has been thoroughly studied in MIR, there has not been visual-based beat tracking for dance videos.We approach dance beat tracking as a time series classification problem and conduct several experiments using a Temporal Convolutional Neural Network (TCN) using the AIST Dance Video Database. We evaluate the proposed solution considering different data splits based on either "dancer" or "music". Moreover, we propose a periodicity-based loss that considerably improves the overall beat tracking performance according to several evaluation metrics.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ISMIR 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers