Emergent Mind
Representation Learning of Music Using Artist, Album, and Track Information
(1906.11783)
Published Jun 27, 2019
in
cs.IR
,
cs.MM
,
cs.SD
,
and
eess.AS
Abstract
Supervised music representation learning has been performed mainly using semantic labels such as music genres. However, annotating music with semantic labels requires time and cost. In this work, we investigate the use of factual metadata such as artist, album, and track information, which are naturally annotated to songs, for supervised music representation learning. The results show that each of the metadata has individual concept characteristics, and using them jointly improves overall performance.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.