2000 character limit reached
Musical Word Embedding: Bridging the Gap between Listening Contexts and Music (2008.01190v1)
Published 23 Jul 2020 in cs.IR, cs.LG, cs.MM, and stat.ML
Abstract: Word embedding pioneered by Mikolov et al. is a staple technique for word representations in NLP research which has also found popularity in music information retrieval tasks. Depending on the type of text data for word embedding, however, vocabulary size and the degree of musical pertinence can significantly vary. In this work, we (1) train the distributed representation of words using combinations of both general text data and music-specific data and (2) evaluate the system in terms of how they associate listening contexts with musical compositions.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.