Modeling Musical Attributes

Overview


In this body of work, we seek to model musical attributes from music audio signals. These attributes span instrumentation, rhythm, and sonority, as well as elements performer expression.

Projects


Modeling Genre with Musical Attributes


Genre provides one of the most convenient groupings of music, but it is often regarded as poorly defined and largely subjective. In this work we seek to answer whether musical genres be modeled objectively via a combination of musical attributes and if audio features mimic the behavior of these attributes. This work is done in collaboration with Pandora, and evaluation is performed using Pandora’s Music Genome Project® (MGP).

Modeling Rhythmic Attributes


Musical meter and attributes of the rhythmic feel such as swing, syncopation, and danceability are crucial when defining musical style. In this work, we propose a number of tempo-invariant audio features for modeling meter and rhythmic feel. This work is done in collaboration with Pandora, and evaluation is performed using Pandora’s Music Genome Project® (MGP).

Percussion Excitation


In this work, we present a system that seeks to classify different expressive articulation techniques independent of percussion instrument.

Dataset of Expressive Percussion Techniques


The presented work makes use of a newly recorded dataset that encompasses a vast array of percussion performance expressions on a standard four piece drum kit.

Published Work:


  • Prockup, M., Ehmann, A., Gouyon, F., Schmidt, E., Celma, O., Kim, Y., "Modeling Genre with the Music Genome Project: Comparing Human-Labeled Attributes and Audio Features." International Society for Music Information Retrieval Conference, Malaga, Spain, 2015. [PDF]

  • Prockup, M., Asman, A., Ehmann, A., Gouyon, F., Schmidt, E., Kim, Y., Modeling Rhythm Using Tree Ensembles and the Music Genome Project. Machine Learning for Music Discovery Workshop at the 32nd International Conference on Machine Learning, Lille, France, 2015. [PDF]

  • Prockup, M., Ehmann, A., Gouyon, F., Schmidt, E., Kim, Y., "Modeling Rhythm at Scale with the Music Genome Project." IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, New Paltz, New York, 2015. [PDF]

  • Prockup, M., Scott, J., and Kim, Y. "Representing Musical Patterns via the Rhythmic Style Histogram Feature." Proceedings of the ACM International Conference on Multimedia, Orlando, Florida, 2014. [PDF]

  • Prockup, M., Schmidt, E., Scott, J. & Kim, Y. (2013). Toward understanding expressive percussion through content based analysis. Proceedings of the 14th International Society for Music Information Retrieval Conference. Curitiba, Brazil.[PDF]

  • Scott, J., Dolhansky, B., Prockup, M., McPherson, A., Kim, Y. E. (2012). New Physical and Digital Interfaces for Music Creation and Expression. Proceedings of the 2012 Music, Mind and Invention Workshop, Ewing, NJ: [PDF]