Modeling musical attributes: When searching, sorting, and recommending music, humans apply a variety of attributes for similarly and discrimination. By designing audio features that capture these attributes, we can develop models that allow us to automatically generate descriptions of music that are grounded and intuitive.
Expression in music performance: Expression is the creative variation a human imparts on a piece of music to make it their own. In combining signal processing techniques, machine learning algorithms, and music performance practices, we attempt to quantify what makes a musical performance expressive and creative.
Development of interactive live performance systems: A large subset of musical performance requires a relationship between the performer and their audience. By creating interactive media technologies for musical performace, musicians can better provide contextually relevant information and create more intimate relationships with their audiences.
Developing interactive music and sound-based activities: Through the lens of music technology that we use every day, we can motivate and illustrate science, technology, engineering , arts, and math (STEAM) concepts in K-12 curricula.
Modeling Genre with Musical Attributes: Genre provides one of the most convenient groupings of music, but it is often regarded as poorly defined and largely subjective. In this work we seek to answer whether musical genres be modeled objectively via a combination of musical attributes and if audio features mimic the behavior of these attributes. This work is done in collaboration with Pandora, and evaluation is performed using Pandora’s Music Genome Project® (MGP).
Modeling Rhythmic Attributes: Musical meter and attributes of the rhythmic feel such as swing, syncopation, and danceability are crucial when defining musical style. In this work, we propose a number of tempo-invariant audio features for modeling meter and rhythmic feel. This work is done in collaboration with Pandora, and evaluation is performed using Pandora’s Music Genome Project® (MGP).
Expressive Percussion: Percussionists alter tempo, dynamics, timbre, and excitation techniques in order to perform expressively. In collaboration with the Music Industry department at Drexel, we developed a sample library containing a wide range of percussion instrument articulations as well as a novel mobile tablet interface to easily navigate this expressive library. In addition to being a stand-alone sample library for performance, it acts as a large dataset of sound samples that can be used for experimentation.
LiveNote: Orchestral Performance Companion: I am continuing further development on the iNotes Project in collaboration with the Philadelphia Orchestra. We have developed a system that helps users by guiding them through the performance using a handheld application (iPhone app) in real-time. Using audio features, we attempt to align the live performance audio with that of a previously annotated reference recording. The aligned position is transmitted to users’ handheld devices and pre-annotated information about the piece is displayed synchronously. [video] [official PhilOrch page]
The Science of Jazz: This unique performance allows attendees to experience the science behind the music, augmented with large-screen visuals and an interactive iPhone app illustrating the principles of frequency, harmony, and acoustics. In 2012, acclaimed keyboardist Marc Cary lead his Focus Trio along with featured soloist and Grammy-Award winning percussionist Will Calhoun and other special guests in a tour de force of virtuoso jazz. [video]
The AppLab @ ExCITe: In collaboration with the Pennoni Honors College, with founding support from Bentley Systems, the ExCITe Center welcomes the APP Lab. Open to all students across the university the APP Lab will serve as a resource for those interested in mobile app development, whether novice or advanced. The APP Lab will also sponsor periodic apps-related public events, University panels and demonstrations, student research and development projects throughout the year.
Multi-touch Technology: I worked worked with a self-built multi-touch display, and focused creating applications and SDK's to enhance and streamline the user experience. I worked with pathologists at the University of Pennsylvania and the Children's Hospital of Philadelphia to create a multi-touch medical image slide viewer. Additionally, we worked towards creating a simple SDK for allowing non-technical developers to incorporate multi-touch into Adobe FLASH applications.