Music Technology Research @ Drexel

Philly Tech Week 2011: Music Technology Research @ Drexel


Tuesday, April 26th, 2-5PM

At MET-lab, we are building new tools for creating, understanding, and interacting with music. At this event we will share some of our recent music technology research projects. Bringing together engineering and music not only helps develop a deeper understanding of creative musical expression, it also offers new educational opportunities for promoting science and technology to students of all ages.

The event will feature a series of short presentations by current MET-lab members followed by an open house with hands-on demos of our current projects. Featured projects include dancing and piano-playing robots, an electronically-transformed acoustic piano, an interactive handheld guide for audience members at orchestra concerts, and new iPhone and iPad musical instruments.

An RSVP is requested to help us plan for the number of attendees; RSVP to Andrew McPherson (apm@drexel.edu). However, please feel free to attend whether or not you have sent an RSVP!

Schedule


2-3:30PM: Presentations in Drexel Main Auditorium (3141 Chestnut St.) (Please note location change!)

  • 2:00: Overview of MET-lab research
  • 2:15: Musically-aware humanoid robots
  • 2:25: DrexelCast: Interactive orchestral performance companion (in collaboration with the Philadelphia Orchestra)
  • 2:35: Music video games based on your own music library
  • 2:40: Predicting emotion in music through audio analysis
  • 2:45: New iPhone and iPad musical instruments
  • 2:50: The magnetic resonator piano: electromagnetically augmenting the acoustic grand piano
  • 3:10: Questions and wrap-up


3:30-5:00PM: Open house, MET-lab, Bossone room 405 (3120 Market St.): Live, hands-on demos of current MET-lab projects!

Featured Projects


Magnetic Resonator Piano

Philly Tech Week 2011: Music Technology Research @ Drexel


Tuesday, April 26th, 2-5PM

At MET-lab, we are building new tools for creating, understanding, and interacting with music. At this event we will share some of our recent music technology research projects. Bringing together engineering and music not only helps develop a deeper understanding of creative musical expression, it also offers new educational opportunities for promoting science and technology to students of all ages.

The event will feature a series of short presentations by current MET-lab members followed by an open house with hands-on demos of our current projects. Featured projects include dancing and piano-playing robots, an electronically-transformed acoustic piano, an interactive handheld guide for audience members at orchestra concerts, and new iPhone and iPad musical instruments.

An RSVP is requested to help us plan for the number of attendees; RSVP to Andrew McPherson (apm@drexel.edu). However, please feel free to attend whether or not you have sent an RSVP!

Schedule


2-3:30PM: Presentations in Mitchell Auditorium (Bossone Research Center, 3120 Market St.)

  • 2:00: Overview of MET-lab research
  • 2:15: Musically-aware humanoid robots
  • 2:25: DrexelCast: Interactive orchestral performance companion (in collaboration with the Philadelphia Orchestra)
  • 2:35: Music video games based on your own music library
  • 2:40: Predicting emotion in music through audio analysis
  • 2:45: New iPhone and iPad musical instruments
  • 2:50: The magnetic resonator piano: electromagnetically augmenting the acoustic grand piano
  • 3:10: Questions and wrap-up

3:30-5:00PM: Open house, MET-lab, Bossone room 405: Live, hands-on demos of current MET-lab projects!

Featured Projects


Magnetic Resonator Piano
On a traditional piano, the performer has no way to alter the sound of a note once it begins. The magnetic resonator piano lets the pianist continuously shape the sound of every note. Electromagnets induce the strings to vibration without being struck by the hammers, and optical sensors on the keyboard record the detailed, multidimensional shape of every key press. The instrument is capable of infinite sustain, crescendos from silence, harmonics, pitch bends and new timbres, all without the use of external loudspeakers. It has been used in concert performances across the country and will be set up for hands-on demos during this event.

Music Emotion Recognition
Music is sometimes referred to as a "language of emotion," and it is natural for us to categorize music in terms of its emotional associations. In developing automated systems to organize music in terms of emotional content, we are faced with a problem that often lacks a well defined answer; there may be considerable disagreement regarding the perception and interpretation of the emotions of a song or ambiguity within the piece itself. Furthermore, myriad features, such as harmony, timbre, interpretation, and lyrics affect emotion, and the mood of a piece may also change over its duration. Using data collected through MoodSwings, an online collaborative activity for music emotion annotation, we present a system linking models of acoustic and human data to provide estimates of the emotional content of music.

iNotes for Orchestral Performances
Many people enjoy live orchestral performances, but those without musical training may find it hard to relate to the music. We have developed a system that helps users by guiding them through the performance using a handheld application in real-time. Using chroma features and dynamic time warping, we attempt to align the live performance audio with that of a previously annotated reference recording. The aligned position is transmitted to users’ handheld devices and pre-annotated information about the piece is displayed synchronously.

Musically-Aware Humanoids
Human musicians make use of substantial auditory and visual information throughout a performance. Our research focuses on providing humanoid robots with such capabilities (e.g., audio and visual beat detection, note onset and pitch detection, and basic control for musical keyboard performance), with the long-term goal of enabling a large humanoid to be an interactive participant in a live music ensemble.

Analysis-Synthesis of Expressive Guitar Performance
During performance, a guitarist employs many techniques to convey expressive intention, including pluck strength and location and choice of pick. These techniques often correspond to simultaneous variation of the source (guitarist-string interaction) and the filter (resonant string) parameters. We propose modeling expression at the signal level via joint estimation of the source and filter parameters of plucked guitar sounds in order to achieve realistic re-synthesis while incorporating expressive intentions. This research has several applications, including the analysis of a particular performer's style and the parameterizing and expressive synthesis engine for new musical interfaces.

Music on Mobile Platforms
The increasing power and decreasing cost of mobile devices is driving the creation of novel and sophisticated virtual musical instruments. We have developed several mobile applications for music creation and manipulation. MusiCube is a 3D music sequencer with an interface well suited for a touchscreen device. SimpleDrum is a drum simulation app that gives a user expressive control over playing the drums as they swing the device like a drum stick. MET-amp allows a listener to dynamically remix a song to bring out different emotional qualities.

ALF: Audio processing Library for Flash
We have developed tools providing sophisticated signal processing routines to web developers using the Flash language. We hope that this library encourages the growth of audio-centric Flash applications for research and entertainment. Our work led to a collaboration with the Drexel RePlay Lab to develop a unique music centric game. Pulse 2 is a side scrolling platform game that allows a user to select any song from their personal music library to use with the game. The game environment and gameplay are created through computational analysis of the music.

For more information on our current work, check out our research page.


If you have any questions about this event, please contact Andrew McPherson (apm@drexel.edu)