Please visit our DrexelCast page for the most recent updates regarding this project.
Humans with even a small amount of musical training have little trouble aligning a performance to a musical score. In other words, they are able to listen to a recording and follow along in the printed music. For a computer, however, this task is very difficult. This project attempts to develop an algorithm that would enable a computer to be able to do exactly this.
One application of this work is in developing automatic accompaniment programs. Currently, computer programs that make music in real-time with humans require the human to put effort into aligning themselves with the computer. However, a better system would be one that aligned itself with the player, regardless of, for example, variations that the musician makes in their tempo (speed).
At the MET-lab, we hope to share this technology with the Philadelphia Orchestra for use in their concert multicasts. First, a reference (a musical score or a previous recording of the piece being performed) will be tagged with program notes, commentary, and text translations, all at appropriate times in the piece of music. Then, the computer will, by following along in the score (or other reference) in real-time during the performance, be able to display the relevant tags at appropriate times in the performance. Hopefully, this technology will be set up on the iPhone/iPod Touch so that users can watch the commentary in the palm of their hand while they watch the multicast on the projection screens. Also, this will allow for the user to choose between various available feeds on the iPhone touchscreen. For instance, one feed could contain translations of the lyrics as they are sung. Another feed could contain musical commentary targeted at non-musicians. A third feed could contain musical commentary targeted at trained musicians.
Real-Time Score Tracking for Personalization of Live Orchestral Performances - ISMIR 2008