Humanoid Instrumental Performance

The Hubos cover of Come Together by The Beatles


    Playing and enjoying music are integral activities in many people’s daily lives. Developing systems with an understanding of music and other performing arts is becoming more important as humans and robots interact more frequently. While there have been several robots designed to play musical instruments, most are specialized (non-humanoid) designs that do not attempt to model a human performer’s specific gestures and movements. Musical instruments are designed to be played by humans, and the process of developing humanoid performers offers the potential of revealing new insights into the control of instruments in a musically expressive manner. Simply playing notes as they are printed on a sheet of music, however, is not enough to make a performance musical. Without expression and interaction between musicians music becomes flat, boring, and emotionless. For a robot to be considered a musical participant it needs to be able to interpret environmental feedback (both audio and visual) and respond to it.


      Our overall goal is to allow for richer human-robot interactions by enabling humanoids not only to play instruments but also to respond to cues and feedback from their environment. The final system should do the following:

    • Run in real time in order to respond to feedback from the environment
    • Use audio feedback to self-calibrate and detect errrors while playing
    • Run on an adult-sized humanoid capable of playing human instruments

    Current Abilities

    • Play PVC pitched pipe instruments, snare drum, and cymbals
    • Perform in a multi-robot group using UDP signals to coordinate the robots
    • Use audio and force/torque feedback to determine whether a note on the PVC instrument was played correctly with up to 80% accuracy

    More Videos

    RoboNova plays 'Twinkle, Twinkle, Little Star'.

    RoboNova self-calibrates for position and keypress distance using audio feedback. It is able to adapt when the keyboard location changes.

    RoboNova plays 'Chopsticks'

    Hubo's first attempt at piano


  • A. M. Batula, M. V. Colacot, D. K. Grunberg, and Y. E. Kim, "Using Audio and Haptic Feedback to Detect Errors in Humanoid Musical Performances," in Proceedings of the International Conference on New Inferfaces for Musical Expression, 2013. [PDF]

  • D. K. Grunberg, A. M. Batula, and Y. E. Kim, "Towards the Development of Robot Musical Audition," in Proceedings of the 2012 Music, Mind, and Invention Workshop (MMI), 2012. [PDF]
  • Y. E. Kim, D. K. Grunberg, A. M. Batula, D. M. Lofaro, J.-H. Oh, and P. Y. Oh, "Enabling Humanoid Musical Interaction and Performance," in Proceedings of the 2011 International Conference on Collaboration Technologies and Systems, 2011. [PDF]
  • A. M. Batula and Y. E. Kim, "Development of a Mini-Humanoid Pianist." in 10th IEEE-RAS International Conference on Humanoid Robots. December 2010. [PDF]
  • Y. E. Kim, A. M. Batula, D. Grunberg, D. Lofaro, J. Oh, and P. Oh, “Developing Humanoids for Musical Interaction,” Proceedings of the International Conference on Intelligent Robots and Systems , October, 2010. [PDF]