Event Details

Free-space Gesture Mappings for Music and Sound

Presenter: Gabrielle Odowichuk
Supervisor: Dr. Peter Driessen and Dr. George Tzanetakis

Date: Mon, July 30, 2012
Time: 13:30:00 - 00:00:00
Place: ECS 660

ABSTRACT

ABSTRACT:

This seminar will present a set of software applications for real-time gesturally controlled interactions with music and sound. The applications for each system are varied but related, addressing many unsolved problems in the field of audio and music technology. The three systems presented in this work capture 3D human motion with spatial sensors and map position data from the sensors onto sonic parameters. Two different spatial sensors are used interchangeably to perform motion capture: the Radiodrum and the Xbox Kinect. The first two systems are aimed at creating immersive virtually-augmented environments. The first application uses human gesture to move sounds spatially in a 3D surround sound by physically modeling the movement of sound in a space. The second application is a gesturally controlled self-organized music browser in which songs clustered into ground based on auditory similarity. The third application is specifically aimed at extending musical performance through the development of a digitally augmented vibraphone.