Researchers Examine Augmenting the Body With Musical Prosthetics

Friday, July 19, 2013

Researchers Examine Augmenting the Body With Musical Prosthetics

 Body Augmentation
Researchers at McGill University's Input Devices and Music Interaction Lab, have fused digital technology, 3D printing, and modern dance in a project designed to look at how augmented prosthetics may impact the future.

Researchers at the Input Devices and Music Interaction Lab (IDMIL) at McGill University recently released a video documentary on the design and fabrication of prosthetic digital instruments for music and dance performance.

The instruments are the culmination of a three-year long project in which the designers worked closely with dancers, musicians, composers and a choreographer. The goal of the project was to develop instruments that were visually striking, utilize advanced sensing technologies, and were rugged enough for extensive use in performances.

Digital music prosthetic

Related articles
The result of the process are instruments that are transparent and have complex forms.  Many are lit from within, and include articulated spines, curved visors and ribcages.

Unlike most computer music control interfaces, the instruments work both as hand-held, manipulable controllers and as wearable, movement-tracking extensions to the body.   The researchers worked to find transparent conductive materials to combine touch controls with the desired futuristic aesthetics.

The prosthetic instruments were designed and developed by Ph.D. researchers Joseph Malloch and Ian Hattwick under the supervision of IDMIL director Marcelo Wanderley. Following conceptualization and prototyping, the researchers used digital fabrication technologies such as laser-cutters and 3D printers to produce the instruments.

3d printed digital music prosthetic

Each of the nearly thirty working instruments produced for the project has embedded sensors, power supplies and wireless data transceivers, allowing a performer to control the parameters of music synthesis and processing in real time through touch, movement, and orientation.

The signals produced by the instruments are routed through an open-source peer-to-peer software system the IDMIL team has developed for designing the connections between sensor signals and sound synthesis parameters.

The instruments were featured in recent productions of the piece “Les Gestes” for two dancers and two musicians. The piece was developed in collaboration with the IDMIL researchers, and toured parts of Canada and Europe this past spring.

The full 15 minute documentary video below explains the development process and shows the instruments in action:


By 33rd SquareSubscribe to 33rd Square