top of page

ISOLATE // 2019

Live-Electronics Performance in Mixed-Order Ambisonics (MH3.1)

Duration // 9'

Written in 2019, ISOLATE was composed as the first piece written exclusively for performance with my handmade electronics performance interface MH3.1, also known as “Franky”.

In this work I am exploring compositional concepts such as performer agency in live electronic music, the use of harshly juxtaposed sonic elements, complex / nested gestural materials, and an investigation of density and texture in higher-order ambisonics. In regards to the sounds themselves, the materials used to create this composition are widely varied and range from closely recorded vocal samples (breath, vocal fry, and so on) to more intense methods of digital synthesis such as granular synthesis and Tom Mudd’s gutter synthesis. The result of all of these factors is a work which inhabits both periods of near stasis and the highly chaotic, in which musical materials and gestures continuously fracture and constellate. ISOLATE is presented in mixed-order ambisonics; many elements are generated in real- time, which are diffused and output natively in fifth-order ambisonics. Meanwhile, there are fixed-media cues which, using Master Hand are further manipulated and diffused in first-order ambisonics.

In regards to the interface itself, “Franky” is a real-time electronics performance interface developed for implementation with Max/MSP and Wekinator. Originally designed as a spin on the core design concepts of video game controllers, such as the failed Nintendo Power Glove, this interface makes use of a specially made glove and exoskeleton which places five small 2-axis control sticks at the user’s fingertips, alongside of a ribbon sensor and a 3-axis gyroscope. With some practice, this type of interface allows for nuanced control over various parameters in a performance system. Beyond the sensors and interface, this system is further augmented both by carefully tuned mappings, and by the implementation of Wekinator as a platform for building gesture recognition using machine learning. This build of the interface is the second iteration in what is currently planned as an ongoing project.

bottom of page