MyoSpat is an interactive system that allows musicians to manipulate timbrical and spatial properties of the sound as well as light projections through hand-gestures. The system aims to facilitate the creative use of audio processing during live musical performance through easily learnable hand gestures. MyoSpat is developed using the Myo armbandMyo Mapper. machine learning models designed using Wekinator and implemented with ml.lib; and an audio-visual engine developed in Pure Data.

In addition to performing The Wood and The Water, MyoSpat has been used for delivering the  HarpCI workshop at the University of Southampton and the MiXD workshop at Berklee College of Music - Valencia.