MyoSpat is an interactive system that allows musicians to manipulate timbrical and spatial properties of the sound as well as light projections through hand-gestures. The system aims to facilitate the creative use of audio processing during live musical performance through easily learnable hand gestures. MyoSpat is developed using the Myo armbandMyo Mapper. machine learning models implemented using Wekinator, and an audio-visual engine developed in Pure Data.