gSpat

Posted on Posted in Performance, Research, Software Development

gSPAT is a interactive system able to map gestural data to spatialisation parameters using a commercial gestural controller such that the results are musically meaningful to both performer and audience.

This system is currently under development. Until today, it has been developed using the Myo armband as input device, which is able t o send OSC and MIDI data through the MyoMusic application. The Myo data is mapped using a Support Vector Machine classification system (ml-lib) and consequentially used to drive a sound spatialiser.

Recently the gSPAT has been involved in the realisation of VoicErutseG v0.1, an interactive performance of Sequenza III by L.Berio and Stripsody by C.Berberian; performed at Frontiers Festival and electroAQustica.

Moreover, it has been utilised to develop touch-less Virtual Instruments and Mixed Reality interaction system prototypes.

From experiments and applications of the gSPAT, it has been usable to perform different pieces without adding any choreographic indications to the performer; resulted in a ‘natural’ experience for the performer and applicable different contexts such as Mixed Reality and touch-free music interaction.

gSPAT has been presented at RESCON14, MTFScandi, K-Array labs and ICAD15 (paper, poster). Moreover it has been employed for the Moog Sound Lab Residency at Birmingham City University.

 

Leave a Reply

Your e-mail address will not be published. Required fields are marked *