This is an example of some bio-data coming from a dancer (Alyssa), driving both sound generation in realtime and image generation. The sound is in Kyma and the image is in Unity3D. I am working with collaborators at ASU (Julie, Alyssia, Andreea) to develop a new dance work using these techniques – this is early material from that work
InTensions & Bodytext
Collaborative works by: Simon Biggs, Sue Hawksley, Garth Paine
Bodytext is a performance work which uses speech, movement and the body, to question and offer insight into the relations between movement, agency, representation and language.
Using real-time motion capture, voice recognition and interpretive language systems, the dancer’s movement and speech are acquired and remediated within the performance space. The dancer’s acquired speech is re-written within a projected digital display, the text animated by and re-presenting the performer’s gestures. The performer can, through their movements, cause texts to interact and recombine with one another, changing their grammatical composition in the projection.
Visual Artist: Simon Biggs
Choreographer/performer: Sue Hawksley
Sound Artist: Garth Paine
Bodytext was completed through an artists’ residency at the Bundanon Trust (Australia) and a subsequent collaborative residency with sound artist Garth Paine at the Virtual Interactive Performance Research Environment (VIPRe) Lab at the University of Western Sydney.
Date: Saturday 16th October
Venue: Woodend Barn, Banchory
Cost: £9 / £7