Darker Edge of Night – Interactive Dance work

Over the last 2 years, I have been working with dancer Helln Sky to develop an interactive dance work that uses bio-sensors on her body to drive realtime sonification/synthesis of sound and generation of images. Hellen has placed some video on YouTube, which includes elements of the first performance season in 2008 in Melbourne Australia, and a few video clips from a recent development period at the Perth Institute of Contemporary Art, in August 2009. Here is one of the clips from this year – all the sound is generated from Hellen’s voice (she is wearing a radio mic), and manipulated using the bio-sensors in realtime – the images are also controlled by bio-sensing. Some background info is available on the website of the Australia Council for the Arts, the funding agency who supported the research.

No comments | Trackback

Oct. 21 lecture/performance at Concordia University

I am giving a lecture/performance at Concordia University on Oct 21 from 12:30 – 3:30 – the event is free and open to the public

No comments | Trackback

The Thinking Head Project – PhD Scholarship on offer

The Thinking Head Project

FYI – applications close Oct 30

Thinking Head Postgraduate Research Award
Performance Scholarship


No comments | Trackback

Slice of Light

No comments | Trackback

A Slice of Light – this week at PICA

PICA is delighted to be assisting Hellen Sky with the continued development of her work ‘A Darker Edge of Night’. Garth Paine (UWS), Brandon Hur (Melbourne), Paul Bourke (UWA) and Hellen Sky will be adapting audiovisual environments to the architecture of PICA next week. Part installation, part performance and part lecture, Sky and her collaborators invite you to take a peek into the processes involved in the design and performance of their ongoing interactive hybrid work. This series of presentations called ‘A Slice of Light’ reveals the challenges and possibilities that digital interactive performance offers to the performer, composer and designers. It also explores the audience’s reception of non-linear interactive digital performance.

PICA would like to offer you complimentary tickets to ‘A Slice of Light’. To reserve your place please RSVP by contacting invite@pica.org.au or phone David on 9228 6300. There are three different performance times, please feel free to confirm your place at the time and date that most suits you. Hurry, places are limited.

Season: Saturday 8 August, 2 & 4pm
Sunday 9 August, 2pm

Duration: 40 mins no interval

Venue: PICA Central Gallery
Perth Instiute of Contemporary Arts
Perth Cultural Centre, James Street
Northbridge

RSVP: On 9228 6300 or invite@pica.org.au

PICA - a slice of light invite

PICA - a slice of light invite

No comments | Trackback

SynC performing Sonic Alchemies on Vimeo

Garth Paine and Michael Atherton – SynC – performing Sonic Alchemy’s at the opening of the Virtual, Interactive, Performance, Research Environment (VIPRE) at the University of Western Sydney – a new research lab established by Garth Paine over the last 3 years that houses first class 20 channels 3D audio spatialisation facilities, large animation screens – DLP projectors, MoCAP etc for research into interactive systems in the arts.

See vipre.uws.edu.au/ and vipre.uws.edu.au/tiem/

SynC performing Sonic Alchemies from Garth Paine on Vimeo.

Thanks to Jon Drummond for editing this together

No comments | Trackback

OSC for MoCAP – the Data Port project

Over the last year, I have been leading a project, working with 2 Melbourne based programmers (Olaf Meyer and AJ of Visual Synth fame), to develop an Open Sound Control data space for Motion Capture data, in order to facilitate the selection of data points for realtime sonification and visualisation applications. This project has a particular focus on the generation of realtime environments for the performing arts, utilising the motion of dancers as realtime controllers within dynamic multimedia works. The following video is a short excerpt of some of the development and experimentation that has been going on. We hope to have a drag-and-drop interface finished in the next month, which will allow the selection of MoCAP points and the construction of OSC message that can be sent out to sound and vision applications. We will then be adding a plugin structure and starting to implement Laban Motion Space analysis plugins to get higher order data from the gesture than the many data points associated with a typical MoCAP data stream. More to come in the next few months.

Comments (1) | Trackback
Powered by WordPress 4.7.8