Critical path, Australia, Sydney, Choreography, Dance, New South Wales, Choreographer, Choreographic centre, Artists, Research, Development, Workshops, Masterclasses, projects
You can see an example above. There are cameras in the ceiling which are doing blob tracking, in this case the blobs are people walking on the floor. The floor then responds to the blobs by having colored circles appear underneath the feet of someone standing on the floor and then the circles follow that person around.
Interactive Spaces works by having consumers of events, like the floor, connect to producers of events, like those cameras in the ceiling. Any number of producers and consumers can be connected to each other, making it possible to create quite complex behavior in the physical space.
You can subscribe to the discussion group.
The initial releases of Interactive Spaces were written with collaboration from the folks at The Rockwell Group’s LAB. There were many conversations about how to do this kind of software and Interactive Spaces would not be what it is without their expert knowledge of interactive design, experience, patience, and many scary nights of trying to get the project done.
This is an initial source release. As the final licenses are finished, there will be a binary release. Please see Downloads for a PDF file for documentation. This documentation is in progress.
NI-mate gives and excellent full body skeleton including Multiple user support for MIDI & OSC controller modes.
They have an ambition 2012 roadmap, much of which is already in place in V1.02
NI mate roadmap 2012
v1.1 Multi-user support for exact & controller data
v1.1 Multiple simultaneous feeds through Syphon and the FreeFrameGL plugin
v1.2 Recording and playback of captured data
v1.2 Controlling NI mate parameters & changing profiles through MIDI & OSC
v1.3 More MIDI & OSC controller options
– Head/wrist orientation tracking
– Support for multiple sensors
– Improving tracking by using multiple NI mates
– Face detection & other means of identifying users
– Integratin other input devices to improve the tracking (for example smart phone with accelerometer)
– Speed increases from multithreading and from using the gpu
here are a couple of videos about it use
Together with Jonas Jongejan and Ziv Hendel from PrimeSense http://www.roxlu.com/ created a working version of OpenNI and the complete driver (SensorKinect) for Mac OS X! Ziv is working on a Mac binary for NITE! As I’m working towards a build of OpenNI for openFrameworks, which works on Linux and Windows as well, I went through the code so I got a good understanding of the internals of OpenNI. Basically OpenNI works like this: you create an object in your code which is called the context. This context object contains all necessary information which is necessary for the framework. If it’s not clear what OpenNI is, let me give you a short introduction. OpenNI isn’t much more than a couple of coding “rules” for systems which work with natural human interaction. The framework makes use of “modules”. When you create a context object in your code the modules will be loaded into the context/framework. When the modules have been loaded and initialized you can use them in your application.