Mike, Gareth and Joe have been working on a 3D scene which will be projected up on the wall, and which will respond in real-time to the Kinect data. The environment is built and runs in VVVV.

A first dry run
A first dry run

Mike has created the particle stream (the spheres mentioned in the design which flow to / from the threshold). He found the CiantParticles patch early, because it pretty much just does what we wanted to do. CiantParticles move according to forces you specify with parameters, and they respond to other objects in the scene - in this case our Kinect skeletons.

We are still experimenting with exactly how the particles should respond to people - should they gravitate towards and fall away? Or bounce off, or avoid them altogether? Mike has created a threshold (the vertical line in the image), and how they react will depend on which side of the threshold the person is on.

Joe's broken laptop screen
Joe's broken laptop screen

Joe has created a floor that our Kinect people will be walking on. The floor is a 2D grid distorted by waves. The wave system is supplied by the Fluid plugin. You feed into the plugin the size of the grid, and the coordinates of people's footfalls, and it sends out a new grid with the wave values.

As the footfall interaction changes, the internal state of the Fluid plugin updates to reflect the wave patterns and outputs the changes in realtime. We then use these output values to affect the 2D floor grid by supplying them as arguments to a vertexshader operating on the grid.

A render of the watery floor
A render of the watery floor

Gareth has been working with the Bullet physics system to allow Kinect skeletons to interact with floating objects inside Mike's particle stream. The idea is that when you are inside the stream and looking at yourself in the projection, you will notice floating objects near your Kinect alter-ego.

You can then reach out and interact with them - we are still experimenting as to how. But this provides a more immediate objective for people inside the stream, because the other interactions with the sound environment will probably take a little more time to get used to.

Mike's screen rendering particles
Mike's screen rendering particles

Data from this interaction with objects inside the stream will be sent out to the sound environment so that the interaction has a sonic element to it too, as discussed in this post.

We got as far as a dry-run by the end of the night tonight - not bad for three days work. Tomorrow we'll be looking to tighten everything up, and to experiment with how the interaction feels now that we have a running environment.