An interesting project was posted on CAN today: The Space Beyond Me (by Julius von Bismarck and Andreas Schmelas). In a few words, it's an installation which reinterprets video into a still image. Only it's a bit more than that.
When shooting video, a cameraman or camerawoman tries to shape a scene for their viewers, by framing shots of specific things to include (or exclude). They are translating the rich 3D spatial environment that they happen to be experiencing at that moment into a captured 2D image sequence.
You as a viewer of the video can accurately gauge the passing of time, but are constrained to the cameraman or camerawoman's choice of framing at any moment.
What this installation does is to reverse that trade-off for you. Take a look at the video below:
The installation projector pans and tilts as the cameraman or camerawoman pans and tilts. The projection lands on photosensitive paper, and once the video has run it's course, left behind is a photograph-style representation of the entire scene.
Now as a viewer, you cannot accurately gauge the passing of time but you are also no longer constrained by the cameraman or camerawoman's choice of framing at each moment. You can now view the scene in it's entirety at once, to the extent that the scene was captured on film. This means that it doesn't provide you with more information, but it provides you the same information in a different way.
This is one of the things that digital technology has always been really good at - transforming information from one thing to something else (and ofcourse, combining information from different sources). People like Julius von Bismarck and Andreas Schmelas are using physical computing practices to extend this information-transforming power into physical spaces.
The project uses openFrameworks for the video processing, Arduino for the camera movement, and Processing to help talk between the two. The codebase has been opensourced and is available here.
The installation was featured at Transmediale 2010.