It’s remarkable what a difference accurate multi-touch can make for interface design, especially when the surface is scaled relative to the human body (sorry, iPhone). Jeff Han’s work, widely spread around the blogosphere, is significant because his team has really rethought the whole interface. Gestures for moving things around in 3D space just make perfect sense. The only bad news is that large-scale back-projected screens take up space, and make possible a number of implementation details that wouldn’t work (for the time being) on smaller displays. The good news is, this kind of work could soon be finding its way into performances. Right now, live visualists still focus on the DJ mixer as their primary performance metaphor — a surprisingly deep resource, to be sure, but likely only scratching the surface of what could be possible.
Via Mac Rumors
Half of you readers right now I think are at NYU, so, ahem, feel free to fill us in. (Or join in a chorus of “Our Dear Old NYU”, if you like. Darnit, CUNY needs a song.)