Suddenly, multitouch is everywhere. Multiple touch points are available in MacBook touchpads, in Synaptics touchpads for PCs, in Apple’s new Magic Mouse, on mobile devices (not just the iPhone), and in a handful of new laptops from Lenovo, HP, and others. Multitouch screens are still expensive, but if you’re willing to settle for your mouse or touchpad, various solutions are ready for you. And heck, some systems will even let you plug in two mice and work that way.

Kineme MultitouchPatch from George Toledo on Vimeo.

First up, for Mac users, there’s the new Kineme MultitouchPatch, which uses trackpads and the new Apple Magic Mouse for up to eleven simultaneous points with position, velocity, and angle. Unfortunately, those of us with earlier MacBooks are out of luck, but anyone with a newer Apple laptop or the new mouse can go play. (Thanks to Lee Grosbauer and others for the tip!)

Yep, that’s right: got a Mac, $69, and a nearby Apple Store? You + Magic Mouse = Cheap Multitouch.

There’s no question this will be useful, especially with the ability to drop Quartz Composer compositions into something like VDMX and go play.

For broader applications, though, you need a way to handle all the possible multitouch input sources, from mice and trackpads to displays and camera tracking mechanisms and whatever else you happen to imagine. And there’s no reason to be stuck with one platform (cough, Mac) when you can use them all at once.

Python and Java users have two very powerful options, each of which has been under heavy development recently.

PyMT – A post-WIMP Multi-Touch UI Toolkit from Thomas Hansen on Vimeo.

For Python, there’s the awesome PyMT, a framework for multitouch that also includes easy tools for making gestures, widgets, and interactions. You can really produce things rapidly. Recent releases have added native support for Windows input on a lot of these affordable Windows-based solutions out there.

For Java, and built in everyone’s favorite friendly coding-for-artists environment Processing, there’s MT4j. It’s been tested under Windows and Linux, but with some interested parties on the Mac, shouldn’t have any trouble working there, too. It’s got a scene graph and flexible use of any input, including camera tracking of touch and objects, easy widget library, OpenGL… and the list goes on. The important thing is the ability to use whatever you like as an input and graphical output, with lots of goodies to cut out the work of reinventing the wheel just to bang out a prototype.

And yes, it has the most ghetto-fabulous system for multitouch I’ve seen yet: grab a cheap Windows PC, plug in two mice, and rock the hell out. Use two wireless mice on multiple channels, and you’ve got a recipe for awesomeness, complete with… oh, crap. I’ve run out of hands to play my keytar. Maybe I can have someone stand behind me operating the mice while I blaze through my keytar solo.

There’s really no reason this has to be limited to Microsoft Surface / reacTable-style tables with objects on top of them. Multiple touch points are obviously ideal for any graphical or expressive application. They could be on the display of your laptop or tablet, or a projection, or whatever. I’ll be looking at some specific applications of MT4j soon.

More resources:

Sparsh is a framework for providing multitouch from a variety of hardware, also built in Java. It uses a gesture server + gesture adapter + device driver model, works with any OS, and can be interfaced with both C/C++ and Java. That allows easy access to basic gestures and custom gestures in a way that’s hardware-agnostic.

On the NUI Group’s superb forum, there’s a discussion of accessing the new touch APIs in Windows 7.