I’m Superman! Uh… yeah, so there is some appeal to gestural interfaces for 3D navigation, I meant to say. Photo (CC-BY-SA) Open Exhibits; see note below.

The absence of an official SDK has hardly discouraged experimentation and innovation with Microsoft’s depth-sensing camera for the Xbox 360. But Microsoft, via their research and strategy officer Craig Mundie, announced today that it would make a non-commercial, research SDK available this spring for Windows, with a commercial version to follow later.

Microsoft acknowledges that the research and experimentation communities are already all over their technology, describing “an already vibrant ecosystem of enthusiasts.” So, will an official Kinect for Windows SDK offer something the hacked toolkits don’t already?

Odds are that it will. The academic version, says Microsoft Research, will offer deep support for the capabilities of the hardware and its APIs:

While Microsoft plans to release a commercial version at a later date, this SDK will be a starter kit to make it simpler for the academic research and enthusiast communities to create rich natural user interfaces using Kinect technology. The SDK will give users access to deep Kinect system information such as audio, system application-programming interfaces, and direct control of the Kinect sensor.

Not all those features are supported by the open source toolchain, and certainly not across all platforms and language bindings. For Windows programmers, in particular, it should make a convenient platform, and might be enough to convince Kinect enthusiasts on platforms like the Mac to dual-boot into Windows.

On the other hand, I don’t imagine open source toolchains are going anywhere fast, either. These tools, aside from supporting other platforms like Mac and Linux and diverse language bindings, can support experimental features and build support around what’s there.

My guess is both options should coexist nicely and continue building momentum for what Kinect can do.

Kinect isn’t the only area Microsoft Research is touching – literally – as Mundie also recently explored commercially-available haptic technology for the medical field. Amidst all the (deserved, perhaps) hype that Kinect has gotten, I think it’d be a mistake to overlook other natural interface and haptic innovations, too. If Kinect demonstrates anything, it’s that the rapid spread of communication and community around these platforms can finally help people to really explore not just how the technology works, but how to apply it.

Who might use this? I came across, entirely by accident, one such example looking for a photo for this story: Open Exhibits:

Open Exhibits is a new open source initiative for informal science education.

Open Exhibits is a National Science Foundation-funded initiative to develop a library of free and open multitouch-enabled software modules for exhibit development. Built using the popular Adobe Flash and Flex authoring tools, museum professionals will be able to creative innovative floor and web-based exhibits easily and inexpensively.

Beginning in November, 2010, we will release Open Exhibits core software, which includes support for multitouch gestures within Adobe Flash and Flex. This software will be free for museums, universities, students, and other educational organizations.

Ideum Inc. operates Open Exhibits with three museum partners: the Don Harrington Discovery Center, the Maxwell Museum of Anthropology, and New Mexico Museum of Natural History and Science. Evaluation will be conducted by Rockman et. al.

And who else? You, very likely, knowing readers of this site.

For instance, there’s always DJing in Ableton Live using big gestures and Kinect: