Blender, the free and open source 3D modeling tool that’s also a real-time game engine, promises real-time visual performance possibilities, and is even a video editing tool, continues its march toward the long-promised, insanely powerful 2.5 milestone. (“Point five” doesn’t really begin to cover it.)
2.49 is now stable. And boy does it have a heck of a lot going on. There’s nodal texture editing, multiple streams of video playback in the Game Engine (making this especially appealing to visualists), 3D painting, real-time dome rendering in case you’ve got a planetarium gig, faster Game Engine performance, Bullet physics improvements for lots of physics-y goodness, real-time shape modification, and better game logic and Python control and included Python script extensions. And that’s just the start.
Basically, Blender has become a full-blown, real-time OpenGL video and graphics powerhouse inside an existing modeling tool. I’m still intrigued by dedicated game engines, but this means your modeling workflow and real-time workflow are one and the same.
And it’s capable, as a result, of some stunning visuals. The video above is from Martin Supitis, who describes it thusly on YouTube:
Few weeks of exploring the magic world of GLSL coding and few days of getting it all in this demo. Here is the result.
The thread in BlenderArtists forums that also contains download links and updates – here:
blenderartists.org/forum/showthread.php? t=152343Made for company Twilight 22 where i take part of creating adventure action game Fire Wire District 22 as concept artist, modeler, now also learning graphic coding.
here is seen final composite of GLSL scene + SSAO, Depth of Field, Light Scattering and Chromatic Aberration filters, captured 30fps in 1680X1050 resolution; 8xAnisotropic filtering and 16xQ Antialiasing.
For live visuals, of course, modeling tools do way more than we might actually want or need. But if you can dive into Blender and find a way to simplify the work to the point you might like for a visual performance, I think it could be an immensely powerful tool.
And then there’s hardware control. Marco Rapino aka Akta has been controlling Blender with the accelerometer in his Nokia N95 phone, as in the video seen here. (Oh yes, I do need to port this to Android, especially as I already have the sensors working.)
N95 acceleremoter in Blender from aktathelegend on Vimeo.
Full details:
N95 accelerometer with Blender [ Akta’s Way Blog, via BlenderNation ]
Of course, I’d like to see standardized OpenSoundControl for this sort of application. (Accordingly, OSC may soon lose the “Sound” officially in its title, given its more generalized purpose. Open Systems Control, perhaps? Open Stuff Control? Open Smurf Control?)
There’s been at least one paper on the topic of combining Blender with Pd for sound (“Blendnik”):
http://porcaro.org/blendnik.html
I’m not sure of the preferred way to implement OSC inside Python inside Blender, but I’ll have to give this a try myself.
A huge thanks to Giorgio Martini aka Tweaking Knobs for these links. Giorgio is working on his own live project. Here’s a glimpse of what that looks like, in progress:
Untitled from TweakingKnobs on Vimeo.
You can go grab Blender for basically any operating system you can imagine.