Blender, the free and open source 3D modeling tool that’s also a real-time game engine, promises real-time visual performance possibilities, and is even a video editing tool, continues its march toward the long-promised, insanely powerful 2.5 milestone. (“Point five” doesn’t really begin to cover it.)

2.49 is now stable. And boy does it have a heck of a lot going on. There’s nodal texture editing, multiple streams of video playback in the Game Engine (making this especially appealing to visualists), 3D painting, real-time dome rendering in case you’ve got a planetarium gig, faster Game Engine performance, Bullet physics improvements for lots of physics-y goodness, real-time shape modification, and better game logic and Python control and included Python script extensions. And that’s just the start.

Blender 2.49

Basically, Blender has become a full-blown, real-time OpenGL video and graphics powerhouse inside an existing modeling tool. I’m still intrigued by dedicated game engines, but this means your modeling workflow and real-time workflow are one and the same.

And it’s capable, as a result, of some stunning visuals. The video above is from Martin Supitis, who describes it thusly on YouTube:

Few weeks of exploring the magic world of GLSL coding and few days of getting it all in this demo. Here is the result.
The thread in BlenderArtists forums that also contains download links and updates – here: t=152343

Made for company Twilight 22 where i take part of creating adventure action game Fire Wire District 22 as concept artist, modeler, now also learning graphic coding.

here is seen final composite of GLSL scene + SSAO, Depth of Field, Light Scattering and Chromatic Aberration filters, captured 30fps in 1680X1050 resolution; 8xAnisotropic filtering and 16xQ Antialiasing.

For live visuals, of course, modeling tools do way more than we might actually want or need. But if you can dive into Blender and find a way to simplify the work to the point you might like for a visual performance, I think it could be an immensely powerful tool.

And then there’s hardware control. Marco Rapino aka Akta has been controlling Blender with the accelerometer in his Nokia N95 phone, as in the video seen here. (Oh yes, I do need to port this to Android, especially as I already have the sensors working.)

N95 acceleremoter in Blender from aktathelegend on Vimeo.

Full details:
N95 accelerometer with Blender [ Akta’s Way Blog, via BlenderNation ]

Of course, I’d like to see standardized OpenSoundControl for this sort of application. (Accordingly, OSC may soon lose the “Sound” officially in its title, given its more generalized purpose. Open Systems Control, perhaps? Open Stuff Control? Open Smurf Control?)

There’s been at least one paper on the topic of combining Blender with Pd for sound (“Blendnik”):

I’m not sure of the preferred way to implement OSC inside Python inside Blender, but I’ll have to give this a try myself.

A huge thanks to Giorgio Martini aka Tweaking Knobs for these links. Giorgio is working on his own live project. Here’s a glimpse of what that looks like, in progress:

Untitled from TweakingKnobs on Vimeo.

You can go grab Blender for basically any operating system you can imagine.

  • Thats interesting, but there is one strikingly absent piece of the puzzle for realtime performance here: UI.

    A tool as powerful and multi-purpose as Blender has so many switches, buttons, dials, etc that using it live strikes me as kind of a kludgy UI nightmare. Youd have to implement yourself all of the custom presets, controls, and mapping logic, plus video deck selectors, and the like.

    Its certainly intruiging though, ill hand you that. I spent some time with Unity3D and have pondered using it for realtime video performance, but existing tools purpose built are really just as capable.

    One of the only things I can see these sorts of game engines as realtime visual/performance systems excelling at is multi user driven worlds. That could be much fun 🙂

  • One other thing I think is missing from Blender to use it as a live tool: multiple monitors management. I don't mind building myself a custom UI with max/msp or pd for controlling objects in the game engine, but if I can't control where I want the game window to be displayed (or span it on multiple monitors or computers via ethernet), I find it difficult to use.

  • Well, presumably one might want to eliminate the UI altogether and have everything happen in the game space. But I'm not sure about multiple displays — that, I agree, is totally key and is absolutely missing.

    Interestingly, the dome stuff comes out of SAT in Montreal.

    Of course, it's open source. So *someone* just needs to do multiple monitors.

    I'm still investigating jReality:

    It has an obscene number of output options from domes to multiple projectors to CAVEs. I'd like to see it support Blender projects natively, but at the very least, via an interchange format you could still build scenes in Blender.

    I had also looked at jMonkeyEngine, but I'm concerned again about output options and the fact that project isn't as well-supported development wise as jReality.

    Unity is brilliant, but the open source projects are a little more open to the sort of stuff we want to do.

  • Cheers for posting my N95 hack on your blog 🙂
    However OSC sounds interesting and of course doable in my opinion. Blender is really open to everything, stay tuned because really soon I'm going to release something that can really make you understand that not just microsoft can do cool stuff (see E3, L.A.). Well I said too much maybe 😀 You will hear from me soon, hopefully! 🙂

  • blender is also very interesting from a community development point of view, but that is another story.

  • I just found out that version 2.5 will include multiple window management, and the last developing phases are due in August.

  • Johnny Woods

    I'm currently working on a Unity project which will accept OSC. I'll be using junXion to get wiimotes, midi controllers, audio input, and arduino to run the engine.

    I'm hoping the experiment goes well, and possibly will lead to a robust OSC implementation within Unity. I'll be testing it out live next week… I'll let you know how it holds up.

  • At the moment (<2.5), Blender follows the "do everything in one window" paradigm.
    As jeff clermont stated above: Blender 2.5 will be the first blender release which will allow multiple windows. Hopefully, one can show the output of the game engine on one monitor in full screen while manipulating it on the other screen.
    The new event system in 2.5 might allow some nice possibilities for visualists. Pablo Vazquez has done a nice of manipulating a running animation (in the 3d view), which isn't possible in Blender <2.5.

    For using video textures in the game engine, check out this (found at )

  • Joe

    My vote is definitely for "Open Stuff Control" 🙂 Much love to Blender, its devs and its users x

  • Pingback: Create Digital Motion » Blender: 2.5 Gets Real-Time, Slick Interface; Video Texture Tutorial()

  • Hey guys , there was a misunderstanding , my work in progress clip is no realtime is rendered stuff , im sorry if i expressed myself wrong.

    anyway it is possible with baked textures and glsl filters to get something like this.

    Anyway, blender is getting very very cool , and 2.5 is going to be awesome.

    Again sorry for the misunderstanding ;D