To start our Monday morning right, here’s some nice, rhythmic glitch in the Mac tool Quartz Composer, using a combination of custom and off-the-shelf plug-ins — bless you, modularity. It’s a lovely demonstration of how having an ample set of pluggable tools can help you to produce the results you want.

PXN_richterizz from pixel noizz on Vimeo.

The work is just a small sampling of the prolific output of David Szauder, aka PIXEL NOIZZ, a Budapest-born, Berlin-based audiovisual artist. Don’t miss his blog for lots of other tidbits and Quartz Composer how-to info.

Description:

Richter’s rhytmus 21 meets ‘PXN’. I developed this plugin to produce richter style video glitches from any input . All QC based, with the great support of Vade’s plugins and one from NI (noise industries).

Artist pixel noizz also shares an evocative performance called “Generative Cities,” quite unlike the work above. Full description after the jump. It’s a different aesthetic, but it shares some of the Quartz Composer goodness in its toolset.

Generative Cities (alpha version) from pixel noizz on Vimeo.

first test of this performance, presented at the b-seite festival mannheim. 2010.

40 min.

music tracks: astrowind, vicsek and moog conspiracy
special thanks to auderoselavy for the video record.

chapter 1
arrivals: structure-state
chapter 2
transport: virtual-state
chapter 3
reconstruction: organic-state
chapter 4
time: discontinuation

The performence contains four chapters and four different aspects of my generated city. Nothing is fixed at the beginning, just small video fragments from Gabor Body’s film are shown. People are arriving into the future. I don’t consider this state as ‘future’, but as a borderline between the space of the reality (where the audience is sitting during my performance), and my own structure, i. e. my generated city.

Entering the city I try to create some basic rules as: space, aspects, structure. It is the structure that will construct the next level, the virtual state. The virtual state is nothing else but a modelled world with a virtual surface. During the performance I try to walk around in the space of this city, involving more and more the audience into it. My perspective is more like a ‘human viewpoint’, the movements are not predefined. I try to walk from object to object, observe them, collecting my own experiences, every time a different one! It is not a fixed part of the performance, as I have a real 3 dimensional controller for this part for improvisations inbetween the fixed objects. At the end I let the light in to this virtual space, and I try to fill it with more and more realistic sceneries. By the increasing abstraction of the light, the impression of the entire situation becomes gradually more and more organic.

This organic like visualization of my virtual city becomes then more and more like a self-analysed scene. At this point, with a noise in the background, I try to fly back to the reality, showing some video fragments with very trivial symbols (a plant, a clock, a dead man in bubbling water, a walking woman on a corridor). With this last step I try to finalize the journey so I discontinue it. This is the end point

tech.: triplehead2go, quartz composer, iphone, wii
credits: Vade (V002-many plugins), George Toledo, Kineme as always and the folks on Kineme.net.
Gabor Body and mr. IM for the additional footages.

Thanks for the inspiration, and viva TripleHead2Go.