Syphon Teaser from vade on Vimeo.
In the audio realm, piping audio and MIDI between apps is commonplace (see ReWire, JACK, Soundflower, IAC MIDI, etc.). But imagine if you could take textures and frames from one app and share them, live and real-time, with another app. That’s the vision of Syphon, a Mac-only, open-source framework that promises to share graphics and live video between any visual app on the platform, from 3D apps to live VJ/video tools.
Syphon is in development, and not everything is public-facing yet, but it’s moving incredibly fast, and we’re able to take a first look. (The software is currently in a quite-stable private beta.) First, its creators sum up what it’s about more eloquently than I can:
Syphon is an open source Mac OS X technology that allows applications to share frames – full frame rate video or stills – with one another in realtime. This means you can leverage the expressive power of a plethora of tools to mix, mash, edit, sample, texture-map, synthesize, and present your imagery using the best tool for each part of the job. Syphon gives you flexibility to break out of existing graphics pipelines and mix creative applications to suit your needs.
Sharing frames happens behind the scenes so you can minimize or hide your apps and frames still flow. Frames shared via Syphon support an alpha channel, so rendered 3D content, masks, keys and transparency all work as expected. Wherever possible, published frames stay on the graphics card, so Syphon is fully hardware accelerated, and does not duplicate resources unnecessarily. This means Syphon is fast. You can share HD video and larger in realtime with little overhead between applications.
Lastly, Syphon is designed for and by new media technologists, realtime video performance artists and visualists.
Syphon Introduction from vade on Vimeo.
Let’s put this in simpler terms: Syphon is therefore all about collaboration, whether it’s working seamlessly across tools solo or, as described below, opening up more fluid interchanges of ideas with another artist.
Anton Marini, who co-created the first implementation with Tom Butterworth, tells us more.
The genesis of Syphon for me is essentially two-fold. I had been working with Mary Ann [Benedetto] (my better half, aka outpt) and wanted to perform together in some way. Neither of us had a video mixer at the time, and investing 1K in a standard definition analog mixer (V4 and friends) seemed completely insane by every measurement, even though there was no talk of the Spark DFuser at all at the time. [Ed.: That’s the awesome community DVI mixer initiative; more on that soon, as the world that has both the DFuser and Syphon in it is looking very happy, indeed.]
She also used Processing, and I was moving between Max/Jitter and Quartz Composer/VDMX and my own custom Cocoa VJ app. The solution I came up with to allow Mary Ann and I to work together was to make a hardware-accelerated screen capture plugin for Quartz Composer that would allow VDMX to capture the rendering of a Processing sketch that ran locally on my system, which Mary Ann controlled remotely from her machine. She built a front-end controller in Max/MSP and used that to control the Processing sketch, which I then had on screen, and brought into VDMX into my mix.
This was fragile, but worst of all, it also meant that it had to capture on screen pixels properly and anything interfering or occluding the window in the capture area would show up on screen. Suffice it to say it was non-elegant, clumsy and took up extra monitor real estate.
I asked around at the time if there were ways to share actual texture resources between apps, and the universal answer was simply “not possible”.
The other part of wanting to do this is my background in video engineering, and using Max and Jitter, routing video around (think video patch bay) was second nature. Why can’t this happen with software in a highly performant way? It seemed obvious.
Now it is, at least in OS X 10.6, thanks to a new API called IOSurface. [Ed.: For more on IOSurface, see Hidden Gems of Snow Leopard: IOSurface on the CocoaAdHoc blog.]
My theory as to why IOSurface exists — and why it’s specific to Mac OS X — is that it’s the result of the convergence of the age of QuickTime, the move to QTKit/Cocoa, and the lack of 64-bit support in QTKit for Cocoa. This is all conjecture — I have no idea if this is true — but I know enough of the status of the APIs to think it’s likely one cause for IOSurface’s creation. [Ed.: Just to reaffirm that point – the ways of OS framework development are mysterious, but I agree with Anton that it may at least have been a significant contributing factor.]
QuickTime is old and funky enough that porting to 64-bit is non-trivial, yet devs needed QTKit to function in Cocoa land. Cuee the move to 64-bit capable OSes (10.4 initially, then 10.5 and now full 64-bit 10.6), and APIs had to be updated in all levels of the API stack. QuickTime was (and still is) still an odd man out with regard to full 64-bit support (many things in QTkit won’t work or can’t be done unless you fall back to 32-bit-only APIs in the older “QuickTime” Library). So a cross-app API was presumably needed to pass frames from a full API 32-bit process (for access to 32-bit-only QuickTime APIs) to a front-end, 64-bit host app. There needed to be a way to pass frames too and fro with very little to no overhead. I believe this is why IOSurface was created. The need (and lack of functionality in QTkit to this day) is also what made it public rather than for internal use only.
Once 10.6 came out and IOSurface became a public API, some folks took notice. However, there was no documentation and no announcement on its intended use.
Once it was apparent what IOSurface could do, I emailed a group of commercial VJ and media developers, and tried to get a joint development effort going on a ‘Open Video Tap” project. Folks were interested, but no one knew if it would work.
Months later, some small examples started popping up from experienced devs and random mailing list postings. At that point, I was working extensively with Tom Butterworth on QC plugins and other projects. We ended up doing some basic IOSurface work for a 64-bit QuickTime plugin for QC. Once we figured out how to get that working, we went back to prototype a basic framework to allow frame sharing to happen easily.
Once Tom got a hold of the framework, he put in an amazing amount of work to polish and optimize it, and really make it shine. He re-engineered it in a way that made clients and servers very tolerant of interruptions or crashes — a client could not bring down a server and vice versa. That is, he solved the hard problems. 🙂
As it stands, we have fully working and debugged implementations of Syphon for Quartz Composer, FreeFrameGL for Mac, and Max/MSP/Jitter. We have a partial implementation for [game engine] Unity 3D Pro 3.0, and are working on a few more. All of these implementations are Mac-only, because they rely on the 10.6 specific IOSurface framework. [Ed.: Check out Processing – that’s a high priority, if anyone wants to claim the bounty and has some JNI and OpenGL experience. -PK]
Hopefully that fills it in. Now I can use AUVI with VDMX via Syphon – and many other combinations. There’s lots of fun to be had, and no more compromising on features 🙂
One side note: why shouldn’t this be possible on other platforms, given the action is all happening on the GPU? Odds are, that returns to Anton’s theories about how it came about in the first place. Whatever the reason, the facilities currently exist only on the Mac, though more fundamentally, the reason this wasn’t a feature of graphics driver architectures to being with may simply be that no one saw an obvious need. Once you see the results in action, the power of such a facility is apparent, so perhaps greater development on other platforms would become feasible. On other platforms, this would require the support of graphics driver vendors (AMD, NVIDIA, Intel) – and would have to take place without having one vendor (Apple) to demand it – so for now, expect this to remain Mac-only. (Graphics driver writers, if you’re out there, and need us to make a case to your bosses…)
Back to its original purpose, what’s incredible about being able to pass frames in real-time between applications is that it transforms the role of an application for visuals. Instead of being an island, a single tool is interconnected with everything else you run. It’s a fantastic feeling, the sense that the computer really is an open environment.
As Anton puts it, the hope is to build an “ecosystem” around shared visual tools.
Stay tuned for more as Syphon becomes public and more client support is made available.
Video Demos
Quartz Composer + VDMX
Syphon for QC & VDMX from vade on Vimeo.
FreeFrameGL (Mac) + Resolume Avenue
Syphon for FFGL & Resolume from vade on Vimeo.
Max/MSP/Jitter
Syphon 2K & 4K Jitter Demo from vade on Vimeo.