ActionScript programmer Peter Kaptein has done some brilliantly creative work to mimic the infamous gestural interface in the film Minority Report using only Flash, a webcam, a printer, and your fingers. (Okay, you may want to pick up some Scotch tape, too.)

The tricks to make it all work (as I see it, anyway):
1. Treat distance as “pressure” for gestures.
2. Use two markers, allowing for multi-finger manipulation of the interface.
3. Create combined actions – and provide lots of visual feedback.
4. Don’t work as much with transformations perpendicular to the screen – by avoiding some of the nuances of these, he navigates around the problem of losing marker tracking.

You’ll also notice he’s got a lot of light.


But the results are simply stunning. And a lot of it isn’t technical trickery, but smart interface design, controlled use of the gestures, and the use of extensive visual feedback to keep the user connected to the experience.

In other words, in a world that may seem all about engineering prowess, good design is still essential. (And that, to me, was part of what made the movie’s vision of the future so compelling in the first place.)

He has code excerpts for each of the individual tracking techniques. It should be possible to port those code snippets to your language of choice – or simply to be influenced by some of the ideas – and go on to do something very different from this project. In other words, augmented reality is finally evolving beyond the Hello, World stage.

“Minority Report” interface using Flash and FLAR toolkit

If you are interested in these techniques, I taught a short workshop on doing this stuff in Processing last week; once I’ve refined my code a bit, I’ll share here.

Thanks to Miguel Isaza via Twitter for the tip!

  • Congratulations your augmented reality exercise is great I have done some experiments with color tracking to create some interactions I haven´t played with artoolkit library for processing It would be great if you could share your work with Processing.

  • 😉

    Excellent job!

  • I feel like all the hardware and sensing techniques we need are here already. Especially so long as we're comfy putting on some gloves or thimbles.

    The next trick is going to be the long, long process of developing this hands-in-the-air interaction vocabulary.

    I'm thinking of the mouse. Technically, an awesome little piece of hardware. And there were a lot of neat demos at first. But the full mouse vocabulary we have today took some time to develop.

  • there is a colour tracker example in Pd I seem to remember. I guess if you had some bright colours on your fingers (say, dark room and a very local light, say from a USB LED) you could make it easy for the web cam in the front of your laptop to track.

    Then I guess it would just be a matter of detecting certain gestures. Hook this up to SuperCollider or whatever via LAN messages and your away!

    There is also some other videos like this which are worht looking at:

  • Some design challenges and concerns to consider. Can you explain some concerns?

    Do I have to get 2D barcodes tattooed to my finger tips or will my fingers be born that way? Or is that just a side issue?

    Can it come with some hand sanitizer so I won't get dataviz germs if I interact with someone else's augmented reality? You know, in case they picked their earballs toegaps or nose before visualizing stuff?

    I prefer to wear my nails long, but not effeminately so — its a cultural thing. Will this interfere with my data sets in any way, perhaps by scuffing or scratching the data in some fashion?

    Will the data leave any stains or cause any skin sores or something if it's of a particular variety or representative of specific kinds of things? Pollution for example? Or toxic spills, crime, solar flares — that sort of thing?

  • @ Julian Bleecker

    Um, the AR markers don't have to be tattooed on your fingertips.

    I'm envisioning a future where you could actually use a custom micro projector (with it's origin perhaps near your pelvic region) to project the codes on your fingertips instead. Of course this would be a rear projection, or might need some kind of bounce card to accurately track the position of your fingertips. Either way, I think this kind of solution will be totally feasible in a couple of years.

    But think, you could easily use some kind of gaze tracker to detect the kind of gesture you wanted to make (maybe by blinking a few times rapidly or something), and the codes you were projecting on your fingers would change too. This would enable you to most likely interact better with your physical environment, and probably has numerous social connectivity applications, let alone enterprise solutions.

  • Actually, I do agree with others that IR emitters could be a good way to go. You know, it does start to suggest that it's even better if we can try to write these libraries in a more generic and modular way… I think that'll come with time, but it's something I'll be thinking about with my own code. 😉

  • Pingback: Touch User Interface()

  • naus3a

    Nice framework and I like the idea it can fit inside a browser. I wrote a similar software that works markerless using optical flow. it does the same stuff: moving, rotating, resizing. I think the next step will be to integrate these interface experiments with standard mouse events.

  • I recently did something similar using optical flow in combination with colour tracking for Squeaky Wheel Media Arts Center in Buffalo.

  • @Julian: HAH! Beautiful.

  • Pingback: Create Digital Motion » An Abstract Visual Interface, Made of Light and Gestures, and Full-Body Virtual Exercise()

  • kyle mcdonald, great point. yeah this shit is poppin' off gnarly ill, but we got a long way before the techies could design standard protocol software to fit this new gestural vocab. that is unless ur talkin more along the lines of VDMX, cuz i've got at least baker's dozen worth of sliders that i could do a lil' multidimensional grind to with this… :]

  • Pingback: minority report in flash / CODEISPOETRY()

  • Pingback: Augmented Reality example – Minority Report Interface « Ideas for Teaching Computer Technology to Kids()

  • Pingback: Comment on Minority Report Interface, Implemented in Flash + … | The Liquid Engine()

  • Pingback: » Blog Archive » “Minority Report” interface in Flash()

  • Rphillips

     I am looking for someone to build a flash project with a minority report theme that I can use to embed in a Crestron demonstration for upcoming tradeshows. Anyone interested in talking?


  • bgrggfe

    You can find many Burberry Outlet Store Online now with the google website search ,so that you can buy some Cheap Burberry Bags directly from the website,you needn’t go aboard now if burberry store are not exist in your local city.And 
    the Burberry Bags On Sale with a incredible price ,you should be happy with them.