Two Kinect talks: Open Source Bridge and ITP Camp

In the last couple of weeks, I’ve given a couple of public presentations about the Kinect. This post will be a collection of relevant links, media, and follow up to those talks. The first talk, last week, was in Portland, Oregon at Open Source Bridge. It was a collaboration with Devin Chalmers, my longtime co-conspirator. We designed out talk to be as much like a circus as possible. We titled it Control Emacs with Your Beard: the All-Singing All-Dancing Intro to Hacking the Kinect.

Control Emacs with Your Beard: the All-Singing All-Dancing Intro to Hacking the Kinect

Devin demonstrates controlling Emacs with his “beard”.

Our first demo was, as promised in our talk title, an app that let you control Emacs with your “beard”. This app included the ability to launch Emacs by putting on a fake beard, to generate all kinds of very impressive looking C code by waving your hands in front of you (demonstrated above), and to quit Emacs by removing your fake beard. Our second app sent your browser tabs to the gladiator arena. It let you spare or execute (close) each one by giving a caesar-esque thumbs up or thumbs down gesture. To get you in the mood for killing it also played a clip from Gladiator each time you executed a tab.

Both of these apps used the Java Robot library to issue key strokes and fire off terminal commands. It’s an incredibly helpful library for controlling any GUI app on your machine. All our code (and Keynote) is available here: github/osb-kinect. Anyone working on assistive tech (or other kinds of alternative input to the computer) with gestural interfaces should get to know Robot well.

In addition to these live demos, we also covered other things you can do with the Kinect like 3D printing. I passed around the Makerbot-printed head of Kevin Kelly that I made at FOO camp:

Kevin Kelly with 3D printed head

Kevin Kelly with a tiny 3D printed version of his own head.

We also showed Nicholas Burrus’s Kinect RGB Demo app which does all kinds of neat things like scene reconstruction:

Control Emacs with Your Beard: the All-Singing All-Dancing Intro to Hacking the Kinect

Me making absurd gestures in front of a reconstructed image of the room

Tonight I taught a class at ITP Camp about building gestural interfaces with the Kinect in Processing. It had some overlap with the Open Source Bridge talk. In addition to telling the story of the Kinect’s evolution, I showed some of the details of working with Simple OpenNI’s skeleton API. I wrote two apps based on measuring the distance between the user’s hands. The first one simply displayed the distance between the hands in pixels on the screen. The second one used that distance to scale an image up and down and the location of one of the hands to position that image: a typical Minority Report-style interaction.

The key point was: all you really need to make something interactive in a way that the user can viscerally understand is a single number that tightly corresponds to what you’re doing as the user. With just that ITP-types can make all kinds of cool interactive apps.

The class was full of clever people who asked all kinds of interesting questions and had interesting ideas for ways to apply this stuff. I came away with a bunch of ideas for the book, which is helpful because I’m going to be starting the skeleton tracking chapter soon.

Of course, all of the code for this project is online in the
ITP Camp Kinect repo on Github. That repo includes all of the code I showed as well as a copy of my Keynote presentation.

This entry was posted in kinect. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *