Kinect and XML
Wednesday March 30th 2011, 7:55 am
Filed under: Kinect

(Check out the rest of the Kinect section for more tutorials.)

Update: Our XML-based Kinect mocap tool now has its own website, For general Kinect setup help, look here.

To record joint data or OSC output by the OSCeleton Kinect mocap utility to an XML file, to play back the XML, or to convert it for use in After Effects, get this KinectToPin Processing sketch from GitHub.

…the practical upshot of this? You can send all your motion capture information to your XML file in realtime, without worrying about capturing your video image. Then, you can read the file back later and do all the complex rendering you want in full HD. By the way, the sketches will work with anything that uses OSC, not just a Kinect–all you have to do is modify the XML tags to suit your needs.

Installation Checklist
Wednesday March 23rd 2011, 6:22 am
Filed under: Concepts

1. Computers:

  • How many computers does this run on?
  • What are their specs (RAM, CPU, storage, graphics card, etc.)?
  • Do they need to communicate?
  • What types of network connections do they have (WiFi, gigabit ethernet, etc.)?
  • Do they need any peripherals (mouse, keyboard, Wiimote, Kinect, etc.)?
  • What types of peripheral connections do they have (USB, Firewire, Bluetooth, etc.)?

2. Media

  • What’s your storage medium (hard drive, SSD, optical disc, etc.)?
  • What’s your backup strategy (RAID, Time Machine, disk image, spare drive or optical disc, etc.)?
  • Will there be an attendant present in case the computers crash?

3. Video

  • How is the video being presented (projector, LCD, CRT, LED wall, etc.)?
  • Is there live video input?
  • Does it need to be recorded?
  • What kind of cameras are you using (webcam, industrial, DV, DSLR, etc.)?
  • Do they have a usable “live view” mode?
  • How are they connected to the computer (Firewire, USB, analog, etc.)?
  • Do they have manual focus/iris/white balance?

4. Audio

  • How is the sound being presented (built-in speakers, external amplifier, stereo, 5.1, etc.)?
  • Is there live sound input?
  • Does it need to be recorded?
  • What kind of microphones are you using (shotgun, contact, cardioid, lavalier, etc.)?
  • How are they connected to the computer (Firewire, USB, analog)?
  • Do they have manual gain control?

5. Sensors

  • What other types of live input do you need (light, temperature, vibration, tilt, acceleration, etc.)?
  • How is the data being presented?
  • Does it need to be recorded?
  • Can you find commercial products that fit your budget, or will you need to make your own sensors?
  • How are they connected to the computer (Arduino, MakingThings, serial port, etc.)?
  • What external hardware controls do you need (on/off switch, gain, threshold, status lights, etc.)?

Types of Animation
Wednesday March 23rd 2011, 6:20 am
Filed under: Concepts

1. Time-lapse: Photographing a scene only at selected intervals, creating an illusion of spontaneous change when played back. The earliest form of animation, predating live-action cinema; various playback methods for time-lapse photos had been invented by the 1830s.

2. Stop-motion: Photographing a single object and moving it while the camera is stopped. Can be done with paper cut-outs, characters with poseable armatures, or even cooperative human actors (“pixilation,” which with this spelling has nothing to do with computers, meaning “possession by evil spirits”). Quickly adopted in the 1900s for visual effects in early silent films.

3. Replacement: Replacing the object being photographed with a different object while the camera is stopped. “Classical animation” is replacement animation using pencil drawings on paper or ink on plastic cels; this technique dominated animated feature production until the late 1980s. Less common variations use photo collage or sculpture. First came into wide use with “lightning artist” vaudeville acts in the 1910s, where audiences would watch an animated film being made.

4. Rotoscoping: Using live action as a frame-by-frame reference for animation. Traditionally done by projecting film footage and tracing it, an established technique by the 1940s. “Motion capture” is a modern variant of rotoscoping, in which the analysis of movement is done with a computer instead of by hand.

5. Computer graphics: Breaking an image down into mathematical elements and manipulating the values of those elements. Usually done by representing an image as a grid of colored dots (“pixels”). Widely adopted by the 1990s; the most common form of animation in use today.

Kinect Setup Links
Wednesday March 23rd 2011, 6:02 am
Filed under: Kinect

(Check out the rest of the Kinect section for more tutorials.)

The Kinect has two basic tricks–first, it grabs a depth map, and second, it figures out the joint coordinates of your skeleton from the depth map. There are lots of easy-to-use options for the first trick, but not so many for the second. Still, even using the depth maps alone, you can track motion more effectively than with most RGB methods.

0. Easiest Installer Options (current—start here!)
There are now some easier installer options for OpenNI/NITE—I recommend starting with these! Here are:

1. Depth Maps (outdated)
On a Mac, you can get started quickly—for an initial test, you can run CocoaKinect, a small app that just displays the depth map. Then there’s a Processing library and a Max/MSP/Jitter external that work out of the box. You can also get a Quartz Composer plugin that also runs on the latest version of Isadora, but its installation is slightly more complicated. Which brings me to:

2. My Bundles (outdated)
With few exceptions, installing software for the Kinect requires getting a program from site A, a couple of drivers from sites B and C, and an installation tutorial from site D. To get you started quickly, I’ve collected all the bits you need to install:

I made these bundles for my own convenience, and they’re all almost certainly out of date as you read this, so once you have them up and running you should get updated versions of the programs and drivers from the original sources listed here.

3. Skeleton Tracking (outdated)
On Windows, there’s BrekelKinect, a slick-looking all-in-one utility that can capture depth maps and do skeleton tracking, recording joint coordinates to BVH files (usable with Maya or other 3D programs). (It might be able to communicate live with other programs, but I haven’t tested that.)

On Mac, Windows, and Linux, there’s OSCeleton, which sends joint coordinates out as OSC data. It’s quite a bit harder to set up than BrekelKinect, but it has the ability to do live skeleton tracking and pass the information on to other programs. Here’s an example receiver patch for Isadora. (A warning, you’ll need to use Terminal commands to install OSCeleton. If you’re not comfortable with that, I’d start with one of the ready-to-use alternatives and dive into OSCeleton when you have a full day to spend poking at it until it works. Here’s a Mac setup tutorial.)

Synapse (Mac/Win) is another powerful skeleton-tracking OSC app to experiment with, although it works somewhat differently than OSCeleton. It’s an all-in-one download that’s very easy to set up.

OSCulator (Mac) is an OSC routing app that can help you manage all the information that these OSC apps pump out; use this config file to get started.

4. To talk to Flash, you can use another app to get the Kinect data, analyze it, and send OSC using Flosc (Mac/Win/Linux) or Oscar (Mac). Other Flash options are AS3Kinect (Mac/Win/Linux) and Beckon (Win).