3D Depth Cues
Sunday October 30th 2011, 11:31 am
Filed under: Concepts

Monocular
1. Motion parallax: distant things move slower; close things move faster.
2. Depth in motion: something coming closer to you gets bigger.
3. Perspective: parallel lines converge as they move away from you.
4. Familiar size: evaluating the position of an object whose size is known.
5. Relative size: comparing the position of an unknown object to one whose size is known.
6. Color and contrast: all else being equal, reds appear closer than blues, and high-contrast colors appear closer than low-contrast colors.
7. Accommodation: whether the eye is focused on a near or far object, as reported by our eye muscles.
8. Occlusion: an object blocking something else is assumed to be in front of it.
9. Depth of field: blurrier objects are assumed to be further away.

Binocular
10. Stereopsis: the difference between the views of the left and right eye.
11. Convergence: the angle from our eyes to the object viewed, as reported by our eye muscles (for objects closer than ~10m).



OBJ Batch Export
Saturday October 22nd 2011, 7:34 am
Filed under: Maya

Here’s how to export a sequence of frames from Maya as individual OBJ files:

Step 1. Download this OBJ Exporter MEL script.

Step 2. In Maya, go to Window / Settings/Preferences / Shelf Editor to add the script as a button.

Step 3. Click the New Item icon to create a new blank button.

Step 4. Click the Icon Name icon to add a custom image for your button (included with the download).

Step 5. Go to the Command tab and copy-paste the script in. Make sure the radio button is set to MEL, the language the script is written in.

Step 6. Click Save All Shelves.

Step 7. The new button should now appear in your shelf. Click it…

Step 8. …to open the OBJ batch export panel.

An OBJ sequence is a great way to do replacement animation with 3D printouts:



Format Suggestions
Tuesday October 04th 2011, 7:06 pm
Filed under: Concepts

1. Archival master: This is the copy you file away forever, in the highest quality available, and in an “open” format that you can be reasonably sure will be available many years into the future. (That is, it’s open-source, or the patent’s expired). Usually a format meeting these requirements can’t be played back easily–it’s just for storage.

Recommendations:

  • picture:  1920×1080 @ 12fps, 23.976fps, or 24fps.
  • sound:   48KHz 24-bit, stereo or discrete 5.1
  • codecs:  Quicktime PNG (24-bit color) or PNG image sequence (any bit depth); uncompressed AIFF or WAV files.

2. Submaster: This is the copy you make from the master and work with day-to-day. It’s slightly smaller and lower-quality than the master, and usually in a proprietary format that might be gone in five years, but works great for now. (A high-quality HD video that still plays back adequately on an older computer is a relatively recent invention.) You can also use the submaster to make DVD and Blu-ray discs if you need to.

Recommendations:

  • picture:  1920×1080 @ 23.976fps or 24fps.
  • sound:   48KHz 16-bit, stereo or discrete 5.1
  • codecs:  Quicktime Apple ProRes, Avid DNxHD, or PhotoJPEG; PCM audio.
  • note: Quicktime DV is still a popular choice for SD material, but be aware that —unlike DVD discs—the DV tape standard only supported NTSC and PAL frame rates. 24p DV tape hardware used hacky workarounds specific to the manufacturer, so if you’re working with some legacy material I would transcode it all into a modern codec.

3. Distribution copies: These are the copies you make from the submaster and hand out to your audience. They’re much smaller and lower-quality than the submaster, but can be easily passed around and viewed on many devices.

Recommendations:

  • picture:  1280×720 or 1920×1080@ 23.976fps or 24fps.
  • sound:   48KHz 16-bit, stereo or discrete 5.1.
  • codecs:  H.264 MP4 video; AAC audio.
  • note:If you’re concerned about the rights issues surrounding H.264 and Blu-ray discs (which use H.264), look into the similar WebM format. Bizarrely, it’s a bit of a legal gray area as to whether H.264′s license terms allow an independent artist to exhibit movies commercially. It’s very unlikely that the owners of the H.264 patents would ever target individuals—it would generate a lot of bad press for little profit. But until the issue’s resolved, it might be worth offering customers a WebM option as a hedge.)


Log and Transfer
Wednesday September 07th 2011, 7:55 am
Filed under: Final Cut

Up through the mid-’00s, we used to capture footage in realtime from tape. If you think about it, “capture” is an exciting and dangerous sort of word which implies you’re getting a hunting party together and it might come back empty-handed. Now we “ingest” footage by copying it in non-realtime (that is, faster than) from memory cards and hard drives…which to me suggests a much more relaxing process, one that happens after a nice meal. However, it’s a bit of an arcane process in Final Cut:

Step 1. Choose File / Log and Transfer.

Step 2. Click this obscure icon to import files.

Step 3. Click this other obscure icon to view your Preferences.

Step 4. Check to see if your camera shoots “24pa” or “24psf” video. If so, you’ll need these extra steps. (Traditional camcorders are more likely than DSLRs to have this issue.) ProRes is a good default choice for your video codec.

Step 5. Select the clips you want to transfer, or else choose Edit / Select all.

Step 6. When you’ve made your selections, choose Add Selection to Queue to begin transferring.

Step 7. Once you’ve transferred your clips, click on your Sequence.

Step 8. Go to Sequence / Settings.

Step 9. The default settings probably won’t be what you want, so click Load Sequence Preset.

Step 10. Most modern cameras in North America will work nicely with Apple ProRes 422 1920×1080 24p 48KHz as your preset.

Step 11. After loading the preset, your settings should look like this.

…and you should be ready to edit. Be sure to back up your original video files!



App Compatibility in OS X
Tuesday August 16th 2011, 12:36 pm
Filed under: OS X

Guessing which of your applications is broken following a major OS upgrade is a familiar ritual for Mac owners. Use these tools to learn what will need to be replaced:

  • If you’re upgrading to 10.6 Snow Leopard, download and run the SnowChecker utility to check whether you have any incompatible software.
  • For 10.7 Lion, use the RoaringApps wiki.


Reverse telecine 24p
Tuesday April 19th 2011, 2:36 pm
Filed under: Final Cut

In 2012, interlaced video will be 90 years old, so little wonder it’s a hard standard to completely get rid of. Many great cameras capable of shooting true 24p video still have to save the images in interlaced formats (confusingly called “24pa” or “24psf”—essentially the same thing). But this is only a temporary inconvenience; with a bit of extra effort it’s possible to perfectly reconstruct the 24p original:

Step 1. Choose File / Log and Transfer.

Step 2. Click this obscure icon to import files.

Step 3. Click this other obscure icon to view your Preferences.

Step 4. Make sure Remove Advanced Pulldown and Duplicate Frames is checked. Use ProRes as your video codec.

Step 5. Select the clips you want to transfer, or else choose Edit / Select all.

Step 6. When you’ve made your selections, choose Add Selection to Queue to begin transferring.

Step 7. Once you’ve transferred your clips, click on your Sequence.

Step 8. Go to Sequence / Settings.

Step 9. The default settings probably won’t be what you want, so click Load Sequence Preset.

Step 10. Choose Apple ProRes 422 1920×1080 24p 48KHz as your preset.

Step 11. After loading the preset, your settings should look like this.

Step 12. When you add clips to your sequence, you’ll be presented with this warning. Be sure to click no.

Step 13. The thin green bar indicates that a realtime reverse telecine effect has been applied. You should now be able to view and export clean deinterlaced frames.



Kinect and XML
Wednesday March 30th 2011, 7:55 am
Filed under: Kinect

(Check out the rest of the Kinect section for more tutorials.)

Update: Our XML-based Kinect mocap tool now has its own website, kinecttopin.com. For general Kinect setup help, look here.

To record joint data or OSC output by the OSCeleton Kinect mocap utility to an XML file, to play back the XML, or to convert it for use in After Effects, get this KinectToPin Processing sketch from GitHub.

…the practical upshot of this? You can send all your motion capture information to your XML file in realtime, without worrying about capturing your video image. Then, you can read the file back later and do all the complex rendering you want in full HD. By the way, the sketches will work with anything that uses OSC, not just a Kinect–all you have to do is modify the XML tags to suit your needs.



Installation Checklist
Wednesday March 23rd 2011, 6:22 am
Filed under: Concepts

1. Computers:

  • How many computers does this run on?
  • What are their specs (RAM, CPU, storage, graphics card, etc.)?
  • Do they need to communicate?
  • What types of network connections do they have (WiFi, gigabit ethernet, etc.)?
  • Do they need any peripherals (mouse, keyboard, Wiimote, Kinect, etc.)?
  • What types of peripheral connections do they have (USB, Firewire, Bluetooth, etc.)?

2. Media

  • What’s your storage medium (hard drive, SSD, optical disc, etc.)?
  • What’s your backup strategy (RAID, Time Machine, disk image, spare drive or optical disc, etc.)?
  • Will there be an attendant present in case the computers crash?

3. Video

  • How is the video being presented (projector, LCD, CRT, LED wall, etc.)?
  • Is there live video input?
  • Does it need to be recorded?
  • What kind of cameras are you using (webcam, industrial, DV, DSLR, etc.)?
  • Do they have a usable “live view” mode?
  • How are they connected to the computer (Firewire, USB, analog, etc.)?
  • Do they have manual focus/iris/white balance?

4. Audio

  • How is the sound being presented (built-in speakers, external amplifier, stereo, 5.1, etc.)?
  • Is there live sound input?
  • Does it need to be recorded?
  • What kind of microphones are you using (shotgun, contact, cardioid, lavalier, etc.)?
  • How are they connected to the computer (Firewire, USB, analog)?
  • Do they have manual gain control?

5. Sensors

  • What other types of live input do you need (light, temperature, vibration, tilt, acceleration, etc.)?
  • How is the data being presented?
  • Does it need to be recorded?
  • Can you find commercial products that fit your budget, or will you need to make your own sensors?
  • How are they connected to the computer (Arduino, MakingThings, serial port, etc.)?
  • What external hardware controls do you need (on/off switch, gain, threshold, status lights, etc.)?


Types of Animation
Wednesday March 23rd 2011, 6:20 am
Filed under: Concepts

1. Time-lapse: Photographing a scene only at selected intervals, creating an illusion of spontaneous change when played back. The earliest form of animation, predating live-action cinema; various playback methods for time-lapse photos had been invented by the 1830s.

2. Stop-motion: Photographing a single object and moving it while the camera is stopped. Can be done with paper cut-outs, characters with poseable armatures, or even cooperative human actors (“pixilation,” which with this spelling has nothing to do with computers, meaning “possession by evil spirits”). Quickly adopted in the 1900s for visual effects in early silent films.

3. Replacement: Replacing the object being photographed with a different object while the camera is stopped. “Classical animation” is replacement animation using pencil drawings on paper or ink on plastic cels; this technique dominated animated feature production until the late 1980s. Less common variations use photo collage or sculpture. First came into wide use with “lightning artist” vaudeville acts in the 1910s, where audiences would watch an animated film being made.

4. Rotoscoping: Using live action as a frame-by-frame reference for animation. Traditionally done by projecting film footage and tracing it, an established technique by the 1940s. “Motion capture” is a modern variant of rotoscoping, in which the analysis of movement is done with a computer instead of by hand.

5. Computer graphics: Breaking an image down into mathematical elements and manipulating the values of those elements. Usually done by representing an image as a grid of colored dots (“pixels”). Widely adopted by the 1990s; the most common form of animation in use today.



Kinect Setup Links
Wednesday March 23rd 2011, 6:02 am
Filed under: Kinect

(Check out the rest of the Kinect section for more tutorials.)

The Kinect has two basic tricks–first, it grabs a depth map, and second, it figures out the joint coordinates of your skeleton from the depth map. There are lots of easy-to-use options for the first trick, but not so many for the second. Still, even using the depth maps alone, you can track motion more effectively than with most RGB methods.


0. Easiest Installer Options (current—start here!)
There are now some easier installer options for OpenNI/NITE—I recommend starting with these! Here are:


1. Depth Maps (outdated)
On a Mac, you can get started quickly—for an initial test, you can run CocoaKinect, a small app that just displays the depth map. Then there’s a Processing library and a Max/MSP/Jitter external that work out of the box. You can also get a Quartz Composer plugin that also runs on the latest version of Isadora, but its installation is slightly more complicated. Which brings me to:

2. My Bundles (outdated)
With few exceptions, installing software for the Kinect requires getting a program from site A, a couple of drivers from sites B and C, and an installation tutorial from site D. To get you started quickly, I’ve collected all the bits you need to install:

I made these bundles for my own convenience, and they’re all almost certainly out of date as you read this, so once you have them up and running you should get updated versions of the programs and drivers from the original sources listed here.

3. Skeleton Tracking (outdated)
On Windows, there’s BrekelKinect, a slick-looking all-in-one utility that can capture depth maps and do skeleton tracking, recording joint coordinates to BVH files (usable with Maya or other 3D programs). (It might be able to communicate live with other programs, but I haven’t tested that.)

On Mac, Windows, and Linux, there’s OSCeleton, which sends joint coordinates out as OSC data. It’s quite a bit harder to set up than BrekelKinect, but it has the ability to do live skeleton tracking and pass the information on to other programs. Here’s an example receiver patch for Isadora. (A warning, you’ll need to use Terminal commands to install OSCeleton. If you’re not comfortable with that, I’d start with one of the ready-to-use alternatives and dive into OSCeleton when you have a full day to spend poking at it until it works. Here’s a Mac setup tutorial.)

Synapse (Mac/Win) is another powerful skeleton-tracking OSC app to experiment with, although it works somewhat differently than OSCeleton. It’s an all-in-one download that’s very easy to set up.

OSCulator (Mac) is an OSC routing app that can help you manage all the information that these OSC apps pump out; use this config file to get started.

4. To talk to Flash, you can use another app to get the Kinect data, analyze it, and send OSC using Flosc (Mac/Win/Linux) or Oscar (Mac). Other Flash options are AS3Kinect (Mac/Win/Linux) and Beckon (Win).