I was recently asked to do a presentation on the topic of the Digital Arts for Penn State's Take Our Daughters And Sons To Work Day. Part of it was a PowerPoint presentation and a talk, but I didn't think it would be right to lecture about three-dimensional modeling and animation without an accompanying demonstration. So I resolved to create a short animated sequence to illustrate the Digital Arts in action.
In it, I decided to have the Tin Man of Oz (a fellow who's very amiably in the public domain) give part of my talk for me, accompanied by some digital special effects. Since his segment of my presentation needed to be only a few minutes, I imagined, in my woeful naïveté, that I could whip out my little animated masterpiece in a couple of weeks, with time to spare.
I got my animated presentation done in time, but it wasn't quite the trivial walk in the park I'd envisioned. Learn from my mistakes, O overconfident sailor into the stormy seas of Maya, AfterEffects and Final Cut! Press on, and gain wisdom!
What made me so sure that this project would be an easy one was that I already had an animated Maya model of the Tin Man on hand. I'd created him almost a year ago as an exercise in modeling and animation of a humanoid figure, and there he was, already articulated, skeletonized, modeled and fitted with render parameters (Figure 1):
Since the Tin Man was to actually deliver his portion of the talk, I had to give him a voice, and synchronize it with his mouth. First step, of course, was to write a script, so I'd know exactly what His Tinship was going to say.
In the past, I've made extensive use of the freeware Audacity sound editor. It's definitely a good piece of software, but alas, it doesn't (yet) run on Intel-based Macs, so I used Apple's Soundtrack Pro sound editor for the Tin Man's voice. Reading from the script, I recorded my own voice (putting on my best attempt at a fussy English butler's voice) speaking the lines, then used Soundtrack Pro (Figure 2) to raise the pitch to a twittering tenor. The result was saved as a .wav audio file.
Now came the hard part: synchronizing the Tin Man's movable jaw to the spoken words. I did this by moving the Soundtrack Pro timeline to the beginning of each word, noting the time count at which it occurred (in seconds), and then multiplying it by 24. Why 24? Because the standard frame rate for film is 24 frames per second, which allowed me to use the Maya timeline to see at exactly what frame each mouth movement should begin and end. This was a tedious process, but by making sure the Tin Man always used short sentences I was able to break each segment of synchronized speech into small enough units to keep things manageable in my already-shrinking preparation time.
After synchronizing each Maya segment with the audio recording, I then had to render each one. This is always the bottleneck for any animation, and when you're doing sequential sequences (or overlapping ones, as I was), you can?t start on the next one until the present one is finished. Even the shortest sequences required several hours to render, and the longer ones I let run overnight.
Once a sequence is rendered, it emerges from Maya (or any other rendering engine, such as RenderMan) as a series of .IFF files, one for each frame. IFF (Interchange File Format) is a file format commonly used in the Maya world for output images, each one of which has a .IFF file extension. Each .IFF is a single animation frame with an empty alpha channel, and the sum of the IFF files for each animation segment (they're sequentially numbered) is an iff sequence.
What do you do with all those iff sequences? You pull them into a compositor, such as Adobe AfterEffects, which is what I used (Figure 3).
This example is from the final sequence of the animation, a fairly complex one. In this sequence, the Tin Man finds himself transported to the deck of an enchanted flying cloud-ship, with a figure imported from Poser as pilot. In this scene, the Tin Man reacts, there are some special lighting effects around him, and the camera pulls back to watch the ship disappear into the distance in the night sky.
Below the AfterEffects viewer window, you'll see a timeline with multiple colored bars. Each one is an image object of some sort, and each object forms a layer, like a Photoshop layer, only animated. In this case, the animated iff sequence was layered on top of a starfield background and a NASA image of the moon. The lightning effects surrounding the Tin Man on his arrival only appear for a few seconds, so their timelines are short.
In other sequences, I had the Tin Man speaking in front of a series of PowerPoint slides, and these were also brought into AfterEffects as background layers. Each layer has a set of characteristics such as position, scale and transparency, all of which can be animated to make even still images do eye-catching animated things.
AfterEffects will import Photoshop files, either as a single merged image or as individual Photoshop layers. In this way, I was able to make a set of end credits that enumerated each piece of software that had been used to create the animated sequence. In Photoshop, I created an image the same size as the animation frame, completely black. Then I added a new layer for each line of yellow text, until I had a multilayered Photoshop .psd file. Each layer was then imported into AfterEffects individually, and the Opacity aspect of the layers was animated, so that each line gradually faded in, one after the other. A very simple technique, but one that looks good on-screen.
Each one of these composited scenes then had to be rendered (not the same thing as image rendering, a la Maya) after creation, in this case into QuickTime clips. Fortunately for me, these renderings are a matter of minutes, not hours.
Okay, now here I was with a pile of compiled QuickTime scenes, a dozen or so audio clips, and still no movie. And time was running low. I had to get this sequence completed, pronto.
All of these disparate components of what would become a completed short film are spliced together with an editor. The editor I used was Apple's Final Cut Pro, software that gets used (as do Maya and AfterEffects) even at the highest levels of the movie industry. Here's how it looks (Figure 4):
Again, we have the timeline paradigm, in which every component is represented by a colored line along a time scale. But this time we're not compositing visual layers, we're stitching individual clips (QuickTime, in this case) together. And between each clip we can add video transitions, where needed (cross-fades, clock dissolves, etc.). We can also add the audio tracks I recorded as well, and you can see them as the green stripes in the timeline. Here was where I moved them about to synchronize them with the movements of the Tin Man's hinged jaw.
Once I set the output parameters for the final QuickTime movie that would be made from all these components, I had to render them. Again, this is not to be confused with image rendering, such as is done in Maya. A Final Cut render takes all the edited and positioned scenes of your movie and merges them, together with the audio track, transitions and any added video effects into a single QuickTime movie that is the final product. And although it isn't as lengthy a process as a Maya animated image render, it isn't quick, either. My demonstration movie is a little over two minutes long, and the render time approached one hour. This will, of course, vary with the size and complexity of the movie being rendered.
I'd like to be able to report that this project went as smoothly as this brief article makes it sound, but the truth is far less tidy. Sound clips had to be recorded and re-recorded to remove one flaw or another; Maya animations that had already been rendered had to be re-done to fix obscure rendering problems (one of the Tin Man's legs inexplicably rendered itself sticking up beside his shoulder!); minute errors in pivot points in the Tin Man's joints suddenly magnified themselves and had to be corrected; AfterEffects layers had to be delicately adjusted for animated characteristics; sound had to be painstakingly synched to video in Final Cut; render jobs had to be run over and over in both AfterEffects and Final Cut. And, at the very last minute, I discovered to my very great annoyance that I'd rendered my demo QuickTime movie at a resolution and size that was far too large for my laptop to display smoothly. After a couple of hours of frantic late-night tweaking that only created more problems, I finally threw up my hands and said "Aw, heck!" Or something like that, anyway. And, with the sands in the hourglass rapidly dribbling away, I took the brute-force fix and dragged my G5 workstation to the presentation room so I could run the movie on it next morning, piping the output to an overhead projector.
Total time for this project from start to end: two weeks, more or less. Also known as "Just under the wire."
The moral of this story is that an animator should always assume that everything is going to take at least twice as long as it actually seems that it should, and budget the available time accordingly. There are a lot of hidden and unexpected things that can go wrong, and even if you have a good working knowledge of all the software (I did), there's always some new quirk that pops up when you need it the least. Well, if this sort of thing was easy, Hollywood animators wouldn't get paid such big bucks, would they?
Hmmm...big bucks. Now, where did I put George Lucas' phone number...?