Last time around, this series took a peek behind closed doors at Hollywood's dirty little secrets (sounds juicy that way, doesn't it?) by investigating the process by which movie animations are created. But as we all know, there's more to generating a believable virtual object than simply making it move; it has to look right, too. And rendering is a major part of that process.1. What are we talking about?
"Rendering" is one of those terms that gets tossed around a lot, without being generally well understood. For the purposes of this article, we're going to define Rendering as the simulation, within a virtual environment, of the effects of real light falling upon a real object. Or, to put it another way, making computer-generated objects look as if they were real objects.
Consider your own hand, for example. It looks very different when held up to a window, when backlit by firelight or when held under water. It's still the same hand, of course, but the lighting effects are drastically different. And it would look very much more different still if it were made of aluminum or sawdust or plastic. Having seen these effects every day of our lives, we take them for granted-so much so that when we don't see them in a computer-created object, we know instantly that something's wrong.
Simulating these various-and constantly changing-shadows and reflections is an extremely complex operation that requires a lot of CPU cycles from an animation workstation, which is why CGI (Computer-Generated Imagery) is such a recent phenomenon.
Faithful readers will remember this beast (Figure 1) as the Martian War Machine from part one of this series. It's standing over the edge of a crater here, and ought to be bathed in firelight from the burning Martian projectile ship beneath. Yet obviously it isn't.
Notice, by the way, in this Maya screen shot, that there is already a bit of pseudo-rendering in the image. The upper surfaces of the metal War Machine are light, while the lower ones are darker, as if there were a weak light shining from above. That's because Maya puts a dim default lighting into its workspace, just to help the user get a feel for the three-dimensional relationships among the onscreen objects. It has nothing to do with the lighting on the final rendered output image. (The greenish blob in the lower center of the frame, by the way, is how flames are represented on-screen in the Maya modeling phase).
Step one is to decide upon the characteristics of the animated object. Is it made of plastic? Metal? Glass? What color is it? Transparent? Translucent? Does it have a smooth and shiny surface, or is it dull? (Of course many of the movies produced with CGI are very dull indeed, but that's outside the scope of this article).
In Figure 2, we can see how this is done in Maya. This is for the control cab, or "hood" of the War Machine, and the idea is to give it the look of shiny plastic or painted metal. So first, we choose a material type of Phong (named for its inventor, Phong Biu-Tuong, for you CGI trivia buffs). Phong and the similar Phong E are material types with high specularity, or reflectiveness, and will suit the Martian quite well. I chose a white coloring for the surface, to give it a glossy, businesslike look. Note that there are many characteristics of our Phong surface that are adjustable with slider bars. Since this is a fairly simple surface, most of these are left at the defaults, except for the specular color (white) and the reflectivity, which I boosted to a fairly high .802 value.
Step two is to define some lights for our scene. In Maya (as in many other 3D modeling packages), a light is just another object, with characteristics such as direction, diffusion, intensity, etc.
Those who know anything about portrait photography will already be aware that the most common lighting setup is to use a relatively bright key light above and a bit to one side of the subject, and a less bright fill light low and on the opposite side. And in many instances, some variation on this basic scheme will work well in a virtual environment. But nature is seldom so neatly arranged, and since we're trying to simulate nature here, we're forced to be a bit more creative. For the crater from which the Martian emerges, I needed the wavering glow of a fire, plus some flickering bright-violet lights that would simulate the light of welding torches, all to give the War Machine a suitably sinister entrance.
The fire itself gave some glow, but I found that it wasn't enough to bring out the detail of the towering alien device, so I created a point-source orange light deep in the crater and out of sight (Figure 3).
It took a bit of fiddling to get the intensity just right, but after setting the color and positioning it for maximum effect, this was a pretty simple job.
Somewhat more difficult were the arc-welding lights I wanted. To get the appropriate flicker effect, I created two more point lights in the crater, this time a harsh, bright purple. Then I created two small opaque half-cylinder objects and animated them to revolve rapidly around the two lights like shutters, only at an irregular rate. Perfect. Now only one part remained: The crater wall over which the Martian would step to begin its career of havoc upon unsuspecting humanity. This was created from a NURBS (NonRational Uniform B-spline, remember?) plane that I sculpted into a rough crater shape. Obviously, a mound of earth isn't going to be shiny and white, so it had to be a dull, flat surface that would resemble heaped clay.
This was done by assigning a Lambert material to the crater; the Lambert type is suited to dull, minimally-reflective surfaces (Figure 4). I colored it a dark clay-brown, then applied a bump mapping texture map to it. "Bump mapping" simply means applying an image file of grayscale pixels (the "texture map") to an object, to give it apparent texture without actually altering the object itself. Human skin is usually given texture with an appropriate texture map, as is wood, brick or any other non-smooth object.
All of these "materials" and maps, along with their respective characteristics, are properly called shaders, and the combination of them into daisy-chains of surface-modifying properties are called shader networks. Here's a sample (this one from Curious Labs' Poser software) of a very simple shader network, used to define the characteristics of a character's green eyes (Figure 5). It graphically shows how each shader node (set of properties) links to another, to provide the final effect.
And what about the hellish flames licking cruelly up from between the alien invader's all-destroying feet? (A Martian War Machine should always be described with as many lurid pulp-magazine adjectives as possible). They were created with Maya's Fire Effect, which is actually a type of particle effect that glows and whose motion characteristics can be manipulated to simulate the appearance of real-world flames. It's not easy to work with, and takes some getting used to, but the results can be quite effective. Figure 6 shows the settings I used for the flames beneath the War Machine.
Okay, so now we've got our animated Martian War Machine (from Part 1), our lighting and our surface textures and colors. Since Maya's modeling window doesn't show any shading or lighting effects the way they'll appear in the final render (for which CPUs the world over are deeply grateful), it's now time to hit the Render button and see what our scene looks like (Figure 7).
Not too bad for a first attempt. Of course, the flickering lights from the busy Martian arc-welders don't show up in a still photo, and the flames are pretty dim against that ugly blank background, but there's a definite improvement already.4. Time Out
So much for rendering a single frame. But animations are normally done at 24 frames per second, which means that a mere 30-second animation is going to require 720 separate images to be rendered. Just how long this takes will depend upon the speed of your machine, but even a relatively simple animated object like the Martian War Machine is going to keep Mr. CPU mighty busy for a long time to render an animation of any size, using Maya's built-in rendering engine. And for really complex scenes, the problem multiplies exponentially. As an example, I have created a fairly complicated Maya scene that requires a little over four minutes to render a single preview frame at production-level definition, using the Maya native renderer. At four minutes per frame and 24 frames per second, that's 96 minutes (a.k.a. 1.6 hours) to generate a single second of on-screen animation! With both processors puffing and straining, my workstation is clearly not going to be much good for anything else during a render of any real length. How does Hollywood-where time is big money-handle this situation?
Well, the industrial-strength high-end renderer of choice for most major animation studios is RenderMan, a product of Pixar (whose CEO just happens to be the same nice fellow who runs Apple Computer). Using RenderMan, an animation job can be submitted to remote render servers on a network, so that processing all those frames doesn't tie up your local machine. That way, animators can move on to something else while their most current sequences are being rendered, instead of falling asleep at their consoles over half-eaten bags of corn chips and stale soft drinks. A collection of such networked remote render servers is called a render farm (or as some of the folks at WETA Digital like to say, a "render wall"). Without render farms, big-budget productions like Finding Nemo or Lord of the Rings would be stupendously time-consuming and difficult-instead of just horribly time-consuming and difficult.
Since Maya is the most popular animator/modeler and RenderMan is the most popular renderer, Pixar produces a simple plug-in (called MTOR) for Maya that allows a single frame or an entire animated sequence to be directly batch-submitted to a render farm, freeing up the local workstation for other tasks. The plug-in is currently only available for Maya, but the RenderMan Artist Tools toolkit is capable of processing a RIB (Renderman Interface Bytestream) file generated by any imaging software.
Note, by the way, that, respected and popular as it is, RenderMan definitely isn't the only product that can be used to do remote rendering or construct an animation render farm by any means. There are certainly others, most of which adhere to the RenderMan standard ("RenderMan" is both a suite of Pixar software products and a set of technical specifications defining the interaction of modeling and rendering programs). But insofar as there can be said to be anything approaching an "industry standard" renderer in the high-end digital effects world, RenderMan is it.Coming Attractions
Well, here we are with our rendered, illuminated and textured Martian War Machine, standing embarrassingly unimpressive against a stark and featureless white sky. Obviously we need more before this scene is suitable for showing to even the most favorably-disposed audience! So what do we do now? Tune in again next issue for our exciting conclusion, Part Three: Compositing and Editing.
Definitions of "rendering":
An animator's resume (for those considering doing this professionally):
The RenderMan Interface Specification:
The Alias (Maya) home page (Alias was formerly a division of SGI, but is now an independently-owned software company):
Some interesting sites for 3D artists, both animated and still:
All the Maya screenshots for this series were taken from Maya Unlimited, V5 and V6. Maya Unlimited runs on Irix, Windows XP, Red Hat Linux and Apple OS X. For details, check the Alias website above.