For the seventh time, ILM creates innovative CG effects for a Star Trek film, but this time the director causes a break in tradition.
images copyright 2009 Paramount Pictures Corp. Courtesy Industrial Light & Magic
Six television series, including the 1966 original spread across 23 seasons. Ten feature films. Dozens of games, hundreds of novels, and a themed attraction in Las Vegas. The Starship Enterprise. Warp drive. Vulcans. Romulans. The United Federation of Planets. Starfleet. Spock. Captain Kirk. Scotty. Uhura. “Bones” McCoy. Sulu. Holodecks. Phaser Guns. Transporters. Gene Roddenberry’s fictional Star Trek universe embedded itself deeply into the pop culture long ago.
So, when director/producer J.J. Abrams, writer and director for Mission Impossible III and executive producer for the TV series Lost and Alias, considered taking on the 11th Star Trek feature film for Paramount, he didn’t follow the Star Trek tradition of going where no man has gone before. Instead, the 11th Star Trek takes audiences to a frontier the franchise characters have visited before, but we haven’t—to the beginning, the time before the first episode in 1966.
The only actor from the earlier series and films to star in the Abrams prequel is Leonard Nimoy as “old Spock.” Twenty-nine-year-old Chris Pine plays James T. Kirk, 32-year-old Zachary Quinto is Spock, and the remaining members of the
Enterprise crew are actors in a similar, under-40 age range.
“J.J. didn’t want to make a movie for just the fan base,” says Industrial Light & Magic’s Roger Guyett, overall visual effects supervisor and second unit director. Guyett had worked with Abrams as visual effects supervisor for Mission Impossible III, so he was the obvious go-to person for Abrams when the director/producer became concerned about creating blockbuster visual effects on a tight budget.
“Paramount had a perception about what the box office would do based on previous Star Trek films,” Guyett says. (Star Trek: Nemesis, the most recent film in the franchise until now, generated only $18.5 million on opening weekend in December 2002, and limped to a less-than-stellar final box office total of $43 million.) “But, when I read the script,” Guyett continues, “I thought, this is a very different kind of movie than the traditional Star Trek movie. And, as Paramount became convinced that J.J. would make an energetic, exciting film, the budget grew.” The studio also optimistically moved the release date from Christmas to the leading edge of the summer blockbuster season.
ILM, which had created visual effects for six of the previous 10 Star Trek features, produced 850 of the approximately 1000 shots in this film, with Digital Domain, Evil Eye Pictures, Lola Visual Effects, and Svengali supplying additional digital effects.
Live Long and Prosper
The first Star Trek film for which ILM created visual effects was Star Trek: The Wrath of Khan in 1982, and to bring a dead planet to life for that feature, Lucasfilm’s computer division, which would spin off to become Pixar Animation Studios, created the now-famous “Genesis Effect.” It marked the first use of fractals, particle effects, and a 32-bit RGBA paint system in a feature film.
(Top) ILM's viewplatters produced shot-specific textures for close-ups of the six-mile long Nevada, but one well-lit map created the nightmarish vision when the entire ship was visible. (Bottom) modelers sculpted a streamlineed version of the Enterprise with great care and respect for tradition.
For the latest trek into the final frontier, Lucasfilm’s ILM also used particles, but in combination with state-of-the-art simulation systems, to destroy two planets for this film. In addition to building and demolishing the planets, ILM created the Starship Enterprise and the other space vehicles, which were always CG, a mining platform and other objects, two creatures, digital matte paintings, and animatics and previs for many of the shots.
“Working on this film is my proudest moment here at the company, guaranteed,” says Bruce Holcomb, model supervisor, who has built CG models at ILM for nine years. “We put a lot of love and a lot of effort into this film. We wanted to salute 30 years of history.”
The two biggest challenges for Holcomb and the modeling department were the two biggest ships, the 2000-foot Enterprise and the villainous Narada, a six-mile-long Romulan mining vessel. “We started with concept drawings from Ryan Church that didn’t have a lot of technical information,” Holcomb says. “They let us [the modelers] add that.” And they certainly did: Some parts of the final models ILM created for the Enterprise and the Narada were so detailed the crew didn’t need to subdivide the polymesh before rendering.
The goal for modelers working on the Enterprise was to pilot the ship into the 21st century without changing the look too much. “It was the best exercise I’ve been involved in,” Holcomb says. “The Enterprise is a sacred thing for this company, so we wanted to be sure we thought of everything. I have never spent so much time and love on anything.”
Working from the concept art and from reference footage, the modelers began fleshing out the parts starting with the saucer section, which Rene Garcia built to match that in previous films. Then, the team began the subtle process of streamlining the ship—stretching it from the original 1300 feet to 2000, making the neck a bit thinner in areas, and otherwise slightly changing the proportions to make it look grander. Although the modelers built the ship using polymeshes in Autodesk’s Maya and ILM’s Zeno software, Holcomb helped visualize these changes using NURBS in Autodesk’s AliasStudio.
“I created design curves to give a directional sense,” Holcomb says. “AliasStudio lets you streamline stuff you can really see, like the beautiful moment when the Enterprise comes by the camera and you can’t tell what it is, but it feels graceful.”
The design process for the Enterprise continued for four months. “It wasn’t one of those things where you build the model and that’s it,” Holcomb says. “We were looking at it as days went on, looking at it and caring about it. John Goodson, one of the viewpainters [texture painters], was a model maker in the practical model shop for three of the earlier films. We’d gather around his desk and say, ‘This could be thinner. Let’s try that.’ The look changed depending on the camera lens and how close it was to camera, and every time we saw it, we would tug on something.”
The other main ship, the Romulan mining vessel, is asymmetrical, mysterious, and a big contrast to the Enterprise. “The general sense of this ship is that it’s a nightmare,” Holcomb says. At six miles long, a huge nightmare.
“Selling the scale of that ship was challenging,” Holcomb says. “We always had to do something to make it look that big in CG—details on the windows, vent work. Any time you see the ship in its entirety, it has one paint job. We let lighting speak for it. But, when the camera is close, we painted it on a shot-specific basis.”
Modelers also created Spock’s little ship, which has interdependent flanges that rotate clockwise and counterclockwise, and an early version of a Starfleet vessel named the Kelvin, which appears early in the film. Although the Kelvin has only a saucer and engineering pod, it was a particularly intense project for the modelers because the ship breaks apart over a span of six minutes. “We designed six different stages of damage,” Holcomb says. “And the damage included a lot of inner bodywork to give it scale. We didn’t go down to chairs and bathroom stalls, but we did beams and girders.”
Also, at one point in the film, the Enterprise enters a debris field made of parts of damaged ships from a small fleet. To create the Armstrong, Newton, Defiant, Excelsior, and Mayflower, modelers used five different designs, but borrowed paint jobs, that is, texture maps, that they had scaled down from the Enterprise and the Kelvin.
Modelers did not create the damaged planets, however. To demolish these CG objects, the crew used simulation software. One planet implodes because of black-hole forces. The second succumbs to a pressure wave when the sun goes supernova.
ILM maqtched a small wedge of this drilling platform, built in the parking lot of Los Angeles' Dodger Stadium, with computer graphics, completed the pie-shaped platform, and then surrounded it with a completely digital environment.
The Trouble with Rubble
The big issue in destroying something as huge as a planet is generating broken bits in many scales—from the entire globe, through continents, boulders, and, finally, grains of sand.
To break apart the planet itself, ILM developed a new tool called Fracture, a simulation system built on the studio’s Physbam libraries, which more typically create fluid simulations. “It’s a one-frame simulation,” explains CG supervisor Joakim Arnesson of Fracture. “You run a simulation on one frame. It breaks something apart and calculates the stress and fracture patterns. And then you have geometry you can run a sim on.”
Here’s how it works: First, the technical directors convert a sphere into a volume; that is, create a voxel grid. Then they sow random seed points—Voronoi noise—mathematically inside the volume, or some part of the volume, randomly or by using a field. Each seed point defines the center of a cell.
“The seed points become a cell structure based on forces,” Arnesson says. “When you run a stress test on the volume, you get a pattern that looks sort of like broken rocks. The structure fractures along these cell noise lines.”
The result is 3D cells, randomly shaped and sized depending on how many seeds the TDs put into the volume and how they distributed the seeds. “We could stretch the cell noise and control where we had more seeds,” Arnesson says. “If we wanted small pieces, we used more seeds.”
Then, to move these pieces, the TDs pushed the cells along the pipeline to the creature dev department, where a crew applied forces in a rigid-body simulation system. “The cells moved based on weight, mass, and forces, and collided with each other,” Arnesson says.
As these chunks of the planet collided, they needed to break apart, and smaller pieces needed to crumble off the larger ones. “We can simulate between 500 and 10,000 pieces with the rigid-body simulator,” Arnesson says. “So, to simulate hundreds and millions of rigid bodies, we used particle systems procedurally tied into the rigid-body simulation.”
To create the dust, sand, and other small pieces, the creature dev artists sent key timing data provided by the rigid-body simulation, and the planet chunks, back to the technical directors for the particle simulation. “The rigid-body simulator can record the stress factor on a per-CV or per-face basis,” Arnesson says. “The particle engine can look at that data and know a particular face was triggered by a break-off, so it’s time to emit particles from that face.”
Arnesson also singles out two other computer graphics innovations at the studio for this show. One is a volume shader for Pixar’s RenderMan that the TDs can drive from a field, and which they used primarily for explosions. “We’ve had similar shaders before,” Arnesson says. “But this is a step better and definitely faster; RenderMan opens Zeno and evaluates a field to render a volume.”
The second is a new, procedural RenderMan plug-in based on a data-flow graph. Created by Dan Piponi, the new node-based interactive system gives the technical directors more flexibility than before. “We can input particles, do operations on them, and, at the end, render them as points, curves, spheres, or blobs,” Arnesson says. “Or, for example, we can do something different for some of the particles based on distance. And, we can see and adjust the results very easily.”
The TDs used the system to generate a black hole by having it render curves that follow a particle simulation. Similarly, by using the particle renderer, they procedurally built a plasma drill that stretches from a CG mining platform in space to the surface of the Vulcan planet 10,000 feet below.
The actors playing Kirk, Sulu, and Olson space-jumped onto the digital drill platform by skydiving on wires in a greenscreen set, and then ILM sent the digital doubles down the digital umbilical cord toward the planet.
Paul Kavanagh led the team of animators who performed the digital doubles for that shot, brought creatures, spaceships, and other objects to life, and, in an unusual move, also created previs and animatics for many of the shots.
As often happens in Star Trek movies, Captain Kirk finds himself on a planet with a strange and threatening alien life-form, and for this film, the animators performed two such creatures. One is a cross between a polar bear and a gorilla that they dubbed “Polarilla,” and the other, “Big Red,” looks like a giant lobster with huge jaws and 120 eyes on the back of its neck. Big Red tumbles down an icy slope while chasing after Captain Kirk. “We looked at the way the stuntman fell down the slope and had the creature fall with the same difficulty,” Kavanagh says.
Most of the animators’ work, however, takes place in outer space, and many of those shots are all-CG. Usually, ILM receives plates—the filmed footage—from which the layout department derives camera moves and sends matching 3D scenes to the animation department. Animators then create the shots based on animatics—previsualizations provided by the studio. For all-CG shots, the layout department creates the camera moves based on the previs.
But that changed for this film. “J.J. [Abrams] changed so many sequences after the edit that the previs was pretty much thrown out,” Kavanagh says. “The previs guys were gone. The editors just inserted black cards with J.J.’s description of what he wanted the shot to be when he wanted to tell the story in a slightly different way.” Thus, to create shots that matched Abrams’ new ideas, ILM needed to create both the previs and the final shots.
(Top) ILM worked closely with director J.J. Abrams to update the transporter effect and retain its previous appeal. (Bottom) Production designer Scott Chamblis moved the Enterprise bridge into the future. There, ILM floated data on the windows rather having it display on viewscreens, as in the past.
Fortunately, the team had decided early on to make the production more efficient by eliminating the usual back and forth between animation and layout. Instead, they had created one department and filled it with people who could do both animation and layout. Thus, when they discovered that black cards had replaced many of the shots they had already animated using the previs, the team was better able to adjust to an empty space with no plate, no camera, and no previs than they might have been before.
For many of the sequences, the new combined animation/layout team, which grew to 18 people, had to redo about half the shots, and they created 70 shots in the end sequence entirely from scratch. “We’d assign people a few shots in a row so they could take on a little mini-sequence,” Kavanagh says. “They would pump out an animatic with a new camera and a new location that the editors could cut into the footage.” When the animators received feedback from Abrams, they could quickly finish the shots.
“Because we had done the previs work in our pipeline, we could add the nuances and hand the shots to a TD to start rendering,” Kavanagh explains. “It took a lot of effort to do the shot from scratch, but the payoff was that the shots were almost ready to go.”
Beam Me Up
The biggest payoff, however, was in giving Abrams the flexibility to exercise his imagination after postproduction was under way.
ILM followed its tradition of inventing new CG technology to create the effects. And, in changing the studio’s normal processes by creating a streamlined animation/layout department, the visual effects crew made it possible for the director to use CG in the best way to tell a tradition-breaking story that’s set to break box-office records.
As Scotty says in the film, “I like this ship! It’s exciting!”
Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at