Large As Life
Issue: Volume: 31 Issue: 1 (Jan. 2008)

Large As Life

Gorgosaurus. Pteranodon. Tylosaur. Styxosaurus. Doli­chorynchops. These monikers are as large as the creatures they belong to. And these are but a few of the massive beasts that ruled the land and seas more than 82 million years ago. During the late Cretaceous period, size did matter. But then again, nearly all the creatures at the time were supersized. So, it should come as no surprise that when National Geographic decided to bring more than 30 species of these creatures to life for the documentary Sea Monsters: A Prehistoric Adventure, it did so in a big way, via 3D IMAX.

The large-format stereoscopic film takes viewers on a journey deep in time, when the middle of North America was submerged beneath a sea inhabited by these otherworldly creatures. Directed by Sean McLeod Phillips and produced by Jini Durr, Lisa Truitt, and Jack Geist, the story follows a curious dolichorynchops and its family as they traverse the most dangerous waters to ever exist.

These 10- to 25-foot-long beasts cruised through the shallow Cretaceous oceans with long paddle-flippers that propelled them through the water. Their long snouts and 30 to 40 thin, sharp teeth enabled them to grab their dinner—fish, squid, and small animals—and swallow their catch whole. No matter how frightening the dolichorynchops may appear today, back in its time period, this dinosaur lived at the bottom of the prehistoric food chain.

The dolichorynchops truly was part of a lost world. Until 2005, only one species, D. osborni, belonged to the genus dolichorynchops. But a happenstance discovery in Saskatchewan, Canada, led to the addition of D. herschelensis to this family of polycotylid plesiosaur. And it is this chance discovery that helps set the film’s story in motion.


CineVision Visual Effects, a small VFX facility in London, created big CG models, including these prehistoric fish, for the stereo IMAX film Sea Monsters.

How do you bring to life a creature that has been extinct for millions of years and whose life scientists can only piece together based on fossil information? For National Geographic, the only choice was with “nearly larger than life” (these were giants, after all) photorealistic stereoscopic 3D containing live-action elements and stunning CGI. Assisting National Geographic on this quest was CineVision Visual Effects and DamnFX, both of which provided a substantial portion of the CGI, with assists from FloqFX and Sassoon Film Design.

Under the supervision of Robin Aristorenas, the small London facility CineVision handled a number of compelling sequences totaling 75 shots, most of which occur underwater as well as some on land. The aboveground shots featured a 30-foot-tall gorgosaurus (a relative of tyrannosaurus) and a family of pteranodons, featherless flying dinosaurs with a 25-foot wingspan. But, as the title of the film suggests, the majority of the action takes place in the water, where CineVision produced most of its shots, ranging from deep-water environments for a big tylosaur fight sequence to clearer, warmer, shallow-water sequences featuring the dolichorynchops, styxosaurus, and a variety of prehistoric fish.


A modest-size facility, DamnFX in Montreal was tasked with crafting a number of ancient species, including the stars of the 3D movie, the dolichorynchops.

The larger DamnFX in Montreal, meanwhile, delivered 80-plus shots comprising character animation and visual effects. These included 60 different creatures, as well as the necessary water interactions, bubbles, floating plankton and particulates, and interactive lighting and light rays in the shots. Most of DamnFX’s shots, like those of CineVision, were ocean-based.

Theory Comes to Life
Sea Monsters weaves together various paleontological digs from around the globe in a compelling story about scientists as they try to answer questions using information, obtained from excavations, about this ancient ocean world. Because this was a show for National Geographic, the content had to be entertaining as well as scientifically accurate—no easy feat considering the artists had no living counterparts to use as reference. “When we first started, we did a lot of our own research for the creatures, but we found that often led us in the wrong direction,” says Aristorenas. “The experts provided us with lots of information, and it turned out for the best that we stayed with their references and information.”

The experts—a panel of paleontologists—availed themselves for questions the artists at the VFX facilities had regarding these dinosaurs during the initial modeling and animation stages. They provided data on the theorized joint limitations and limb motion deduced from the fossils. Then, the models and animations underwent a lengthy approval process at each interval, which could slow the pipeline, as the scientists sometimes had different opinions about various aspects of the creatures, Aristorenas notes.

Also complicating this process was the need for the groups to tell an interesting, dramatic story. At times, making the dinosaurs appear interesting on the screen meant taking the joints a little past their scientific limits. “Getting ‘character’ to show through the creatures without them becoming too cartoony was a constant tightrope walk for us,” says Derek Wentworth, VFX supervisor at DamnFX. “You kind of want to make them ‘cute’ and expressive at times, but you have to keep your thumb on the whole anthropomorphic drive. The director zeroed in on this aspect often, so we never strayed too far.”

Of all the creatures that CineVision created, the most difficult was the tylosaur. It was the first the group tackled, and it took the longest time to complete due to the number of iterations requested by the scientists.

For modeling and detailing the beasts, CineVision used an Autodesk Maya and Pixologic Zbrush pipeline. “From the concept drawings, the low-res basic shape would take form, and once that was approved, we UV-mapped it and then imported it into Zbrush, where the details were added and sculpted in,” explains Julian Johnson Mortimer, character concept and creature modeling supervisor at CineVision. He notes that Zbrush proved especially useful during the approvals for the high-res sculpted versions, as the tool maintains all the levels of detail for the creature. So during reviews, it became much easier to make the necessary changes to the basic shape of the creature while maintaining the finest level of detail, thereby saving valuable time.

And detail was extremely important in the IMAX environment. Thus, when the high-res model was approved, it was subsequently broken up into smaller pieces and taken back into Zbrush so another level of detail could be added. Displacement maps were then generated from the high-res model parts and applied to a complete base mesh in Maya for rendering in Pixar’s RenderMan.


CineVision used an Autodesk Maya- and Pixologic Zbrush-based pipeline for its work. The water was captured in camera, with the artists adding CG particulates and more.

The CineVision group also used Maya for the creature animations. Then, a physics-based skeleton developed by technical director Julian Mann was added to simulate the secondary motions in real time, thereby saving time and making the creature movement more realistic. So, after applying just a few path keyframes, the creatures exhibited realistic behaviors, allowing the artists to turn their attention to the overall choreography.

DamnFX, meanwhile, created the stars, the lovable dolichoryn­chops, as well as the giant killer fish xiphactinus, a giant squid, some shelled creatures, the platecarpus, all the sharks, the swordfish-like protosphyraena, a giant turtle, several smaller fish, and more. In all, DamnFX created approximately 16 different species and their iterations using Maya, Zbrush, and Maxon’s BodyPaint. “Each creature was challenging; we had to be not only mindful of their paleontological accuracy, but assure that they fulfilled their dramatic niche,” notes Wentworth.

All DamnFX’s animations were done in Maya and then rendered in Mental Images’ Mental Ray with a custom shader that likewise split the render layers so they could be handled during comp within Eyeon’s Digital Fusion.

Seascape
While the dinosaurs obviously were not real, the water, on the other hand, was produced in camera. As a result, the VFX facilities had to create more environmental effects, as opposed to environments, for the movie. Filming occurred mostly in the Caribbean, whose clear water made it easier for the compositors to integrate the CG elements into the shots. Nevertheless, the live-action water surface did prove problematic for all the artists.

“There were sweeping camera moves in a few shots, and tracking the choppy surface was nearly impossible,” Mann points out. “When the creatures were close to the camera, it didn’t matter much if the move was interpreted as a track or a nodal pan. But when we added CG elements—bubbles, plankton, interactive light rays—that filled the space right up to the surface, it became obvious where the camera really was. Even though the water was moving, any slight slip in the track became obvious.” As a result, in one particular shot, CineVision had to replace the plate with a CG-generated water surface.

At DamnFX, dealing with stereo under­water likewise proved challenging. “Depth of field and environmental murk are easy to handle and cheat when you are working in mono footage, but once you step into stereo, you can no longer hide where something is in terms of depth because the viewer knows exactly where it is,” says Wentworth. “So, all your settings for depth of field and murk must be spot on.”

Furthermore, the groups had to consider the refractive index of water when calculating the lens values of the CG cameras with those of the live cameras, as objects appeared to have been shot with a longer lens. Yet, the under­water environment also could be forgiving. For instance, the animators did not have to contend with the effects of gravity in their underwater animations. They did, however, have to consider drag, particularly with the tylosaur animations. And if the animation had not followed or matched the rules of nature, the audience would have noticed, concludes Mann.

A utility created by CineVision TD Zhang Jian generated caustic maps to simulate underwater lighting. He also created a GPU-based fluid simulation plug-in for Maya that could drive particles moving like squid ink in the water, as well as  multipass output shaders that CineVision used on various creatures. “The fluid generator could handle high-grid resolutions, such as 128x128x64, working several times faster than the Maya built-in 3D fluid solver because it took advantage of graphics hardware,” he explains. The resulting fluid was used as a dynamic field to blow hundreds of thousands of particles in 3D for creating motion of blood and bubbles in the water.

DamnFX also found the lighting of the CG and live-action water tricky when matching caustic speeds with practical plates. There were some shadow-casting issues as well. To combat a number of these problems, DamnFX created an image-based custom light rig that became the group’s starting base for all its shots. “It enabled us to get a quick result that was consistent with the plate by taking on color and light levels extracted from the plate itself,” Wentworth explains.

Baiting the Scene
In addition to creating extremely large creatures, the artists had a smaller role, at least in terms of the fish size: to create schools of smaller prehistoric fish that appear in nearly every underwater shot. At CineVision, this was done using an artificial-life simulation, created by Mann, whereby each fish was outfitted with a preset rig and made aware of certain environmental factors, the location of the predators, and the location of the other fish in the school. In the end, their movements were driven by their individual fears and desires.


DamnFX TDs created flocking tools to generate schools of fish.

Just like a typical simulation, the fish received signals that told them how to align their speed with that of their neighbors, how to avoid collisions, and so forth, and they acted on the most important cues. Also factoring into the equation were constraints on how fast a fish could turn and accelerate, and the angle at which it could swim. As a result, unpredictable behavior emerges. “[The end result] is also difficult to tweak for large scenes, as the behavior can change qualitatively if the number of fish changes, just as it would in a real bait ball,” Mann says. “So in a shot with 50,000 fish, testing and tweaking with 5000 only gets you so far. At some point you have to increase the numbers.”

The resulting data from the AI was stored in a list of Maya particle files, wherein each fish was represented as a particle with attributes defining the fish motion—position, velocity, facing, vector, tail flapping, and so forth. A proprietary script reconstructed the particle system and instructed it to present a fish model in place of each particle.


Working with paleontologists,the artists designed, modeled, and animated more than 60 monstrous ocean creatures, which had to be scientifically accurate.

“We previewed the render in Maya and checked out the motion quickly in OpenGL, while a custom RIB generator attached to the particle system inserted the parameters needed during the RIB translation,” says Mann. A custom file format exporter/importer—which works similar to that of an OBJ format, only binary—was used to store the fish models.

The simulation produced schools of hundreds of thousands of swimming fish, each with more than 10,000 vertices. According to Jian, the size of the RIB file defining one of these scenes could reach several gigabytes; rendering RIBs at that size was not possible for CineVision. So Jian created a RenderMan procedural primitive DSO that could handle it. “It was basically a plug-in for PRMan that loaded at rendering,” he explains. “I used the DSO to load the model of the fish at rest pose, to move it, to face it in the correct direction, and to deform to the pose as needed.” The resulting fish model was injected into the RIB stream and rendered out.

Only the parameters were added to the RIB files, not the model, so the RIB size was dramatically reduced. Also, the DSO proved useful in calculating how big the fish would appear in the image and deciding which level of model to load (there were four LOD versions, ranging from more than 10,000 vertices to just a few hundred). Most fish were far enough from the camera so that only the lowest LOD model was needed, and if a fish swam out of the field of view, no model was loaded.
 
“In this way, we reduced memory usage and render time to make it practical to render these ‘bait-ball’ shots to 4k output,” says Jian.

DamnFX also developed in-house flocking software to generate swarms of schooling fish and crowd-duplication tools for large groups of crinoids and other creatures. For the ammonite sequence, the artists created an extremely detailed model that had to be duplicated many times over, with offset animation cycles and, in some cases, very specific hero animation. 

Moving in Stereo
For National Geographic, bigger is better, and surely audiences who see Sea Monsters will agree. Yet, the IMAX stereo format placed great demands, times two, on the VFX facilities—achieving a 3D effect requires a left- and right-eye camera, each projecting a rendered set of the IMAX-style 4k imagery. As Wentworth points out, rendering such a frame can take eight times longer than a comparable standard feature-film frame. As a result, the facility invested in technology upgrades for the project, including a faster network, more and faster render processors, and more than 27tb of storage space on a state-of-the-art Blue Arc server (nearly all of which was used for DamnFX’s 80-odd shots). Also, the TDs developed new tools and techniques for the artists to help them achieve the director’s vision.

“Sea Monsters really did require us to draw on almost every aspect of VFX to realize photoreal CG creatures, both as individuals and with thousands in frame at once, particularly to the level of detail required for a stereoscopic large-format production,” says Wentworth. “IMAX really tests the technical limits of any facility. 4096x3072 is huge; it really bogs down networks, especially when you have 12 passes per creature, per camera.”

When working in mono, artists can employ a number of cheats. “But once you step into stereo, you are stripped naked technically, and you cannot cheat anything,” adds Wentworth. “Every object has a position in x, y, z space, and you have to know where something is in space at all times. So everything must be perfect from the start. A few pixels off here or there, and your creature is going through an object, as opposed to in front of it.”


CineVision devised an artificial-life simulation for schools of smaller fish. The movement of each of these fish, which had a preset rig, was driven by unique factors.

Similarly, the CineVision team, led by co-VFX supervisor Chris Panton, also had to make adjustments to its pipeline to handle the demands of the IMAX supersized data. Apart from the obvious burden on storage, the network, and renderfarm, it also meant the compositors were working with a different set of constraints, since depth passes were required for nearly everything. “Anything done in comp for the left eye had to be done for the right and matched in stereo, so hand tweaks were not an option,” says Aristorenas, echoing Wentworth’s words. “You can’t do patches, and you can’t rotoscope. If it’s not keyable or rendered separately, you can’t isolate it.”

In the end, a larger group of 70 artists and technical staff at DamnFX delivered 12 minutes of 4k stereoscopic imagery for this project, while a smaller group of less than 20 people at CineVision produced approximately 10 minutes for this large-format project.

According to the director, DamnFX’s animation strengths enabled the facility to pull off a wide range of scenes requiring believable animal behavior. “The entire team showed a high level of creativity and technical innovation in every aspect of VFX production, from design through compositing,” says Phillips.

For CineVision, it was the group’s stealth and small size that worked in its favor. “Our budget wasn’t huge, and we were scouring the world for VFX companies that could deliver a high-quality product for what we had to spend,” says Phillips. “The conventional wisdom would say that London wouldn’t be the most competitive market, but [CineVision] had the ability to compete with more traditional markets, like Canada (where the DamnFX is located). Robin [Aristorenas] and his team are seasoned effects artists working together without the cost burdens of larger companies, and they have structured themselves around the efficiencies of the Internet age.”

Indeed, the work by DamnFX and CineVision truly reflects the general underlying concept in Sea Monsters: survival of the fittest. 

Karen Moltenbrey is the chief editor for Computer Graphics World.