Character Splash System
Issue: Volume: 29 Issue: 10 (Oct 2006)

Character Splash System

By Scott Cegielski

Effects involving water are generally regarded as some of the most difficult to achieve. The DreamWorks/Aardman animated feature film Flushed Away (see “Flushed with Success,” pg. 22) contains a wide variety of water effects, each of which posed specific challenges. Waterfalls, boat wakes, giant waves, water slides, and splashing are just some examples of water-based effects featured in the film.


Due to the large number of shots that required splashes resulting from a character’s interaction
with water, Cegielski’s team developed a system to handle all the related scenarios.

 
Some of the most common effects in many of the movie’s sequences are the splashes that result from a character’s interaction with a body of water. This posed a specific challenge because of the number of shots and the variety of characters that were creating the splashes.
The quantity of shots that required this effect warranted the development of a character splash system that could accommodate all the scenarios quickly and efficiently. The system had to automate the process of generating splashes from a character’s interaction with a water surface while still providing a large degree of control for the artists.

Completed, the system contains various tools for generating splash emissions, localizing a fluid simulation, and generating a surface from the resulting particles. It does not include a fluid dynamics simulator, but instead uses a previously developed in-house simulation tool.
In order for an animator to implement this effect efficiently, splashes from a character’s water interaction were generated semi-automatically. This was done in several stages. First, the character’s surfaces were converted with an evenly spaced distribution of particles using a proprietary tool. The spacing of the particles could be controlled based on how much detail was required from the character’s motion. And, each particle contained a local velocity vector of the character’s motion at that position.



Next, a closest point calculation was performed between each particle and the water surface with which the character was interacting. Using each particle’s closest point distance, velocity, and user-controlled parameters, further calculations were performed. The user-controlled parameters could be animated over time and could also vary over the surface of the character using painted texture maps. The particles were first culled based on user-defined criteria such as a maximum distance from the water surface and a minimum velocity threshold. The positions and velocities of the remaining particles were then modified based on several computed vectors that gave the user a high degree of control over the resulting splash direction and amplitude. The resulting particles were then input into the fluid-simulation program.

Local Fluid Simulation

In some shots, the character was moving at a high rate of speed due to the current in the water. This meant that over the course of a shot, the character might be required to travel quite a long distance. Moreover, the fluid dynamics simulator required a grid covering the area of desired simulation, and these shots would have required a very large box to cover the character’s range of motion. Since detail was necessary in the simulation, the grid also would have had to be subdivided into many cells. These two factors could have resulted in very long simulation times.

Therefore, an alternative method was implemented to localize the simulation around the character. This entailed stabilizing the character, a process that involved tracking the center of its bounding box at every frame, and then inverting the tracked data. It resulted in the removal of the large-scale motion of the character but the retention of the local motion, such as limb movement. The character’s general position ended up at the origin, where a much smaller fluid grid was placed for the simulation.

In order to retain the sense of the large-scale motion in the fluid simulation, a vector force was included in the simulation. The direction of this force was derived from the frame-to-frame motion of the character and was recorded during the tracking phase. Therefore, if the large-scale motion was along the positive Z axis, then the vector force applied to the simulation was in the opposite direction, along the negative Z axis.

After the simulation, the particles were then transformed so that on each frame they track with the character’s pre-stabilized motion.

Surface Generation and Integration

Due to the miniature scale in this film, it was decided that the character splashes should be rendered as surfaces rather than as particles. To create the surfaces, the particles from the fluid simulation were first converted to a density grid with an octree structure, using a previously developed tool. For the conversion from the density grid into a polygon file, a program using the marching cubes algorithm was used.  Marching cubes takes a density grid and generates a configuration of polygons for each grid cell based on its density in relation to the density of neighboring cells. The user provides a density threshold value, which helps determine where the surface should be generated in the density grid.


The first image above shows the
meniscus used to blend the
splashes into the main water
surface. The second image above
is that of a splash polygon used
to smooth the water surface.


The surface was then smoothed using a surface-relaxing program. Since the topology of the mesh changed each frame, special handling was required to get accurate-looking motion blur. For each frame’s polygon file, a second file was generated in which the vertices were deformed. The direction and amount of deformation on the vertices were determined by finding and using the velocity of the closest particle to the vertex from the simulated particle set. This second file was then used at each frame as the previous frame’s position for the vertices, and, thus, correct motion blur was achieved.

Another requirement was that the splash surface blend into the water surface. This was done by iterating over the vertices in the splash’s polygon file. For each vertex, a closest point calculation was performed to the water’s surface. As the vertices approach the water’s surface, they were deformed along a vector that was both tangential to the water surface and in the direction of the splash surface’s normal. This created what appeared to be a meniscus between the two surfaces, and when rendered, the splash surface and the water surface appeared continuous.

Conclusion

The system started with a relatively small set of user-controlled parameters, which grew as needed throughout the production. One of the initial parameters was a threshold value that controlled how fast a character had to be moving when intersecting the water in order for a splash to be generated. This prevented a character that was idle in the water from generating splashes at its intersection with the water. 

Some of the other initial parameters included the number of particles to generate from a splash, the amplitude of the splash, and a few controls for the direction of the splash. Later, controls were added to enhance the shape of the splash. This included a noise multiplier for the splash amplitude and the ability to modulate some of the parameters using texture maps that the user would paint on the character using a 3D paint program. For instance, if more splashing was needed on a character’s hands, maps could be painted in which the hands had a higher luminance then the rest of the body.

Moreover, it was important to the directors of the film that the look of the effects corresponded to the style of the film as closely as possible. Since the characters are reflective of the style used in the previous Aardman films (claymation), we tried to match that look for the effects. This was especially important in the character splash system since these splashes were being rendered around the characters. One of the key decisions in achieving that look was to use a metaball technique for the surface generation. This closely mimicked the claymation process of using plasticine for such splashing effects.

At the conclusion of the production, the splash system had been refined to a high degree. It has gone through revisions as needed in the many shots for which it was used in the film. It will, most likely, continue to be used in future productions at DreamWorks where splashing effects are required.



Scott Cegielski, effects lead at DreamWorks Animation, recently completed work on the DreamWorks Animation/Aardman co-production Flushed Away. For this film, he wrote many of the tools used for the film’s effects, including the system for computing splashes for characters interacting in CG water. In 2001, he worked on the animated feature film Shark Tale at DreamWorks Animation, developing a system for growing the numerous underwater plant life, and was nominated for an Annie Award for his accomplishments.