Motion-capture systems allow the acquisition of actual human performances, with all the timing, body mechanics, and gravity that make for compelling natural motion in animation. However, there are certain types of captures that only the hardiest of actors would be willing to tackle: falling off buildings while being hit with high-velocity projectiles and landing on oil drums, for example. Yet, Natural Motion’s Endorphin 2.5 motion-synthesis software was designed for these types of animation challenges.
The Endorphin software includes a specialized set of character animation tools for creating scenes like the one mentioned above, with the sort of physical behaviors you would expect from live actors interacting in real-world situations. It gives you a simple but powerful way to synthesize interactive animation between characters and dynamic environments.
Endorphin’s behavior-based animation pipeline has a straightforward interface that can be integrated with other animation packages. Behaviors in the program are configurable units of animation, which can be applied to a character for varying lengths of time. A model that is being driven by a Catch Fall behavior, for example, will do its best to turn and break a fall with its hands after being hit by an object. If you precede the Catch Fall command with a Stumble behavior, for instance, the animation pattern changes, and the character will try to keep its balance before falling, doing its best to land in a protected position.
Combining behaviors and making subtle changes to the parameters in Endorphin can deliver different results from the same source material. Objects can be constrained to scene elements so the characters interact appropriately within a variety of environments. The software’s available tools make it easy to guide the animation’s results with these behavioral tools, instead of having to keyframe the animation.
|Using Natural Motion’s Endorphin 2.5 motion-synthesis software can deliver unpredictable results.
By simply tweaking the behavior parameters in Endorphin, you can re-simulate and review your changes. The simulation engine is efficient and quick; even on a modestly powered machine, iterations are possible.
Creating the work flow in Endorphin begins with the program’s standard biped character. Custom characters created in other programs such as Softimage’s XSI or Autodesk Media and Entertainment’s Maya, which match the skeletal structure and proportions of your destination character, can also be imported to replace the standard biped character.
We tested the dynamics functionality in the program by setting up a scene with a digital actor positioned on a roof and peering down over the edge. We programmed an object to hit the character from behind and, as the actor fell, arms flailing, it grabbed a light pole and dropped to a crouching position on the ground.
Setting up the initial scene in Endorphin required setting up a plane for the roof, posing the character, and creating a cube to represent the projectile. We created a cylinder primitive to represent the arm of the streetlight, adding a force before hurling it at the actor.
Up to this point, we have the makings of a standard dynamics simulation, but this is where Endorphin’s behaviors take control. When the actor is hit by the projectile, we added a Writhe behavior to make the character’s arms thrash about. Sometime during the middle of the fall, we scheduled a “hands reach and look at” command to have the character reach out for the arm of the streetlamp, constraining the hands to the streetlight to let the digital actor catch the arm. Releasing the lamp’s arm and having the character land on its feet was accomplished by applying a Land and Crouch behavior to the model.
We could also transition the character into a motion-captured clip of an actor standing and looking over its shoulder. Blending from Endorphin’s behavior-driven crouch to the motion-captured data is accomplished using a transition event, which blends the simulated environment with the recorded motion data.
All the scheduling mentioned above simply requires moving and scaling events on the timeline, a process familiar to anyone who has used a track-based NLE interface found in a video-editing system or a multi-track audio-editing solution. Notably, the simulation above was set up without keyframing, and entirely new versions of the animation can be generated with subtle tweaks to the behaviors and environmental components of the simulation.
The program has an uncomplicated interface, and arranging events on the simple track-based timeline is straightforward. The properties windows expose just enough information to make tweaks without the process becoming unmanageable. Overall, the interface provides what’s needed-nothing more.
Using Endorphin within an existing animation pipeline is accomplished with the import/export support. Existing scene elements in OBJ, XSI, and FBX formats can be brought into the program for spatial reference, and existing animation in the major formats can also be added to the scene.
Dynamic Blending allows you to blend between the simulation environment and imported animation sources for more realistic transitions. And, the tools for creating custom characters in Endorphin 2.5 make the setup and proportion matching of destination characters in other packages more fluid. A rig remapping tool lets data that is applied to one character be remapped to another, even if the skeleton setup differs.
It would be good to see Natural Motion add a more comprehensive set of behaviors in future versions of Endorphin. Almost all the existing sets are focused on impact or fall-related reactionary events. And, while it’s a safe bet that most people using this type of software will have access to a library of motion-captured source material, adding more ambient behaviors would make the software program better. Some examples we would like to see added include turning to react to sound and ducking or dodging to avoid incoming projectiles.
Also, incorporating parameters to the characters that govern their overall coordination level would be a welcome addition. Settings that allow you to increase or decrease the overall dexterity of your character would allow for a greater range of output, even with the existing behaviors.
Adding constant forces, such as wind, to the single impulse force would definitely broaden the scope of the possible simulations. And, finally, it would be nice to see a batch-rendering option that would allow for a range of simulations to be batch-rendered with the opportunity to vary the parameters of the simulation within specified ranges.
With a clean interface, a strong and growing set of built-in behaviors, and good support for standard animation file interchange formats, Endorphin is a powerful addition to any character animation pipeline.
is the founder of Pixel Corps, a guild for content creators of all skill levels.
is a motion-capture specialist and research division leader at Pixel Corps.
Minimum System Requirements:
Windows 2000/XP, Intel Pentium Processor or AMD Athlon 1.7
RAM, Nvidia GeForce 2, or ATI Radeon 7000 or higher.