Issue: Volume: 22 Issue: 12 (December 1999)

Innovative simulation technology breathes life into digital human models



Each year, thousands of business, industry, community, and education professionals from the world over flock to NASA's Johnson Space Center in Houston to attend Inspection, a three-day exhibit designed to introduce attendees to NASA-developed space technologies that can be utilized for unique applications on Earth. Usually, attendees find technologies that will help them solve problems in their businesses, or that they can turn into commercial products. But sometimes, NASA finds a company or person with whom to partner on research projects. · Such was the case when Robert Rice attended Inspection96. At that time, Rice was an adjunct professor at the University of Houston and a professor at Montgomery College in Conroe, Texas. He visited Inspection96 hoping to find technology he could use with the recently completed Visible Human Project to develop an interactive, customizable, virtual human body with which to teach his anatomy students. Instead, he found in NASA a partner to help him create such a tool. · That technology-which Rice since developed, with funding from NASA-is called the Virtual Interactive Anatomy (VIA) Project. Available through Rice's company, Dynoverse Corp. (The Woodlands, TX), VIA begins with data from the National Library of Medicine's Visible Human Project. Begun in 1989, the Visible Human Project uses CT, MR, and cryosection images of two cadavers, one male and one female. With these image slices, which were taken at 1mm intervals for the male model and 0.33mm intervals for the female, one can distinguish the internal details of the human body and reconstruct them in 3D with volume rendering. Today, Visible Human data is available either in original form from the government, or as static polygonal or voxel files, from companies such as Visible Productions (Ft. Collins, CO) and Gold Standard Multimedia Inc. (GSM; Tampa FL), that define a segmented structure-for example, the biceps muscle in a person's arm. · Rice's virtual human enhances the basic anatomy of the Visible Human Project by introducing a high degree of interactivity and individualization with the model. Perhaps more significant, it replicates physiology, including blood circulation, muscle fatigue, and joint mobility.
This Virtual Interactive Anatomy model not only looks realistic, it also replicates human physiology, including blood circulation and muscle fatigue.




As such, it can be used in any application that requires analyzing the human body or a part of the body as it is involved in a process-interacting with an object or an environment, wearing a product, even interacting with a real human. · "I am developing my virtual human at the tissue level," he says, "so that it not only looks and moves like a real human, but can also be used in what-if simulations to analyze how the body reacts when put in positions or situations that can cause stress or harm." Thus, while a Visible Human's biceps muscle, for instance, exists in the form of a static image, a biceps on the virtual human's arm contracts, changes shape when it contracts, responds when a nerve impulseisdeliveredtoit,and becomes inactive if its nerve supply is lost, as in stroke patients. It also fatigues after a period of use, loses mass and force if it's taken into outer space, and ruptures or tears if its tendon is overworked. "All that, and lots more, is what tissue-level properties impart to an otherwise inert object in a computer graphic image of the arm," Rice explains. Dynoverse applies these motions, attributes, responses, and functional qualities to the tissues of the Visible Human model, and turns that model into what is intended to be the most realistic simulation ever created of the entire human body.



To accomplish this remarkable feat, Dynoverse is combining several tools. At its base, the technology uses Visible Human Male data, which Dynoverse purchased from Visible Productions and GSM (Rice plans to purchase Visible Human Female data soon). It also uses PolyRed and Open Inventor from MultiGen/Paradigm (San Jose, CA), and SIMM, from MusculoGraphics, Inc. (Evanston, IL). In addition, it can be used with haptic interfaces from SensAble Technologies (Cam bridge, MA) for applications requiring tactile interaction. "With these tools, we're taking the Visible Human data and making it come alive," he says.

To simulate a biceps contracting, for instance, Rice first enters into the system variables reflecting a change in the muscle. He manipulates the data using the polygon-reduction tool PolyRed, derives force values and vectors from SIMM, then uses proprietary software to create a model that is fully interactive not just on the surface, but down through the layers of human tissue, from skin to skeleton. The resulting muscle not only looks, but acts like it should. He then renders and adds realistic textures to the imagery using Open Inventor and proprietary software.
The VIA models can be viewed from all angles in semitransparent layers, so that, for example, internal anatomy appears in the context of skin and body contours.




Currently, the VIA technology is available in two forms: as a product and as a project. In the former case, customers can purchase the human-model source code, along with the rights to further develop and enhance the model and to run their own simulations. "If the user is changing just a few parameters, is not using a highly complex model, or is not working on the entire body at once, we can get this down to a Pentium II PC," Rice claims. Generally, though, the technology requires at least a dual-processor workstation with a recommended 256K of RAM.

However, most customers will license the VIA technology as a project, Rice believes. When customers choose this option, they can feed Dynoverse the data and operational/functional variables relevant to the type of human model (size, shape, age) and the application they want to simulate. Running the technology on SGI workstations, Dynoverse creates, delivers, and implements, in whatever environment the customer selects, a working simulation that has all the appropriate variables and interfaces, haptic or otherwise, that the customer has specified. "We can do this at their site, or we can support it on our servers, and they can access it remotely," he says. "We think most customers will license the technology this way because it gives them exactly what they want, without having to create their own virtual reality simulation infrastructure or buy rights to some of the software subsets we have built into our technology."

According to Rice, the VIA Project is unique in that it is the only project that will offer such detailed interactive biophysics-based replicas of the entire body. "As a result, we can determine the range of forces that can be generated in a particular muscle, and we can set that range so that it reflects a highly physically trained individual, for instance. Or, we can look at the tissues in terms of hydration level and simulate edema and other vascular changes. Capabilities like these are extremely important in terms of creating accurate simulations of how the body functions." Rice is working with several biomedical labs and research groups to ensure that the tissue-related data being used in the VIA Project is validated.
Each Virtual Interactive Anatomy functional element-bone, muscle, joint-can be seen individually or as a composite from any axis.




Another important characteristic of the VIA Project's virtual human model is that it will incorporate data being compiled through CAESAR (Civilian American and European Surface Anthropometry Resource). CAESAR is a research project that is generating 3D data on the size and shape of the modern human body, based on measurements of men and women subjects, ages 18 to 65, in North America and Europe. Administered under the direction of the Society of Automotive Engineers' Cooperative Research Program and scheduled for completion by the middle of next year, the CAESAR survey is the first international anthropometric program to use a whole-body scanner (Cyberware's WB-4) as a data collection tool.

"The Visible Human Project provides images of structures characteristic of only one male and one female cadaver. With the CAESAR data, we can scale our virtual human model so that it is very short and stout, very tall and thin, or anything in be tween," says Rice.

The capabilities of the VIA technology have far-reaching benefits. Already, NASA is using it to build a better fitting space suit (see "NASA's New Clothes"). However, the technology could be used in any application that involves human safety and comfort.

One of these applications may be vehicle safety testing. In fact, Rice is already investigating the merits of a virtual crash dummy and has written a white paper on it (see www.somaticsciences.org). The crash dummies that are used today, however sophisticated, remain inert. Seated in the test vehicle, they speed down the crash run and smash into the wall, after which they are photographed, and the data their sensors recorded is analyzed. But they will have done nothing in anticipation of the crash.
Portions of the body can be isolated for viewing, as in the volumetrically rendered neck vertebrae (left) and the internal blood vessels (center), or highlighted within the overall body (right).




A virtual model, on the other hand, can replicate avoidance moves prior to the crash. "We can simulate the contraction of the muscles, the raising of the arm to cover the face, whatever someone might really do," says Rice. "That completely changes the dynamics of the crash, how the person is thrown about the vehicle, and how he injures himself. All that can be simulated with a virtual car and a virtual human on a computer screen."

Another area concerns the evaluation of medical therapies and procedures. "Anybody who works with patients needs training, and the most common way they get it today is by performing the procedures on real patients," says Rice. "That's sometimes tough on the patients, and it's not cost-effective." Practicing medicine on a virtual patient first, he says, would alleviate some unnecessary discomfort for patients and would likely help contain costs. In fact, Dynoverse is currently working to develop a haptic simulation for physical diagnosis training.
Conventional crash simulations use dummies without biomechanical response capabilities. But a VIA simulation might, for example, raise an arm to shield itself.




A third possible application is in the clothing and footwear industry. "We can readily imagine point-of-sale kiosks in high-end merchandising facilities for footwear, clothing, outerwear, and sports equipment," Rice says. "In seconds we could get a complete scan of the customer, including uniquely personal features such as his foot arches or arm length-to-shoulder breadth ratio. Then we could place his virtual counterpart in an appropriate environment and outfit him with virtual clothing, equipment, and shoes, let him try on samples virtually, and make a buying decision without requiring a lot of inventory in the store."

According to Rice, these are not bleeding-edge applications, but rather, applications for which a strong market now exists. All that's needed is the money to fund them. To that end, Rice has been working closely with PricewaterhouseCoopers' (PWC) In tel lectual Asset Management Prac tice to identify companies that might be interested in his technology as well as markets in which the technology may be applicable. Already, according to PWC partner Bryan Benoit, PWC has identified several companies that are interested in the technology, and the firm is "in various stages of negotiations" with them.

The VIA Project is an innovative technology, with far-reaching applications. "It was only a matter of time before all the tools to make this happen were developed," Rice concludes. "It's an idea whose time has come."

A Computer Graphics World contributing editor, freelancer Audrey Doyle is based in Boston. She can be reached at audreyd@mediaone.net.

The qualities and capabilities of the Virtual Interactive Anatomy (VIA) technology make it useful in numerous applications-which Anthony C. Bruins, a systems engineer at the NASA Johnson Space Center, immediately recognized when he first met with Robert Rice back at Inspection96. "We liked his idea, so we asked him to create a proof of concept for us," says Bruins.

Thus, in April 1997, Rice formed the Institute of Somatic Sciences, a non-profit organization at the Houston Advanced Re search Center (The Woodlands, TX). That year, he developed a prototype of his virtual model, and at Inspec tion97 he displayed a virtual shoulder area. The folks at NASA were impressed and ap proved funding for Rice's work. Shortly afterward, NASA became Dynoverse's first customer, when the agency asked Rice to build an interactive virtual human that NASA could use to help design a new space suit.

Today's space suit was designed primarily to deliver optimal engineering performance, not necessarily to provide comfort for the wearer. "We now want to take a human-centered design approach," says Bruins. "Dr. Rice's technology will enable us to model humans and their movements, then build the suit around them."
VIA technology is being used to help NASA develop lighter, more flexible space suits that will offer greater freedom of movement to astronauts.




NASA is also looking toward future space missions and how a new suit can better accommodate the astronauts. "When we someday explore Mars, for example, we'll need a different suit," says Bruins. Astro nauts visiting Mars will be exploring the planet from a geological perspective, which requires a lot of bending, reaching, and kneeling. The existing suit weighs about 275 pounds.; not only is it heavy, it also restricts activity and, therefore, can't support required movements.

"We'll be able to put the virtual human into the virtual suit and evaluate right there onscreen issues such as safety, performance, comfort, and mobility," Bruins says. If certain areas of the body demonstrate re stricted range of mo tion during simulated tasks, for in stance, such results will show up as highlighted areas on the surface of the virtual models. "And be cause we'll be using virtual-reality technology, we can go through the conceptualization, design, development, and testing phases before we cut any hardware, thus saving time and money," he says.

According to Bruins, after analyzing data compiled by companies that already use virtual prototyping to develop products, NASA anticipates it could slash the time and money involved in creating its new space suit in half. This is significant, especially considering that it typically takes years to bring a suit from the concept to certification stage, a process that in the past has cost millions of dollars.

In addition to a new space suit, NASA also plans to design and build a new glove-one that will enable astronauts to grasp objects and hold them tightly with less fatigue and for a longer period of time. With today's glove, astronauts must learn to work with serious limitations on their hand/wrist motions and comfort. "The glove is an important element of the space suit because that's what an astronaut uses to do work," says Bruins. "With Dr. Rice's model, we'll be able to simulate not just how long it will take the astronaut to burst internal components of the glove, but also how long it will take for the hand, wrist, forearm, and shoulder muscles to become fatigued if he performs a certain task-say, turning a wrench for several hours straight-in a certain way."
Object models such as a glove can be displayed as a polygon surface mesh or in a texture-mapped format.




But perhaps the most intriguing application of the VIA technology for NASA-apart from the design of a new space suit and glove-is the simulation of the abnormalities that astronauts suffer due to microgravityorweightlessness. The human body goes throughnumerous changes when a person leaves Earth's atmosphere. For instance, the heart expands, body fluids pool in the upper part of the body, muscles begin to lose mass and strength, and the bones suffer de mineralization. By applying abnormal variables characteristic of these changes to the interactive virtual human, NASA will be able to simulate how these abnormalities affect astronauts during space missions.

"We can apply different levels of stress on the virtual human and study what will happen," says Bruins. Based on such data, the NASA engineers can then develop countermeasures, such as special exercises and diet modifications.

So far, Rice has completed the back, arms, legs, and complete skeleton of the virtual human for the NASA project. " We have enough data to design better gloves and better boots," says Bruins, "and as soon as we get the green light for the next space mission, we'll start using the technology to create some prototypes." In fiscal year 2000, he adds, Dynoverse will begin working on the chest area. "We're slowly building the entire human, piece by piece."
-Audrey Doyle