Issue: Volume: 23 Issue: 11 (November 2000)

Atomic-scale visualization

Scientists use 3D imagery to enhance simulations of NUCLEAR processes

If the US continues its ban on nuclear-weapons test explosions, defense planners will be forced to rely exclusively on simulations to evaluate the performance of America's nuclear arsenal. To ensure that the country will be ready to give up weapons testing, the US Department of Energy is turning to 3D computer visualization to design an environment for simulating nuclear processes never before accessible in a laboratory setting.

One of the highest profile efforts to develop this capability is the Department of Energy's National Ignition Facility (NIF), which will generate nuclear fusion with the world's largest experimental laser in a stadium-size building at the Lawrence Livermore National Laboratory in California. When the NIF opens later this decade, it will provide scientists with an environment for creating, on a small scale, the physical processes involved in nuclear-weapons detonations. The facility's lasers will focus 192 beams onto a BB-sized capsule filled with tritium/deuterium fuel. As the capsule implodes in the resulting 100 million-degree heat, the fuel will ignite and burn in a nuclear fusion reaction.

To understand the design requirements for the new laser system, scientists have simulated a variety of complex physical processes, augmenting their mathematical analyses with high-resolution 3D visualizations. If this research is successful, the payoff will be great because of the facility's potential impact on US national security, basic science, and energy independence. Besides offering scientists a tool to enhance their understanding of weapons processes and to conduct reliability tests, the facility also will be useful to astrophysicists seeking to understand the inner working of stars, and energy researchers trying to develop nuclear fusion as a commercial electric-power source.
Using 3D computer graphics, scientists are trying to create a simulated laboratory test environment for visualizing the complex conditions that occur when a nuclear device implodes.

Visualization of large datasets-the kind needed to simulate fusion reactions-has come to the fore recently. Over the past five years, greater computing speed and software advances have made it practical to increase image resolution by two orders of magnitude-enhancements that have made visualization a more attractive analytical tool for scientists. The Department of Energy's Accelerated Strategic Computing Initiative (ASCI) is encouraging this trend by funding basic research to develop modeling and simulation as a means of replacing physical tests of US nuclear weapons (see "Getting a Bang Out of Simulation," Computer Graphics World, July 1998, pg. 117).

A major objective of the new NIF simulation research is to determine specifically how to design the target capsule so it properly implodes, creating the extraordinarily high pressures needed to ignite fusion in the tritium/deuterium fuel. One of the key concerns in imploding the capsule is to keep its surface smooth and spherical, says Lawrence Livermore physicist Steven Langer. Langer and his colleague Marty Marinak have created visualizations of implosions to determine the kind of capsule materials and laser pulses needed to maintain the ideal capsule shape-and to understand the process variables that optimize the implosion process.

Langer and his Lawrence Livermore colleagues also have simulated the propagation of the NIF laser beam to determine the requirements for focusing its energy at the desired point on the capsule. "If we don't achieve the desired conditions, the laser beam can break up into many smaller beams headed off in other directions," he explains. "Visualizations help us understand why beam breakups occur."
This volume rendering from an "idealized" simulation illustrates the type of instability that occurs within an NIF fuel capsule.

According to Langer, some simulations of NIF processes have focused on extremely specific technical issues that are enriching broader scale simulations of the entire capsule implosion problem. One such problem concerns the threat of fluid turbulence to successful implosion. A major challenge in building the NIF is to develop a capsule shell thin enough for easy implosion while avoiding hydrodynamic instabilities that could break it apart before fusion ignition. To counter this potential difficulty, scientists are studying the type of turbulence that can occur when the shell and fuel are heated into fluid form and begin to mix together.

One common type of fluid turbulence is triggered when fluids of different densities push against each other and mix. Such turbulence could easily occur during the NIF's fusion ignition, when laser pulses convert the target capsule and its fuel into quickly accelerating fluids. In one recent ASCI project, physicist Andrew Cook of Lawrence Livermore's Defense and Nuclear Technology Directorate developed computer-based visualizations of this kind of turbulence.

According to Cook, when laser beams hit the NIF target, the heat turns the capsule and the fuel inside into plasma. "The fuel is the light fluid, and the shell is the heavy fluid. The fuel pushes back on the shell as the capsule implodes; as it pushes back, fingers of light fluid work their way into the shell and vice versa," he explains. "The region of interpenetration may eventually be come turbulent, leading to mixing between the fluid and the shell. If the mixing region expands to the outside edge of the shell, then the shell might break up, halting the implosion."
This sequence shows the increasing amount of turbulence (green) as the heavy fluid (red) mixes with the light fluid (blue).

Cook and professor Paul Dimotakis at CalTech have completed a series of 256- by 256- by 1024-point simulations that offer important insight into controlling turbulence in NIF fusion reactions. The simulations confirmed that the rates of fluid mixing and mixing-zone growth are more strongly influenced by initial irregularities on the surfaces of the fluids than was previously known. The implications of these findings for the NIF project, Cook says, are that the inner surface of the target shell should be made as smooth as possible to minimize the growth of turbulence and that operators of the laser should "hit the capsule as hard and as evenly as possible."

Atomic-level turbulence is a complex process consisting of millions of data points, so numerical integration of the governing partial differential equations required a great deal of time during Cook's simulations: between 200 and 300 hours over a two-month period using 128 processors of ASCI's IBM SP2 (Blue Pacific) supercomputer. Once the simulation was completed, it typically took several hours to render the visualization.

Cook created visualizations of his simulations using 32 processors of an SGI Origin computer running EnSight Gold, a visualization and data-analysis package from CEI (Morrisville, NC). EnSight Gold was designed to meet the needs of ASCI, whose researchers wanted to use the latest parallel-processing computers to render models with millions (and soon billions) of nodes.
2D slices from a 3D dataset provide snapshots of the turbulence that occurs during the mixing of light and heavy fluids.

Cook uses EnSight Gold to make quick, simple visualizations of simulations in progress. In this way, he has been able to see indications of any obvious problems before the simulation is completed. For instance, Cook says, the presence of any point-to-point oscillations in a quickly assembled interim visualization of the process would offer evidence of a resolution problem in the simulation.

Once the simulation is completed, Cook combines all the raw data files into a 3D-formatted file. This enables him to visualize certain aspects of the processes that are not evident in the mathematical analysis. For instance, Cook says that the persistence of initial surface irregularities as the process evolved was not readily obvious from an analysis of equations alone.
This time sequence shows the "bubbles" (red) and "spikes" (blue) that result when heavy and light fluids begin mixing.

One example of this work (see images directly below) depicts heavy fluid in red and light fluid in blue. With visualizations such as these, Cook has been able to obtain quick, visual clues of the mixing that would occur under different conditions. For example, the visualizations show the effect that a "leading bubble or spike" of mixed fluids can have on the growth of turbulence between the capsule and fuel layers. "They also help answer many questions," he says, "such as when do the bubbles and spikes begin to merge, where is most of the mixing occurring, and where does the mixed fluid go?"

To develop his visualizations, Langer wanted a tool with a full programming language as well as a sophisticated graphical rendering capability. He has worked with a package from Research Systems (Boulder, CO), called Interactive Data Language (IDL), and Yorick, a freeware product created at Lawrence Livermore. Both IDL and Yorick have 2D and 3D graphics packages and are based on interpreted languages that allow Langer to handle large data files without writing and compiling code.
During nuclear implosion, the temperature of a fuel capsule is coolest near the high-density portions of the shell.

Langer has used both products to simulate NIF fusion implosion processes. For instance, in one simulation (see image at right), a plastic capsule filled with gas is immersed in X-rays that will cause the shell to implode and, if conditions are right, trigger fusion in a confined space such as the NIF. Ideally, the X-rays will have consistent strength around the capsule, so the imploding material can remain smooth and spherical. Langer and his colleagues wanted to know how much distortion to the capsule's shape would be tolerated and how uniform the X-ray strength had to remain in order for fusion to occur.

As Langer increased the pressure density on the capsule, he discovered that the densest material was located on one side of the capsule and that high-density material was poking into the fuel. He wanted to know if this plastic "finger" was reducing the temperature of the hot gas. Langer got an answer by adding a slicing plane at the equator of the 3D model that depicted in color the variations in gas temperature. "If you look closely at the slicing plane," Langer says, "you can see that the hottest gas in the center is round, while at lower temperatures the contours follow the shape of the plastic." After further simulations, Langer and his colleagues concluded that implosion would be compromised if variations in X-ray strength were any greater than shown in this simulation.

Cook is now creating a different simulation at double the resolution so that he can better visualize the degree of turbulence likely to be found in the NIF project as well as other phenomena such as supernovas. The 512- by 512- by 2040-point simulation was started in December and usually runs for a day on weekends and sometimes for a few hours overnight. It is now about two-thirds complete and should be finished by the end of the year after approximately 1000 hours of run-time. In the future, Cook hopes to start running different conditions simultaneously.

Scientists like Cook, though, must proceed cautiously. They must distinguish whether any seeming abnormality in the simulation rep resents a real physics problem or whether it is simply an artifact in the graphic rendering of an image. For example, a visualization package might show a "crease" in a boundary surface where none should exist because of inconsistencies in the computation of surfaces. Another interpretive problem can arise when a software user chooses a color table that generates false contours or suppresses important features.
As a laser interacts with ionized gas, the beam pushes some particles aside (blue) and ejects others outward (white).

Despite these issues, Cook maintains that the benefits of using 3D visualization far outweigh the drawbacks, especially in verifying the relevance of complex physics models. Furthermore, the graphic renderings provide clues and insights not offered by standard statistical analysis.

Langer agrees: "The physical systems we want to understand and whose behavior we need to predict are so complicated that they can't be solved with analytic methods such as paper and pencil." He also adds that it's otherwise impossible to scan through "millions of numbers" and pick out anomalies with the same speed and ease possible when looking at a 3D visualization.

In Cook's opinion, current computer visualization packages have most of the basic functions that physicists need to build graphical renderings of simulation data, but these tools need to run faster. Langer, though, believes that the other research laboratories involved in the ASCI program should develop some of their own visualization tools geared for their specific projects while encouraging software vendors to improve their products' capabilities for the large-scale visualizations needed by many physicists. Toward this end, the ASCI program recently entered into a three-year, $1.8 million research and development contract with CEI to make enhancements to the EnSight Gold package, which is now used in the ASCI program by the national laboratories.

According to CEI president Kent Misegades, one of the company's key research objectives will be to make the software scalable, so it can render graphic images of terabyte-size datasets without a significant slowdown in the processing time. Another focus of CEI will be to develop new ways for scientists to interact with their data, including partially or fully immersive virtual environments.
This3D data sequence depicts the increased change in fluid velocity during four different simulation times.

If CEI is successful, Misegades says, scientists will be able to create graphic renderings of extremely large datasets in near real time and work with them in a more natural way. These improvements should make visualization even more attractive to researchers working in programs such as the NIF who are already finding that many processes are too complex to understand through mathematical analysis alone. With visualization software, scientists and engineers can check on the quality of their simulation during runtimes, show how processes change by performing what-if scenarios with different conditions, and offer an easier way to communicate research. By using the sense of the visual, researchers can get a more complete view of their data and enhance their ability to grasp the subtleties of complex processes.

Mark Hodges is a Computer Graphics World contributing editor based in Atlanta. He can be reached at