by Barbara Robertson
Someday, people from Earth will fly to other parts of our galaxy as easily as Captain Kirk, Captain Picard, and Luke Skywalker. We know this as surely as we know that Columbus landed in the West Indies in 1492, and Neil Armstrong took one small step for man on the Moon in 1969. Until that "someday" comes, though, we must be content with observing space through telescopes, extrapolating from data collected by interstellar probes, and then simulating treks through the universe as we know it.
|Using proprietary software, scientists at the San Diego Supercomputer Center turned an astrophysicist's 3D model of the Orion Nebula into a gaseous volume.|
Increasingly, space-travel simulations have been created with computer graphics, and this month marks the debut of a space simulation in one of the most spectacular venues ever built: the dramatic new Hayden Planetarium, an 87-foot sphere that sits on stilts 30 feet off the ground inside a giant glass cube. It's the centerpiece of the Rose Center for Earth and Space at the venerable American Museum of Natural History (New York); you can see the glowing sphere inside the glass cube from Central Park. "It looks suspended; as if we captured an alien object," says James S. Sweitzer, director of special projects for AMNH.
Surrounding the sphere are dozens of exhibits that help us understand where we are in the universe. The simulation, however, happens inside, in a Space Theater that occupies the entire top half of the sphere. There, a digital dome system can display images on a surface 68 feet in diameter. "The entire hemisphere is a computer display screen with a 7.3 million pixel resolution," says Sweitzer. The display system from Trimension (West Sussex, UK) uses seven projectors to spread images across the dome's surface and, by blending the edges, creates one huge, apparently seamless image. Five projectors cover the sides; two cover the dome cap. An SGI (Mountain View, CA) Onyx2 computer with 28 processors and seven Infinite Reality2 graphics systems, each with four Raster Boards, can generate and send digital images at 1280 x 1024 resolution to the seven projectors in real time (30 frames per second) or send pre-rendered digital images that have been stored on the 14 Ciprico disk arrays.
|One of the difficulties in rendering the Orion Nebula was that the Digital Galaxy team wanted to be able to fly into the Nebula and around a Proplyd as shown in the four images above. This meant creating a large Nebula volume with multiple smaller volumes|
"Almost anything you project onto this dome is stunning," says David Nadeau, staff scientist at the San Diego Supercomputer Center (SDSC), who helped create a 3D visualization of the Orion Nebula for the first Space Theater show. "Because the dome is filled with images to its equator, and because there are no seams, it feels like you're looking up into the sky."
One of the problems in using computer graphics to simulate space travel has been that the universe is so big. "In computer graphics, the weakest link to large-scale visualization is the display," says Donna Cox, a professor at the University of Illinois (Urbana, Illinois). "We're mapping 40,000 galaxies. On a video screen, with so many galaxies separated by so much space, the galaxies become smaller than one pixel. The dome addresses that issue."
For example, in the opening show, viewers will leave Earth and travel to the edges of a universe created with computer graphics where they can explore thousands of galaxies, all interconnected like neurons, thanks in part to NCSA. It's a ride through a three-dimensional space that because of the seamless, high-resolution imagery and the ability to see the galaxy from a viewpoint other than Earth is unlike any other, according to Sweitzer. "The problem for me as an astronomer," he says, "has been that I want people to see and appreciate the universe, but I couldn't get them there. I think of the theater as a device. It gives us a way to go there."
First, though, a 3D model of the universe through which they could navigate had to be created. With funding provided in part by NASA, and with help from the museum's co-principal investigators Sweitzer and Myles Gordon, vice president for education, and Neil de Grasse Tyson, director of the planetarium and leader of the science team, the Digital Galaxy project was put into motion. For the project specifically, Dennis Davidson acted as project manager, Carter Emmart, as art director for science visualization and senior engineer, and Aram Friedman as senior engineer. The team started by focusing on our "home" galaxy, the Milky Way, using NASA data for our solar system and the European Space Agency's Hipparcos database for 100,000 nearby stars. But even this wasn't enough information.
"If you look at ancient maps, you might see a detailed drawing of the Mediter ranean Sea surrounded by a lot of confusion," says Emmart. "The Hipparcos satellite measured stars using an accurate mapping scheme, but these stars are nearby. They're like the equivalent of the Med iterranean Sea in the ancient maps," he says. The location of stars farther away, however, is less certain because accurate measurements don't yet exist for distant stars. Thus, to go beyond the Hipparcos data the team had to create a statistical dataset-a best guess. For this, Ron Drimmel, an astrophysicist at AMNH, created a parametric description of the galaxy that enabled the team to create a statistical dataset of more than 2 billion stars.
|Working from satellite data and a statistical model, scientists at Aechelon Technology produced a model of the Milky Way, then created Performer- based software to render and display that galaxy in real time for interactive flights through our solar syste|
The Hipparcos and statistical datasets combined would give the museum a digital 3D model or map of the galaxy, but the galaxy had to be rendered to be useful. "Originally, we intended to pre-render the galaxy using RenderMan," says Davidson. The team also planned to create a real-time version. Emmart remembers: "I thought that if we're going to be creating shows over and over again, we'd want an interactive tool for exploration." The team considered Evans & Sutherland's interactive dome-based systems that allow people to navigate through a 3D star field.
Then, at a simulation conference in Florida, Emmart saw Aechelon's (Sunny vale, CA) SGI-based flight-simulation software and wondered if the same technology used to create runway light points could be used to make stars. In fact, as the project developed, the pre-rendered version was dropped in favor of a real-time version, with Aechelon taking on the job of developing the real-time galaxy. (When the rendered version was canceled, Michael Wahrman, visual effects advisor for the Digital Galaxy project, took that on himself as a separate project. "It was always clear to me that a rendered version of the galaxy would still be needed to complement the real-time version," he says.)
"We did a test for the Digital Galaxy team using the Hipparcos data," says Ignacho Sanz Pastor of Aechelon. "It was interesting how much we could see and how much freedom we had to move through the data." Thus, the flight simulator company soon began writing new software, now called C-Galaxy, to visualize in real time the observed and simulated datasets provided by AMNH for our solar system and the Milky Way galaxy.
|A simulation from Princeton's Paul Bode and Jeremiah Ostriker provided the structure for this visualization by NCSA of thousands of galaxies in the universe.|
To do this, Aechelon used SGI's Iris Performer and OpenGL and its own proprietary code to create three types of objects: stars with colors and halos, planets, and sprites rendered as textured billboards that always face the camera for interstellar dust and nebulae. The objects are displayed using three modes for real-time lighting: planetary, intergalactic, and extragalactic. To keep the stars from flickering or scintillating, a multipass rendering technique that considers eight sub samples for each pixel is used for antialiasing. To make the dataset easier to navigate, a grid with icons, reference markers, and a scale in light years can be overlaid on the galaxy.
"You can fly up to a particular group of stars and orbit slowly around them," says Davidson. "When you fly around the Pleiades [star cluster in the Taurus constellation], it looks like a knot of bees and then a flight of birds. You can fly toward a constellation and watch the dots stretch out in odd ways and then you can look back toward earth. It's really extraordinary."
Although C-Galaxy was created for interactive flights, the Digital Galaxy team decided to render the frames created in real-time out to disk for the show; that is, to use the software to pre-render the show. "We're anxious to use it in a more interactive fashion, but for shows, driving with the joystick is too hard," says Sweitzer. "It can be pretty disorienting to be in a place without walls. The solar system is so huge-trying to hit a bunch of planets in succession gracefully is like trying to make a 3-cushion billiard shot."
Instead, the team used Maya from Alias|Wavefront (Toronto) to create a flight path and to control the show production. For the opening sky show, the flight takes the viewer out of the solar system and aims the virtual starship directly toward the Orion Nebula. Built into C-Galaxy is the ability to switch between interactive graphics and pre-rendered "movies" for "beauty shots," and the flight through the cavern of gas that makes up the Orion Nebula is definitely a beauty shot.
Located 1500 light years away, along a spiral arm of our Milky Way, the Orion Nebula sits in the middle of the Hunter's sword in the Orion constellation. It is a birthplace of stars, which are formed from collapsing clouds of interstellar gas, and it has been photographed by the Hubble telescope. The Hubble images show a stellar cavern with glowing gases forming peaks, valleys, and walls. These gases are illuminated by a "torrent" of ultraviolet light from four massive stars, called the Trapezium, and hundreds of younger stars. In addition, writes astrophysicist C. Robert O'Dell of Rice University, the images reveal hundreds of "glowing protoplanetary disks (...dubbed 'proplyds')...believed to be embryonic solar systems that will eventually form planets." Because some proplyds are shedding gas and dust, which is being pushed away by the light from the hottest stars, they look like they have tails.
To visualize the Orion Nebula, the Digital Galaxy team asked Odell to create a geometric model, a mesh, based on the Hubble images, and asked scientists at the San Diego Supercomputer Center to create and render a two and one-half minute movie. "Because the nebula is a soft, fuzzy, space-filling structure, and not points of light or a faceted solid shape, we decided to do volume imaging," says David Nadeau, staff scientist. "We could probably have written a RenderMan shader, but we have volume-rendering software that runs on a supercomputer, and the project had to be done quickly."
|Starting with scientific data for Orion's Trapezium, AMNH created a polygonal model in Alias|Wavefront's Maya that SDSC turned into a volume for rendering. |
With the SDSC software running on a teraflop supercomputer, the IBM SP2, which has nearly 1000 processors, they were able to render 28,000 images (seven projectors, two-plus minutes, 30 frames per second) at 1280x1024 resolution in 13 hours. To turn O'Dell's mesh into a volume, Nadeau filled the skeletal surface with fog. "I squirted fog inside, punctured the surface with tiny holes, and took a 'snapshot' of the resulting cloud just as tendrils begin to squirt out the holes," he says. Then he added turbulence to ripple the surface. To give the resulting white blob some color, he projected Hubble images onto the cloud and modeled how deep the colors would get into it. The result is a glowing, turbulent, fuzzy layer that matches the original data.
All told, Nadeau created 85 volume datasets: one for the main shape, one each for the 57 proplyds (including one close-up of a proplyd that is displayed on one-eighth of the dome's surface), and additional volumes for shock fronts and splashes in the gaseous cloud. These he handed to Jon Genetti, staff scientist, for rendering.
"To make a picture of the nebula, we have to render all 87 of the volume datasets simultaneously," says Nadeau. Genetti's task, therefore, was to render the multiple volumes in multiple resolutions. "The problem we have in space is that there is a lot of empty area and then a spot with highly detailed things," Genetti explains. "An analogy would be if each voxel were a Lego and each proplyd 1/100th the size of a Lego. To see the proplyds we could use tiny Legos, but we can't build a nebula out of tiny pieces."
So, using a ray-casting technique for rendering, Genetti created software in which the size of a voxel tells the ray whether to interpolate color and opacity in tiny steps or large steps as it travels through the volume, thus allowing the volumes to be, in effect, nested. And with that, the team was able to interpret the space telescope data in three dimensions and fly close enough to it to see the proplyd disks and stars spurt out.
"I did my thesis on Orion, and people didn't go there then," says Sweitzer. "Seeing this visualization was an emotional thing for me."
In the Space Theater's first show, the virtual starship leaves our galaxy and, thanks to NCSA, travels deep into the universe. Using techniques and software, created for the Imax film Cosmic Voyage, including a Star Renderer written by Loren Carpenter of Pixar Animation Studios (Pt. Richmond, CA), NCSA turned the digital galaxy into a digital universe. "We're mapping 40,000 galaxies, some bigger than the Milky Way," she says. She used the supercomputer center's CAVE for choreography. "We can load the Hayden dome program and all the galaxies into our cave and use voice and gesture to do the camera choreography," she says. "After we decide what paths to take through the data, I do color and transitions, then we render each frame seven times at 1280x1024 resolution."
"This is a museum of expedition," says Sweitzer. "We've sent people to the Gobi desert, and [the planetarium] offers as much of a science journey as going to the top of a mountain. It's real science visualization."
None of this, of course, takes us any closer physically to other parts of our universe, but for the thousands of people who will sit back in the reclining seats in the Space Theater and look up at the sky inside, trips such as these offer a view of the universe they've never seen before.
Barbara Robertson is Senior Editor, West Coast, for Computer Graphics World.