Issue: Volume 33 Issue 7: (July 2010)

Scientific Visualization

By: Karen Moltenbrey

Star Power

Before the millennium, most planetariums explored the heavens using non-digital equipment, such as “star balls,” which—among other hurdles—were limited in their unidirectional ability of showing the night sky solely from the perspective of planet Earth. Since that time, an increasing number of planetariums have embraced digital technologies, starting first with the installation of SGI supercomputers and, more recently, with commodity machines.
“Through commoditization and Moore’s Law, other solutions started to become viable when the SGI Onyx became unstable,” says Benjy Bernhardt, director of engineering at the Rose Center for Earth and Space, home to the Hayden Planetarium. “Powerful PCs, CPUs, and GPUs have brought the power and functionality that used to be the exclusive domain of really high end, refrigerator-size computers down to small boxes, even laptops.”

At the Hayden Planetarium in New York City, as well as at the Gates Planetarium in Denver and the Morrison Planetarium in San Francisco, a digital evolution occurred, enabling those museums to deliver immersive experiences that were unheard of just a decade or two earlier. Now, visitors can fly through amazing 3D images of cosmic phenomena that are created from complex data simulations for scientific accuracy.

Unlike a feature film, the sole purpose of which is to entertain, a space show is first and foremost a scientific endeavor, requiring planetarium staff to review the scientific data from various simulation and visualization processes, and integrate information into a comprehensive presentation. As a result, the data sets created and used by planetariums are exceedingly complex, requiring space-age hardware and software solutions that will grow and evolve to support new out-of-this-world discoveries.

Space exploration has matured, just as computing and visualization have, taking planetarium visitors far beyond our own galaxy to the ends of the known universe.

Reaching for the Stars

Using VIS-SIM to explore the outer limits of the Universe

Children and adults alike often can be found wishing upon a falling star. But, according to many astrophysicists, shooting stars are not really stars at all. Mostly, they are meteors—particle debris from the solar system—entering the Earth’s atmosphere. This reality seems far less interesting than the childhood myth, yet the real death, and birth, of a star is an event that is truly spectacular beyond all imagination. And it is one that is depicted in awesome fashion in the Hayden Planetarium’s latest immersive show Journey to the Stars.

A mixture of complex astrophysics research, physics-based simulations, scientific visualizations, and cutting-edge CGI, Journey launches audiences through time and space to experience the life and death of stars in our night sky. The show, like all the planetarium’s productions, was created at the American Museum of Natural History by an internal team of astrophysicists, scientific visualization experts, and a media production crew, in collaboration with NASA and dozens of scientists throughout the world.

“The clock started ticking in 1999, and since then, we have produced four full-blown, high-end space productions, in addition to other projects,” says Benjy Bernhardt, director of engineering at the museum’s Rose Center for Earth and Space, home of the Hayden Planetarium. “We are able to produce roughly one show every three years.”

Unlike a typical movie crew, creating these scientific shows is just one aspect of the Hayden team’s job responsibilities. And while creating eye-catching CG and visual effects for entertainment-based feature films can be complicated and time-consuming, generating stunning, scientifically accurate simulations of out-of-this-world phenomena becomes an astronomical feat—and one that can only be accomplished through leading-edge digital technologies.


Members of the creative team at the Rose Center for Earth and Space guide the space show Cosmic Collisions in the Hayden Planetarium to completion.

Recent History

This year marks the 10th anniversary of the Rose Center for Earth and Space, which opened its doors at the turn of the millennium and houses the rebuilt Hayden Planetarium, originally established in 1935. Gone was the old optical star projector, and in its place was a next-generation visualization system consisting of the powerful Zeiss Mark IX optical sky projector and a digital dome system comprising an SGI Onyx 2 InfiniteReality2 and a Trimension video display with seven coordinated Barco high-resolution CRT projectors (five around the periphery and two for the center).

“If you want to experience phenomenon in the night sky, explore the Orion Nebula, or see what colliding galaxies will look like 50 million years in the future, and visualize that scientific data in a meaningful way, you need a compelling set of tools,” says Bernhardt. “That is what we looked for when we started to develop the technology for the new Hayden Planetarium that opened in 2000.”

Not long after, a more cost-efficient solution—one using commercial hardware that took advantage of the revolutionary power of GPUs—replaced the Onyx as the museum’s visualization system. It was built around dual AMD Opteron 250 rackmount CPUs with Nvidia Quadro FX cards sporting G-Sync option boards. The system would support multiple functions, including the visualization of real-time data presentation in the dome for lectures or the staff’s own previsualization needs, rendering the work for upcoming shows, and synchronizing the projection of the multiple images onto the 21-meter (69-foot) dome. Seven of the black-box PCs were devoted to visualization and projection, with an eighth serving as the master controller and as a backup when needed.

In essence, this genlocked, swap-group environment uses inexpensive hardware to drive a high-bandwidth playback that only a few years ago required the use of a supercomputer. By dividing large data sets across multiple storage and computational sites, and running operations on them simultaneously, these clustered computers can manipulate extremely complex data sets at phenomenal speeds.

The content for the shows, meanwhile, resides on an external SAN storage array. By utilizing that external ADIC StorNext file-sharing platform, the museum is able to play back material to the dome, while writing new content for production—without any interruption or compromise in service, says Bernhardt.


In the early universe, the first stars form from clouds of gas drawn together by the gravity of the mysterious substance called dark matter. The Hayden team re-created this event using high-performance computing and visualization techniques, and incorporated it into the show Journey to the Stars.

The dome itself is omnidirectional, with 429 seats surrounding the screen in a circular fashion. A six-channel projection system (now comprising Projection Design F30s situated around the circumference of the screen, in addition to a master, each paired to a media server) displays the imagery. In order to display the large visuals used by the planetarium, the imagery is segmented and divided into six corresponding “slices,” and then synchronized, or “linked” together, to create one seamless, edge-blended picture.

“The Hayden’s current setup is limited by the projectors at this point, which show a 3k x 3k image at 8 bits,” says Bernhardt. “Our medium-term goal is 4.5k x 4.5k at 10 bits.” Despite the limitation of the present projection system, the staff nevertheless began working in higher bit depth on the graphics side, allowing for a higher dynamic range and the ability to do gamma adjustments without introducing a lot of noise into the imagery. The group moved to 16-bit floating-point EXR and needed a projection system that could keep up with the graphics. Meanwhile, the museum’s recent shows, including Journey to the Stars and a previous show, Cosmic Collisions, were rendered at 4k x 4k resolution, to future-proof the content.

“We wanted a computer system that could immerse the audience in content that has a Hollywood-style rendered look, but it is most important to remember that the imagery here is driven by real science, through simulation and visualization. And we wanted it to play in a theater that supports Broadway-style theatrics,” Bernhardt explains.

While the current system is state-of-the-art, the new projection system will be bleeding-edge—so much so that audiences will be able to see finely detailed imagery that had previously been “invisible” to the general public. “We know the material is present; we can see it on our monitors [during previsualization],” says Bernhardt. “We work at the highest quality and resolution so we can be as accurate as possible, and we know that the projectors will keep getting better and will be able to reveal more and more of the information. To that end, we use higher resolution and higher bit depth from the graphics side because we know that any flaws we have will also be revealed in the future with the new projection system.”

Star-studded Animations

The latest development at the Hayden Planetarium is Journey to the Stars, which plays daily. Journey, nearly three years in the making, was completed by the facility’s astrophysics department with direction from Carter Emmart, director of astrovisualization, who, according to Bernhardt, “is a master at figuring out how to translate the science into something both accurate and compelling to look at.”

Journey follows Cosmic Collisions, which, along with other space shows produced by the museum, is licensed and featured at museums and planetariums around the world, including the Gates Planetarium in Denver (see “Sky High,” pg. 44). That show illustrates how, through explosive collisions, our solar system was shaped. The remarkable events portrayed in Journey and Collisions were generated through simulations and visualizations requiring phenomenal power, and then processed with complex tools and techniques. In nearly all of the shows’ sequences, the group had to mathematically describe and then computationally model the intricate events depicted; then, the numerical simulations (data sets) had to be turned into graphic animations.

The space-show production team is set up similar to a typical film production crew, with a director, content creation group, and VFX supervisor (in this case, a visualization director). “We run through the same type of story­board and iterative dailies viewing processes you would have on a CG film,” explains Bernhardt. “Only on a CG film, when it is time to look at dailies, you just pop over to the viewing area. Our viewing area is the dome, which is used from 10:30 am to 5 pm every day, so we work a lot of atypical hours.”

Where does this space-age imagery come from? For Journey and its other shows, the development team of scientists and visualization experts use various digital content creation tools.


The events shown in Cosmic Collisions are based on simulations of extremely complex data sets.

Usually, the shows require ultra-specialized data from a number of scientific sources throughout the world, including NASA, which use scientific simulations to describe the complex physical events in the shows, such as the impact of an asteroid hitting Earth, as depicted in Cosmic Collisions. “Whenever possible, we use real simulation data,” says Bernhardt. “That is one of the hallmarks of our shows. It is created by collaboration, but with strong scientific involvement both internally and with our collaborators. The science rigor starts at the very top and percolates throughout the entire production process.” The production team then translates the data sets into visualizations that are accessible to learners of all ages.

To that end, the museum staff wrote multiple pieces of software that would transform the complex, dynamic data sets, such as 2D image files taken by NASA’s Hubble and Spitzer space telescopes, into 3D volumes moving through time. “When you look at a Hubble picture, you are looking at a snapshot of data. We can fly up to a 2D Hubble picture in the dome, but we cannot fly into it. It’s flat,” explains Bernhardt. “We model what we see in the picture, so it becomes 3D, and now we can move the camera through it to give the visitor the experience of flying through or around the object.”

When the in-house staff is called upon to generate 3D models, it does so using Auto­desk’s Maya or Side Effects’ Houdini for modeling and managing the camera animation. On the software side, the rendering is usually done with Pixar’s RenderMan, though at times the group’s custom volumetric tool, mPire, is needed for certain phenomena (such as stars and clouds, giving the imagery an accurate and beautiful “cosmic look”), while the in-house developed Spot is used to generate star fields.

“That renderer (mPire) was the result of collaboration we did on our first space show with the San Diego Supercomputer Center, and we dusted it off for large sections of Journey to the Stars, creating some amazing volumetric renderings that were efficiently done on the new Intel quad-core boxes,” says Bernhardt. “For that first show, we ran it on a giant Sun cluster for some ungodly amount of time. But with the modern processors, we were able to render a test scene on just a few processors in a couple days.”

Image compositing is done using The Foundry’s Nuke, with Seriss Rush managing the renderfarm.

According to Bernhardt, the production chain is mostly Linux-based. As a result of the setup, the TDs and graphics artists use a Linux-based version of Maya running on their Dell graphics workstations. The hardware rendering also runs under Linux within a cluster of more than 100 processors running the Fedora operating system on PSSC Labs rackmount computers.

More recently, the digital dailies function was replaced by a newer image generator, for playing content back to the dome and for reviewing the content for the dailies when making upcoming shows. This latest system, which is Windows-based and installed by Global Immersion, runs Delta software from 7th Sense.

The playback system, with its fast processing power and storage, pulls the necessary bandwidth that enables the planetarium staff to view the rendered content in real time without compression. As a result, the group can look at renders in an artifact-free state and make content decisions quickly. “This ability to do digital dailies is invaluable,” says Bernhardt.

Real-time Reality

After the planetarium went digital a decade ago, development of its real-time environment seemingly  progressed at the speed of light—a perfect scenario for, well, flying around the universe. To that end, the American Museum of Natural History curated and now maintains the Digital Universe, an atlas database of all the known stars, galaxies, and objects in the observable universe—a 3D mapping of the universe, if you will, containing vital data, such as the location of the planets and stars, brightness levels, proximities, and so forth. It is constantly updated as NASA and other researchers conduct new surveys and make new discoveries.

The planetarium staff uses the information contained within the Digital Universe as the basis for a significant part of its shows, utilizing the database in collaboration with a real-time data viewer as a previs tool to produce accurate, rough content cuts—working out complex camera paths, for instance. Custom scripts pull the data into Maya, where the content-creation team works to finesse the imagery and make it screen-ready for shows.

“If we need to figure out the camera movement from Earth to Mars, and then to the edge of the solar system, we want to work with the camera animation as we would in a regular 3D package,” explains Bernhardt. “And, it is better to do it in the dome because of the orientation with so much peripheral-vision matter. We can sketch things out and get a good sense of what we want to do.”

Working alongside Digital Universe is Uniview, a Windows-based VR-type viewer that enables the group to examine the comprehensive galactic database in real time on the dome. Uniview eventually replaced an earlier VR-style of viewer that ran on the Onyx. “It was a bit primitive, but we were able to navigate around the data fairly quickly and render out content,” says Bernhardt.


For Journey to the Stars, the content team had to mathematically describe and computationally model the cosmic events in the show, and then turn the numerical simulations into graphic animations. In the bottom image, the group also added a 3D model of the space probe Voyager 1.

The new viewing system actually sprung out of an internship at the American Museum of Natural History in collaboration with the University of Norrkoping in Sweden, and spawned the company Sciss, which began commercially marketing the software, now named Uniview. Uniview runs on multiple platforms ranging from laptops to multichannel PC clusters and is used in a number of digital domes throughout the world. However, other viewers also utilize the Digital Universe atlas via agreements with the museum, including Sky-Skan’s DigitalSky and Evans & Sutherland’s Digistar, which are also popular in planetariums. Global Immersion and Zeiss, meanwhile, use Uniview for their real-time planetarium installations.

In addition to using Uniview for previs, the production team has relied on the setup’s real-time functionality to generate rendered content for the show Field Trip to the Moon, a tour of the universe geared for school groups. The crew also uses Uniview for other real-time operations, such as live presentations during public lectures and allowing speakers to go off script.

In addition, the crew uses Uniview for other real-time operations, such as generating educational products for lectures and as a previsualization tool when making the pre-rendered space shows. Furthermore, Uniview and Digital Universe are vital components in mixed-media productions and live performances held in the planetarium. Earlier this year, a rendition of Joseph Hayden’s opera “The World on the Moon” was staged at the theater, with the backdrops created by the Gotham Chamber Opera group using the planetarium’s vast image library. “This program was a hybrid—it wasn’t using real time, but it involved more than just playing back pre-rendered content,” says Bernhardt.

Recently, “The Known Universe” video, which has gone viral on YouTube with more than five million hits, was created at the museum (in partnership with the Rubin Museum of Art) utilizing the Uniview software.

A Changing World

Ever since going digital, the Hayden Planetarium has continued to evolve, staying ahead of the technological curve as much as possible. And with the 10th anniversary of the Rose Center for Earth and Space approaching quickly, an effort is in full swing to bring the entire museum up from 2000 technology to 2010 technology. Included in that plan is to change out all the LCD monitors in the Cullman Hall of the Universe to high-brightness LCDs; this includes a new Christie Digital Micro Tiles video wall to display the HD Science Bulletins’ Astrobulletin there, replacing the current rear-projection video wall. Another big renovation is in the Big Bang Theater, which will now get an updated, four-channel blended display consisting of four LED-based projectors dedicated to the explanation of the Big Bang, with two channels per computer, as opposed to a single channel.

“We are also redoing more than 50 visualizations that play in the Hall of the Universe, and almost in every case, we are trying to replace pan/scan with real visualization data, which is a pretty large task, but a lot of the data is out there and available from various research institutions,” says Bernhardt. “We want to make it less of a slide show and more current by providing accurate scientific information.”

To the Outer Limits

Without question, the immersive shows produced by the American Museum of Natural History are pushing the state of the art in computer graphics, computation, and visualization. “There are few attractions out there that keep up with the speed and advancement, and engagement, of the experience,” says Bern­hardt. “And the digital dome experience is one of those. It engages your peripheral vision and immerses you in ways that sitting in front of a television or flat screen does not.”

With such state-of-the-art equipment, the group is able to put together systems that are as fast as anything out there, allowing them to experiment and revitalize the planetarium dome experience, all in a much smaller physical space and at a lower cost and heat and power load, but with far better performance and results. “I’m not just talking about, ‘Gee whiz, look how fast we can fly through the data.’ Rather, it enables us to step beyond just making things work so we can focus on the conceptual stuff beyond that, whereby we can ask ourselves, ‘What if we could do this, or that?’ To do so, we need a platform that is stable, modern, and has forward growth.”

According to Bernhardt, the digital tools now enable him and his colleagues to push science visualization to the outer reaches of the universe. “You may think that [new technology] will save you time and money, but what it really ends up doing, if you do it right, is raise the quality of your projects by giving you more options. We can render more quickly and reliably, and give the production team the ability to make more informed decisions. It doesn’t really save us time, but brings up the overall quality of the work by allowing us to iterate more effectively.”

Over the years, the Hayden Planetarium has explored the outer reaches of the universe, enabling visitors to experience the cosmos like never before as a result of the efforts of scientists here on Earth. Indeed, thanks to advances in observing and computing technologies, astro­physics has entered a phase of tremendous discovery and expansion during the past 10 years alone. To that end, mathematical and visual models of the universe and its physical processes, like those featured in the space shows, now are made from enormous observational and theoretical sets of data, which can be manipulated to get results in weeks and months, rather than years or centuries.

So, the next time you look up at the night sky and see a falling star, go ahead and make a wish. But this time, you may want to wish for further advances in and commoditization of visualization, as it can make many dreams—particularly those of scientists and researchers—come true.

Galactic Gazing

Since the dawn of time, humans have been captivated by the wonders of the night sky. In the early days, people would gaze upward at the sight of twinkling dots of light, taken by the majesty but unable to comprehend the magnitude of the scene. More recently, planetariums were built to assist us in celestial navigation and comprehension.

Statistics claim there is one planetarium in the US for every 100,000 citizens. But then again, what some call a “planetarium” can hardly be compared to the state-of-the-art theaters springing up in the US and around the world. During the last few years, the Hayden and Gates Planetariums began replacing older hardware with next-gen supercomputer-based digital equipment, and in the process, are able to offer visitors experiences like no other.

Among the facilities that have undergone this type of digital makeover is the Morrison Planetarium in San Francisco, which closed its doors in 2003 at the California Academy of Sciences in Golden Gate Park, and reopened them two years ago following a $20 million renovation.

In fact, the Morrison was only part of a renovation project that unified the Academy’s three public attractions—the Steinhart Aquarium, Morrison Planetarium, and Kimball Natural History Museum—under a 2.5-acre undulating green roof. The new facility is one of the most environmentally friendly museums on the planet. In a unique setup, the planetarium dome is cantilevered out over the aquarium’s Philippine Coral Reef tank.



Today, the Morrison facility is one of the largest all-digital planetariums in the world. The screen is a 75-foot-diameter dome (tilted 30 degrees) within a 90-foot-diameter shell.

At one time, the Morrison Planetarium’s custom-made star projector opened the universe to area stargazers. Now, six high-resolution F30 sx+ DLP projectors from Projection Design fulfill that task. They project imagery at 1400 x 1050 resolution (though are capable of projecting at 1920 x 1200) onto a state-of-the-art dome from Spitz. The dome panels fit together seamlessly—without any visible seams. (Plans are under way, too, at Morrison Planetarium to eventually migrate to 4k projectors.) Custom-designed, optical-blending technology melds the six streamed slices of data into one very large, uniform image.

Similar to the Hayden and Gates Planetariums, the Morrison astronomers are able to take audiences on real-time guided tours of the universe, play pre-rendered shows, and even generate their own content for unique shows that are produced in-house.

Behind the scenes, back-end server clusters are responsible for the planetarium’s stellar operations. The first is a six-channel Definiti graphics cluster from Sky-Skan that feeds content into the system. The second cluster runs the Sciss Uniview software for scripting and recording presentations, and playing them back in real time. Both work alongside a Global Immersion server rack as data visualization platforms for flying through the universe. That system comprises HP xw8600 workstations equipped with dual-core Xeons and Nvidia Quadro FX 5600 cards.

Like all new planetariums, the dome has been designed for a diverse range of uses and audiences for today and tomorrow, and will be used by scientists to broadcast live NASA feeds related to content from missions, as well as eclipses and other events. In addition, it will be used to connect visitors to Academy research expeditions around the world.

Currently, the planetarium is showing Journey to the Stars, developed by the American Museum of Natural History in New York. In addition, it is showing Fragile Planet, the Academy’s inaugural planetarium show produced in-house, which takes viewers on an adventure through space. —KM

Sky High

Cutting-edge technology enable out-of-this world experiences

At the Denver Museum of Nature & Science (DMNS), visitors can step into the theater of the Gates Planetarium and be transported to another world. Or Universe. Or Galaxy. If they prefer, they can also get a unique perspective of what is happening on their home turf, the planet Earth, by examining it from space. And real “homebodies” can enjoy a visual experience of the sights and sounds that are more down to earth: a theatrical or concert experience, for example.

These journeys—whether impromptu (real time) or planned (pre-rendered)—are made possible through the planetarium’s use of the latest digital offerings: supercomputer-level hardware, projection technology, and 3D software. Like many such facilities across the US, the Gates Planetarium underwent a major makeover of sorts about a decade ago, when the facility’s mechanical Minolta star projector and other aged equipment yielded to the digital revolution. As a result, the universe at the Denver planetarium became much larger and more stunning for audiences.

But even the brightest stars will burn out eventually—some more quickly than others. Immediately following the renovation, the planetarium boasted the most impressive high-resolution imagery powered by a 30-processor SGI Onyx 3800 visualization system with an 11-channel InfiniteReality4 graphics subsystem. At the time, that specialized supercomputer system was the bleeding edge—and the only equipment available to handle this type of complexity. That system drastically expanded the number of stars that could be seen virtually, from several thousand to several billion. And with the high-resolution CGI, those stars were no longer just points of light, but an intense visual experience.


Ka Chun Yu, curator at the Gates Planetarium, continues to expand his work in immersive virtual environments. Most recently, he worked on the museum’s Black Holes show.

But over the years, more flexible technology emerged, and more recently, those SGIs were replaced by hardware that was less expensive in terms of the cost and the manpower needed to use and maintain the specialized architecture. Today, HP workstations with Intel processors are enabling the planetarium’s missions. Because of this change, the planetarium was able to revamp its software platform; now it supports both Linux (for scientific applications) and Windows (for more basic functions). This, in turn, opened what had been, in essence, a specialized, closed architecture. It also opened the door to far more visualization possibilities and opportunities.

The planetarium’s impressive star power is particularly evident in its real-time capabilities, enabling the host to take stargazers to the ends of the known universe—or to a friend’s backyard. What’s more, the high-resolution imagery in these non-scripted  (as well as scripted) galactic journeys is scientifically accurate, based on imagery from NASA and often compliments of the Hubble Space Telescope.

According to Dan Neafus, operations manager at the Gates Planetarium, the current system and setup enables the facility to offer three types of programs in the dome: pre-rendered films produced out of house and licensed by the planetarium, programs created on site, and real-time presentations, which can involve space travel or tertiary-related content.

According to Neafus, the planetarium has two diverse categories of shows that are running now. The first, which the facility distributes to other theaters worldwide, is called Black Holes: The Other Side of Infinity. The program features high-end supercomputer renderings of galactic and black-hole phenomena, and the Denver role was to use on-site equipment for compositing work and editing. The second program is Cosmic Journey: A Solar System Adventure, which is a tour of the solar system that the local crew created entirely in-house and will be releasing for distribution.

Exploring Black Holes

In the museum-produced show Black Holes, viewers embark on a virtual adventure inside one of these cosmic curiosities, “zipping through otherworldly wormholes, experiencing the creation of the Milky Way Galaxy, and witnessing the violent death of a star and subsequent birth of a black hole,” promises the planetarium on its Web site.

The experience got off the ground thanks to Andrew Hamilton, an astrophysicist at the University of Colorado, whose flight-simulation software brought the imagery to life. Most of the scenes in the show were generated using traditional computer animation techniques, visualizations of numerical simulations, model reconstructions based on observed data, and Hamilton’s real-time general relativistic flight simulator. The group at the museum played an executive management role, using a suite of Adobe software running on HP workstations to composite and edit the imagery, as well as mix the audio and video, before previewing it to be sure it would deliver well on the dome screen.

In addition to using HP workstations to generate proprietary pieces, the xw8400 machines with quad-core processing technology are also used in one of the museum’s playback systems. Each workstation (equipped with quad-core Intel Xeon processors and Nvidia 5500 graphics cards) is set up in a parallel array and linked to a corresponding F30 digital light processing (DLP) projector from Projection Design situated across the interior of the 125-seat domed theater. (The current array has five projectors around the perimeter and one at the apex.) The system works in tandem, and the images are blended together to create a seamless image that, when projected onto the dome, simulates a nighttime sky. The dome itself is 57 feet in diameter and is situated at a 25-degree tilted angle, projecting 3600 x 3600 imagery.

Another HP workstation acts as a seventh channel, serving as a control node for the master unit, processing the images within the 3D software and sending them to the other workstations. In fact, the planetarium has a duplicate of this system, allowing personnel to jump between the two—one configured with Linux and the other with Windows—while doing after-hours and test work as to not interfere with the bay system that runs the daily shows.

“The majority of the work we did on Black Holes was to help the show’s director, Tom Lucas, and production group get a good under­standing of how the final product would look on a dome screen,” says Neafus. “Lucas had extensive HD TV and film experience, but had not produced for the dome before. The supercomputer group [from the National Center for Supercomputing Applications] had a good amount of dome experience, but they had to make sure what they were sending and rendering actually looked good on our screen. Too many production [crews] don’t take the effort to check what they are doing to be sure that they create a good, immersive audience experience.”


The Gates team encourages and assists those wanting to take advantage of their local resource, in hopes of extending content development on this platform. Here, university students created animated backdrops for their production of “Joan of Arc,” held at the planetarium.

Verification by the Gates staff involved a series of setups, going from the high-resolution dome master—kind of a film snapshot of the whole dome—to slicing the imagery up for the individual projectors and then having those compressed and formatted so they could play and stream back on the video players. Called “slicing,” the process is typically done offline on the workstations with proprietary software. In essence, the big image is divided among the visual projectors and compiled in a proprietary format for streaming off the hard drives quickly and efficiently.

For Black Holes, the Gates Planetarium staff worked with a range of collaborators, and each had their own preferred tools. In fact, the group came together serendipitously. Hamilton was from Boulder, Colorado, and the director, Tom Lucas, from New York City; while both were in Denver, they met up and brainstormed with a third partner, Mike Bruno, from dome fabricator Spitz’s digital content group. They applied for a National Science Foundation grant, and when that was approved, they moved forward with the work, bringing in other experts to contribute to the project.

The Black Hole flight simulator was a proprietary tool that Hamilton had built from scratch using raw code written in C++ and incarnations of the SGI architecture. Another contributor was James Arthurs, who worked as a contractor using NewTek’s LightWave; he delivered the dome master to the Gates group for editing and splicing. “The NCSA group uses tools that are proprietary and primarily based on SGI Performer that they migrated to other tools for creating the high-resolution galactic flythroughs and, more particularly, for creating the raw scientific visualization,” says Neafus. A good deal of the tools used by the production team were developed specifically for this project.

“We gave them storyboards, and they worked as a subcontractor crew to provide a specific visual,” says Neafus of the visualization team. Lucas provided hard-copy sketches that were scanned and manipulated, then inserted into an animatic created within Apple’s Final Cut and Adobe’s Premiere to give the subcontractors a feel for the piece and how it should look. The final scenes, says Neafus, were full 3D and rendered fisheye style in the final pass.

Journey to the Stars

The show Cosmic Journey: A Solar System Adventure also put the in-house setup to the test. The current version of the pre-rendered show Cosmic Journey is the museum’s fourth; it takes audiences on a trip through the solar system at many times the speed of light, stopping along the way for a close-up look at certain planets and their moons, which are shown in scientific detail. To update this show, the museum personnel used Uniview, a Windows-based astronomical visualization and universal data exploration platform. “Simply put, it is a real-time tool that lets you fly through the universe,” says Neafus.

Uniview contains a library of information for cosmic objects—the Earth, for instance. “You have the correct orbital information, texture maps to layer onto a sphere, the surface, and so forth,” Neafus explains. “The architecture is parallel to a typical modeling program, but in this case, you are dealing with real time, and the system moves closer to a video game architecture for fast polygon counts, fast refresh … those kinds of things.”

While Uniview can run on various platforms, for the planetarium’s usage, it is set up on both of the HP systems dedicated to the development and experimental work. The first incarnation of the program was done using the SGI equipment back in 2003. Since then, the group has implemented other tools and upgraded the visualization with new terrain data and texture maps. Now, it is developing Unity’s game development/visualization software for the flythroughs.


The capabilities of the Gates Planetarium setup enable cosmic flythroughs on the fly (top), as well as more “down-to-earth” applications, including generation of digital backgrounds during live concerts.

The Denver group licenses the Uniview database, which works with its multichannel systems, for demonstrating solar system phenomenon and flying around the galaxy and stars. “All the imagery is accurate and to scale, and all of it is near-photorealistic,” adds Neafus. “It’s a very powerful tool.”

For Cosmic Journey, Neafus and an in-house team did frame grabs of a flight sequence they devised, utilizing the real-time Uniview navigator/viewing software as a high-end animation tool. As they conducted the flythrough in real time, they recorded it, though Neafus had to make the journey many times before getting it to time just right with the prerecorded narration. Once he had a perfect flythrough, he kicked it off to rendering. Rendering for the real-time system as well as the dome playback is a frame-capture algorithm—not a multipass, as is the case for most 3D animation tools.

“It is WYSIWYG, but it can handle what we are doing,” he notes.

The output from the Uniview and real-time frame captures requires a great deal of compositing and tweaking, which the staff does in Adobe After Effects. Dealing with such issues as color, contrast, antialiasing, shading, and layering can be more painstaking than the image creation itself, Neafus says, “You have to nail them perfectly.”

When model creation is necessary, the planetarium staff also uses Autodesk’s Maya and Maxon’s Cinema 4D software tools.

Real-time Productions

Lastly, the planetarium also runs real-time flythroughs using Uniview running on an HP cluster that is not part of the server setup. “Because it is real time, it plays like a video game. You can go anywhere you want to go,” says Neafus. In fact, the planetarium does a daily program that allows the public to walk in and decide where in the universe they would like to visit, “and we take them there on the fly. Nothing is pre-rendered for that experience.”

In addition to both workstations dedicated to the experimental work, all six workstations in the cluster also contain a complete copy of Uniview for every node that lives on each hard drive, so the cluster can efficiently access the data (more than 2gb per copy). Although every machine in the cluster is running a parallel copy of Uniview, each one is looking in a different direction. “The trick is to make them all run together and call up the data simultaneously,” says Neafus. “That is the beauty of the architecture.”

By running the software in this type of setup, the group, for instance, can call up a map of Earth, as well as a weather map, bring that in as a spherical texture map, and copy it to all six nodes so they all have a copy of the same texture map. “As we are flying through, the system is calling up all the data at once, and the six machines are blending the imagery together as full hemisphere projections,” Neafus explains. That imagery usually comes directly from the origination point—say, a NASA satellite. As a result, the crew has shown, for instance, the affected area of the recent oil spill in the Gulf of Mexico and the volcanic ash cloud that recently plagued Europe.

How far can the group push these visualizations? Basically, as far as the known universe, to as close as sub-meter resolution. The key is to be sure that each machine’s output rate can keep up with the robust data. While the RAM and motherboard are important, the brunt of the demand is placed on the video card, to keep up with the refresh rate and stay synchronized.

Fast Forward

“Installed in 2007, the current architecture is already a bit old,” points out Neafus. “We are looking at an update.” Currently, the output from each computer in the dome is a rectangular 1440 x 1050 image, which is then warped into the curved shape to match the dome using a proprietary black-box solution (from Global Immersion), and then passed on to each projector—basically, the first half of the geometric blend process. “Our architecture makes it simple to have multiple image generators upstream and switch among the clusters because everything is going into that black box, and it blends well up on the dome,” Neafus says. Newer architecture, however, is accomplishing that on the video cards.

In so far as the 2007 upgrade to HP workstations, the staff witnessed a dramatic improvement in rendering and processing speed, on average 50 times faster. Currently, the group can access and modify frames in approximately 10 seconds, compared to 60 seconds previously, though that can still hinder real-time scenarios.

In Neafus’s opinion, the biggest challenge that comes with this type of dome work is the scale of the files and managing those files. “You are dealing with six times the demand you would have for HD editing,” he says, referring to the six clusters in the dome. “That requires a robust capability for throughput, storage, and network storage. For the past few years, prices for that have come down to reason, but a few years back, it was a nightmare.”


With its state-of-the-art digital setup, the Gates Planetarium is able to create its own space shows, including the museum-produced Black Holes.

To show just how far things have progressed, Neafus points out that the older SGI system allowed just 4tb of storage. “In the first production we put together, we could barely accommodate one complete show in terms of development. We didn’t have enough room to do another program or even a backup. So, it was a big bottleneck.” In contrast, today the group is using a 15tb EMC RAID for holding the daily work, and the HP workstations hold 1.5tb each in the offline systems; they also rely on more than a dozen 1tb USB portable drives for archiving, and all of that is networked onto a gigabyte network for sharing copy back and forth.

“Yeah, four years ago it was a real bottleneck. Today, moving 4tb is trivial by comparison,” Neafus says.

Even so, the current tool set is enabling new and exciting productions. Not long ago, local university students created moving digital backdrops for 17 scenes in their production of “Joan of Arc” that were projected up on the dome while the performers were on stage. “Students had the chance to experiment with the different visualizations and the timing using our offline system,” Neafus says. Another unique production was “Live Out There,” a musical performance given by museum curators. In these instances, the imagery was created from scratch by the users; sometimes it is done in Autodesk Maya, other times in Cinema 4D or other types of common DCC software.

“We want to increase the pool of users who can create this type of work, and make sure they get a quality experience out of it,” says Neafus. “The museum’s goal is to increase scientific literacy and to continue pushing the envelope in terms of programming that engages our audiences.”

Karen Moltenbrey is the chief editor of Computer Graphics World.
Back to Top
Most Read