VISIT MARS AND CONQUER EVEREST IN TWO REALISTIC JOURNEYS THAT PUSH VR BOUNDARIES
In the past year, virtual-reality experiences of all kinds have been popping up, now that VR headsets are finally becoming available. Last year, at the SIGGRAPH VR Village, show-goers had the chance to traverse a cable atop the Twin Towers of the World Trade Center, canoe the waters of the Grand Canyon, explore a room inside an Egyptian pyramid, see ancient cave art, and so much more. This year, we are expecting even more engaging experiences as VR content creators keep pushing the limits of this medium.
Two projects in particular are breaking new ground in terms of their graphics realism: Mars 2030 and Everest VR. Mars 2030, available later this year, lets users experience what life would be like on this planet. The science and information, as well as the imagery, is based on data provided by NASA. Meanwhile, Everest VR, available in late summer, takes users on a trek to the highest mountain on earth, through a landscape that is an authentic representation of the terrain.
GAME ARTISTS USE SCIENTIFIC DATA TO RE-CREATE MARS FOR A COMPELLING VR EXPERIENCE
In 2015, actor Matt Damon and director Ridley Scott renewed our fascination with the Red Planet in the science-fiction film The Martian, about the crew of the fictional Ares III manned mission to Mars to explore the Acidalia Planitia. When a dust storm threatens to topple their Mars Ascent Vehicle, they make a hasty exit and leave the planet. It’s then that one of the astronauts, played by Damon, is injured and, believed to be dead, is left behind. When he comes to, he makes his way to the habitat, where he must survive until the Ares IV arrives at the Schiaparelli crater… in four years.
The movie was a box-office success. And, it piqued the interest of many who hold a fascination of living on another planet. Although digital artists consulted scientists when re-creating the planet, the movie leans more to fiction than fact. For instance, due to the atmosphere, a dust storm would not have enough force to blow the rocket ship over in the opening sequence, says a one expert commenting on the film’s science. Nevertheless, filmmakers attempted to provide a reasonable snapshot of what life would be like on Mars, while remaining true to its main mission of entertaining audiences.
Soon, though, the public will have a new opportunity to return to Mars, this time in a scientifically accurate virtual-reality experience that is educational as well as entertaining. Called Mars 2030, the experience, created by game developers with a fascination for this faraway planet, will give earthlings a taste of what it is like to be on Mars. Heading up this venture is Fusion, with Julian Reyes at the helm, as lead VR producer.
Reyes, like most of his crew on Mars 2030, has an extensive background in game development. Most of them, in fact, have full-time positions at game companies, working on Mars 2030 after-hours. It was a combination of his fascination with science and his expertise with real-time imagery that led to this novel project.
A GROUP OF GAME DEVELOPERS AND CG EXPERTS, UNDER THE MONIKER FUSION, ARE CREATING THE VR EXPERIENCE MARS 2030, WHICH ENABLES USERS TO GET AN ACCURATE FEEL FOR WHAT IT IS LIKE LIVING ON MARS. PICTURED HERE IS THE HABITAT.
Reyes started on this journey in 2014 after reading a technical paper by some grad students from MIT’s Aero/Astro Lab. They were writing a report on the Mars One initiative – a private, one-way mission to Mars to establish a permanent human settlement – challenging a number of proposals of that mission. “I spoke to them, and they provided some scientific papers on habitation, crew health, space suits, and after I read through them, I had a clearer idea of where we stood and where the technology stood on an actual mission to Mars,” says Reyes.
After more discussions, Reyes decided he wanted to create a VR experience based on the Mars research. It would take close to two years to forge an agreement with NASA whereby both Fusion and NASA would share their work. “We signed a Space Act Agreement with them,” says Reyes.
“They agreed to share additional research with us so we could get this [experience] right.” Similarly, Fusion will provide its data to NASA for use in its virtual-reality lab, for instance, where astronauts are trained to operate, maintain, and fix systems.
“We went to the Johnson Space Center and got to drive the latest rover, talk to the scientists, see the new space suit prototypes, and visit the crew health [experts] to talk about the psychological impact of living on Mars, and then started to build this out,” says Reyes. “I have always been a giant space fan. This was a dream come true.”
RESEARCH AND DEVELOPMENT
The goal of Mars 2030 – named for the expected time frame of placing man on Mars – is to gain an understanding of why some people want to go to Mars and the implications of such a mission. “To identify with [Mars One], we are currently on a mission to Mars ourselves,” says Reyes. “Through our downloadable VR experience, everyone will have a chance to go to Mars.”
Reyes knew from the start that the VR project would be complex, and thus brought several others on board as contractors to assist the small team at Fusion. Currently, the digital crew consists of approximately 20 or so individuals whose roles span everything from technical development to generating actual design elements for the experience.
Authenticity for the project is paramount – after all, this is not a video game, Reyes points out. “It feels as if we are virtually rebuilding a mission to Mars. We have the highest consideration of what is actually being considered in such a mission, and we want to get all that information inside of our VR experience and have it as accurate as possible,” he says. “That is why we have been trying to push so hard on the graphic capabilities of VR, to be able to translate the research in not only a technical way, but visually as well.”
One person who joined the project is Dave Flamburis, lead CG artist and creative consultant, who brings a host of real-time experience from working on Triple-A games. “Julian asked me where I would like to take this and I said, ‘I don’t know, but I am in,” he recalls.
Exactly how does one go about creating a realistic version of an actual planet? Flamburis was given a primitive prototype with some height map-based terrain – not much in the way of a detailed landscape, he adds. “We decided to figure out what it would be like standing there, and started by breaking down the environment into its essential pieces,” says Flamburis.
Flamburis and Reyes devised a substantial list of questions for the scientists at NASA. “They answered all my technical questions, and then I started asking a lot of common questions, like if I was on Mars and bent down and dusted off the red soil, what would I find? We started getting deeper and deeper into things, like what is the essential makeup of all the rocks that you see everywhere? I was trying to nail down the variety of rocks I was seeing, the scale of everything, and figure out what color the planet really is,” he says.
Some information we know about Mars came from imagery sent back from the Viking 1 in the 1970s, when technology was not as advanced and the processing of white light proved difficult. That is where we got the notion that Mars is bathed in red, thus the nickname “the red planet.” However, the majority of our current information comes from the Curiosity rover. From this recent data, scientists now know that the planet is more of a butterscotch color.
According to Reyes, Mars 2030 is a cross-discipline collaboration involving space exploration research, advancements in real-time 3D rendering for VR to achieve the most realistic visuals and physics, and multi-dimensional audio, working with a team at Source Sound to fully immerse users within the experience. “Mars 2030 combines those elements: the latest in scientific research to visualize the future, the most bleeding-edge graphics, and pushing the envelope in VR and multi-directional audio,” he says.
In terms of the visuals, Epic’s Unreal Engine 4 (UE4) became the development habitat for the project – the fact that the project team had years of experience working with the engine was definitely a plus, but more so because “the graphical output of UE4 is superior,” says Reyes. “It is a powerful engine, and we are striving for a level of realism in VR and wanted an engine that would let us get there.”
“We are pushing graphic boundaries to show what it is like to actually be on Mars and let people experience being there,” Flamburis adds.
ROCKS WERE PLACED THROUGHOUT THE LANDSCAPE USING THE FOLIAGE SYSTEM IN THE UNREAL ENGINE. ABOVE SHOWS A WIREFRAME VIEW; BELOW IS A RENDERED VIEW.
When Flamburis came aboard this past January, the terrain already had been processed using remote surface imaging data and height maps that were stitched together. The information is based on elevation data from imagery taken by the Mars Orbiter Mission space probe, resulting in an 8 by 8 kilometer swath with an accuracy of approximately 1 foot in height, generated by Technical Designer Justin Sonnekalb. Because an actual region has not been selected yet for the real Mars One venture, the Mars 2030 team honed in on an actual area on Mars with features that would make it a viable landing spot, and then re-created that as the locale for the virtual experience. It was then up to Flamburis to add realistic geology, including rocks and Martian soil.
“I spent about a week of research, thinking about things and talking to Martian geologists at NASA,” says Flamburis.
“The important thing was to ask the right questions, practical questions. What does the surface look like? What does it feel like? Why are there so many rocks? How big are the rocks?” In addition to information garnered from these discussions, Flamburis also obtained reference from image data-bases and from HiRISE, a camera on board the Mars Reconnaissance Orbiter.
After close to a week of study, the team articulated a vision. Stina Flodstrom, a world builder and landscape/foliage artist, created gigapixel look boards from the plethora of reference material collected. “Everyone spent a few days looking and absorbing the imagery from the planet’s surface so we knew what we were looking at, and then we started making assets,” says Flamburis.
Flamburis began the process using Pilgway’s 3D-Coat voxel sculpting software and then textured the imagery using Allegorithmic’s Substance Painter. “I am able to use a lot of smart materials for the geology I am forming with Substance Painter,” he notes. “3D-Coat is fantastic for getting meta normals, but the details and feel, the PBR, and the coloring were all done using smart materials I developed in Substance Painter.”
The artist team additionally used Nvidia’s new GameWorks toolset to push performance levels and achieve the desired level of realism for the virtual-reality experience.
THE ROCK SHADER USED FOR MARS 2030 CONTAINS A NUMBER OF PARAMETERS.
The entire digital landscape is surrounded by global illumination, lit with a stationary light that is static and dynamic, casting a shadow map that may or may not remain in the final version of the experience. At the end, there will be a good deal of post-processing work and some color grading that will be required. “Mars gets only 40 percent of the sunlight that we get on Earth, and we are still on the fence as to how dim we want this to be,” says Flamburis.
First, though, the artists needed to understand the geology of Mars. The soil is red due to an abundance of a powdery mineral called limonite, which is an iron oxide. As Flamburis explains, it’s also the cause of the sky color: The dust permeates the atmosphere and scatters blue light, giving Mars its butterscotch appearance.
“In some places you could probably wipe the dust and sand off a rock and see a true color that is more gray or grayish blue,” says Flamburis. “It’s the total coating of dust and iron oxide that give Mars its warm hue.”
Moreover, the planet comprises a plethora of basalt from volcanic activity, and a good deal of scoria, a bubbly matrix of volcanic and other rock. Also plentiful is silicate (sand), found in dunes as well as sandstone deposits. Lava caves are abundant, too, and there is widespread evidence of long-ago water flow.
When creating the landscape surface, Flamburis integrated 75 different parameters that he can play with to get different looks. He also created a shader on the land-scape that he made by hand. “At first, I tried a slope-based shader that would change composition by height, but that didn’t work for Mars,” he explains. “I just started with sand, then went with soil, and broke down the sand and soil into two types. We have two styles of fractured sandstone embedded in it, as well.”
The landscape shaders comprise: two soil shaders taken from photographic reference of the Martian surface (taken by the Curiosity rover), two sand shaders from remote-sensing HiVIEW images, two fracture shaders, a dust shader, and various parameters. In a nutshell, there is a lot of iteration, back and forth, until things look and feel correct.
THIS IMAGE PROVIDES A CLOSE-UP LOOK AT THE ROCK SHADERS USED IN THE VR EXPERIENCE. THE ARTISTS INTEGRATED 75 DIFFERENT PARAMETERS THAT CAN BE ADJUSTED FOR VARIOUS LOOKS.
In addition, a database of imagery provided height maps of craters, with infinite visibility down to 17 meters. “That was 5 gigs of height data that Justin [Sonnekalb] stitched together, brought into UE4, and used as the height offset for the Unreal Engine landscape,” says Flamburis. “Then I started collecting the straight overhead shots of sand dune formations. It’s a super windy planet, but the wind does not have the same force as it does on Earth because there is very little atmospheric pressure. So, you get a lot of rippling, like on a beach, though on a massive scale.”
The Martian landscape is stark and minimalistic, what Flamburis calls “beautiful desolation,” though it is dotted with rocks. And there is a science to creating and placing them in the VR application. Flamburis created a set of approximately 25 rocks of different geology based on the makeup of the planet. “I have folders of different geology, including scoria and basaltic rock. For the scoria, I created a rather large rock, broke that rock up into five pieces, and then developed five or six additional rocks,” he explains, noting that these assets are reusable. He retopologizes them in 3D-Coat and brings them into Autodesk’s 3ds Max where he unwraps them by hand, then later imports them into Substance Painter.
“I started making lots of smart materials and broke them down into my base solve materials,” Flamburis points out.
The artist generates extra map types inside Substance Painter, using world vector blending for paint layers, detail layers, color layers, and so forth, which he saves as smart materials, then brings in another rock and repeats. The rock imports as a white mesh with just its base normal, onto which he adds a texture to the smart material. “I will export the normal, the curvature maps, the height maps, the roughness maps, and the base color maps,” he says.
The rocks are about the size of a chicken. “It’s rare that you will see any larger than the size of a microwave,” Flamburis says. With such few geological features on Mars, it was important for the artists to nail the scale of the small rocks and the relative scale of the soil against the small rocks.
The group supplemented these materials with scanned data from the Azores, which contains a volcanic beach with sediment and a gnarly scoria landscape; they nestled this data into the Martian sand. “That scanned data had to stand up next to our handmade imagery, and the references match up nicely,” Flamburis points out.
Flamburis also crafted a rock shader with a dust feature that can be dialed in. The dust is pervasive and covers everything on the planet. “Any object sitting outside – from the rocks to the habitat – has a dust component built into the shader,” he says. “It has a clinging feel, and I use cavity maps and up-vectors to blend against the surface.”
LEAD CG ARTIST DAVE FLAMBURIS CRAFTED A ROCK SHADER WITH A DUST FEATURE THAT CAN BE DIALED IN FOR MORE COVERAGE.
Once he has created enough rock specimens, Flamburis incorporates them into the UE4 foliage system, and he and Flodstrom take turns “painting” the rocks onto the landscape, distributing them according to certain geological types that made sense based on the conversations with the NASA scientists. “The rocks primarily were spewed out as a result of volcanic activity on Mars,” he explains. “It had been occurring for so long that literally the entire planet is sprinkled with them. Most of these have been smashed multiple times through successive eruptions over millions of years, which also accounts for the evenness of the scale.”
The artists literally have “painted” the surface with a million rocks so far, and are hoping to place 10 million before the VR experience is completed. “They are dynamically lit and cast dynamic shadows,” Flamburis says. “We didn’t want to bake them statically because we would have run out of lightmap memory.”
Flamburis points out that the group was pushing 53 million triangles in the scene due to the denseness of the terrain and the foliage (rocks). Each rock is about 2,000 to 3,000 polys, which enables the user to look down at any point in the experience and see the rocks – and be able to determine their scale and see detail. “You have to have high-resolution rocks and high-resolution details. It is the texture samples themselves that are very high resolution,” he says. “After all, this isn’t a game, but rather more like a scientific visualization. Depth and detail are necessary.”
MEN IN SPACE
The area being re-created for Mars 2030 is 22 square kilometers. The surrounding terrain is composed of static meshes developed from a valley called the Aurorae Chaos; Mars 2030 plays out somewhere in the middle of the massive valley.
According to Flamburis, they are using a UV-less workflow for the surrounding static landscape – a slope-based, three-way blend material with height division and falloff for color. He stitched together the height data for Aurorae Chaos in 3ds Max, displaced on tessellated quads, and imported into UE4.
As part of the Mars 2030 experience, there is a rover, designed from the vehicle being built at Johnson Space Center, which can be used to transport users around the digital landscape. All the man-made objects in the experience are NASA designs. At times, the artists were given Autodesk AutoCAD models, which were used as reference and re-created from scratch at Fusion to run in real time.
While the Mars 2030 team has provided glimpses of the project, they are taking great care not to reveal too much at this time, keeping future plans close to the vest. However, this is what we know. The group anticipates completing their mission in late October or early November, and the experience will be released shortly after and will be available initially for the Oculus Rift and the HTC Vive headsets.
Also, we are told that at every turn, the crew is committed to accuracy and pushing the technology envelope. “The level of attention to the detail of the scientific research we are using is at the granular level. We are trying to create a project that is amazing and fun, but also realistic, and we want it to be a vision of the future, for it to be a good reference to where are at [in regard to a manned mission to Mars],” says Reyes.
For decades, developments at NASA have propelled space travel and exploration. But only in the past year or so have there been advances to VR and real-time graphics that are making an application such as this possible, including reason-ably priced headsets running applications at 90 frames per second, and blazing rendering speeds from GPUs that enable the production of high-quality imagery for such an experience.
Still, Reyes admits there remains a balancing act between performance and graphics quality when it comes to real-time applications. He notes the group is really pushing the quality of this experience, but is constantly conscious that it is a very difficult process to maximize image quality in real time, especially with VR.
THE TERRAIN WAS GENERATED USING REMOTE SURFACE IMAGING DATA AND HEIGHT MAPS THAT WERE STITCHED TOGETHER. THE INFORMATION IS BASED ON ELEVATION DATA FROM IMAGERY TAKEN BY THE MARS ORBITER MISSION SPACE PROBE.
“We want to create something that visualizes what the future looks like and where we are in regard to a manned trip to Mars. Accuracy is very important, as we are compiling the various research being done by groups across NASA, so they can visualize and better grasp that information,” says Reyes. “As for our experience, we want people to feel fully immersed. We are pushing graphic boundaries. We want this to be the closest thing to being on Mars, so resolution and accuracy are vital to convincing some-one that they are there.”
By the looks of things, this indeed will be a journey we don’t want to miss.
DIGITAL ARTISTS CONQUER GRAPHIC CHALLENGES AND REAL-TIME PERFORMANCE ISSUES TO CREATE EVEREST VR
It’s a dream for so many: climbing Everest, the highest mountain in the world. But for most, this dream will never become a reality. The trek requires long, intensive training and is fraught with danger – over the years, the mountain has claimed 250-plus lives and counting. Thanks to virtual reality, however, those dreamers no longer have to be left out in the cold.
That’s because VR specialists Solfar Studios and VFX facility RVX have teamed up on the CG adventure Everest VR, a virtual-reality application that presents the famous mountain in realistic detail and sends users on a journey of a lifetime to experience the same view and climb for which so many have risked their lives.
Where does a CG artist begin when setting out to accurately re-create such an imposing environment? Luckily, there was an experienced adventurer on the team who had conquered the mountain before – Dadi Einarsson, creative director at RVX, who served as visual effects supervisor on last year’s feature film Everest, based on the 1996 disaster on the mountain that left eight climbers dead when a fierce blizzard rolled in. Everest Director Baltasar Kormakur want-ed to give audiences “a sense of place, with Everest in all its glory, might, and danger.”
“The way we created the mountain [for Everest VR] evolved out of what we did on the film,” says Einarsson. “We had created the whole environment in low resolution, and popped in super-high-resolution images for the scenes in the film. For Everest VR, we figured out a pipeline for which we could make the whole mountain range in 3D, and make it photorealistic and beautiful.”
In fact, the idea of building an authentic VR experience centered on Mount Everest came as a result of the work Einarsson and RVX did for the film. “We had visited him at the studio and came to find out they had created the full mountain in 3D. We had no idea that was possible,” says Reynir Hardarson, cofounder/creative director at Solfar Studios, a VR games and experience company founded less than two years ago.
Seeing the 3D model sparked the idea for an Everest virtual-reality experience, and a partnership between the two companies, both located in Iceland, was formed. While Solfar is focused on virtual reality, the staff’s experience was born from the gaming world, where the artists honed their skills with real-time graphics. “VR is very different from television and games, though,” says Hardarson. “The environment is extremely important in VR. You spend your time inside the environment rather than just looking at it from the outside. So, they really need to be compelling.”
With virtual-reality experiences, it’s a lot about the look of the imagery. “Game-play is important, but in a different sense; it’s not about scoring points,” explains Hardarson. “Visual quality is incredibly important because we are comparing [the graphics] to reality for a very emotionally driven experience.”
To this end, the partnership with RVX on the project was “a match made in heaven,” says Hardarson. “They also have a strong focus on visual quality, and their background comes from the visual effects industry.” So, the work began, with RVX handling mainly the VFX, and Solfar, the real-time aspect and lighting/shaders, though each often crossed boundaries with close collaboration. In total, approximately 15 to 20 people worked on the project.
Despite RVX’s prior Everest work, the artists started from scratch when building the Mount Everest model for this project. “It’s a totally different medium with totally different needs from the film,” says Einarsson.
The groups first spent approximately a year in prototyping. “We did a lot of prototyping early on because we were pushing boundaries that hadn’t been pushed before in VR,” says Hardarson. “There was a lot of work to be done on the technical side. We had to overcome things like memory limitations on the graphics hardware. We had to figure out ways to create this so it could run on a regular computer at 90 frames per second (fps) in VR. The difference between making Everest, a real-world place, and computer games is that we cannot use any digital trickery with the landscapes. Every pixel has to be in the correct place. This creates a massive challenge when it comes to video memory.”
EVEREST VR PLACES USERS AT THE TOP OF THE WORLD. SNOW COVERS THE MAJORITY OF THE LANDSCAPE AND CAN BE SEEN BLOWING IN THE WIND. ARTISTS USED A VARIETY OF METHODS TO CREATE THE SNOW AND EFFECTS, INCLUDING NVIDIA’S TURBULENCE FLUID SIM.
The artists at RVX built the mountain using photographic modeling techniques. The group started with thousands of photo-graphs – many taken on the mountain by Einarsson himself using a Canon 5D camera. They also collected data and photo reference from a number of other sources. Then, they turned those photos into a point cloud, a 3D representation of the photos, using a photogrammetry solver from Designing Reality.
The stereophotogrammetry process was done using 28-core PC workstations with 128GB RAM equipped with six Nvidia Titan X graphics cards.
“Using thousands of photographs, you can create a very accurate surface,” says Einarsson, who estimates the team used approximately 4,000 to 5,000 photos in the actual build. We processed somewhere close to two trillion polygons in total. From there, it goes through a pretty traditional asset building pipeline, where we re-model it, re-topologize it, or fix the topology.”
As Einarsson notes, there is inevitably some holes or patches that need to be fixed. “Mainly, you get areas of low resolution in terms of the photo quality or the model gets soft, so you need to go to a more traditional VFX modeling and texturing pipeline to fix it up,” he says. “We use a process whereby we project the photo-graph onto the 3D model and paint and fill gaps, holes, even the lighting, to make sure it is consistent throughout.”
The artists had less photographic reference for the higher-elevation locations, which required more hand-modeling. This was done using Autodesk’s Maya, with texturing in The Foundry’s Mari.
Einarsson says the model is extremely accurate, based on comparisons to low-resolution topology data of the mountain range and summit area.
USERS TAKE THE SOUTHEAST ROUTE TO THE PEAK, PASSING THE HILLARY STEP. AN IMAGE PROGRESSION (TOP TO BOTTOM) SHOWS A WIREFRAME, LIGHTING, AND FINAL RENDER.
Everest’s Kormakur set out to make the feature film as authentic as possible, wanting moviegoers to feel as if they had almost climbed the mountain themselves. But in Everest VR, that is exactly what users do, albeit virtually. Everest VR is a real-time experience, powered by Epic’s Unreal Engine 4.
As Einarsson points out, there are very intricate details in the 3D surfaces of the model, making the VR presentation as realistic as possible. Yet, Everest VR is far more than a scenic tour of this amazing environment. It attempts to provide users with the experience of an actual climb. Headphones with stereo sound help situate users in the setting; hand controllers enable them to utilize the climbing gear, such as ladders for crossing chasms. And, of course, a VR head-set immerses them in the stereo imagery.
“This is not a game,” says Hardarson. “It’s an emotional experience.”
The virtual ascent starts with a traditional puja ceremony, and then users start off from base camp, through the Khumbu Icefall, on to camps I through IV, passing the Hillary Step before reaching the summit. This is the southeast route that adventurers take when making the actual climb. If users move too quickly in the higher altitude area known as “the death zone,” they start to black out – here in a virtual sense, of course.
During the treks, users walk near the ledge and climb up to the ledge above for a view of the surrounding area. “It’s about being in the moment,” Hardarson says of the experience. And between treks to the location points, hikers are led on cinematic journeys, giving them a good sense of the geography. “You get to know the route very well,” he adds.
Yet, Everest VR is more than a simulated physical experience, as educational information about the mountain, climbers, and sherpas is presented along the route.
On the virtual mountain, climbers will experience a range of snowfall and wind, as well as changing skies, all created with simulations. For the snow effects, the artists used Nvidia’s new Turbulence scalable fluid simulation, part of Nvidia’s GameWorks, which enables artists to change parameters such as fluid viscosity, turbulence, and particle mass to generate particle effects based on fluid dynamics.
“We are using millions of particles in a 3D fluid simulation in real time,” says Hardarson. The particles and turbulence fields, generated within the game engine, collide with the environment and form swirls and blow from side to side – bouncing off the hikers’ gloves, for instance. To create snow in the distance, the artists used Side Effects’ Houdini and Maya.
USERS WILL ENCOUNTER VARIED WEATHER CONDITIONS AND CHANGES IN TIME OF DAY.
“There are over 50 different words for ‘snow’ in Icelandic,” says Hardarson, noting there is snow everywhere in Everest VR – on the ground, in the air. “There are many types in the experience, and we used multiple methods to produce the effects, from simple shader trickery on sprites and planes, video cards created in Houdini, and real- time fluid simulations using GameWorks.”
As all artists know, getting detail from the white snow is not an easy task. With such a stark, white environment throughout this application, lighting became especially difficult. “Snow is super hard; it’s a massive challenge from a visual standpoint. It’s not just the subsurface scattering, but the way it reflects light into all directions from within the subsurface that makes it quite tricky,” says Hardarson. “It is very easy to get wrong.”
The RVX crew already overcame this obstacle for the movie, and now the artists had to solve it again within the real-time environment through shaders and lighting. “Our familiarity was a big help, and we were able to iterate with confidence that we were going in the right direction until we had a representation of what we were after.” Of course, that familiarity resulted from the film as well as real-world reference outside their office windows in Iceland.
In addition to the snow, the artists used real-time cloth simulation, with wind beating against the climbers’ suits. The cloth sim was generated using Nvidia’s PhysX, part of its GameWorks toolset. To optimize the VR development, the group also used Nvidia’s VRWorks, a suite of APIs, sample code, and libraries for virtual-reality development.
Additionally, the crew used Simul Soft-ware’s TrueSky real-time cloud simulation, and provided volumetric clouds and atmospherics.
The development work at Solfar, which is still in progress, is being done on high-end PCs with Nvidia 980TI cards. The studios’ platform machines contain i5 processors with Nvidia 970GX cards. “You need lots of power to run this at 90 frames per second,” says Hardarson. “But, it’s hard to maintain that 90 frames throughout.”
DRIVEN TO NEW HEIGHTS
The VR experience comprises 108 billion pixels and 13 million polygons – the equivalent of 14,000 shots taken with an iPhone 6 camera. Creating such a realistic, real-time experience resulted in a number of production challenges. According to Hardarson, one of the biggest technical challenges was cramming the experience onto an Nvidia 970GX card with 4GB of video memory.
“A big part of the last two to three months has been getting this to run on consumer- level machines at 90 fps all the time,” Hardarson says, noting they are reducing the memo-ry footprint through the use of Graphine’s Granite texture-streaming middleware.
“VR is so new, and there are so few people pushing the extreme limits like we are, that there’s not a lot of people who we can ask for advice. It’s great to have companies like Nvidia and Epic to turn to, to help us squeeze the most graphic quality from the imagery,” he continues. “After all, we are all in this together, to create the most compelling experiences for VR.”
In all, the group re-created roughly a seven- square-kilometer area for the high-resolution “experience,” and approximately 50 kilometers of the low-resolution area in the distance.
As of press time, the team was in the final stages of production, with a target completion date of this summer. Everest VR will be released initially for the HTC Vive and will be available through Valve’s Steam. In the fall, it will be available in various app stores and will be supported on the Oculus Rift.
While Everest VR has been an education experience for both studios, each plans to apply the lessons learned on the virtual mountain to another high-end VR project. “We plan to take a similar voyage and route again, which is what happens when you push the envelope,” says Hardarson. “It’s good to push bound-aries, especially in this new medium.
If it were easy, we’d probably be doing something else.”
Karen Moltenbrey is the chief editor for Computer Graphics World.