Unreal Power
By: David Cohn
Issue: Volume: 33 Issue: 2 (Feb. 2010)

Unreal Power

Epic Games uses the latest technology to remain on the cutting edge.

The soldier runs through a ruined urban landscape, the hulk of a burned-out vehicle smoldering in the distance. Suddenly, an alien creature looms in front of him. Reacting quickly, he aims his Lancer assault rifle and fires away at the beast, blasting away bits of its armor until, at last, the monster crumples to the ground. But as the warrior moves forward, he can hear other battles raging, and he knows that within moments he’ll be confronted yet again.



A scene from yet another sci-fi blockbuster at the local multiplex? It could be. But in this case, it’s just another encounter in the fast-paced video game Gears of War 2. This third-person shooter—in which the player assumes the role of one of the characters and views a 3D virtual world from the cinematic perspective of a virtual camera—is hugely popular, selling more than five million copies, winning numerous awards, and grossing more than many Hollywood films.

The fast-paced action and stunning visuals come about in large part thanks to the game’s underlying technology, the Unreal Engine 3. Developed by Epic Games, a cutting-edge game developer based in Cary, North Carolina, the Unreal Engine is a game development platform providing an array of technologies, content creation tools, and support infrastructure tailored specifically for the unique needs of game developers and creators of other 3D animated content. According to its developers, every aspect of the Unreal Engine is designed to put as much power as possible in the hands of artists and designers.

Not Just Playing Around
First used in 1998 for Epic Games’ own first-person shooter Unreal, the Unreal Engine has since been the basis for many popular games, including BioShock, Medal of Honor: Airborne, Tom Clancy’s Splinter Cell, and Harry Potter and the Philosopher’s Stone.

Epic’s Unreal technology also provides the platform and tools needed to develop other complex 3D projects. HKS, one of the world’s leading architectural firms, is using the Unreal Engine to bring 3D building models to life on projects such as the W Hotel in Dallas and the Dallas Cowboys football stadium. By leveraging the interactive gaming technology of the Unreal Engine, the architects and their clients can walk through and experience virtual buildings in real time at their own pace.

Producers of Nickelodeon’s award-winning children’s TV series LazyTown utilize the Unreal Engine to blend live-action and puppetry on a physical set with a greenscreen and computer-generated background content in real time. As the actual camera moves around the actors and physical set, the backdrop scene also moves in real time, opening up creative possibilities never before possible.

In addition to its technical wizardry, the Unreal Engine provides a high degree of portability, supporting multiple platforms, including personal computers running Microsoft Windows operating systems, as well as video gaming consoles, such as the Xbox 360 and Playstation 3. The Unreal Engine’s constantly evolving tool set and support for numerous platforms have made it one of the most popular development systems currently available. But it’s in the world of video games where the engine’s reputation shines.

More Demanding than Movies
Video games are among the most demanding of applications. In order to achieve the high degree of interactivity and visual fidelity expected by players, video games require high-end hardware—powerful laptops or desktop machines with multiple processors and sophisticated graphics cards or the latest gaming consoles. If the games themselves require this kind of horsepower to play, imagine what it takes to create these titles.


LazyTown, Nickelodeon’s award-winning children’s show, uses Epic’s Unreal Engine to combine live-action and puppetry with computer-generated content in real time.


The development of any successful game involves many of the same processes traditionally found in filmmaking. Games begin with preproduction, in which the game designers develop early concept art, come up with the overall design, produce early prototypes and 3D models, and generate elaborate storyboards to help chart the story arc. Once the development team moves into actual production, teams of artists not only must produce complete three-dimensional worlds, they also have to create the actors—the three-dimensional avatars representing both the characters that the game player will control throughout the life of the game, as well as every other antagonist, monster, bit player, and extra they interact with. Unlike films with physical sets and live actors, everything in a video game must be created digitally and the story and interactions programmed by teams of technical and creative designers.

Much of this begins by capturing the movements of real actors. Epic does all its own motion capture in-house, in a state-of-the-art mocap facility equipped with more than three dozen Vicon cameras. All the cinematic sequences in the games are recorded using real actors and physical props. Epic’s animators then translate that data into highly polished realistic scenes.

While postproduction may not involve hours in a cutting room, it does require weeks of testing, bug fixing, and endless tweaking. Even when the development team goes home for the day, their computers continue running 24/7, rendering models and scenes to be incorporated into the next day’s production schedule. A group of 60 developers, artists, and programmers were involved in the production of Gears of War 2, and they always seem to max out their hardware.

Hardware Assist
To live at the forefront of the highly demanding world of game development, Epic Games requires computer workstations that deliver both state-of-the-art performance and unsurpassed reliability. Epic recently began rolling out Lenovo ThinkStation workstations companywide for all its game and gaming engine development.
 
The company first started using Lenovo workstations in March, coinciding with its exhibition at last year’s Game Developers Conference (GDC), the world’s largest professionals-only game industry event. It was at that trade show that Epic unveiled the latest edition of its game development engine, Unreal Engine 3. One of the highlights of that release is Unreal Lightmass, a global illumination solver that produces high-quality lighting effects, including soft, highly accurate, and realistic shadows.

For the engine’s unveiling, Epic ran this highly demanding function on a “swarm” of Lenovo workstations, utilizing a cluster of nine ThinkStation workstations, including a mix of D10, S20, and D20 machines, configured and managed using its Unreal Swarm technology. Yet another feature of the new release, Unreal Swarm is a massively scalable job distribution system optimized for high-speed networks of multi-core PCs. It transparently spreads applications, such as Lightmass, out over the entire network, harnessing the computational power of all the machines running the Swarm Agent.

“Swarm ran beautifully on the networked Lenovo ThinkStations at GDC 2009,” an Epic Games representative reported after the trade show. “Performance was smooth and consistent, and we were pleasantly surprised with how cool and quiet the server room remained throughout the show.”

Even on an eight-core system, rebuilding lighting for large, complex scenes is extremely time-consuming. As a result, in the past, Epic’s designers were hesitant to try out different lighting schemes. But with Unreal Swarm, portions of the Lightmass computation can be performed in parallel, distributed across the entire network, multiplying performance many times over and bringing extremely time-consuming operations, like global illumination, into the realm of iterative development—something heretofore considered to be impossible.

Since returning to North Carolina, Epic Games’ developers and artists have used the Lenovo machines to complete work on their new games and upgrades to Unreal Engine 3.

For its ongoing development, Epic Games has standardized on the ThinkStation D20 workstation, each equipped with a pair of Intel Xeon quad-core processors. With the CPU’s Hyper-Threading enabled, each workstation can execute 16 simultaneous operations, providing a huge performance increase over the company’s older workstations—a significant factor when rendering the complex scenes in a computer game, video production, or architectural simulation. Epic has already rolled out more than 20 of the new Lenovo ThinkStation D20 workstations and will eventually install them companywide.

Epic employees use Lenovo ThinkStations to power all aspects of their work: creating 3D models and animations, programming the actual game play and engine code, and releasing the latest build of Unreal Engine 3 to the company’s many licensees. Development team members transfer hundreds of gigabytes of data on a daily basis. Epic’s art and animation teams use Autodesk’s 3ds Max and Maya, Pixologic’s ZBrush, and Adobe’s Photoshop; the cinematics department utilizes Apple’s Final Cut Pro to create trailers, cut-scenes, and in-game cinematics; and engineering and Q/A rely on Microsoft’s Visual Studio 2008, Perforce’s software configuration management system, Seapine Software’s TestTrack Pro, and internal tools for gameplay balancing and stats tracking, along with programming tools such as Intel Threading Building Blocks.

“With Lenovo’s S20 digital workbench and two powerful Intel Xeon 5500 series processors, Epic’s animators are presented with a premier digital canvas on which to test and refine their ideas,” says Tony Neal-Graves, workstation segment general manager at Intel’s Data Center Group. “It is amazing what one generation of Intel and Lenovo technology delivers. Epic can make virtual reality feel so real, it’s incredible.”

The hardware combination is helping Epic Games and its partners to not only develop entertaining games with stunning visuals, but also enabling this technology to be leveraged in other fields, such as architectural design and more linear visual storytelling. Not bad for a technology originally developed for shooting aliens.