Ray Davis
Issue: Volume 39 Issue 1: (Jan/Feb 2016)


Epic has been pushing the boundaries within VR with its Bullet Train experience.

One of the fundamental goals of game development has always been about bringing to life fantastic worlds that we otherwise would never get a chance to live in. From the very beginning, creating games has been about making the player become the hero (or even the enemy, occasionally), giving him or her the opportunity to live out the empowerment fantasy and accomplishing feats that were otherwise unattainable for most of us in the real world.  

When something like VR comes along and gives us a whole new set of tools to build immersive experiences that completely trample anything we’ve been able to build before, it’s not hard to see how game developers become instantly enamored with the potential in this emerging platform.  

With VR, you get this amazing level of presence (for free!) that both gaming and film have been chasing since … well, forever. Classic gaming experiences, such as a first-person shooter, which have been the mainstay of games for many years now, almost seem absurd once you’ve seen what a VR first-person experience is capable of.  

That presence, that immersiveness, is exactly what many developers have been chasing throughout the years with PC/console game development, and it’s incredibly invigorating to see how VR will dramatically leap us forward in regard to what we’re now able to create.

I’ve yet to give a VR demo to a fellow developer who hasn’t come out of the experience immediately churning out a hundred ideas of what we could do or what we should prototype next. The potential of this platform, even if not perfectly realized by today’s hardware availability, is hugely attractive to anyone who is looking to build interactive digital content.

The Cost of Nirvana 

Of course, the blessings of VR come at a steep price, and there are incredibly brutal obstacles to overcome for the seasoned game developer when building content for this new platform. 

For example, in my career as a game developer, I’ve probably written at least a couple thousand lines of code aiming to manipulate and nudge cameras ever so slightly, to get just the right transitions or just the right framing in the various games I’ve worked on. For years, developers have worked to engineer and master these incredibly complex camera systems, since they were such a crucial tool to use in manipulating player attention and attitude.  

But with VR, you can’t touch the camera at all unless you’re willing to risk making the player ill. All camera control is essentially gone with VR at this point. No longer can the developer simply grab your attention and force your view toward a specific point, to make sure you see some crucial plot point unfold. 

Now we’re forced back to the drawing board, to find new ways to encourage players’ attention so that they witness the important events at hand. The price we end up paying for this whole new level of VR immersiveness apparently is to have some of our most reliable and powerful game development tools taken away from us.

And to make it even more fun, these constraints aren’t just limited to the design space! To build a compelling VR experience, you have to pay the extra computing cost for stereo rendering, which naively equates to nearly twice the rendering cost for a single frame in previous games. On top of that, we’ve discovered that compelling VR requires consistent high frame rates to avoid causing significant discomfort for viewers.  

With the last console generation upgrade, most developers in the industry were grudgingly making the transition from building 30 fps games to 60 fps at 1080p resolution. This may not sound like much, but it actually ends up being a considerable challenge from a technical point of view. Then VR comes along and demands even more – render every scene twice for stereo and then run at a guaranteed 90 fps! 

Going from the mentality of aspiring to hit 30 fps “most of the time” in the last generation to now unfailingly needing to hit 90 fps to prevent people from becoming ill turns out to be a monumental leap in engineering effort.

New Level of Gaming

Fortunately, though, we’ve already found these constraints and trade-offs to be absolutely worth it. For the first time in a long time, I find myself once again passionate about solving really hard problems in building VR games. There’s so much undiscovered ahead of us for this new platform, and I’m absolutely excited that this year, in particular, we’ll see a massive influx of new creative talent, now that more people can finally get their hands on VR hardware.  

Epic’s Showdown, a virtual-reality experience, immerses users in the middle
of a street battle.

Both veteran and newcomer developers alike are once again put on equal footing, and we’re put in a situation where the best idea will win – nobody knows quite what that is, but everybody is welcome to try to figure it out.  

Furthermore, the VR development community has been incredibly open and forthcoming with their own prototypes in learning. It’s as though everyone recognizes that nobody actually knows what they’re doing yet with VR, and so we might as well all share our successes and failures in an effort to help one another figure it out that much quicker. It’s incredibly refreshing as a developer, especially when, in recent years, many other developers may have been more focused on making money rather than on making something novel and new.

One other unexpected add-on of the VR platform emergence is the surge of new interaction models that are coming on-line in support of it. HTC (along with partner Valve) somewhat surprised the industry at GDC 2015 by announcing its motion controllers alongside its Vive HMD. And then not long after, Oculus was quick to showcase its own motion controller answer along with some really compelling content.  

Out of this particular case, I believe we’ve somewhat accidentally identified the minimum requirements for a truly immersive VR experience – high-quality head-mounted displays plus some form of motion-tracked controllers. Without that form of 1:1 input, you’re left shoehorning previous forms of archaic input into the new VR medium, and it quickly becomes kludgy, reminiscent of the early days of touchscreen mobile games, when developers thought that virtual D-pads (directional pads) would be a good idea (or at least good enough).  

Clearly, mapping physical D-pads to tiny touchscreens didn’t work out so well, and it wasn’t until developers started really embracing the capabilities (and limitations) of the touchscreen that we started to see some of the defining experiences for mobile. 

Motion controllers for VR feel as if they are very much in the same situation in this particular instance, and without being able to use my hands directly with VR, it’s hard to not feel severely handicapped.

Will we discover the killer app of VR in 2016? Maybe not, but I can guarantee we’ll absolutely see some truly amazing games this year. And probably some really astounding evolutions of storytelling from the film industry brought to VR, as well. Maybe there also will be some novel ways for us to communicate, with each leveraging VR. Who knows? 

Or is VR really so boring that any development will only find life in enterprise settings? I doubt it, but no matter what, I’ll be there along with every other passionate developer, trying to figure out all the cool, new things we can do with VR, learning and sharing everything along the way.  

VR will consume and replace all our other methods of interacting with technology. It’s simply a matter of time as to when we’re ready to embrace it. 

Ray Davis is the manager of Epic Games’ West Coast studio, driving developer relations, evangelism, and support for Unreal Engine. He is also Epic’s VR strategist for the Unreal Engine.