Most games are designed with a computer screen or television in mind – but what happens when the screen is attached to your face? What happens when your body is being tracked? There’s a lot of things that go into making virtual reality systems work, and they all fundamentally change how games are experienced and designed.
The most obvious change is peripheral vision. With tech like the Oculus Rift, the optics actually bend a stereoscopic image around your eyes. So you feel quite a bit more “present” in the game with VR goggles that you would with a television. Because of the presence factor, the sense of space is drastically different, and players move at a slower pace with more deliberation.
One funny thing I’ve noticed is that so many first person shooters today still involve a lot of running – the player literally sprints everywhere, traversing large and small environments the same way. But is that really realistic? Why are we always running?
So far it has been necessary in order to keep the player engaged – the lack of presence and intimacy in the environment forces the player constantly forward. This incessant running might seem normal on a TV, but in a head-mounted display it feels oddly inhuman. Now that peripheral vision can immerse the player so strongly in the environment, exploring subtle details and intricate spaces becomes that much more interesting.
Exploration, Challenge, and Reward
Indie game developers have been toying with exploration-centric games for some time now. In a recent EDGE Online, Dan Pinchbeck (designer of Dear Esther) describes how the indie community can embrace VR gaming for this very reason:
“With Dear Esther, and Journey, and Amnesia, they are about drawing the player into a deep rich world and making the sense of being in that world the primary reward of the game experience … the core idea is that you don’t have feedback loops of challenge and reward, it’s a much more integrated, rolling reward.
In many ways, this isn’t new. Right back in the days of PSOne, games like Tomb Raider used sweeping vistas and orchestral stings to make exploration a massive reward in it’s own right, and Half-Life did scale in an amazing way – as much as the gunplay, you just wanted to keep exploring Black Mesa. But these indie titles are just all about that, and they are proving again and again that you can cut out all of those short-term loops and still have an amazing experience. So for developers like us, I think the potential for deepening and strengthening that idea is hugely attractive”
Virtual reality is going to revolutionize spatial storytelling and exploration, and the inherent reward found in this experience is powerful. However, this doesn’t necessarily mean that traditional game mechanics have to be tossed out the window. They just have to be revamped and redesigned to take advantage of new technology.
To use Pinchbeck’s words, traditional feedback loops of challenge and reward can still work well in virtual reality, but its going to take a lot of development and iteration over the next few years to figure just how ludic interactions can change for the better.
If indie developers are embracing virtual reality to enhance exploration-centric game design, I can definitely see mainstream developers embracing VR to build upon traditional play-centric game design in an innovative and refreshing way. Big publishers and developers aren’t that far behind the indie community – already Valve, Id Software, and Epic Games have expressed an extreme interest in virtual reality, and there’s no doubt they have the capability of capitalizing on the opportunity.
While peripheral vision can fundamentally change the way gamers explore their environments, and is thus a focus of the indie community, there is one other technological facet of VR that is inherently bound to action and play, and thus may be more appealing to mainstream developers:
Motion tracking is a broad category, but the most important takeaway is that VR games utilize positional AND orientational tracking of the head. So what happens when the player can freely move or turn her head in any direction?
You can jump, duck, dodge, and do a number of real movements with your body, instead of pressing the X button. There’s a lot of room here for new twists on classic game mechanics – imagine bullet time making a big comeback!
Tracking the hands and other appendages also changes game design drastically. First of all, its really really cool to hold up your hands in the real world, while wearing VR goggles, and see virtual hands in front of your face. This is generally called avatar embodiment, and it gives you a 1:1 representation of your body in virtual world.
We’ve been playing around with avatar embodiment on Project Holodeck quite a bit, where we use Razer Hydra controllers to track the hands in three-dimensional space. Alex Silkin, lead interaction engineer on the team, has developed code that allows players to pull out guns from their hip, or pull swords from their back, so they can shoot and swashbuckle with pirates in our VR game Wild Skies. For the first time, you can look down the sight of a gun like you would in real life, and the direction of the bullet is determined by how accurate you are actually aiming!
This is just one example of the fundamentally different play experience that motion tracking provides. In most first person shooters, there are three different kinds of motion that are usually compressed into a single action: the direction you are looking, the direction you aim your gun, and the direction you are running.
With motion tracking, these three actions are separated as they would be in real life, and new game mechanics and level design techniques can be developed to take advantage of this new experience.
So what do you think? Are there any other ways that virtual reality is fundamentally changing game design?