In this development update I thought I’d run through some of the things we’ve been doing and learning about with virtual reality.
As you know our lead Engineer Tony attended Unite Boston 2015 and it was a great opportunity not only to get early feedback on the game but also to see how other developers where using VR.
One of the main concerns at present is the ‘locomotion issue‘ – which is the motion sickness you feel when your eyes think you’re moving but your body doesn’t feel it.
This was probably the single biggest topic in the VR talks at Unite and the consensus was that VR games need to be designed differently from traditional flat screen games. Most of the games solved this by designing spaces where the player looks around but doesn’t move, such as a small kitchen or a cockpit. Other developers have tried to address this issue by adding a fake nose, keeping the player on rails or using teleport pads to move the player through the virtual world.
However that wasn’t really going to work for the Corridor, we wanted players to be able to explore the world as they would a traditional first person game but with the spatial presence of VR.
As you can imagine we’ve done a lot of testing (and retesting) of various mechanics in the Corridor but one of the key things we focused on is realism. When I say realism I don’t just mean from a graphical stand point but also the player. One of those aspects was to get players movement just right and in particular walking and running, we found that keeping these movement as close as possible to how you might move in real life reduced much of the locomotion problems and also added to player presence. We reduced the overall speed and bob that a traditional FP game might use and kept a more reserved setup. Of course there is the argument that VR will never fully be a comfortable experience for everyone, but that’s another discussion and VR is very much in its infancy.
This realism theme also extended through to interactions within the game world, one example of this is the ‘Data Log’ a real-world menu object that players can interact with. It records players evidence they find on their journey, their objectives and inventory items.
As many of you probably know, we’ve been testing VR on the Oculus Rift but more recently we thought we’d try the experience on Google Cardboard. If you’ve not heard of it, Cardboard is a virtual reality platform developed by Google for use with a fold-out cardboard mount for a mobile phone, it intended to be a cheaper alternative than say the Oculus.
Virtual Reality Best Practices
We’ve complied a collection of virtual reality best practices or fundamentals from different sources and from our experience in developing so far. A lot of these points have derived from trial and error and testing, but hopefully they’ll help other developers. Ultimately see what works best for your game.
- Avoiding manipulation of the camera in general is good practice. Bobbing, Shaking or zooming the camera will be uncomfortable for the player.Taking control of camera movements, away from the player is one of the biggest causes of players having a bad VR experience
- Check your game as much as you can with as many different people as you can to make sure that you are not causing Simulation Sickness.
Some things to avoid:
- Avoid zooming in and out, don’t manipulate the camera’s Field of View (FOV).
- Avoid Depth of field effects.
- Avoid excessive camera bob.
- Keep your frame rate as high as possible. Oculus has stated that low frame rates are another cause of Simulation Sickness. So make sure to optimize your game as much as possible.
- Target 60 fps for DK1 and about 75 fps for DK2.
User Interface Design
Oculus suggests that user interfaces in VR should fit inside the middle 1/3rd of the user’s viewing area. Remember players won’t move their head they should be able to examine it with head movements. The last thing you want them to be doing is straining there eyes.
Depending on the platform your aiming for, making a simple ‘Safe Area’ template, like this may help in design your UI elements.
The folks over at Unreal have mentioned that assets that are scaled incorrectly can cause sensory issues for players. They suggest that objects in the virtual space are best viewed when they are in a range of 0.75 to 3.5 Meters from the player’s camera.
Controlling the Player
- As players are wearing HMD’s (Head Mounted Display), they will generally have to rely on tactile feedback (or trying keys) to find controls. So if you’re going with a keyboard for input, try thinking about familiar keys or easy to find keys.
- Some developers have noted that a joy-pad with analogue sticks works the best.
- Aim for realistic movements rather than an exaggerated control setup.
- Look-to-select. Use head movement itself as a direct control over world interactions. This isn’t always a possibility though and again it depends on the experience your creating.
- Depending on your game and the experience your aiming for, tracked controllers such as the Oculus touch may work for manipulating objects in VR.
- make sure in-game sounds appear to emanate from the correct locations by accounting for head position relative to the output device.
That’s about that for this list but its constantly growing as we build the game and thats about it for this development update.
Thanks for stopping by,