Tejas Shroff
11 min readSep 6, 2017

--

Note: This post was originally published on July 14th, 2017.

Below is the final product after 13 hours of prototyping and problem-solving. Read the post below to view my journey in achieving this result.

Beginnings

After developing a paint app in VR, I had some ideas on what I wanted my next project to be. At the time I was playing around with component/bone mapping and was impressed at the simplicity of implementing a bow and arrow system in VR. Taking what I learned to the next level, I decided to build a small game where the user would control a bow and arrow to shoot at various targets in a virtual space. I also wanted to create a way for the user to easily move around in this virtual space. I knew the locomotion method (the method in which a user can move around in VR) would be the main challenge in the project, so I tackled it first.

A bow and arrow prototype I built from scratch

One common way for users to move around in VR is through teleportation, where the user points their controller to the destination they want to move, then releases/press a button to immediately teleport to that destination. However, teleportation in VR breaks the immersion of a technology that is specifically made to be immersive. I instead tried to mimic the touchpad controls featured in other VR games where the movement was determined based on where the user places their fingers on the Vive touchpad. For example, if the user places their finger on the left side of the touchpad, their character in VR would move left, and the more the user placed their fingers on the outer edges of touchpad, the faster their character would move.

In testing these motion controls, I found the touchpads to be extremely sensitive to my fingers and often found myself moving in sudden bursts instead of controlled amounts. There had to be a better way in which to let the user control not only the direction of movement but precisely how fast they wanted to move.

Using the touchpad to move around made me feel sick. There has to be a better way.

Programming Trigger-Locomotion

My new trigger-locomotion method would work by enabling the user to point one controller in the direction they want to move, then squeeze the Vive trigger button of that controller to move in that direction.

ControllerForward & CameraFollow

In programming this mechanic, I started by getting the direction of where my controllers were pointing. For simplicity, I focused on applying this movement method only to my left controller. I started by creating an empty child object under the left controller and named it “ControllerForward”, then created another object under ControllerForward, and named it “CameraFollow”. The thought process for creating these objects is that I would move CameraFollow in the forward direction of the ControllerForward object, and then have the SteamVR CameraRig smoothly follow this CameraFollow object. Unfortunately, this did not work as expected — because CameraFollow was attached to the left controller, and the y-position of the left controller was always higher than 0, there were scenarios where the SteamVR CameraRig was lifting off the ground.

Tweaking & Solution

I ended up having the ControllerForward object a child of the SteamVR CameraRig, making it easy to control the smoothness of the camera’s movement, lock the Y position of the camera so it won’t lift up off the ground, and still get all controller information from the left controller. The idea behind the CameraFollow object (which is still a child of the ControllerForward object) is that as the trigger button is pressed, the depth information of the trigger (how much the trigger button is pressed) moves the CameraFollow object between 0 and 1 units in the forward direction of the ControllerForward object. So if the trigger button is pressed only 10%, the CameraFollow will only move .1 units. This way, the actual SteamVR CameraRig can be coded to smoothly move towards the CameraFollow object as the trigger is pressed.

The results were pretty impressive for something I came up with in just a couple hours — I could smoothly and intuitively move around in VR, and with some adjustments to the camera follow speed to simulate a normal walking speed, I felt little to no VR sickness. In the below GIF the round capsule is the CameraFollow object and the small square is the CameraFollow object.

My new locomotion method of using the Vive triggers to move around

Implementation

Now that I programmed a working locomotion method, I tried to implement it into my original idea of a simple bow and arrow game. However, the two-handed mechanic of handling bow and arrow did not mesh well with using the left controller to move around in a space. In addition, the movement was not suitable for large worlds, and the speed of movement could not be too fast otherwise I felt significant waves of VR sickness.

Instead of scrapping this trigger-locomotion method, I thought long and hard about how we could use the strengths of this movement system in a game that relies on those strengths. Trigger-locomotion simulates walking in the real world, with the ability for the user to control exactly how fast to walk, and what direction to walk in. This controlled and comfortable walking simulation for stealth games, or one of my favorite genres, horror.

New Direction

Wasting no time, I grabbed the “Horror Hospital Pack” by FoeJofMay off the Unity Asset store and started to build a small horror environment. When jumping into VR to see how the trigger-locomotion would hold up in small spaces, the beginning tests looked really promising; the slower, more methodical movement held up extremely well in horror/eerie environments.

First pass at creating a horror-hospital environment

User Interactions

Creating a horror scene is about immersing the player as much as possible by influencing the users’ senses. Unique to VR is how it directly influences a players sense of touch via virtual controllers. Keeping this in mind, the main user interaction I wanted the user to control (besides movement) was a small light they could use to illuminate the environment and objects as they moved the scene. I snagged the “SciFi Handlight” asset by Hit Jones, and mapped it to the right controller, make it so the user could wield the light in the right hand and move with the left. I first coded the light to turn on or off when the user pressed down on the touchpad, but then asked myself why the user would turn off the light at all. The point of the experience I was trying to create was to make the user not feel safe.

Going back to the drawing board, I thought it would be more interactive if the user had to shake their controller to generate more light, much like a glow stick. As time passes, the light would slowly fade, forcing the user to shake their controllers again to illuminate their space and see where they are going. To do this I mapped the range and intensity of the users light to the angular velocity of the right controller, making it so the faster the controller was shaken, the farther and brighter the light it would shine. When the controller’s angular velocity does not meet a specific threshold, the range and intensity of the light decrease over time, creating the effect that the light is slowly fading. Although not perfect, it definitely got the job done in forcing player interaction and creating a significantly more effective horror-feel.

Using the angular velocity of the controller I can increase the light spread

Animation & Triggers

A common way many horror games make their worlds feel alive by placing animated objects such as crawling monsters, moving doors, and flickering lights. For these actions to happen, there need to be systems in place to keep track of where the user is in a scene, and then use this information to control any objects that animate when the user enters a certain section of the scene. As a programmer, I can place empty objects and use them as triggers to signal when objects in the scene should animate.

I first created a monster that quickly scurried across the ground when the user passed a specific point near the beginning of the scene. Using the True Horror asset by Witch-A-Twin Studios, I dragged in a monster model and created an animation controller for the model. From there I dragging in a specific pre-made animation that animated the monster quickly crawl across the floor, hoping that this will scare the user.

The scene view of the monster and its trigger box on the left, and the monster animation controller on the right

I then added some more animations and triggers include a door opening, multiple light flickers, the appearance of creepy paintings, and a frightening surprise for the user at the end of the level. Taking a step by step tour through the code of these behaviors would take some time, but the hardest part about creating these triggers and animations is perfecting the timing and behavior in order to give the user a real sense of fear and dread.

All the objects and triggers in the scene.

Post-Processing

As someone who notices and appreciates excellent cinematography in the medium of film, TV, games, and of course virtual reality, I wanted my work to be intensified by the ability to post-process the scene with different kinds of effects. I specifically used Unity’s Post-Processing Stack FX asset to amend these effects, and although I won’t talk about all the effects I used, I will briefly touch upon a few that I played around with:

Anti-Aliasing

I used anti-aliasing to sharpen some textures and give the world a more rounded and life-like feel. I could not crank this setting up to the max due to the performance and stability requirements of developing in VR (where 90fps is the bare minimum for the 1080p screen resolution), enabling the default setting was more than enough to get the job done without sacrificing frames.

Ambient Occlusion & Bloom

Ambient occlusion is an effect that mimics how light in the real world bounces off of objects, including how light gets traps in small areas like the crevices of a couch or the seams of a bed. This effect adds realism without the performance cost. Bloom, however, is not inherently natural to the eye but adds an effect where lights are harsher and more intense, creating a constant feeling of uneasiness.

Depth of Field

Depth of field is an interesting effect that makes objects in the users view sharper while making objects outside of the users view more faded and blurry. This effect looks great in many modern games and functions well as a tool in photography. However, when I tried to apply depth of field in VR, it felt very uncomfortable. I am unsure if the Unity shader was not VR compatible but enabling DOF was disorienting in VR, and taught the lesson that effects made for a 2D screen can automatically have the same effect in VR. As a result, I decided not to use DOF in this scene.

Chromatic Aberration, Vignette, and Lens Dirt

These three effects made the camera look more broken, dusty, and dirty, adding to the eerie/horror feel. I did not turn the intensity of these effects to 100%, but tuned them just enough to visually reflect the decaying objects in the environment with the camera lens

Grayscale and Film Grain

These two effects are commonly used to make a camera/scene look older and aged. For example, popular shows like the The Walking Dead use film grain as a way of making their scenes more in context to the post-apocalyptic environment, while films like Mad Max (in its collectors BluRay edition) use grayscale as a way of making the camera feel more grounded in its environment and to make the action more cinematic.

You can see the results of post-processing below. Not only do the scenes become darker, but they become more visually reflective of the environment I set out to create.

Before Post-Processing:

After Post-Processing:

Sound & Polish

As a musician and horror fan, I understand the importance of good audio when creating surreal, immersive moments. With the help of audio samples embedded in the plugins I have already used, I added 3D sounds to the scene such as water drips, creaks, heartbeats, ambient voices, and uneasy violins.

One sound that posed a challenge was the user’s footsteps. In 2D or 3D games, footstep sounds are usually played on a timer anytime the character is moving. This programming method can’t be copied and pasted to a VR game, since if the user moves downward a footstep sound should not play, and neither would the sound play if the user walks a half a step back, then slightly forward. To account for this in a virtual space I create an object called “FootPos” that changes position to equal the camera position every time the user moves .6 units away from this FootPos object exclusively on the X or Z axis. This distance of .6 units is about the length of one step, and so if the user takes a step in the physical world, the FootPos will change its position to where the camera is(and thus the user). In addition, every time FootPos moved its position (i.e., every time the user took a step) a random footstep sound would play from a pre-determined array of footstep sounds.

When the FootPos object is more than .6 units away from the camera on the X/Z axis, the position of the FootPos object reset to the camera position and a footstep sound is played.

I consider myself a perfectionist when it comes to things I care about, and so I also spent a great deal of time perfecting the placement of objects, making sounds spatial, cleaning up code, and spending the time needed to make the final moments of the experience very memorable for the user.

I am always willing to share my creations, and you can find the experience I built here. Keep in mind you’ll need an HTC Vive, a VR-ready computer, and SteamVR!

See you in the next post,

Tejas

--

--

Tejas Shroff

An XR developer excited about learing and sharing new things