Tejas Shroff
6 min readSep 5, 2017

--

Note: This post was originally published on July 18th, 2017.

Fresh off the heels of my previous project where I created a new VR locomotion system and implemented it in a horror scene, I was inspired to take the knowledge of creating environments and using post-processing to create a different kind of experience in VR. At the time, I was also gearing up to participate in the GameMaker’s ToolKit 3-day Game Jam, and I had always envisioned building a scene close to reality that made you forget that you were in virtual reality. Although that was a fairly ambitious idea, I tried my best with the time I had to work hard, learn new things, and keep pushing the limits of experiences in VR. While the rest of my post is about my journey in fleshing out this idea from concept to working prototype, you can go ahead and watch a captured video of my submission to the game jam below. I made this VR experience in about 15–20 hours.

The theme of the GMTK Game Jam was to create something with a dual purpose design, and the example given was a game called Downwell. In Downwell, the game’s mechanics (i.e., jumping, moving, shooting, etc.) had more than one purpose, making players think strategically about their next move as they play the game. While what I created is not nearly as complex, I tried to creatively interpret what is means to incorporate dual purpose design.

Concept & Iteration

The basic concept I wanted to work towards was an experience where the user dynamically controlled the world by physically looking around in a virtual space — a mechanic that is unique to VR. I was a bit ambitious towards the start of the project and began to build the scene instead of programming the game mechanic. I added high-density textures and models, as well as a myriad of effects such as rain, dust particles, glare, and post-processing before programming the code needed for the game mechanic to work. However, when I put on my Vive headset to view my work I was met by poor results. There were numerous dropped frames, and the resolution of the Vive headset made close-up objects look grainy, negating any details and high-quality textures. I knew I had to take a step back and re-think my approach.

After tinkering with various ideas and scene setups, I found that large objects that were positioned far away still retained most of their detail when viewed through a Vive, and so I decided that my final product would compose of a sun view and a moon view, where the world would change between the two view depending on where the user was looking. But before I dived into scene creation, I decided to first program this dynamic-gaze view mechanic.

Capturing Gaze Direction

Capturing users gaze direction, or knowing where a user is looking at all times in-game, was much more difficult than I had originally expected it to be. After a few hours of brainstorming and trial and error, I ended up with a very simple equation. I found that I could define the users gaze direction by assigning it to a variable that equaled the y-rotation of the “eye” object of the SteamVR Camera. This y-rotation altered between 0 and 1 and tracked the horizontal direction of the user’s gaze. At the start of the game, the y-rotation is 0 at as the “eye” object calibrates to where the user is looking when the SteamVR system is initialized. Then, as the user turns their head (and, in turn, the VR headset), this the y-rotation is set to the absolute value of the rotation against the y-axis.

As a small prototype to see how I can use this y-rotation variable to affect the world, I added a rain effect to create the logic needed to fade in or out specific effects. I also created two separate variables for purposes of this prototype; rainyPercent and sunnyPercent, where rainyPercent tracked how much the user was looking at the rain side of the world (represented as the blue cube in the clip below), and sunnyPercent tracked how much the user was looking at at the sunny side of the world (represented as the red cube in the clip below). Now that I had the main mechanic of my component figured out, it was time to build the environments.

As a user moves their head, this rotation is tracked and measured on a scale between 0 and 1. One of the first demos to test this gaze direction concept.

A Lesson in VR Optimization

I wanted my project to feature realistic environments in order to make the world as a whole more believable. For the “moon” side of the world, I first added a heavy amount of dust particles, and a field of 2K-textures mountains assets I found on the Unity asset store. I also added shooting stars, and the planet models of Mars and Venus to give the night sky more flavor. For the “sun” side, I added a forest-full of shrubbery and trees as well as some sun-ray effects and glare for brighter contrast. After I finished placing those objects and adding those effects, I coded each of the objects so that their positions would change as the user move their head, and the indirect light of the scene would also change cue the night and day views. However, I put on my headset to test my scene but was again disappointed at the lack of steady performance I experienced.

After researching and reading some articles on how to optimize for VR applications, I found that VR is still a very delicate platform to build due to the rigorous performance needed to build a scene. In technical terms, to build a view for a VR headset, a PC/laptop needs to render the image in 1080p resolution twice as well as render each 1080p image at a minimum rate of 90 frames per second. In stark contrast, modern games for the PC/laptop only need to render images at 1080p resolution once at a frame rate of 30–60 frames per second.

Learning more about the GPU and CPU systems, static/dynamic processing, and draw calls, I decided to cut down the number of objects in my scene. In fact, I felt like I needed to start over, so I opened up a new scene in Unity and started from scratch. I first placed all objects that I felt were necessary (such as the sun and the moon), then slowly added objects while repeatedly testing the performance of the game. This method showed me what objects/effects dragged performance while what objects I could comfortably add without seeing a cost in performance.

Just the sun and the moon, and some mountains in a fresh scene.

Finishing the Scene

After much trial and error, I managed to finish the scene and optimize objects/effects to create an experience that would output at 90+ frames per second. I also composed some dynamic music and sounds for the scene so that when the user faces the moon side, they hear a mellow piano chord riff and cricket sounds, but when they face the sun side they hear birds chirping and a brighter version of the same chord pattern but with a melody, strings, and drums added in the mix.

Using what I learned about animations in creating my horror demo, I also animated some birds using a few free assets, so that when the users are facing the sunny side, they also see birds flying towards them.

A complete rendition of the dynamic world concept

Overall, I learned a hefty amount from such a short and small project, and I look forward to participating in another game jams in the future. But for now, I think I need to small break to recharge my creative juices.

See you in the next post,

Tejas

--

--

Tejas Shroff

An XR developer excited about learing and sharing new things