Designing ‘Virtual Virtual Reality’, One of Mobile VR’s Most Immersive Games Yet

Source Node: 1384335

Launched initially on Daydream in early 2017, and now available on Gear VR, Oculus Go, and Oculus Rift, Virtual Virtual Reality’s smart interaction design gives players freedom and control which—combined with a narrative tying it all together—makes Virtual Virtual Reality one of the most immersive mobile VR games to date. This guest article by Mitch Mastroni, Interaction Designer at Tender Claws, the studio behind the game, explores how the game achieved significant immersion even on more restrictive mobile VR headsets.

Guest Article by Mitch Mastroni

Mitch Mastroni is an Interaction Designer at Tender Claws, where he handles all aspects of systems design and programming across both VR and AR experiences. He pulls from his background in performance art—ranging from improv comedy to jazz percussion—to create compelling interactive experiences. He holds a B.S. in Computer Science: Game Design from UC Santa Cruz, where he developed the 2016 IndieCade finalist Séance. You can find him in the corner of a networking event, waxing poetic about theme park design.

Our game Virtual Virtual Reality is a comedic adventure that is both love letter to VR and playful commentary on the tech industry. Players are welcomed by their manager Chaz to Activitude, a virtual service where humans are tasked with assisting AI clients. These AI, which appear in various forms ranging from a tempermental artichoke to a demanding stick of butter, have increasingly bizarre requests for the player to perform. The story unfolds as the player travels between virtual realities, diving deeper and deeper into the machinations of Activitude.

If you haven’t had a chance to play Virtual Virtual Reality, check out the trailer below to get a taste of the game, which also recently launched on the Oculus Rift:

Object Interaction: The Leash

When players pick up objects in Virtual Virtual Reality, they see a curved line connecting their VR controller to the object in question. This ‘leash’ is the only tool that players have at their disposal for the full duration of the game. All other object interactions in the game (plugging a plug into a socket, watering flowers with a watering can, etc.) are performed with the leash. Even simple interactions—like tossing a ball in the air or dragging your manager by his robotic legs—are very satisfying to perform with the leash.

The leash helps the player understand the relationship between the controller’s movement and the object’s movement. It also enhances game feel by giving virtual objects weight. Instead of instantly moving the object to the position where the player’s controller is pointing, the leash applies a constant force to the object in the direction of that position. Heavier objects will take longer to arrive at their destination and will sag the leash downwards. By swiping the trackpad forward and backward, players can also push and pull objects towards and away from themselves, enabling 6DOF object control from a 3DOF controller.

Virtual Virtual Reality was originally developed for Daydream VR and its 3DOF controller, leading us to consider control schemes found on other devices with 3DOF controllers (see this article for an introduction to 3 DOF vs 6 DOF ). We were inspired by the ‘Capture Gun’ in Elebits, Konami’s 2006 Wii-exclusive title. Elebits achieved a surprisingly intuitive use of the 3DOF Wiimote that we had yet to see implemented in any game: VR or otherwise. We were pleasantly surprised to find that the leash is also comfortable while using multiple controllers and 6DOF controllers. We designed unique visual and haptic feedback for the leash to fit each of Virtual Virtual Reality’s platforms and to leverage their respective control schemes.

.IRPP_kangoo , .IRPP_kangoo .postImageUrl , .IRPP_kangoo .imgUrl , .IRPP_kangoo .centered-text-area { min-height: 100px; position: relative; } .IRPP_kangoo , .IRPP_kangoo:hover , .IRPP_kangoo:visited , .IRPP_kangoo:active { border:0!important; } .IRPP_kangoo { display: block; transition: background-color 250ms; webkit-transition: background-color 250ms; width: 100%; opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #e6e6e6; box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); -moz-box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); -o-box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); -webkit-box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); } .IRPP_kangoo:active , .IRPP_kangoo:hover { opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #FFFFFF; } .IRPP_kangoo .postImageUrl , .IRPP_kangoo .imgUrl { background-position: center; background-size: cover; float: left; margin: 0; padding: 0; } .IRPP_kangoo .postImageUrl { width: 30%; } .IRPP_kangoo .imgUrl { width: 100%; } .IRPP_kangoo .centered-text-area { float: right; width: 70%; padding:0; margin:0; } .IRPP_kangoo .centered-text { display: table; height: 100px; left: 0; top: 0; padding:0; margin:0; } .IRPP_kangoo .IRPP_kangoo-content { display: table-cell; margin: 0; padding: 0 10px 0 10px; position: relative; vertical-align: middle; width: 100%; } .IRPP_kangoo .ctaText { border-bottom: 0 solid #fff; color: #000000; font-size: 13px; font-weight: bold; letter-spacing: .125em; margin: 0; padding: 0; text-decoration: underline; } .IRPP_kangoo .postTitle { color: #2C3E50; font-size: 16px; font-weight: 600; margin: 0; padding: 0; } .IRPP_kangoo .ctaButton { background-color: #FFFFFF; margin-left: 10px; position: absolute; right: 0; top: 0; } .IRPP_kangoo:hover .imgUrl { -webkit-transform: scale(1.2); -moz-transform: scale(1.2); -o-transform: scale(1.2); -ms-transform: scale(1.2); transform: scale(1.2); } .IRPP_kangoo .imgUrl { -webkit-transition: -webkit-transform 0.4s ease-in-out; -moz-transition: -moz-transform 0.4s ease-in-out; -o-transition: -o-transform 0.4s ease-in-out; -ms-transition: -ms-transform 0.4s ease-in-out; transition: transform 0.4s ease-in-out; } .IRPP_kangoo:after { content: “”; display: block; clear: both; }

SEE ALSO
Exclusive: Designing ‘Lone Echo’ & ‘Echo Arena’s’ Virtual Touchscreen Interfaces

The choice of the leash was also informed by the distance between players and the objects that they interact with. Early VR experiments at Tender Claws resulted in us constraining object interactions to the “mid-range.” Most objects that the player grabs are at least one meter in front of the them and and no further than six meters away. This tends to be the most comfortable range for modern VR headsets. Some players have trouble focusing on objects closer than one meter. Further than six meters away, there is no clear sense of depth and small objects are clearly pixelated. The leash closes the mental gap between the player and their object of focus, allowing that object to become an extension of the player.

World Interaction: Headsets

The most recognizable gameplay mechanic of Virtual Virtual Reality is the ability to put on and take off any VR headsets in the game at any time. Virtual reality inside of virtual reality. Yes, in fact, it is kind of like Inception.

Early into our development of the headset transition mechanic at a 2015 hackathon, we realized that the experience of taking off and putting on headsets had potential beyond a narrative framing device. We wanted players to interact with headsets as often as possible.

One key characteristic of headset transitions is that they are completely seamless without any perceivable loading time. To achieve this, every accessible virtual reality, or level, is loaded into memory before its associated headset appears. Although this required significant performance optimizations to reduce the memory footprint of each level, it also lead us to an artistic direction that reduced the workload of our artists.

We experimented with various visual transitions to reduce the jarring effect of leaving one level and entering another. Ultimately we chose a fisheye lens effect that warps the edges of the screen, paired with a single frame cut between the two levels at the peak of the warping. The fisheye effect is accomplished through the use of a vertex shader: the geometry of the world is actually stretched away from the player to emulate the familiar look.

The interaction language and logic is consistent for the VR headsets in the game. They can be picked up like any other object in Virtual Virtual Reality. To take off their current headset, the player points their controller at their head and grabs that headset. Drawing attention to the presence of the player’s real headset does not compromise immersion, in fact it reinforces their connection to the experience.

We decided that the action of moving between virtual realities should be a valid choice at any point. Any headset in the game can be picked up and put on, and at any point you can take off your current headset to ‘go up a level’. These choices are also recognized and validated by other systems in the game. For example, characters will comment on you leaving and returning to their virtual realities, which helps reinforce the relationship between the headset system and the narrative.

Localization and Subtitles

We began the process of localizing Virtual Virtual Reality into eight languages after the game launched on Daydream. The spoken and written words of Virtual Virtual Reality are central to the experience and we wanted to give more players an opportunity to comfortably enjoy the game.

The decision to use subtitles instead of recording dialogue in new languages was a matter of resources and quality control. We worked with an extremely talented cast of voice actors who recorded over 3,000 lines of dialogue to bring the characters of Virtual Virtual Reality to life. The task of re-recording and implementing that dialogue in eight additional languages was simply beyond the scope of our team. Instead, we focused our efforts on creating the best subtitle system ever conceived by god or man. Or at least by a mobile VR game in 2017.

The Virtual Virtual Reality subtitle system was designed with two guiding principles. First, subtitles should be comfortably visible at all times. Second, it should always be clear who is speaking. Neither of these are novel concepts (see the game accessibility guidelines and this excellent article by Ian Hamilton), but at the time of development there were virtually no examples of these principles being applied in VR.

The key to our approach is dynamic positioning. The subtitles are repositioned to best fit the direction that the player is looking. When the player is looking at a speaking character, the subtitles appear directly below that character. When the player is looking elsewhere, the subtitles appear at the bottom of the player’s view with an arrow pointing in the direction of the character. The arrow is particularly helpful for players who are hard of hearing. Subtitles smoothly transition between the two states so that reading is never interrupted. Scenes with multiple speaking characters utilize different colored text for additional clarity.

Next Steps

Designing Virtual Virtual Reality was an incredible learning experience for our whole team. We all have backgrounds in gaming but none of us had ever worked on anything quite like this—a dense three-hour narrative adventure in VR. We are currently working on several new projects that leverage our lessons learned from Virtual Virtual Reality and further our integration of systems and narrative. The state of interaction design in VR has come so far in the past few years, and we’re excited to continue exploring and innovating as we create new experiences.

The post Designing ‘Virtual Virtual Reality’, One of Mobile VR’s Most Immersive Games Yet appeared first on Road to VR.

Time Stamp:

More from Road to VR