Google Looks To Develop Next Innovation In Room-Scale VR


Google is looking at ways to fundamentally change how room-scale VR works based on no fewer than three recent patents spotted at the World Intellectual Property Organization by Android Headlines. The purpose of the three patents vary but each seems aimed at a common set of goals.

The technology described bears some similarity to currently available room-scale VR but Google wants to take things further. Namely, the search giant wants to take things further by exploring ways to make mapping out and interacting in a room-sized VR environment more power-efficient, less cumbersome, and intuitive with improved immersion into the experience.

Mapping out a room and obstacles


There are aspects of each goal covered under all three patents and each focuses primarily around the way depth is measured and relayed back to the wearer of a VR headset. In each of the patents, a system for measuring depth without tethering to a PC and associated hardware for tracking the user and the room.

Two of the three recent patents pertain specifically to mapping out a room and objects and are very similar in both scope and description. The technology would be utilized in a standalone system similar to Oculus Quest rather than HTC's Vive Pro. That means the headset in each would still be dependent on depth sensing to a degree and components that are already available but it would make a break from present standards in several key ways.

In both patents, the 'pose' of the headset is an integral metric for the maintenance of safety and distance tracking. That encompasses not only the position of the headset within a room but also the orientation of the headset. That's tied in with a point cloud map generated by an initial room scan to ensure the user stays within a given area.


The initial room scan can generate a designated play area automatically based on the space available, including the creation of virtual boundaries above and below the user. The point cloud serves as the basis for depth tracking, allowing lower-power scanning after the initial scan — primarily for the purpose of preventing collisions in a changing environment, such as when a household member or pet wanders into the play space.

The 6DoF headset can additionally be linked up with a controller apparatus, enabling estimations to be made of the position of not only the arm and hand position of the user but their legs as well.

The result would theoretically be a VR system that tracks a room and possible obstructions accurately and efficiently while preventing awkward instances where an arm or foot will accidentally bump into objects within the designated space. That could help safely open up room-scale VR to more complex environments and reduce requirements for external hardware too.


Making interactions even at a distance more natural

The first two patents contain descriptions of user-activated passthrough camera capabilities too, so users can quickly gain an assessment of their real-world surrounding environment. That ability is coupled with intuitive navigation warnings that appear to blend cleanly into the background in a way that's similar to AR, alerting users to impending collisions or if they step into an unmapped portion of the room.

The passthrough serves the added purpose of ensuring the patents applicable to AR applications more directly, rather than limiting the invention to VR.


The third patent contains some of the body proportion tracking features found in the other designs and would work well in conjunction with those inventions but adds algorithms to make interacting with objects more intuitive. More directly, those ensure eliminate the need to use preset distances based on the virtual player representation. That means that distance between a user's arm and the object they want to interact with can be more closely tied to the proportions of their real body.

The distance interaction patent serves up UI benefits too, allowing for more natural visual representation of distance in relation to the reach of the headset's wearer. Objects that are being interacted with should, in theory, move and interact more realistically in terms of distance from the user.

Share this page

Copyright ©2019 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Junior Editor

Daniel has been writing for AndroidHeadlines since 2016. As a Senior Staff Writer for the site, Daniel specializes in reviewing a diverse range of technology products and covering topics related to Chrome OS and Chromebooks. Daniel holds a Bachelor’s Degree in Software Engineering and has a background in Writing and Graphics Design that drives his passion for Android, Google products, the science behind the technology, and the direction it's heading. Contact him at [email protected]

View Comments