AH Tech Talk: Is Eye Tracking The Future Of VR?

The website of California startup Eyefluence is anything but subtle. "What if you could rethink human communication? What if you could perceive, peruse, and process information in milliseconds? What if you could accelerate intelligence at the speed of sight?" What if, indeed. It's an incredible proposal, to be sure. An MIT researcher decided she wanted to take a whack at it, so to speak. "It's one thing to play the arcade game Whac-A-Mole by swinging around an oversized mallet; it's far easier to whack those moles virtually, controlling the mallet with just your gaze." This is what Rachel Metz, senior editor for MIT's Technology Review publication, said of her trip to Eyefluence's offices in Milpitas, California, where she got to play with a gizmo that might just be the future of virtual reality, or at least a big part of it. The gizmo in question was Eyefluence's own eye tracking software, for use in virtual reality, augmented reality and mixed reality applications. Eyefluence's own website is full of taglines and marketing buzz, with lines like "Today's slow and clumsy tap, swipe, nod, point, and talk methods for controlling HMDs have been holding back a market ready for phenomenal growth. The Eyefluence iUi takes eye-tracking to the next level." featured prominently. The bottom line, though, is that we are, literally and figuratively, staring into the eye of a revolution of a revolution. Many are already predicting that VR, even given its flawed form, is a piece of the future puzzle of human-technology interaction. With this new interface breakthrough, that puzzle piece may fall into place that much quicker.

Eyefluence is not the only eye-tracking game in town, but according to their website, they might just be the first to market with the breakthrough tech. They've combined the expertise of biologists and technologists to really nail down all of the subtle nuances of eye tracking, but with a bit of a twist. Eyefluence is putting the overall experience above all else during development, saying, "Unlike anything previously envisioned, our entire technology stack is fully integrated with UX at the core. With our "Top-Bottom-Top" design method, development of our user interface, eye-tracking algorithms, hardware configuration, and circuit board design begins and ends with the user experience." If Metz' experience was anything to judge by, they've done a phenomenal job.

Current control methods for VR are just a bit immersion breaking. Google's own Cardboard solution can be controlled via a button on the headset, head tracking and a Bluetooth device like a keyboard or game controller. In some applications, this works just fine; this writer, in fact, happens to absolutely love playing Sony PSP games with Cardboard and a Sony PS3 controller. There is a serious lack of immersion, however, that you encounter with more diverse applications. Competitor Oculus plans to attempt to remedy this with their own controllers, though their headsets will ship with a Microsoft Xbox 360 controller for the time being. HTC, meanwhile, has wandlike controllers for the Vive that, while a bit more natural than a game controller, are still far from ideal. One firm, known as Gest, is creating a solution that can track individual finger movement to allow things like typing, gaming and other diverse operations using a variety of gestures, but even this doesn't mimic the way you'd use your hands in a natural setting. For the most part, it seems like a better choice for the time being would be to get the hands completely out of the equation until a completely different, perhaps full body, solution can be dreamed up. With what Eyefluence is cooking, that could be exactly what happens.

Imagine for a second, a future where Eyefluence hit the market first and really took off, as well as the burgeoning Internet of Things movement. Their sensors are on just about every AR and VR product. You wake up in the morning and put on your glasses. With a quick glance at the time in the upper right corner of the heads up display thereon, the weather and your schedule for the day greet you. As you sit up and climb out of bed, you grunt to the glasses, "What's for breakfast?". After a few milliseconds' pause to connect with your fridge and figure out what's inside, you see before you four panels. One says, "Salty", another "Zesty", another "Fruity" and another "Healthy Request". You gaze at "Fruity" and are presented with recipe cards for crepes, fruit waffles and even exotic fruit-based ramen noodles. You eyeball the noodles on a whim and the card expands, sitting neatly on the right side of your field of vision as you cook. Imagine going about your entire day that way. That's just one possible application of Eyefluence's upcoming technology.

You May Like These
More Like This:
About the Author
2018/10/Daniel-Fuller-2018.jpg

Daniel Fuller

Senior Staff Writer
Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, voice assistants, AI technology development, and hot gaming news in the Android world. Contact him at [email protected]
Android Headlines We Are Hiring Apply Now