Sony & eyeSight Partnership To Go Beyond Xperia Touch


Isreali computer vision startup eyeSight Technologies has partnered up with Sony to roll out an update to the Xperia Touch projector that will enable touchless gesture control, opening up a new layer of interaction with projected content, and that application will apparently be only the first fruit of their partnership. The update is completely on the software side of things, and uses the Xperia Touch's built-in camera. Through computer vision, the camera is able to recognize a user's hands and how far away they are in order to allow interaction without the user having to touch a surface. The technology goes beyond gesture controls; it can also enable devices to see what objects are around a user, who's using the device, and what that person is doing. For the time being, the only part of this technology that the Xperia Touch is taking advantage of is air gesture controls, but this may well change in the future.

According to eyeSight Technologies CEO Gideon Shmuel, the company's technology and vision run far beyond simple applications like this one. The company's aim is to provide cohesive solutions that combine different methods of contact between man and machine to bring interactions as close to the level of nuance found in human to human communications as possible. The possibilities with a system like this are nearly limitless, depending on the device being tweaked with the new functionality and what content creators and developers make for those devices.

Computer vision, high-definition projection, and advanced AI capabilities are all present in the Sony Xperia Touch with this update, laying the groundwork for a revolutionary user experience. Sony's partnership with eyeSight Technologies will go beyond the Xperia Touch, which means that eyeSight Technologies' distinctive and powerful AI could end up in a range of Sony products, from smartphones to PlayStation accessories. Going forward, this kind of computer vision technology could allow for enhanced interaction between man and machine in a variety of situations and across device ecosystems; changing the TV channel with a hand wave, typing in the air to browse the web from your couch, and operating complex machinery in VR and AR by simply moving your hands as if they're the arms of the machine are just a few possible use cases that spring to mind.

Share this page

Copyright ©2018 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Senior Staff Writer

Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, Voice assistants, AI technology development news in the Android world. Contact him at [email protected]

View Comments