How much fun would it be to interact with an invisible computer? That’s exactly the idea behind Project Soli, which Google’s ATAP division is showing off at their Google I/O break-out session today. We’ve seen Project Soli before from Google, they showed it off at Google I/O 2015 in the sandbox and at their session, it was pretty exciting, and now it’s even more exciting. To put it in a nutshell, Project Soli is a way to enable new types of touchless interaction using radar. Google describes it as “one where the human hand becomes a natural, intuitive interface for our devices.” Project Soli is capable of tracking sub-millimeter motions at high speed and accuracy. Google has managed to bring Project Soli to a smartwatch, which is part of their demo today.
ATAP worked with LG to bring Project Soli into a smartwatch. They redesigned the chip to be smaller. The thinking behind redesigning the chip, instead of just making another one that fits into a smartwatch, is that if you can fit it into a wearable then it can fit into anything. And that seems to be the case here. With the LG smartwatch, you can turn an imaginary dial near the watch and scroll through the UI. Use the smartwatch just as you would if you were touching it, except you’re not touching it. Additionally you can interact away from the smartwatch, which opens up all sorts of new possibilities for features. Google’s ATAP team also worked with JBL to bring Project Soli to a speaker. Basically if you bring your hand near the speaker, it will light up. Flick your fingers and it’ll start playing music, flick them again and it’ll stop. Pretty simple, but very cool.
Google is launching development kits for Project Soli in 2017. So developers will be able to play around with Project Soli and bring the functionality to their own hardware. The development kits will include a low-power module design. So it won’t use a whole lot of power – especially important in wearables – but it’s also small so it can fit in just about anything.