The United States Federal Communications Commission approved Google's futuristic hand-gesture radar technology for further testing several days back, allowing the Internet juggernaut to start experimenting with its unique solution aboard aircraft and at higher power levels. As of right now, Alphabet's subsidiary can now conduct tests with more lenient restrictions on the maximum output power of its devices' transmitters and effective radiated power. While the approval of in-flight testing is an entirely new development, the permit was granted only for setups wherein Google's solution operates "in closed exclusive on-board communication networks," meaning no actual consumer-grade use cases can be tested on airplanes and other aircraft. The FCC gave Google the permission to expand its radar experiments on the basis that it concluded the tech in question is extremely unlikely to cause any kind of harmful interference to other spectrum incumbents. The solution operates in the 57-64GHz band and the telecom watchdog concluded that allowing Google to ramp up its testing efforts will serve public interest, explicitly stating that it believes Google will eventually be able to commercialize its wireless, contact-free gesture technology.
Crazy tech isn't created overnight
Originally announced in mid-2015, Google's unconventional initiative — officially called Project Soli — never had massive funding relative to some of the company's other moonshot bets but its internal support was always consistent, even though news of the effort slowed down over the course of the last couple of years. The firm's Soli sensors are essentially high-tech radars; they emit wide electromagnetic waves, then monitor their spread for interference in terms of scattered energy, a portion of which is reflected back to the source after the original beam meets an object. However, unlike an old-school radar that can only detect distance and object size in certain intervals, Google's system is capable of doing so in real time and has been optimized specifically for relatively small subjects such as hands. Based on the properties of the reflected signal received by Soli sensors, the thereof can determine subject characteristics such as distance, size, orientation, dynamics, velocity, and even material.
While the applications of the tech are so vast that Google already opted to release an SDK for it, allowing third-party developers to experiment with its invention, the Mountain View, California-based firm is presently primarily focused on creating a platform for camera-free gesture recognition that's much more accurate and consistent than alternative solutions. The latest implementation of the Soli sensor comes in the form of a chip that can be used in electronics such as smartphones and wearables. While the mobile industry — Samsung in particular — has been exploring hand gestures for many years now, existing solutions are simplistic at best and unreliable at worst, not to mention that they usually require users to physically touch their devices. The rare few attempts at gesture recognition that don't usually rely on front-facing smartphone cameras and consequently suffer from reliability issues. Google's Advanced Technology and Projects group not only created the hardware for Project Soli but developed an entirely new radar sensing paradigm and specialized software meant to make the absolute most of its chip. The comprehensive approach to Project Soli's development that essentially started from the ground up is the main reason why the initiative is progressing at a relatively slow pace, though new applications of Google's tech are expected in the near future.
The tech industry was significantly different back when Project Soli started; in the years that followed, various advancements allowed for "true" (i.e. unprecedentedly powerful) artificial intelligence solutions to emerge, promising countless possibilities and prompting may major players to completely rework their long-term business strategies. Google was part of that group seeing how today CEO Sundar Pichai refers to Alphabet's flagship unit as "an AI company," with machine learning and related technologies now being infused into virtually every project at the firm. As far as Project Soli is concerned, the ATAP division has yet to disclose any major attempts at blending its unconventional gesture recognition with AI but even in its original form, the tech should allow for extremely convenient use cases such as answering a call and putting the caller on speaker without ever touching one's handset or mimicking an imaginary volume dial, with Google claiming such interactions will feel natural in no time.