Facebook Patent Looks To Bring 3D Gestures To Applications


In short: Facebook appears to be looking into new ways to incorporate 3D gestures in its applications, based on a recently awarded patent published with the World Intellectual Property Organization (WIPO). Filed under patent number US20180260035, the associated documents describe the use of a touchscreen and proximity sensors to accomplish that task.  In effect, the software would gauge movements registered first on the display and then swiped away with a given trajectory to instantiate a pre-determined response or action in the application itself. However, at least one figure also seems to indicate that actions could be accomplished using specific gestures while not touching the display at all. Specifically, a user could start at one point above the display and then move toward another before performing what the patent calls a "drop gesture." That would theoretically allow precise interactions with the given app without ever touching the screen at all.

Background: Gestures, including 3D gestures, have been around for years. One of the earliest examples is Samsung's Smart Gestures feature, launched way back in 2013 with the Galaxy S4. Those included at least one motion similar to what Facebook is proposing, wherein a user could wave their hand above the device to gain access to it without touching the hardware itself. Those features continued to be developed, leading to several intelligent gesture options on effectively any Android device with even more coming to the OS itself with Android 9 Pie. However, gestures are still primarily relegated to screen swipes and taps, or to movement of a smartphone itself. That's at least in part to the advent of facial recognition-based unlocking, which renders most off-screen gestures obsolete. Moreover, given the increasing focus on security from both the public and government perspectives, those kinds of features are no longer necessarily desirable for most users.

Impact: Taking all of that into consideration, Facebook's apparent focus on in-app gestures goes beyond how the gestures were typically used. The technology used to accomplish the task never really went away but never quite managed to work its way into use by app developers either. That's despite considerable improvements to the battery of sensors used in modern handsets and the advent of AI and machine learning algorithms that could feasibly improve gesture recognition further. Meanwhile, the documentation doesn't give any specific examples of how the gestures might be used but the possibilities are effectively endless, ranging from starting a voice-to-text response to an instant message to clearing away Facebook, WhatsApp, or Instagram notifications at a wave.


Share this page

Copyright ©2018 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Junior Editor

Daniel has been writing for AndroidHeadlines since 2016. As a Senior Staff Writer for the site, Daniel specializes in reviewing a diverse range of technology products and covering topics related to Chrome OS and Chromebooks. Daniel holds a Bachelor’s Degree in Software Engineering and has a background in Writing and Graphics Design that drives his passion for Android, Google products, the science behind the technology, and the direction it's heading. Contact him at [email protected]

View Comments