Study: Gesture, Motion Are The Future Of User Interface

Moto G 2015 Gestures AH 1

The electronic devices that we interact with on a day to day basis are in the midst of a paradigm shift in user interface. User experiences are becoming smoother, cleaner, more intuitive, and more innovative by the year. Gestures have been a big part of mainstream control paradigms, whether on screen or in the air, for some time now in the mobile world, and a new study from Juniper Research indicates that this trend is only the beginning. While the technology is currently not exactly mainstream, the benefits are quite clear for use cases like smartwatches, TVs, and video game consoles. The study found that the end of 2016 would see some 168 million devices featuring some sort of gesture or motion sensing capability tied to controlling the device.

Gesture control has been on the smartphone scene for years, as seen in devices like Motorola’s Moto G family, and Samsung’s Galaxy S family. It has never, however, been the focal point of a mainstream phone’s interface. Sailfish OS and to a lesser extent Nokia’s Z Launcher played with the idea, but they never really caught on, and certainly didn’t use the sort of air gestures that are finding their way into modern applications like game consoles, VR, and more advanced concept PCs. According to Juniper Research’s study, that trend is going to reverse; we’re looking at something in the area of 492 million gesture-enabled devices across all areas by 2021.

The charge reportedly won’t be led by the likes of smartphones and PCs; these devices’ main points of interaction, like touchscreens, mice, and keyboards, will likely remain as popular as they are now, perhaps only a bit less so, unless something changes. Only about 5% of devices like this are expected to have gesture capabilities on board during the time frame of the study. Instead, Juniper Research expects less traditional devices, mainly wearables, to drive the trend. Current implementations for smartwatches, for example, involve interaction through buttons, voice commands, and a small screen. This limiting paradigm is expected to make way for gestures, as well as others like it such as typing with a game controller or using a mouse and keyboard to control your computer’s interface in VR, and move us closer to half a billion gesture-driven devices, or perhaps even a full billion or more, in due time.