Researchers at the Fraunhofer Institute for Computer Graphics Research in Germany are working on a new way for smartphones owners to interact and command their smartphones to perform select actions. The researchers have developed a prototype earbud, which while allowing you to listen to music, can also recognize changes in the ear canal and interpret those changes as commands. For instance, sending a text message or answering a call.
The logic behind the technology is that when a wearer creates a certain facial expression, the ear canal changes shape and for a wearable to be able to recognize that shape will offer the opportunity for that distinct facial expression (and ear canal change) to correspond to an actionable command. According to the information coming through, while the prototype is in a very early stage of its development, it can already recognize five different facial expressions and to an accuracy rate of 90-percent. Some of the expressions noted include turning the head, smiling, and winking.
Of course, there are going to be those times when users make a facial expression but do not want to actually send a text or answer a call, and it seems the team are working to account for those times – by making the earbuds more context aware. An example given, is that a facial expression command that is linked to answering a call, will only fulfill that command when the phone is ringing. Which does mean that this could be more of a reactionary feature which requires a smartphone prompt in the first place. Overall, this is a still only a research project and seems a long way off becoming a commercially available product, although the team working on it do see the far-reaching implications of such a technology. For instance, while wearable users might like the idea of winking or smiling to text or make a call, the team is also looking at this line of research as an additional tool for people with disabilities. According to the reports, the researchers do plan to expand on how the research is going during a human–computer interaction conference, which is set to take place this May in Colorado.