MIT Develops Crowdsourced Eye Tracking For Phones

The magic of data from multiple sources, known as crowdsourcing, extends to apps, artificial intelligence and large-scale studies, among other advancements. While sourcing input data and sourcing processing grunt are two completely different endeavors, the kernel of an idea at the core is the same; teamwork. In the halls of MIT, long a bastion of scientific and technological breakthroughs and the minds that create them, the spirit of teamwork is alive and well in more ways than one. A team of researchers have decided to create an eye tracking system for smartphones, but rather than calling participants to the lab to let the robot scope out their eyeballs in action, they decided to allow just about anybody to jump in through the use of a special iOS app. This, and a push on Amazon's Mechanical Turk, has resulted in oodles of data from over 1,500 people, as opposed to the normal 50 or so that such a study normally draws in.

The system uses the front camera on a smartphone, standard equipment on almost all models these days, to constantly observe a user's eyes and figure out what movements might mean they're focusing on certain areas of the screen. The study provided participants with an app that would pop up a dot on their screen, which would very briefly turn into a letter when they looked at it, either R or L. The player would tap on the corresponding section of the screen to prove that they had successfully found the dot. If so, the app would use their camera data, since the system now knows to associate the captured data with a look at the part of the screen that the dot was on. Each user wound up contributing some 1,600 odd sets of data, on average.

In order to avoid the scaling issues typically inherent of neural networks and machine learning, MIT used what is called 'dark knowledge'. This means that they used previously captured data on top of real-world data in the moment in order to train their smaller network of machines faster. The researchers responsible say that eye tracking, a long-desired by often prohibitive field, should be much more accessible through this study. Their machine has thus far gotten the accuracy down to a mere half-centimeter margin, which might just be good enough for developers to use in commercial products.

Copyright ©2019 Android Headlines. All Rights Reserved
This post may contain affiliate links. See our privacy policy for more information.
You May Like These
More Like This:
About the Author
2018/10/Daniel-Fuller-2018.jpg

Daniel Fuller

Senior Staff Writer
Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, voice assistants, AI technology development, and hot gaming news in the Android world. Contact him at [email protected]
Android Headlines We Are Hiring Apply Now