Facebook A.I. Can Draw Eyes In Photos With Blinking Subjects


Facebook's aptly named Facebook Research division has now published a new paper showing how A.I. could be used to fix photos taken with the subject's eyes closed. The most obvious use for this would be for selfies or other photos where the user or subject accidentally blinked just as the shot was taken. Of course, that means it could eventually be built out to work with groups of people. If successful, that could mean an end to hour-long photo sessions required when those one or two friends or family members just can't stop blinking at just the wrong time. It might even be expanded to allow multiple photo fixes at once. For now, the system seems geared toward individual blinkers and takes advantage of machine learning, computer vision, and aspects of AR and VR for single images. Using that toolkit, Facebook's Eye In-Painting software effectively paints natural-looking eyes into an image and blends those seamlessly with the rest of the subject's face.

To accomplish that task, Facebook first had its A.I. "learn" what a human face "should look like" using a database of face images. The company didn't provide an exact size but claims that with a sufficiently-sized database, it found avoiding the uncanny valley to be possible. Facebook's Eye In-Painting tool then uses machine learning to analyze images of the user with their eyes open. Those are used to generate a benchmark dataset for comparing its own rendered eye images to the real thing. The company doesn't say whether or not it needs access to images of any specific individual to perform the task but that doesn't appear to be the case at all. Instead, with those images, the computer generates a perceptual code of the person's face and eyes. While removing the original code representing the closed eyes, that code gets put aside for later referencing to maintain facial structure and color tones.

By combining all of that data, it's able to generate a new code which should be representative of what the subject's eyes would have looked like if open when the photo was taken. That means it could feasibly be used for more than just closed eyes in the future. It might be possible to adapt the A.I. to adjust for images where the subject was looking the wrong way or distracted. Beyond that, it could be used to recolor or shift colors automatically in images where lighting creates unnatural looking tones. While the technology itself is still likely to be a long way off and brings some ethical connotations, this feature could prove extremely useful if and when it does break into the mainstream.

Share this page

Copyright ©2018 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Junior Editor

Daniel has been writing for AndroidHeadlines since 2016. As a Senior Staff Writer for the site, Daniel specializes in reviewing a diverse range of technology products and covering topics related to Chrome OS and Chromebooks. Daniel holds a Bachelor’s Degree in Software Engineering and has a background in Writing and Graphics Design that drives his passion for Android, Google products, the science behind the technology, and the direction it's heading. Contact him at [email protected]

View Comments