Essential Reveals The Engineering Process Behind Its Camera


Essential is finally readying its first Android phone for manufacturing, and a member of the engineering team behind the camera wanted to mark the occasion and build some hype by taking to the company blog to reveal the process and philosophies that led to the camera that buyers will see on their units in the near future. According to the official, the team wanted a camera that could sit flush in the phone's thin chassis, but did not want to sacrifice image quality in order to get one, particularly low-light image quality. It should be noted that the team member penning the post is an image engineering specialist with a PhD in Human Visual Perception, as well as a Master's of Science in Color & Imaging.

For still images, the Essential Phone takes advantage of two different cameras. One shoots in color, and no matter how good a color digital camera is, it will end up having to essentially "guess" what color some pixels are, and assign values to others based on its programming, which means that it's not completely accurate when it comes to details. That's where the second camera comes in. The true monochrome secondary camera on the back of the Essential Phone captures extremely clear and detailed images, since it has no need for a color filter or assigning values. The camera software of the phone then merges these two, assigning color data to the highly detailed image captured by the monochrome camera by using the color camera's picture as a template of sorts. The result is a crisp and clear photo with a good resolution that's also well-colored, Essential claims.

Tuning the camera to perfection was done in an objective phase and a subjective phase. During the objective phase, the camera program was trained on "golden samples" which essentially contained all of the typical components and attributes that an ideal picture would contain. The camera hardware and software was tuned until it could consistently match golden samples for given conditions in the lab. In the subjective testing phase, on the other hand, the camera went out into the world with a human by its side to view its photos and tune accordingly. Subjective tuning began all the way back in January of this year, and according to the blog post, it won't stop until "the last possible minute."


Share this page

Copyright ©2017 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Senior Staff Writer

Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, Voice assistants, AI technology development news in the Android world. Contact him at [email protected]

View Comments