Mantis Shrimp-Inspired Camera Could Improve Autonomous Cars

Waymo Google IO AM AH 5

In short: Researchers from the University of Illinois at Urbana–Champaign and Washington University in St. Louis have now developed a new camera that could improve the dynamic range over those found in current self-driving vehicles by as much as 10,000 times. The design is actually inspired by one of nature’s most complex eyes, found in mantis shrimp and represents a complete deviation from how current sensors work, according to the associated study published in The Optical Society’s Optica Journal. That wasn’t necessarily an easy process either. To begin with, the photodiodes in the camera were adjusted to operate in a forward bias. That effectively means that the diodes allow current to be generated rather than impeding it as a typical reverse bias mode camera does. That generates a logarithmic response to light intensity but the discovery doesn’t stop at the photodiodes. Nanomaterials were also deposited onto the associated imaging chip’s surface for increased polarization sensitivity. The researchers then had to develop a new series of image processing steps that handle the additional noise created by the nanomaterials.

The results of those efforts are substantial. Not only are corresponding images much more detailed in terms of dynamic range. The camera’s ability to capture polarization in a snap means that cars equipped with the technology could better handle situations that have previously presented serious challenges to autonomous vehicles. It would allow an AI driver to detect obstacles through haze or fog and to transition from dark to bright or bright to dark environments without hassle. In turn, that may eliminate problems where environmental conditions prevent traditional cameras from seeing objects or people because of color or brightness similarities and would allow vehicles to “see” as much as three times further.

Background: Machine vision technologies in self-driving cars have been mostly dependent on traditional camera systems, resulting in several accidents. At least one of those involved a Tesla-built vehicle which failed to stop in time due to its inability to ‘see’ an obstacle because the brightness of the sky blended with the color of that other vehicle. LiDAR has presented a possible solution, augmenting those with detection systems that can see better in certain lighting but those don’t work as well as might be desired in the rain or other adverse weather. Development has centered around new and novel ways to integrate those two systems and the complexity of AI driving algorithms and systems have increased. So that hasn’t really proven to be a viable solution to issues surrounding driverless vehicles or served to alleviate every safety concern.


Impact: The new camera most likely won’t show up in autonomous vehicle systems in the near future, pending further research. However, the researchers say that it could be mass-produced at costs as low as $10 per camera. What’s more, uses could extend beyond self-driving automotive applications and into other areas that AI is being developed to explore. The detection of cancer cells is one area that’s been indicated by the study but, as with all new technologies, there are probably also countless unforeseen ways a breakthrough such as this could be used.