OmniVision’s New OV6211 Sensor Bring 3µm Pixels to the Table for Gesture Controls

April 2, 2014 - Written By Nick Sutrich

Gesture Controls are something of a paradox in and of themselves.  They’ve been on people’s minds ever since Tom Cruise started waving his arms around to control an entire computer system, but at the same time their implementation this far has been less than stellar.  Sometimes gestures work great, as we’ve seen in some examples on Microsoft’s Kinect 2.0 for the Xbox One, and other times they fall flat on their proverbial faces.  Samsung has been the big pusher of gestures in the mobile space, and they really haven’t done it any better than anyone else.  Sure it’s cute to be able to wave your hand over your phone to answer it when your fingers are coated in BBQ sauce, but real-life usage scenarios are few and far between for most of these cases, and most people just end up turning them off to save that precious battery life.  OmniVision, who is a leading manufacturer of digital imaging sensors and powers CMOS sensors in devices like the HTC One and Google’s crazy Project Tango, has unveiled a new sensor that might help with both of these problems in the future.

The new sensor is the OV6211, and it features a resolution of 400×400 pixels with some seriously massive 3µm pixels.  Just to put this in perspective most phone sensors nowadays ship with somewhere around 1.5µm-sized pixels, and the HTC One’s Ultrapixel camera has 2µm sized pixels.  With pixels this big the device is going to take in a rather large amount of light, giving more accuracy to the images produced and therefore better gesture controls by nature.  Given the resolution of the sensor this isn’t going to be used in anything outside of gesture controls, eye tracking, depth and motion detection or biometrics uses.  The sensor also features ultra-low 15µA power modes and always-on capabilities, with the ability to only wake from the ultra-low power sleep mode when changes in light are detected.  The sensor also uses a global shutter to capture images, giving it significantly more accurate imaging in low-light and fast-moving situations.  Given that people aren’t going to be waving their hands around in slow motion, this sort of technology needs to be accurate in the harshest conditions.  When will we see these sensors in phones?  Not likely until at least Fall, and even then that’s probably pushing it, so 2015 is probably the first time we’ll actually see this in a production device.

ov6211 block diagram