Now scientist can celebrate as they no longer need to purchase expensive air sensor just to see the level of air pollution in the town. How? Quoting Gizmag:
Visibility sends a user's photo of the sky as a small black-and-white file to a central computer, along with data from their phone's GPS, compass, clock and accelerometer. The computer compares the luminance value of the sky in the photo to algorithmic models for the specific time and coordinates at which the phone data indicates the image was taken. If the sky in the photo isn't as bright as the model says it should be, it means that some of the sunlight wasn't making it through the atmosphere, because it was blocked by haze aerosols. Not only does the computer then send the user a report on the level of air pollution, but it also stores the information (without the user's identity) to augment pollution maps for the area.
There is some user fallibility to factor in, including getting the camera's orientation to the sky correct when snapping a photo, which is why the app has a built-in guide that helps the user orient the phone when taking the photo.
Theyhas tested the app in Los Angeles, California and Phoenix, Arizona -- two notoriously polluted cities -- and the apps readings measure up favorably to the air quality data published by the EPA. So in essence, this is a very potentially useful apps even if you're not a scientist.