Skip to main content

An iPhone app that measures pollutants in air

A new smartphone application can now allow users to keep a tab on the air quality and see the level of pollutants around them.

Computer scientists in the University of Southern California hope that as many users as possible download and try it in order to improve the software. Currently, the download works for smartphones running the Android system and soon will be widely available on Android app sources. An iPhone app is in the works.

The basic principle of the Visibility app is simple, according to the paper documenting the work by USC computer science professor Gaurav Sukhatme.

The user takes a picture of the sky while the sun is shining, which can be compared to established models of sky luminance to estimate visibility. Visibility is directly related to the concentration of harmful “haze aerosols,” tiny particles from dust, engine exhaust, mining or other sources in the air. Such aerosols turn the blue of a sunlit clear sky gray.There is one caveat — It has to be the right picture. The visibility/pollution models are based on the viewing geometry of the image and the position of the sun.

The Visibility app works because modern smartphones contain a rich set of sensors that include cameras, GPS systems, compasses and accelerometers, in addition to the powerful communication capabilities that are inspiring a slew of intelligent phone applications ranging from personal health monitoring to gaming and social networking.

Sameera Poduri, a postdoctoral researcher in Sukhatme’s lab, explained that the accelerometer in the phone — the sensor that tells how the user is holding the phone, determining whether it displays information vertically or horizontally — can “guide the user to point the camera in exactly the right direction.” The picture must be all or mostly sky, which makes a contribution from human user judgment critical. “Several computer vision problems that are extremely challenging to automate are trivially solved by a human. In our system, segmenting sky pixels in an arbitrary image is one such problem. When the user captures an image, we ask him [or her] to select a part of the image that is sky,” noted the research paper.

The accelerometers and the compass on the phone capture its position in three dimensions while the GPS data and time are used to compute the exact position of the sun. The application automatically computes the camera and solar orientation, uploading this data along with the image — a small (100KB) black-and-white file — to a central computer. The central computer analyzes the image to estimate pollutant content and returns a message to the user, as well as registering the information (User identities are anonymized). The system potentially can help fill in the many blanks in the existing maps of air pollution. So far the results are promising, but they indicate that several improvements are possible.Sukhatme added: “We’re sure we can improve it if we get people trying it and testing it and sending data.”

Comments

Popular posts from this blog

Evolution Of Computer Virus [infographic]

4 Free Apps For Discovering Great Content On the Go

1. StumbleUpon The granddaddy of discovering random cool stuff online, StumbleUpon will celebrate its 10th anniversary later this year — but its mobile app is less than a year old. On the web, its eight million users have spent the last decade recommending (or disliking) millions of webpages with a thumbs up / thumbs down system on a specially installed browser bar. The StumbleUpon engine then passes on recommendations from users whose interests seem similar to yours. Hit the Stumble button and you’ll get a random page that the engine thinks you’ll like. The more you like or dislike its recommendations, the more these random pages will surprise and delight. Device : iPhone , iPad , Android 2. iReddit Reddit is a self-described social news website where users vote for their favorite stories, pictures or posts from other users, then argue vehemently over their meaning in the comments section. In recent years, it has gained readers as its competitor Digg has lost them.

‘Wireless’ humans could backbone new mobile networks

People could form the backbone of powerful new mobile internet networks by carrying wearable sensors. The sensors could create new ultra high bandwidth mobile internet infrastructures and reduce the density of mobile phone base stations.Engineers from Queen’s Institute of Electronics, Communications and Information Technology are working on a new project based on the rapidly developing science of body-centric communications.Social benefits could include vast improvements in mobile gaming and remote healthcare, along with new precision monitoring of athletes and real-time tactical training in team sports, an institute release said.The researchers are investigating how small sensors carried by members of the public, in items such as next generation smartphones, could communicate with each other to create potentially vast body-to-body networks.The new sensors would interact to transmit data, providing ‘anytime, anywhere’ mobile network connectivity.Simon Cotton from the i