A New Way to See the World

Jan 10, 2006

By UCLA Samueli Newsroom

UCLA Researchers Create Low-power Vision Sensors for Embedded Networks

By Marlys Amundson

Vision is among the most powerful ways we have to sense the world around us. However, because of excessive power demands, embedded sensor networks have been unable to leverage imaging to monitor an environment in detail.

In such a system, large numbers of inexpensive sensor nodes could be deployed to detect and reveal information about the environment, including the presence of an intruder in a secure area, movements of people on public transit, or changes in the natural environment.

Modern camera-equipped sensors, although very small and capable of producing very high quality images, can require a great deal of energy to operate.

Mohammad Rahimi, a researcher in the Center for Embedded Networked Sensing, computer science professor Deborah Estrin, and electrical engineering professors Mani Srivastava and John Villasenor are investigating a number of energy-aware networked image sensor platforms and applications. One such platform, Cyclops, bridges the gap between commercial cameras and sensor network capabilities.

“The model we are using is common to embedded sensing: deploying a large number of small devices operating at small data rates,” explains Srivastava. “Cyclops uses a low-complexity vision sensor capable of tracking variations in light, shape, number, or color while limiting power demands on the network.”

A team of engineers and scientists in the UCLA Henry Samueli School of Engineering and Applied Science has created Cyclops, a tiny platform that attaches to Mica motes, which are commonly used as nodes in wireless sensor networks. The researchers adapted small cameras similar to those used in cell phones, and developed the circuitry and software that enables the nodes to process the images in context and report any new information. UCLA is partnering with Agilent who has expertise in manufacturing small, integrated cameras to create the Cyclops platform.

“Local intelligence on board the individual motes enables them to notify only for specific events,” says Srivastava. “The image sensor collects information, processes it on site, and then sends the relevant information in response to queries from users on the network.”

Villasenor’s team are applying their experience in image processing to develop reliable, inexpensive vision systems designed to use as little energy as possible, making them ideal for embedded networks.

“We’re also interested in the question of how much can be done locally vs. how much processing is done elsewhere,” says Villasenor. “Should the nodes process the images and only ship the end data, which takes very energy little to send but requires very sophisticated image processing on site? Or should they ship the image, which requires more bandwidth but less processing on the sensor itself?”

To limit power requirements for the system and make it sustainable over time, the UCLA researchers have reduced the resolution of the images captured by the motes to approximately 160-by-160 pixels.

“We don’t need perfect images to gather useful data,” says Rahimi, “just enough to detect key elements in the environment. We can make up for any loss in quality by having multiple cameras in a single area, providing multiple perspectives of objects and individuals in a given space.”

In addition the imaging sensors, the Cyclops system includes a central computer which manages queries from users and reports the network results, and a database that tracks permanent environmental context information. Like other projects in the Center for Embedded Networked Sensing (CENS), all of the tools they have developed are open source.

Shaun Ahmadian, David Zats, and Juan Garcia, computer science undergraduate students, helped to develop Cyclops’ application software, and participated in a test deployment in which they tracked traffic patterns and movement in a constrained area.

The UCLA research group plans to develop a full testbed, deploying 60 Cyclops units in the Mildred E. Mathias Botanical Garden on campus. The local, outdoor setting will allow the team to easily test new algorithms, database behavior, and network activity under realistic conditions.

“We’re also working on several possible applications for Cyclops,” says Rahimi, “smart environments and biological studies. A system using infrared cameras would alert researchers in CENS when birds are detected in nest boxes at the James San Jacinto Mountains Reserve, and help them track population changes. The infrared sensors would compensate for the low-light conditions without disturbing the birds.”

When fully developed, imaging systems based on platforms such as Cyclops could be applied to a range of applications, including monitoring of traffic flow on public transportation systems, securing sensitive military areas, or tracking the way in which individuals move through an exhibit at a museum.

Share this article