Sunday, June 13, 2010

Tiny sensors tucked into cell phones could map airborne toxins in real time

A tiny silicon chip that works a bit like a nose may one day detect dangerous airborne chemicals and alert emergency responders through the cell phone network.

If embedded in many cell phones, its developers say, the new type of sensor could map the location and extent of hazards like gas leaks or the deliberate release of a toxin.

"Cell phones are everywhere people are," said Michael Sailor, professor of chemistry and biochemistry at the University of California, San Diego who heads the research effort. "This technology could map a chemical accident as it unfolds."

In collaboration with Rhevision, Inc., a small startup company located in San Diego, Sailor's research group at UCSD has successfully finished the first phase of development of the sensor and have begun to work on a prototype that will link to a cell phone.



Caption: Hundreds of separate spots on this flake of silicon can be engineered to change color in response to many different chemicals. By capturing the pattern of color changes using a new kind of supermacro lens, researchers plan to create a versatile sensor small enough to fit into a cell phone that can recognize a wide variety of chemical hazards.

Credit: Sailor Lab/UCSD. Usage Restrictions: For use illustrating stories about this new finding only. For permission for other uses, please contact scinews@ucsd.edu.
The sensor, a porous flake of silicon, changes color when it interacts with specific chemicals. By manipulating the shape of the pores, the researchers can tune individual spots on the silicon flake to respond to specific chemical traits.

"It works a little like our nose," Sailor said. "We have a set of sensory cells that detect specific chemical properties. It's the pattern of activation across the array of sensors that the brain recognizes as a particular smell. In the same way, the pattern of color changes across the surface of the chip will reveal the identity of the chemical."
Already their chips can distinguish between methyl salicylate, a compound used to simulate the chemical warfare agent mustard gas, and toluene, a common additive in gasoline. Potentially, they could discriminate among hundreds of different compounds and recognize which might be harmful.

A megapixel camera smaller than the head of a pencil eraser captures the image from the array of nanopores in Sailor's chip.

To focus on the fine-scale detail in their optical array, the team uses a new kind of supermacro lens that works more like an animal's eye than a camera lens. The lens, developed by Rhevision, uses fluid rather than bulky moving parts to change its shape, and therefore focus.

"The beauty of this technology is that the number of sensors contained in one of our arrays is determined by the pixel resolution of the cell phone camera. With the megapixel resolution found in cell phone cameras today, we can easily probe a million different spots on our silicon sensor simultaneously. So we don't need to wire up a million individual sensors," Sailor said. "We only need one. This greatly simplifies the manufacturing process because it allows us to piggyback on all the technology development that has gone into making cell phone cameras lighter, smaller, and cheaper."

Sensitivity to additional chemicals is on the way. One of the top priorities for emergency responders is carbon monoxide, which firefighters can't smell in the midst of a sooty fire though it's deadly. Sensors on their masks could let them know when to switch to self-contained breathing devices, Sailor said. Similar sensors might warn miners of the buildup of explosive gases.

Adrian Garcia Sega, a graduate student in Sailor's laboratory, is leading the effort to develop the sensors. Gordon Miskelly, deputy director of forensic science at the University of Auckland in New Zealand developed the imaging array sensing methodology. Yu-Hwa Lo, professor of electrical and computer engineering at UC San Diego's Jacobs School of Engineering and founder of Rhevision developed the lens. Truong Nguyen, professor of electrical and computer engineering at the Jacobs School, is developing the computing algorithms to discriminate between different patterns. ###

The project is funded by the Department of Homeland Security.

Contact: Michael Sailor scinews@ucsd.edu WEB: University of California - San Diego

No comments:

Post a Comment