Smart Headlight That Sees Through Rain

Intel, Carnegie Devise Prototype Based on Mobile Intel Xeon Processor

By Mark LaPedus, Contributing Editor

A group led by Intel Corp. and others provided more details about the development of an embedded, smart automotive headlight for use in seeing through rain and snow.

Traditional headlights illuminate raindrops and snowflakes, making them flicker and thereby sometimes causing a distraction to a driver. In contrast, using a processor from Intel and a 2006 Honda Civic headlight at night with no street lights present, the smart headlight is capable of avoiding precipitation to improve driver visibility, according to researchers.

In April, Carnegie Mellon, Mines ParisTech and Texas Instruments originally presented a paper on the subject at the IEEE Conference on Computational Photography (ICCP). The technology was supported by grants from the Samsung Advanced Institute of Technology, the Intel Science and Technology Center and others.

At Intels annual research event in San Francisco on Tuesday (June 26), the chip giant and Carnegie Mellon provided more details on the technology and provided a demonstration. The current smart headlight consists of separate parts, including a projector, camera, beamsplitter and an embedded processor. Processing was performed on 3.2-GHz Intel Xeon processor with 8GB of RAM running on a Windows Vista 64-bit operating system. Researchers also constructed a drop-generation test bed.

Researchers also used a monochrome camera with an Ethernet interface capable of capturing 120 frames per second over a 120 244 region. The projector is a Viewsonic PJD6251 DLP with a resolution of 1024 x 768 at 120Hz.

In the lab, a co-located camera-projector system images and illuminates rain or snow. The particles are first detected by illuminating them in a matter of milliseconds. The rain or snow particle locations are predicted via a proprietary algorithm, and then the rays intersecting them are reactively turned off.

The entire process from capture to reaction takes about 13ms, said Mei Chen, principal investigator for the Intel Science and Technology Center in Embedded Computing (ISTC-EC). ISTC-EC is part of Intel Labs Pittsburgh, which is working on research projects in cloud computing systems and embedded real-time intelligent systems.

In a stationary setting in the lab, researchers captured images of 4mm diameter water drops, which were illuminated by a halogen lamp at different distances from the camera. The light intensity with respect to distance gave a 1024 1024 image and a 4ms exposure time.

In a simulation for a system moving at 30km/hr, the technology achieved an accuracy of 69% or less. The smart headlight is currently 70% accurate, but the eventual goal is to devise a system with 90% accuracy, Chen said.

However, the system with a 13ms latency cant handle extreme conditions like snow without losing too much light. A system with 1.5ms latency is required in extreme conditions. The other goal is to make (the smart headlight) into an integrated system, she added.

Researchers are also improving the prediction algorithm. In one experiment, the drop is first imaged with a 5ms camera exposure time. Some 14 frames are needed for the drop to traverse the field of vision (FOV). The drop is falling with near constant velocity at 16 pixels per frame. In this effort, researchers achieved 99.7% light throughput and 83.6% accuracy.

It is more difficult to predict and avoid illumination of 16 drops emitted at a rate of 32 drops per second. This is the equivalent of a strong thunderstorm. In this case, some 65 frames are needed for 16 drops to traverse the FOV. But the smart headlight still achieved a light throughput of 98.1% and an accuracy of 54.14%. The main bottleneck is the misclassification of pixels as drops, according to researchers.

 


lapedus_mark

Mark LaPedus has covered the semiconductor industry since 1986, including five years in Asia when he was based in Taiwan. He has held senior editorial positions at Electronic News, EBN and Silicon Strategies. Most recently, he worked as the semiconductor editor at EE Times.