Researchers at Purdue University have made significant advancements in the field of robotics and autonomy with their new method, which enhances traditional machine vision and perception.
Zubin Jacob, an associate professor of Electrical and Computer Engineering, and research scientist Fanglin Bao, developed a cutting-edge technique called HADAR (heat-assisted detection and ranging). This innovative approach was recently featured on the cover of the prestigious peer-reviewed journal Nature, and a video showcasing HADAR can be found on YouTube.
In the future, it is expected that one in every ten vehicles will be automated, and there will be around 20 million robot helpers serving people by 2030. These autonomous agents will use advanced sensors to gather information about their surroundings and make decisions without human intervention. However, the simultaneous perception of scenes by multiple agents presents significant challenges.
Traditional active sensors like LiDAR, radar, and sonar emit signals to collect 3D information about a scene. While effective, they face limitations and risks, especially as they are scaled up. Video cameras, on the other hand, offer advantages with their use of natural light, but they struggle under low-light conditions like nighttime, fog, or rain.
Traditional thermal imaging, which captures heat radiation from objects in a scene, is fully passive and can see through darkness and challenging weather conditions. However, it suffers from a “ghosting effect” that leads to textureless images, making it challenging for machine perception.
HADAR combines thermal physics, infrared imaging, and machine learning to create fully passive and physics-aware machine perception. It can recover texture from cluttered heat signals, accurately distinguish temperature, emissivity, and texture of objects, and see through darkness as if it were daylight.
The researchers tested HADAR in a nighttime off-road scene and successfully recovered textures, including fine details like water ripples and bark wrinkles.
Although the current sensor used in HADAR is large and heavy due to the need for multiple colors of invisible infrared radiation, the team is working on improving its size and data collection speed to make it suitable for self-driving cars and robots. HADAR TeX vision has initial applications in automated vehicles and robots operating in complex environments, but its potential extends to agriculture, defense, geosciences, healthcare, and wildlife monitoring applications.
- Eurekalert