Purdue thermal imaging innovation allows AI to see through pitch darkness like broad daylight – Research & Development World
From Purdue University:
The patent-pending innovation sees texture and depth and perceives physical attributes of people and environments
Researchers at Purdue University are advancing the world of robotics and autonomy with their patent-pending method that improves on traditional machine vision and perception.
Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering in the Elmore Family School of Electrical and Computer Engineering, and research scientist Fanglin Bao have developed HADAR, or heat-assisted detection and ranging. Their research was featured on the cover of the July 26 issue of the peer-reviewed journal Nature. A video about HADAR is available on YouTube. Nature also has released a podcast episode that includes an interview with Jacob.
Jacob said it is expected that one in 10 vehicles will be automated and that there will be 20 million robot helpers that serve people by 2030.
“Each of these agents will collect information about its surrounding scene through advanced sensors to make decisions without human intervention,” said Jacob. “However, simultaneous perception of the scene by numerous agents is fundamentally prohibitive.”
Traditional active sensors like LiDAR, or light detection and ranging, radar, and sonar emit signals and subsequently receive them to collect 3D information about a scene. These methods have drawbacks that increase as they are scaled up, including signal interference and risks to people’s eye safety. In comparison, video cameras that work based on sunlight or other sources of illumination are advantageous, but low-light conditions such as nighttime, fog, or rain present a serious impediment.
Traditional thermal imaging is a fully passive sensing method that collects invisible heat radiation originating from all objects in a scene. It can sense through darkness, inclement weather, and solar glare. But Jacob said fundamental challenges hinder its use today.
“Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect,’” Bao said. “Thermal pictures of a person’s face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture, and features is a roadblock for machine perception using heat radiation.”
HADAR combines thermal physics, infrared imaging, and machine learning to pave the way to fully passive and physics-aware machine perception.
“Our work builds the information-theoretic foundations of thermal perception to show that pitch darkness carries the same amount of information as broad daylight. Evolution has made human beings biased toward the daytime. Machine perception of the future will overcome this long-standing dichotomy between day and night,” said Jacob.
“HADAR vividly recovers the texture from the cluttered heat signal and accurately disentangles temperature, emissivity, and texture, or TeX, of all objects in a scene. It sees texture and depth through the darkness as if it were day and also perceives physical attributes beyond RGB, red, green, and blue, visible imaging, or conventional thermal sensing. It is surprising that it is possible to see through pitch darkness like broad daylight,” said Bao.
The team tested HADAR TeX vision using an off-road nighttime scene.
“HADAR TeX vision recovered textures and overcame the ghosting effect,” said Bao. “It recovered fine textures such as water ripples, bark wrinkles, and culverts in addition to details about the grassy land.”
Additional improvements to HADAR are improving the size of the hardware and the data collection speed.
“The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation,” said Bao. “To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster. The current sensor takes around one second to create one image, but for autonomous cars we need around 30 to 60-hertz frame rate, or frames per second.”
HADAR TeX vision’s initial applications are automated vehicles and robots that interact with humans in complex environments. The technology could be further developed for agriculture, defense, geosciences, health care, and wildlife monitoring applications.
Jacob and Bao disclosed HADAR TeX to the Purdue Innovates Office of Technology Commercialization, which has applied for a patent on the intellectual property. Industry partners seeking to further develop the innovations should contact Dipak Narula, [email protected] about 2020-JACO-68773.
Jacob and Bao have received funding from DARPA to support their research. The Office of Technology Commercialization awarded Jacob $50,000 through its Trask Innovation Fund to further develop the research.