Invisible Headlights is a programme that the Defense Advanced Research Projects Agency (DARPA) is running with the aim of finds ways in which autonomous vehicles will be able to navigate in complete darkness using only passive sensors.
DARPA has chosen four industry and university research teams to examine how ambient thermal light can be exploited by creating a passive 3D sensor that will permit autonomous vehicles to navigate their course.
The following research teams will address these challenges:
- Areté Associates will perform virtual analyses to understand the ambient spectral and polarimetric infrared environment and develop low-contrast-capable 3D vision algorithms.
- Kitware will use customised, multi-band hyperspectral infrared cameras combined with artificial intelligence and machine learning algorithms to estimate local 3D scene structure and semantics for navigation.
- A team led by the Massachusetts Institute of Technology (MIT) is working to scale superconducting nanowire single photon detector (SNSPD) technology into a very low noise infrared sensing array.
- Purdue University is developing new types of ultrafast, spin-based sensors and 3D vision approaches that exploit properties of the ambient thermal environment.
“These teams are pursuing innovative approaches to exploit the infrared spectrum,” said Joe Altepeter, Invisible Headlights programme manager in DARPA’s Defense Sciences Office. “They are exploring ways to capture more scene information using new devices, improved algorithms, and increased measurement diversity.”
In addition, a government team led by the Army C5ISR Center’s Night Vision and Electronic Sensors Directorate (NVESD) is accelerating the research teams’ work by assembling an unprecedented set of ground hyperspectral and polarimetric imagery with associated 3D ground truth.
During Phase 1, program performers are tasked to study whether thermal emissions contain sufficient information to enable autonomous driving in very dark conditions. In Phase 2, teams are tasked to design and test sensors and algorithms to show that real systems can measure enough information for 3D vision. The final phase will build and test passive demonstration systems that will be compared with active sensors in field tests.
by Andrew Drwiega