DARPA has developed a new sensor that can see through clouds, operating just as effectively as current systems do in clear weather.
The Extremely High Frequency (EHF) targeting sensor successfully demonstrated its ability to capture real-time video through clouds, marking the completion of its flight tests.
Ultimately, the Pentagon is hoping to create a system that can be mounted on different types of aircraft to capture high-resolution video of targets on the ground, regardless of the weather conditions.
The EHF targeting sensor comes from DARPA’s Video Synthetic Aperture Radar (ViSAR) program, which launched in 2013.
According to the agency, the sensor could be used to provide overhead support to ground troops.
It could ultimately be integrated with a movable gimbal for full-motion video of the target, in cloudy or clear skies.
‘The recent flight of the ViSAR sensor marked a major program milestone toward our goal, proving that we can take uninterrupted live video of targets on the ground even when flying through or above clouds,’ said Bruce Wallace, program manager in DARPA’s Strategic Technology Office.
‘The EO/IR sensors on board the test aircraft went blank whenever clouds obscured the view, but the synthetic aperture radar tracked the ground objects continuously throughout the flight.'
Current airborne weapon systems use electro-optical and infrared (EO/IR) systems.
But, in cloudy weather, these are met with a problem.
Systems that use synthetic aperture radar technology currently are unable to capture high-resolution video in cloudy conditions as well, according to DARPA.
The new system, if built to fit in a standard EO/IR sensor gimbal, would be the first to do so and maintain frame rates fast enough to track moving targets on the ground.
For the tests, the researchers used a modified DC-3 aircraft flying at low and medium altitudes.
Then, they compared the results with data from ViSAR, EO, and IR sensors.
In the next phase, they’ll fit the sensor into an aircraft that has a complete battle management system, which can carry out real-time target engagement.
‘Refining the ViSAR sensor’s visualization software to provide operators a representation they’re used to seeing is the next step in the program,’ said Wallace.
‘We don’t want operators in the back of an aircraft to need special radar training to interpret the sensor’s data – we are working to make the visual interface as easy as existing EO/IR sensor displays.’