Airbus details Project EAGLE for autonomous aircraft
By Helicopters Staff
The helicopter division of Airbus is developing a new system called EAGLE to serve as the eyes of autonomous aircraft. It is not yet applied on fully autonomous vehicles, explains the company, but it describes EAGLE as an important step in making the aircraft commercially viable.
By Helicopters Staff
EAGLE stands for Eye for Autonomous Guidance and Landing Extension and serve as a real-time video processing system for aircraft. In flight, Airbus explains it collects data from sources like cameras and analyzes that information by means of a computer algorithm that has been trained to use the imagery in conjunction with autopilot functions.
“It’s a complete loop,” said Nicolas Damiani, senior expert in systems simulation at Airbus Helicopters. “EAGLE provides information to the autopilot. The autopilot displays how it intends to manage the trajectory to the target point. And the pilot monitors the parameters to make sure they are coherent with the image that is being acquired by EAGLE.”
The company explains that by fully automating the approach, this reduces the pilot workload during a critical phase. The current EAGLE prototype uses three cameras, each producing around 14 million pixels, inside a gimbal gyrostabilised pod.
The research team led by Damiani worked within what it describes as an ambitious test-case: Detect a helipad 2,000 metres away on a shallow four-degree slope approach. Airbus says the resolution needed to zero in on such a target explains the current configuration of EAGLE cameras.
As the aircraft (in the test, an H225) starts its approach, Airbus explains the camera with a narrow field of view sends its input to the system. Medium-range, the system switches to the camera with a larger field of view, according to Airbus, again switching during the landing to a camera equipped with a fish-eye lens.
“EAGLE is compatible with different camera options,” said Damiani, noting the optronics are pure prototypes at this stage. “Because we need to analyse 14 million pixels at around 30 Hz, its interface is made for high-resolution video stream. But EAGLE is also capable of using input from standard cameras.” This may mean using a simpler optical part based on cameras that are installed on today’s helicopters, explains Airbus, to automate approaches when it is not necessary to see the target in such detail.
From the computing end, EAGLE’s core processors are described as the centre of the project’s focus, because, before EAGLE can be industrialised, the algorithms and the processing unit – made of 768 shading units dedicated to graphics and 12 processors – must first be certified.
“As soon as an algorithm is based on image processing, it is difficult to get it certified because it relies on technologies that are currently hard to confirm,” said Damiani. “The paradox is that the algorithms, especially those that are working on machine learning, are based on training the algorithm to learn by itself. In the end, even if the performance is good, it’s very difficult to predict its result because of operational conditions that differ from the image sets used for the training.”