DARVIS - Data Fusion of K/X Band Radar with Visual Sensors for Airborne Platforms

Various sensor technologies in aviation have so far insufficiently combined the respective complementary properties. The optimal fusion as well as the respective characteristics of the sensors are to be optimally used in the future. The different methods for calibration and synchronization will be investigated and optimized. The results will then be incorporated into a sensor technology model for collision avoidance in aviation.

Short Description

Motivation

The rapid development of unmanned aerial vehicles offers great development potential in the entire value-added chain of the aircraft industry right up to industrial companies. The ever-new, highly innovative fields of application provide important impetus for the implementation of unmanned systems in real-life applications. 

It is already foreseeable that future applications will have a higher degree of automation and will operate in the civil airspace beyond line of sight (BLOS). It is important that this development of technologies and applications is done in close cooperation with regulatory bodies so that safety is always paramount. 

On the regulatory side, various authorities and committees seek to create the necessary framework conditions, which have, among other things, identified a clear need for action in the research and development of collision avoidance systems for unmanned aerial vehicles.

Objectives

  • Investigations on optimal fusion
  • Calibration and synchronisation procedures

Content

Previous methods and procedures focus on only combining the results, whereby the complementary properties of different sensor technologies cannot be fully utilised because the individual sensors process the data in parallel. 

In this project, the optimal fusion level is systematically researched in order to optimally use the complementary characteristics of the individual sensors. An information theoretic approach is used to investigate the information bottleneck method application to data fusion, which is considered the theoretical basis for deep convolutional neural networks. 

The research deals with a robust data fusion of multimodal sensors of a collision avoidance system for an EASA CS23 aircraft, consisting of several K-band and one X-band radar as well as several visual and thermal infrared sensors. A particular focus is put on creating a technological added value from the technologies used with regard to the current state of the art, as well as creating a basis for any official regulation decision-making.

Methodology

  • For cooperative, active systems, the following technologies are used:
    TCAS, TAS, FLARM, radio/ATC
  • For cooperative, passive systems, the following technologies are used:
    ADS-B, TCAD
  • For uncooperative, active systems, the following technologies are used:
    LIDAR, RADAR
  • For uncooperative, passive systems, the following technologies are used:
    Electro-optical sensors, thermal-infrared sensors, acoustic sensors

Expected results

The overall goal of multisensor fusion is to increase the robustness and reliability of the sensor network by combining individual sensors. Radar and visual sensors introduce different, complementary information into this combination. Visual sensors impress with their high resolution and information content, and radar with its reliable range. 

Previous methods and procedures focus only on combining the results, whereby the complementary properties of different sensor technologies cannot be fully utilised because the individual sensors process the data in parallel. In this project, the optimal fusion level is systematically researched in order to optimally use the complementary characteristics of the individual sensors. In particular, the data fusion will be investigated using the K-band and X-band radar as well as the visual and thermal-infrared sensors of a collision avoidance system for an EASA CS23 category aircraft.

Project Partners

Funding program: Clean Sky II