Data Fusion in Aviation – Enabling Resilience in Airspaces with OPENSIGHT.
Data fusion in aviation mission support has the potential to transform both reactive and predictive mission development. With the growing complexity and challenges faced by operations teams in modern airspace, every advantage should be pursued to create a safer, more proactive environment.
With civil, commercial and defence operators all sharing the same airspace, airspace management is becoming more complex, increasing the need for more advanced co-ordinating tools. Data fusion can alleviate this issue by utilising a plethora of data-gathering techniques and sources, including radar and satellite data.
The issue, though, is how to integrate all these different sources and data-acquisition techniques to create meaningful, usable information in real time. This is where multisensor fusion comes in: it takes all available data and analyses it to make a virtual ‘3D’ map of the environment around us, particularly improving spatial awareness.
In this article, we’ll investigate what multimodal data fusion is and how it can integrate different sources of information to provide both ground and air crews with the vital information they need to improve spatial awareness, safety, and efficiency. We’ll examine how it’s being used to develop viable, innovative solutions to the challenges faced, and look at the three levels of data fusion in aviation. We’ll also look at how FlySight’s groundbreaking OPENSIGHT platform fits into the future of multimodal data fusion in aviation.
What is data fusion?
Data fusion in aviation refers to the integration of information from multiple sources. This can include radar, ADS-B, weather feeds, infrared and multispectral cameras and, increasingly, Augmented or Enhanced Reality and AI resources. Add to this information from the aircraft’s onboard systems, ground control, and external sensors, and you have a wide-ranging, comprehensive real-time image of the operational environment.
Data fusion in aviation is classified as low, medium, or high, which we’ll examine in more detail shortly. Modern multisensor fusion evolved alongside advances in networked computing and digital communications, where the potential for streaming multiple sources of information became apparent, even though humans effectively ‘data fuse’ information all the time (audio, visual, sensory and so on). In the 1980s, the Joint Directors of Laboratories created the JDL Data Fusion Model, which separated the various processes used in data fusion into six levels:
- Level 0: Source Preprocessing (also referred to as Data Assessment)
- Level 1: Object Assessment
- Level 2: Situation Assessment
- Level 3: Impact Assessment (Threat Refinement)
- Level 4: Process Refinement (Resource Management)
- Level 5: User Refinement (Cognitive Refinement)
- Level 6: Mission Refinement (Mission Management)
While initially criticised for failing to account for the human element in this loop, the model now fits particularly well with modern data collection and collation methodologies, especially given the use of AI and AR data.
The three levels of data fusion in aviation
That original model, like everything else, has undergone refinement over the years, and today aviation experts use a three-tiered multisensor fusion process.
- Level 1 – Data Alignment and Correlation – Also known as low-level fusion, this involves collecting raw data from multiple sources to create a general overview of the environment. This could include merging radar and ADS-B data for accurate tracking, along with weather data, flight information from ground-based resources, and other data.
- Level 2 – Situational Assessment – This intermediate level is when both human and, increasingly, AI-assisted systems interpret the fused data, giving them an overall understanding of the real-time situation. At this level, interpreters are on the lookout for patterns that give them a deeper understanding of the situation, such as whether an aircraft is on the correct flight path according to its plan.
- Level 3 – Impact Assessment and Decision Support – This high-level data fusion process focuses on projecting future conditions and supporting mission decision-making. It involves evaluating current conditions to build a predictive model as accurate as possible.
With the high levels of data from sources such as satellites and LiDAR now available, level three data fusion analysis can be incredibly accurate, producing a detailed, broad-spectrum analysis of a situation. This, in turn, allows for greater mission success, whether in military, search & rescue, disaster management or law enforcement roles such as crowd control or surveillance.
The advantages of data and sensor fusion
Data fusion in aviation mission support has numerous benefits for mission planning, operations, and assessment. Those effects result in a trickle-down effect that improves communication, safety, and better collaboration between new and legacy technologies.
The ability to combine multiple data sources through sensor fusion and present them in a single view on intuitive, already-familiar technology clarifies communication between airborne crews and ground support. This interoperable data can be accessed quickly and easily through mission consoles such as FlySight’s OPENSIGHT system, which uses plug-in modules to allow users to create a customised system suited to their specific needs.
The improved method of data presentation enhances situational awareness by providing better, more accurate information relayed quickly and precisely when needed. This leads to better real-time decision-making and faster target identification, even in challenging landscapes or those altered by a major disaster, for example. By overlaying multimodal data onto a base map, operators can access crucial information from onboard and external sensors, gaining a more complete understanding of their surroundings.
Data fusion in aviation – the cons
There are, however, still some issues to address regarding data fusion in aviation. The biggest of these is caused by how effective data collection has become—the problem of too much information. Crucial data elements can get lost in the background ‘noise’ of superfluous details irrelevant to the mission. Sifting through piles of ‘big data’ takes time and risks presenting inaccurate information that could hinder rather than help.
AI is a helpful tool for sifting through this data, recognising patterns, and fine-tuning the collection and presentation of relevant data using machine learning, but it would be unwise to rely solely on AI in this instance. There always needs to be a pair of ‘human eyes’ examining the data, too.
There are some issue with legacy systems. The interaction between sensor fusion, collection, and presentation technologies must be fully interoperable. Fortunately, steps forward in the development of platforms such as OPENSIGHT are making this connectivity much easier and more accurate.
Can malicious actors hack aviation data fusion systems? Potentially, yes, although it’s unlikely. If data collected is stored in external sources, such as the Cloud, there is always the slight possibility that a highly advanced hacker could access and tamper with it. However, thanks to robust security systems and firewalls, the likelihood of that is slight, but should never be wholly discounted. So multisensor fusion systems need to be resilient —both to environmental factors during use and to software ‘glitches’ —and resistant to outside attack.
The OPENSIGHT platform – a new approach to data fusion in aviation
Aerial platforms face the challenge of minimal space. So any multimodal fusion system that can integrate easily into existing hardware is a winner. OPENSIGHT is just such a system, using sensor fusion to connect to, understand, and present important flight and environmental data within mission parameters. Gathering data from various sources, it provides operators with a fully integrated, real-time picture of their surroundings, including information that would not usually be available to the user.
The software integrated into OPENSIGHT’s Mission Console superimposes multiple layers onto a topographic map. This data can include anything from known occupancy data to target identification to hazard data, such as power lines (a genuine concern for helicopter operators in particular), to geospatial data that can re-render altered terrain affected by a major disaster.
This may sound complicated, but OPENSIGHT has the advantage of presenting this fused data in an accessible, easy-to-use way that integrates with legacy hardware operators are already familiar with. This reduces training time and is intuitive, efficient, and compact.
OPENSIGHT is an Augmented Reality platform that is customisable according to the operator’s needs. With multiple plug-ins that can process high-resolution video feeds and dehaze them for greater clarity, multispectral data fusion from various sources, and geospatial overlays, OPENSIGHT represents the next step in data fusion in aviation mission operations.
You can find out more by downloading the OPENSIGHT brochure or by watching our videos that provide a deeper understanding of this innovative technology’s capabilities. Alternatively, you can contact us direct to discuss how OPENSIGHT can fit into your operational procedure and take your mission development and success to the next level. Contactl us today!



