The principle of aerial surveillance is not a new concept. However, as technology advances and new techniques, equipment, and software are introduced, the capabilities of both manned and unmanned aerial surveillance systems demonstrate an essential role in both military and law enforcement operations. Aerial intelligence not only helps operators deliver a more efficient and effective level of surveillance, but it can also actively save lives, especially when utilised in Search & Rescue arenas.
In this article, we’ll look at how multi-sensor fusion can be used for enhanced aerial intelligence, primarily in helicopters but also in airborne UVs such as drones. We’ll study the different types of multi-fusion sensors, their advantages, and the methodology behind integrating these systems into airborne platforms such as helicopters. We’ll also examine their real-world applications, as well as look at the future of sensor integration and data fusion, and introduce you to the systems developed by FlySight that are taking sensor integration to the next level.
What is data fusion and sensor integration?
Data fusion and sensor integration are synergistic systems that combine information gathered from multiple sources, including cameras, video capture, infrared capture, and various sensors onboard both autonomous drones and manned platforms, such as helicopters.
Sensor integration is the process of combining multiple sensors into a single system and arranging them to work synchronously, capturing data ranging from the movement of targeted vehicles to temperatures (when searching for hot spots in wildfire zones, for example), as well as Lidar and radar data.
Once all the information from the various sensors has been gathered, data fusion homogenises it and collates a multi-layered picture that can include both raw data and augmented reality information already available via other databases. The resulting data fusion can then be used to analyse and carry out better, more precise decision-making, often in real-time, and allow operations managers to formulate a response based on actual data rather than supposition. In aerial surveillance and intelligence operations, this is primarily high-level fusion that can determine a threat or identify a target with far greater accuracy.
The challenges of legacy aerial intelligence and surveillance systems
When it comes to manned aerial platforms such as helicopters, the biggest challenge faced by crews is the issue of space. With exceptionally limited cockpit room, any additional sensors must be integrated into existing hardware and be intuitive to use. By using software grandfathered over to existing systems, no further demands are placed on the aircraft’s operational capacity, and there is no bulky hardware to occupy precious space onboard an aerial unit.
Previously, legacy aerial intelligence and surveillance systems have relied on the human element. Throw into the mix bad weather conditions, such as fog or smoke, and that visible means of collecting data is immediately rendered useless. This alone has caused many missions to be aborted before the aircraft has even taken off.
Other issues include low-lighting situations and challenges in night aerial surveillance, as well as the problem of being unable to observe targets inside buildings or other structures.
Finally, the limited amount of airtime that both human-crewed aircraft, such as helicopters, and unmanned drones can spend in the air means that every second of intelligence surveillance data must be relevant.
How can advanced multi-sensor integration help overcome these challenges?
Advances in multi-sensor integration have negated the worry of overburdening the space available within a cramped helicopter cockpit. The ability to integrate multi-fusion sensors with far greater bandwidths means that a single unit can provide multispectral surveillance capabilities rather than attempting to interlink several different units. Miniaturisation also means that, while increasingly powerful, the hardware used to create the sensors has reduced in size, enabling surveillance operations to cram in more tech into a smaller space.
These sensors are also highly intuitive, minimising the amount of training time required to familiarise the crews with their operation. Because they’re integrated into already familiar hardware using intuitive, touchscreen operating systems, human operators can quickly acquire the skills to use these devices, increasing efficiency and maximising every second in the air.
Different types of multi-fusion sensors
EO/IR (Electro-Optical/Infrared) – These are the foundation of most multi-fusion sensor systems. They utilise a range of hardware, including various cameras, to combine visible light and infrared detection. This, in turn, provides comprehensive imaging and tracking capabilities, regardless of the light levels and whether conditions are clear or obscured by haze, fog or smoke.
LiDAR (Light Detection and Ranging) – LiDAR uses intense bursts of lasers to measure and map distances. When collated, these produce 3D maps that are exceptionally detailed and precise, creating topographical mapping which can be overlaid with Enhanced Reality data to create a multi-layered map of the terrain. Airborne LiDAR is particularly useful in aerial surveillance and SAR to assist crews in determining the layout of a location, even if explosions, fire or earthquakes have drastically altered the original layout.
Synthetic Aperture Radar (SAR) – This sensor technology is a powerful remote sensing method that uses radar signals to produce clear and very high-resolution images of the terrain. Working at a variety of different wavelengths, it has a host of real-world applications, ranging from disaster monitoring to environmental applications, including deforestation and the stability of bridges or urban mapping. In an aerial surveillance context, it can also be used to track targets and map terrain to get a more accurate image of a potential target zone before operational action or deployment of ground teams.
The benefits of combining sensor integration and data fusion
In an aviation application, the benefits of combining sensor integration and data fusion are clear. Any operational controller will know that more data means greater accuracy. The real benefit of these modern systems is that, by combining AI and human analysts, superfluous information can be eliminated, leaving only the data necessary for operational decision-making. This reduces the likelihood of false positives and leads to more informed, refined decision-making.
For crews onboard airborne surveillance platforms, as well as crewless aerial surveillance vehicles, combined sensor integration and data fusion provide enhanced situational awareness and faster target identification, leading to more efficient operations with greater accuracy, even in adverse or challenging conditions.
Real-world applications
The manoeuvrability, ability to carry out low-level flying, and extended flight range of modern helicopters make them excellent aerial surveillance platforms. These advantages, combined with the exceptional skill and training of the helicopter crew, make them the ideal partner for integrated sensor systems and data fusion. Increasingly, large drones are also proving to be particularly well-suited for integrated systems, thanks to the compact nature of their sensors and their compatibility with current drone technology.
In addition to aerial and intelligence surveillance, combined sensor integration and data fusion can also play a crucial role in various operational theatres, including military ISR missions, border and coastal surveillance and monitoring, and Search & Rescue missions. The use of sensors such as thermal and multispectral cameras also makes them practical for monitoring natural disasters and locating potential hot spots within a forest fire, for example. Combining sensor integration also plays a role in law enforcement surveillance, crowd control, and urban policing. With so much potential, it’s no surprise that this form of multi-layered surveillance is becoming the norm for both manned and unmanned aerial surveillance and intelligence gathering.
What does the future look like for sensor integration and data fusion?
The future of combined sensor integration and data fusion takes us to a new level of 21st-century technology – AI and integrated Augmented Reality. Fusing the ability of AI to ‘crunch’ vast amounts of data very quickly, along with the feasibility of remotely downloading real-time data to the Cloud, enables analysts to collate, assess, and initiate a response based on accurate, multispectral data.
The technology can also be applied to autonomous AI ISR platforms, as well as seamless integration with edge computing.
The future, however, is already here. FlySight’s OPENSIGHT Mission Console platform for helicopters is at the cutting edge of modern aerial surveillance, incorporating combined sensor integration and data fusion into its console systems for some time. The result is a system that uses data fusion and advanced sensors to improve real-time situational awareness in rotorcraft and UAV platforms.
OPENSIGHT Mission Console is specifically designed to support aerial payload operators, enhancing mission operations and making every moment count. It provides the means to collect vital data across a broad spectrum, regardless of the conditions. OPENSIGHT is an Augmented Reality platform that delivers a turnkey solution for almost any application. Its ability to manage high-resolution video flows makes it an invaluable platform, delivering real-time data for crews to integrate into their mission parameters. It augments and improves the operator’s geospatial situational awareness by superimposing multiple synthetic information layers. While sounding complex, it does this using intuitive, operator-friendly systems that require less training time.
You can find out more by downloading our OPENSIGHT catalogue, or contact us now to find out more about the OPENSIGHT Mission Console solution for aviation.