Home / Technology / AI & IT / Military Sensor Fusion technology enhances situational awareness of Air, Sea and Space-based platforms and Remote Intelligence

Military Sensor Fusion technology enhances situational awareness of Air, Sea and Space-based platforms and Remote Intelligence

The military typically operates in demanding, dynamic, semi-structured and large-scale environments. This reality makes it difficult to detect, track, recognize/classify, and response to all entities within the volume of interest, thus increasing the risk of late (or non-) response to the ones that pose actual threat. A key challenge facing the military operators, in these contexts, is the focus of attention and effort, that is, how to make the most effective use of the available but scarce sensing and processing resources to gather the most relevant information from the environment and fuse it in the most efficient way. Adaptive Data Fusion and Sensor Management can aid this information gathering and fusion processes by automatically allocating, controlling, and coordinating the sensing and the processing resources to meet mission requirements.

 

Increased defence attacks and threats from enemy have led to increased demand for improved digitalized information in military applications. The concept of sensor fusion has been discussed from years but recently has been deployed on several military tools including weapon systems, combat vehicles, target detection and enemy identification systems. With increased adoption of sensor fusion software technology in military market, the numbers of sensor fusion military applications are also not limited to target detection and identification of potential threats but involve military flight planning, data logistics, air tasking orders and others. The most productive areas in the field of military sensor fusion is the deployment of fused sensor technology in autonomous weapons including bombs and night vision rifles, biological agent and bomb detection systems and surveillance systems. Sensor fusion technology is now capable of integrating with wide variety of defence products including drones, spacecraft, missiles, military vehicles, ships, marine systems, satellites and rockets.

 

Sensor Fusion is defined as “software that combines multiple sensors to deliver enhanced performance”. Multi-sensor data fusion (MSDF) systems use various types of sensors to combine data, obtaining a comprehensive picture of the situation. MSDF may also be programmed to make inferences using given information to create new data. The goal behind this technology is to increase the effectiveness of military objectives by giving a more complete, integrated view of situations to enable a quicker response while eliminating errors caused by individual failure. The US military relies upon diverse sensing sources in the battlefield. As a result of the vast resources providing soldiers with information, the military incorporates probability algorithms into sensor fusion systems that effectively process the sensors’ information. Civilian sensor fusion technology includes applications to robotics, geospatial, and business intelligence. Sensor fusion technology is now capable of integrating with wide variety of defence products including drones, spacecraft, missiles, military vehicles, ships, marine systems, satellites and rockets.

 

Two types of knowledge relied upon in multi-sensor data fusion technology (MSDF) are data and information. The two terms appear synonymous to knowledge used in MSDF; however, data and information originate from different sources. Sensors obtain data whereas information is already available in existing databases. The Department of Defense’s Data Fusion Group in the Joint Directors of Laboratories (JDL) created an early definition of an MSDF’s function . JDL describes the method from a military perspective as a “multilevel, multifaceted process dealing with the automatic detection, association, correlation, estimation, and combination of data and information from multiple sources” . This JDL definition states that MSDF collects and combines data from many sensors to draw conclusions about the current conditions of an area under observation.

 

A more general explanation describing the function of MSDF states, “in the context of its usage in the society, it encompasses the theory, techniques and tools created and applied to exploit the synergy in the information acquired from multiple sources (sensor[s], databases, information gathered by humans, etc.) in such a way that the resulting decision or action is in some sense better  than would be possible if any of these sources were used individually without such synergy exploitation”  This definition originates after the JDL explanation and provides a broader view of MSDF’s varied abilities. This later perspective allows for the technical differences between data and information by listing sensors and databases both as sources of knowledge. Raw data is obtained from the MSDF network sensors. The sensors collect data about the environment by measuring different physical characteristics of the target area. Multiple sensors also measure the same physical characteristics of the observed area to ensure accuracy.

 

Information provided by a database assists the MSDF in identifying the goal as specified by a human user’s parameters. However, raw data undergoes analysis and combination with other types of raw data to become new information for the MSDF to use to draw more accurate and detailed conclusions. The final output of MSDF systems is an accurate piece of dynamic intelligence that gives the user more information than could be gained by looking at the raw data from the individual sensors. This information/data fusion process encompasses data, information, sensors, and logic providing accurate inferences to be used by humans for strategic decisions in warfare or civilian life. Different interpretations of MSDF functions allow for use of this process in many applications.

 

A current military application of MSDF is the United States’ Navy’s Cooperative Engagement Capability (US Navy’s CEC) system. The CEC provides the Navy with tracking and air defense capabilities . Another usage of military MSDF occurs with the Ground Based Mid-Course Defense Ballistic Missiles Defense System (GMD). GMD uses radar to detect nuclear warheads’ trajectories in combination with ground-based interceptor missiles to prevent the warheads’ detonation . The US military is also working on a prototype called the Guardian Angel Project, used to sense improvised explosive devices (IEDs) in war zones . MSDF is so versatile that military and civilian uses can be determined. In civilian life, robotics incorporates MSDF for robotics.

 

A future goal of this robotic technology will be incorporated in surgeries. The general idea behind the MSDF process involves sensors, a database of known information, a process for determining which combinations of data and information are reliable, and an area under study. This broad-spectrum formula allows MSDF technology to be universally applicable. Determining the specific function of MSDF requires the user to define the context in which the MSDF will be used; e.g. what provides raw data and what constitutes information.

 

The physical aspects of MSDF technology are the sensors responsible for raw data collection and transmission throughout the system. A characteristic of the MSDF system is that of reliability under conditions humans cannot endure. These sensors therefore, must be capable of enduring extreme stresses such as extreme temperatures, high winds, extreme pressure, storms, and fires. In military applications, sensors must be able to operate without enemy detection, perhaps while hidden from view, or from long distances.

 

A major engineering challenge is designing these durable sensors. However, no matter how resilient the sensors’ physical covering, other aspects may contribute to individual sensor failure in the system . Energy sources must be decided when designing MSDFs to ensure the system will not fail from lack of energy. Also, sensor placement must be determined so that sensors do not interfere with other sensors’ data transmissions. MSDF information must contain three basic characteristics in order to provide the most accurate data: redundancy, cooperation, and complementarity. For redundancy, a few sensors must measure the same critical aspects of the observed area to provide the user with accuracy.

 

To achieve information cooperation, all aspects of the observed area must be measured. Combining different types of measurements of an environment using sensors, such as motion sensors and photo sensors, for example, result in complementary information. Design challenges are inherent with MSDFs. Engineers must design the strongest physical covering for sensors, sensors’ energy sources, and sensors’ placement while minimizing the amount of sensors used. Also, the minimum amount of sensors must be able to collect the maximum amount of data possible while the whole system remains undetected. Using the minimal amount of sensors possible in an MSDF system saves fossil fuel, allowing MSDF to help sustain the planet. Another dilemma is that each MSDF must specifically be tailored to the environment in which that MSDF will be used.

 

Multisensor Fusion Implementations

Implementation of the “multi-sensor data fusion” into the RAFALE translates into accurate, reliable and strong tracks, uncluttered displays, reduced pilot workload, quicker pilot response, and eventually into increased situational awareness.

It is a full automated process carried out in three steps:

  • Establishing consolidated track files and refining primary information provided by the sensors,
  • Overcoming individual sensor limitations related to wavelength / frequency, field of regard, angular and distance resolution, etc, by sharing track information received from all the sensors,
  • Assessing the confidence level of consolidated tracks, suppressing redundant track symbols and decluttering the displays.

 

“The sensor fusion process produces a unique track of a single target which may be reported by several sensors simultaneously, each one providing a subset of target attributes which are compiled to produce an as complete as possible view of the target,” Friemer says. Algorithms weigh the reliability of each report before merging them to produce a fused target identity and priority.

 

Image

 

IARPA asks for image processing technology using sensor fusion for air- and space-based remote intelligence

The goal of the Space-based Machine Automated Recognition Technique (SMART) program is to automate the quantitative analysis of space-based imagery to perform broad-area search for natural and anthropogenic events and characterize their extent and progression in time and space. The SMART program aims to develop capabilities in the spectral and temporal domains, enabling seamless integration and fusion (i.e., absolute calibration) of data from multiple sensors to deliver a comprehensive representation of seven natural or anthropogenic evolving events. Examples of such events include: heavy construction, urban development, crop disease propagation, forest fire, insect or battle damage, human migration, mining, logging, farming, and other natural events such as flooding, mudslides, or earthquakes.

 

The SMART program will require innovations in new computing approaches and calibration techniques in order to rapidly and reliably compare thousands of images from multiple sensors registered in space and time. The SMART program will also leverage algorithmic approaches to: Search for specific activities; Detect and monitor activities throughout time and over broad areas; and Characterize the progression of events and activities temporally and categorically

 

Algorithms to detect and characterize events or activities over areas larger than 8,000 square meters using space-based time-series imagery. IARPA would like proposers to identify standards like software libraries, data type definitions, architecture guidelines, and software processes that are easy to follow and efficient to carry out. SMART essentially will rely on geographical information from satellite and aircraft cameras, and develop multi–spectral and multi–temporal sensor processing to overlay data from infrared and multispectral sensors to make the intelligence analyst’s job easier.

 

The idea is to reduce uncertainties inherent in single-sensor data, and reduce the sheer amount of intelligence imagery data that can overwhelm intelligence analysts by developing tools to help analysts analyze intelligence imagery using Big Data, IARPA officials say.

 

The program will focus on two primary technical areas: data fusion; and algorithms to detect and characterize events. Offerors may address one or both technical areas in their proposals. SMART will be a four-year project that will extend from August 2020 to July 2024.

 

Data fusion seeks to quantify data quality and cross-sensor inconsistencies in time-series satellite images and develop automated data calibration techniques by blending geometric and radiometric correction, cloud masking, pixel quality, gridding, and collection management. Algorithms seeks to detect and characterize events or activities over areas larger than 8,000 square meters using space-based time-series imagery.

 

Proposals shall identify standards like software libraries, data type definitions, and architecture guidelines and software processes that will be easy to follow and efficient to implement. IARPA officials say they expect to create an infrastructure that will enable analysts to access, calibrate, and process data streams from several satellites simultaneously for time-series analysis.

 

Military Sensor fusion Market

Military Sensor Fusion Market Set to Grow to $921M by 2030. Major factors driving the growth of the market includes increasing demand for information from battlefield owing to the increased defence attacks and threats from enemies. Advancements in sensor fusion technologies has led to the development of new applications other than target recognition such as military flight planning, data logistics, air tasking orders and others. The most productive areas in the field of military sensor fusion is the deployment of fused sensor technology in autonomous weapons including bombs and night vision rifles, biological agent and bomb detection systems and surveillance systems.

 

Leading companies featured in the report who are developing military sensor fusion includes Aechelon Technology, LLC, Analog Devices, BAE Systems plc, Curtiss-Wright Corporation, Esterline Technologies Corporation, General Dynamics Corporation, General Micro Systems, Honeywell International, Kongsberg Gruppen, Logos Technologies, LLC, Lockheed Martin Corporation, Millennium Engineering and Integration Company, MEMSIC,INC, Nurjana Technologies, Qinetiq, The Raytheon Company, Renesas Electronics Corporation, Safran Group and TE Connectivity

 

References and Resources also include:

https://www.militaryaerospace.com/sensors/article/14167837/remotesensing-sensor-fusion-imageprocessing

 

About Rajesh Uppal

Check Also

Smart Ring Technology: A Revolution on Your Fingertips

Introduction: In the fast-paced world of wearable technology, smart rings have emerged as compact, versatile …

error: Content is protected !!