Home / Technology / AI & IT / Navies plan to use Intelligent Virtual Assistant (IVA) and Deep Machine Learning (DML) technologies to enhance situational awareness and battlespace decision making

Navies plan to use Intelligent Virtual Assistant (IVA) and Deep Machine Learning (DML) technologies to enhance situational awareness and battlespace decision making

Autonomous systems are increasingly critical to several current and future Department of Defense (DoD) mission needs. For example, the U.S. Army Robotics and Autonomous Systems (RAS) strategy report for 2015-2040 identifies a range of capability objectives, including enhanced situational awareness, cognitive workload reduction, force protection, cyber defense, logistics, etc, that rely on autonomous systems and higher levels of autonomy.

 

Achieving higher levels of autonomy in uncertain, unstructured, and dynamic environments, on the other hand, increasingly involves data-driven machine learning techniques with many open systems science and systems engineering challenges.

 

Navies are now planning to use Artificial Intelligence (AI) and Deep Machine Learning (DML) technologies to enhance situational awareness and battlespace decision making.

 

Britain’s Royal Navy is to use artificial intelligence situational awareness software to help humans assess threats in a maritime combat system demonstrator. British warships are to employ a voice-controlled system along the lines of Apple’s Siri assistant, the first sea lord, said Adm Sir Philip Jones.

 

Jones said the Royal Navy was witnessing the rapid speed at which warfare is being transformed by IT and had to move to embrace it. He cited the new Type-31 frigates, scheduled for deployment in 2023, as an example, with IT being integrated into their weapons systems as well as the running of the ship and offshore logistics.

 

“What this means in practice is that the Type 31e will feature different app-based tools which can access the ship’s data. These will be operated from a series of touchscreen displays, Siri-style voice-controlled assistants and perhaps even augmented reality technology,” Jones said on Tuesday. “This is not a gimmick or a fad. As modern warfare becomes ever faster, and ever more data driven, our greatest asset will be the ability to cut through the deluge of information to think and act decisively.”
While most people tend to ask voice assistant Siri questions such as “What’s the weather going to be like today?”, the navy may have more ambitious questions such as “Who fired that missile?” – but Jones did not elaborate.

 

The US Marine Corps seeks to leverage advanced artificial intelligence (AI) technologies to reduce information overload, improve situational awareness (SA) and collaboration, and aid in Commander Decision-making.  It has issued a request with the objective to develop an artificial intelligence (AI)-based Command and Control (C2) digital assistant that uses advanced computing techniques such as machine learning and natural language processing to provide answers to complex mission-specific questions to enhance battlespace decision making.

Joe Marino, CEO, Rite-Solutions, Inc. envisages the use of Artificial Intelligence (AI) and Deep Machine Learning (DML) technologies to enable a Decision Support System for Submarine Commanding Officers (COs).  “Intelligent Virtual Assistant (IVA) with a huge compendium of relevant information and past performance experience, and the ability to apply this knowledge almost instantaneously. This IVA would work in concert with their CO to provide fast, reliable, and trusted situational awareness. It would identify probable enemy intent and the corresponding, appropriate Course of Action based on hundreds of wargame exercises and the accumulated experience of the best COs.”

 

“This Watson-like assistant would quickly integrate disparate information sources, offer observations and advice, and provide risk/reward analyses associated with alternative Courses of Action. This IVA could also attend to a myriad of necessary, but lesser important tasks, leaving their CO with more bandwidth to concentrate on critical, tactical issues. Also, using DML technologies, an IVA would quickly increase in capability over time and learn to tailor its decision-making processes to each CO’s personal decision style to optimize the human-machine relationship. It would become a trusted member of their CO’s combat team—a virtual participant that would add tremendous value to the team’s fighting effort.”

Royal Navy’s STARTLE project Roke Manor Research (Roke) is integrating AI software into a maritime combat system demonstrator

STARTLE, using the machine situational awareness software continuously monitors and evaluates potential threats using a combination of artificial intelligence techniques. It is inspired by the way the human brain works, emulating the mammalian conditioned-fear response mechanism. Rapidly detecting and assessing potential threats, the software significantly augments human operator situational awareness in complex environments. The continuous learning will mean that it will be able to remove any anomalies.

 

If integrated into existing warship sensor suites, it would support the Principal Warfare Officer by intelligently processing multiple sources of information, whilst cueing systems to assess and confirm potential threats.  In increasingly complex and dynamic mission environments, it could allow the command team to make better informed decisions faster.  The vital seconds it contributes to decision making could be the difference between success and failure.

 

According to Dabbah, this technology can be integrated into existing warship sensor suites. “Not only will it help to speed reactions, it will also help to support and justify actions taken.”

 

STARTLE is being integrated into the MoD’s Open Architecture Combat System, which is intended to show the utility of research ideas in a representative combat system in a realistic environment. In addition to maritime defence systems, STARTLE can also be adapted for autonomous vehicles, and health and usage monitoring applications.

US Naval research laboratory working on Cognitively Inspired Decision Making for Visualization

The major objective of this project is to investigate the potential advantages of using a cognitively-based approach to autonomous decision making at multiple levels in a command structure. For this project, the Polyscheme inferencing architecture is used to construct a cognitive model of a scenario. The intent of this approach is to facilitate the presentation of autonomous reasoning to human decision makers in ways that allow them to rely on and/or revise decisions that have been made at lower levels.

 

This research will promote rapid situational understanding of the battle space and facilitate the decision maker’s ability to intervene and override the system as needed. The project includes the development of 1) a cognitive modeling paradigm for making autonomous decisions at critical junctures in a network involving multiple strata of information, and 2) appropriate displays of high-level information that afford decision makers the ability to readily grasp, interact with, and, if needed, alter the underlying reasoning upon which they are expected to act.

 

 

Artificial Intelligence (AI)-based C2 Digital Assistant

US Navy plans to develop an artificial intelligence (AI)-based Command and Control (C2) digital assistant that uses advanced computing techniques such as machine learning and natural language processing to provide answers to complex mission-specific questions to enhance battlespace decision making.

 

The cognitive demands of future network-centric forces are overwhelming and commanders often get caught in the weeds and suffer from the glare of information. Intelligent assistants such as Apple Siri, Google Now, or Facebook’s M are commonplace in commercial industry yet similar products do not exist for military commanders tasked with managing an increasingly complex battlespace.

 

New big data computing techniques such as predictive analytics, deep machine learning, distributed rules engines, and real-time contextual search can significantly ease the information burden and enable more effective and efficient decision making. These computing techniques not only identify patterns across multiple data sets but they recommend courses of action and evaluate proposed actions.

 

The aim of AI techniques embedded in an intelligent decision support system such as the proposed Command and Control (C2) digital assistant is to enable computer automation while emulating human capabilities as closely as possible.

 

The C2 digital assistant is envisioned to be integrated into the Common Aviation Command and Control System (CAC2S), an Acquisition Category I (ACAT I), Major Automated Information System (MAIS) that modernizes the air command and control suite in support of the Marine Aircraft Wings. The program replaces and modernizes the currently fielded, stove piped, and rapidly becoming obsolete aviation C2 equipment and facilities that support the Marine Air-Ground Task Force in Joint and combined air operations today. The C2 digital assistant enhances the Command Tools function of CAC2S.

 

The AI-based C2 digital assistant will be a secure, open architecture system that runs continuously in the background and learns from its environment. It will utilize open-source libraries, software development kits (SDKs), and application programming interfaces (APIs) to the greatest extent possible and employ well-defined, well-documented interfaces to maximize modularity and extensibility.

 

It will be capable of interpreting ad hoc natural language queries with minimal training and learn progressively as historical behaviors of both friendly and hostile forces are observed over time. By searching through vast troves of persistent unstructured data, the AI-based C2 digital assistant will greatly improve warfighting outcomes and enable commanders to compose what if  based on intelligence information, local and remote sensor data, logistics and weapons information, and battle damage assessment activities.

 

For example, disparate radio frequency emissions scattered across the battlefield may indicate the presence of an Integrated Air Defense System (IADS) and pose a threat to aviation assets. Using the C2 digital assistant to query previously detected RF emissions, the results will expose the presence of IADS assets and alter the Commander to apply appropriate action.

 

Program Structure

PHASE I: The small business will develop a concept for a high-level information architecture and componentized system design to meet the requirements for the AI-based C2 digital assistant described above.

 

PHASE II: Based on the results of Phase I and the Phase II development plan, the company will develop a scaled prototype of the C2 digital assistant for evaluation and testing. The prototype will be evaluated to determine its capability in meeting the performance goals defined in the Phase II development plan and its ability to assist with efficient and effective decision making in a tactical environment. System performance will be demonstrated through prototype evaluation, modeling and simulation, and use case analysis.

 

Phase II will be classified to the SECRET level. Battlefield data such as Tactical Digital Information Links (TADIL), sensor data, composite tracking data, and tactical intelligence information are examples of data that a C2 digital assistant will evaluate and assess. Testing is envisioned to be part of CAC2S Developmental Testing and Follow-on Operational Test and Evaluations at the Weapons and Tactics Instructor Courses in Yuma, AZ. Evaluation results will be used to refine the prototype implementation into an initial design that will meet Marine Corps requirements.

 

PHASE III DUAL USE APPLICATIONS: If Phase II is successful, the small business will be expected to support transitioning the technology for Marine Corps use in operational command posts and C2 agencies. The company will develop and integrate a full-scale AI-based C2 digital assistant for evaluation to determine its effectiveness in an operationally relevant environment.

The company will provide test and validation support to certify and qualify the system for integration into C2 systems such as the Common Aviation Command and Control System (CAC2S).

Private Sector Commercial Potential: The potential for commercial application of an extensible AI-based digital assistant is high. Possible avenues for employment include search and rescue, first responder applications, law enforcement, homeland security, special operations, cyber defense, and Internet of Things (IoT) applications for consumers and businesses.

 

 

 

References and resources also include:

http://www.navysbir.com/n16_2/N162-074.htm

http://www.rite-solutions.com/artificial-intelligence-machine-learning-innovation-imperatives-for-the-navy/

https://www.nrl.navy.mil/itd/aic/content/adaptive-testing-autonomous-systems

https://www.theguardian.com/uk-news/2017/sep/12/british-navy-warships-to-use-voice-controlled-system-like-siri

About Rajesh Uppal

Check Also

India’s Advances in AI Weaponization Amid Global Military AI Race

As the global military landscape evolves with advancements in Artificial Intelligence (AI), India is making …

error: Content is protected !!