Military operations are becoming increasingly diverse in their nature. To cope with new and more demanding tasks, the military has researched new tools for use during operations and during training for these operations. There have been numerous goals driving this research over the past several decades. Many of the military requirements and capabilities have specifically driven development of augmented reality (AR) systems.
Virtual Reality (VR) creates a digital environment that replaces the user’s real-world environment. It is more about what users feel or experience in that world than how they connect with it. Augmented Reality (AR) overlays digitally-created content into the user’s real-world environment for instance, projecting sales and inventory data onto products on store shelves. Mixed reality (MR) is a blend of VR and AR creating an environment in which digital and physical objects can interact. For example, MR will allow marketers to put virtual products in consumers’ hands and gauge their responses. In recent times, VR and AR are coming together to impact business and enterprise together. It is reported that 500 million VR headsets are expected to be sold by 2025.
The overall goal of the Battlefield Augmented Reality System (BARS) was to do for the dismounted warfighter what the Super Cockpit and its successors had done for the pilot. Initial funding came from the Office of Naval Research. The challenges associated with urban environments were a particular concern: complex 3D environment, dynamic situation, and loss of line-of-sight contact of team members. Unambiguously referencing landmarks in the terrain and integrating unmanned systems into an operation can also be difficult for distributed users. All of these examples show the impairment of situation awareness (SA) military operations in urban terrain (MOUT). The belief was that the equivalent of a head-up display would help solve these. By networking the mobile users together and with a command center, BARS could assist a dispersed team in establishing collaborative situation awareness.
This raises numerous issues in system configuration. BARS includes an information database, which can be updated by any user. Sharing information across the area of an operation is a critical component of team SA. We designed an information distribution system so that these updates would be sent across the network. We enabled BARS to be able to communicate with semi-automated forces (SAF) software to address the training issues discussed above, said NRL. We chose to use commercially-available hardware components so that we could easily upgrade BARS as improved hardware became available. We built UI components so that routes could be drawn on the terrain in the command center application and assigned to mobile users, or drawn by mobile users and suggested to the commander or directly to other mobile users. Typical AR system issues like calibration were investigated. Specific research efforts within BARS for the UI and human factors aspects:
Augmented Reality for UK Navy
BAE Systems hopes to begin operational trials of an augmented-reality system onboard a Royal Navy warship next year as part of a £20 million ($27 million) investment the defense contractor is making in advanced combat systems technology. Company officials said at a briefing in London Nov. 22 that they planned tests of augmented reality for a bridge watch officer role early in 2019 and expected the technology to be tried operationally during the second half of the year.
The augmented-reality glasses would allow the officer of the watch to blend real-world visuals with data generated by sensors, like radars and sonars, laid over the top in a similar fashion to digital helmet displays used by combat jet pilots. Cotton said BAE is using technology from its new Striker II pilot’s helmet to help develop the system for the Royal Navy.
A U.S Navy surface warfare officer has invented a helmet system that could help revolutionize warship gunnery operations, according to service officials. Lt. Robert McClenning’s Unified Gunnery System concept is an augmented reality (AR) helmet that fuses information from a ship’s gunnery liaison officer and weapons system into an easy-to-interpret visual format for the gunner manning a naval gun system, Navy officials said.
According to a previous report by Defense Systems, the helmet was invented by Lt. Robert McClenning and visually synthesizes data from a ship’s weapons systems with information feeds from a ship’s gunnery liaison officer. A GunnAR prototype was demonstrated to select groups within the Navy late last year, said Heidi Buck, Director of the Battlespace Exploitation of Mixed Reality.
US Navy
In 2019, The US Navy tested a platform based on augmented reality (AR) that is designed to significantly improve combat training. The platform is sponsored by the Office of Naval Research (ONR) Global TechSolutions. Sailors tested the tactically reconfigurable artificial combat enhanced reality (TRACER) project at the Center for Security Forces (CENSECFOR) Detachment Chesapeake, on Naval Support Activity Northwest Annex in Currituck County, North Carolina. Other government partners in the project include the Naval Surface Warfare Center (NSWC) Dahlgren and the US Army Combat Capabilities Development Command. Industry partners who contributed to the development of TRACER include Magic Leap Horizons and Haptech.
Key components of the TRACER system include the Magic Leap One AR headset, a backpack processor, and Haptech’s instrumented weapon, which is capable of delivering realistic recoil. The system uses software developed by Magic Leap Horizons as part of the US Army’s Augmented Reality Dismounted Soldier Training (ARDST) project. This will provide advanced weapons tracking capability. Trainers using the system will also be able to create simulation scenarios for security personnel.TRACER project lead Dr Patrick Mead said: “Our training system is built mostly from commercial-off-the-shelf products, so we are using widely available gaming gear.
US Navy awards contract to AVATAR Partners for AR-based technology in Sep 2020
AVATAR Partners, an innovator in virtual, augmented and mixed reality (extended reality or XR, collectively) software solutions for heavy-duty industry and defense, has received a contract to provide its Simplified, Intelligent AR Quality Assurance (SIA-QA) solution for the US Navy. The SIA-QA solution will focus on supporting aircraft wiring maintenance for the Naval Air Systems Command (NAVAIR) Boeing V-22 Osprey aircraft. This AR-based, automatic QA system combines instruction and AI-based performance assessment in a single solution to help inspectors rapidly observe “as-is” versus “should be” conditions to speed installation and increase accuracy.
Michael Davis, technical director, AVATAR Partners, tells IndustryWeek, the dynamic and highly variable nature of the inspection environment brings challenges to implementing object and photo recognition to determine discrepancies in aircraft harness installation. “These variances in the environment could include different lighting conditions, personnel and support equipment movement and multiple aircraft configurations,” says Davis. “Understanding the expected boundaries of the environment within the AR experience will allow the solution to capture and adjust through the use of machine learning.”
Maintainer errors can be highly expensive or life threatening. Moreover, time-to-maintain is often insufficient to support aircraft readiness. The AVATAR Partners SIA-QA solution includes refresher training for aircraft maintenance technicians who have completed tours of duty away from the system for which they were trained. “We are pleased to have the opportunity of helping to improve aircraft readiness through on-demand technician training that provides knowledge at the point of need,” said AVATAR Partners CEO and Founder Marlo Brooke. “Applying military-grade augmented reality provides the resources needed to establish critical training regimens that support a risk-managed framework for preparation and deployment.”
Royal Canadian Navy to use Kognitiv Spark’s Augmented Reality software to improve vessel repairs and maintenance reported in 2019
Kognitiv Spark has announced that it will be providing the Royal Canadian Navy (RCN) with the opportunity to test drive a Mixed Reality Remote Assistant Support (MIRRAS) system, as part of a project that aims to improve maintenance and repairs aboard active naval vessels. The project aims to validate technology adopted from Kognitiv Spark, whose software is designed for use with the Microsoft HoloLens. The software leverages augmented reality, mixed reality, and the integration of artificial intelligence to improve efficiencies with ship operations including repairs, maintenance and knowledge transfer.
This system can be used by RCN Marine Technicians and Weapons Engineering Technicians, to ensure that RCN ships remain at a high-level of readiness for routine training and operational deployments. “Innovation and technological advancement are critical to the future of the Royal Canadian Navy. We are continually seeking new ways to leverage emerging technologies in order to enhance our performance alongside and at sea,” said RAdm Casper Donovan, Director General Future Ship Capability, Royal Canadian Navy. “The Mixed Reality Remote Assistant Support system is an exciting tool, because it may provide our sailors with the opportunity to explore a new, and potentially much more efficient way of conducting onboard maintenance.”
For remote maintenance, a subject-matter expert using this system can see what the HoloLens wearer sees from anywhere in the world. The expert can provide guidance using real-time voice and video, interactive 3D holograms and content, and live IoT data. Alternatively, the technician can use locally stored data to assist with routine tasks when a remote expert is not available. The holographic support is designed to improve decision making by facilitating decisive action and reducing errors by providing clarity and certainty of comprehension, according to Kognitiv Spark.
“Our solution delivers a toolset that can take advantage of the most powerful visual processing system on the planet, the human brain,” said Duncan McSporran, a former military officer and the Co-Founder and COO at Kognitiv Spark. “3D interactive content is more easily interpreted than paper manuals, and therefore reduces any mental fatigue the soldiers, sailors or air force personnel might be facing under harsh and stressful conditions. Our software allows them to make better-informed decisions more rapidly, with all the information and resources possible in a secure system. We are delighted to deploy our Canadian solution, in support of the vision of the Royal Canadian Navy to introduce cutting edge technology.”
Kognitiv Spark was awarded the contract due in part to their security features and reputation from ongoing work with the Canadian Army and the Royal Canadian Air Force. The company also recently won a Microsoft IMPACT award for Innovation with Hardware, a NATO Defence Innovation Challenge award for Mobile Apps for Defence Users and the Atlantic Canada Aerospace and Defence Association Industry Excellence Award.
Kognitiv Spark specializes in industrial task support software primarily in the industrial manufacturing, aerospace & defence, and oil & gas sectors. The company has partnered with hardware providers, and utilizes augmented and mixed reality to deliver its holographic worker support services with interactive 3D content, artificial intelligence, and live IoT data.
References and Resources also include:
https://www.industryweek.com/technology-and-iiot/article/21142049/us-navy-sees-augmented-reality