Many future military operations are expected to occur in urban environments. Delivering better situational awareness to the dismounted warfighter is extremely difficult in urban environments due to challenges like complex 3D environment, dynamic situation, loss of line-of-sight contact of team members and unambiguously referencing landmarks in the terrain. Currently maps are being used that maps draw a user’s attention away from the environment and cannot directly represent the three-dimensional nature of the terrain.
Mobile augmented reality system can overcome these problems. A mobile augmented reality system consists of a computer, a tracking system, and a see-through Head Mounted Display. The system tracks the position and orientation of the user’s head and superimposes graphics and annotations that are aligned with real objects in the user’s field of view. With this approach, complicated spatial information can be directly aligned with the environment. For example, the name of a building could appear as a “virtual sign post” attached directly to the side of the building.
Military to succeed in their missions, require a robust, multi-faceted picture of their operational environments, including the location, nature and activity of both threats and allied forces around them. Augmented Reality technology is making this kind of rich, real-time situational awareness increasingly available to for aircraft, submarines and tanks and other vehicle-assigned forces, along with a capacity to deploy precision armaments more safely, quickly and effectively. Military augmented reality systems are also being used by divers underwater.
Infantry squads, however, have fallen behind because many of these systems are too bulky to carry on the frontline says DARPA. Under the SXCT (SXCT) program, DARPA envisions future supersoldiers using advanced technologies, such as augmented reality (AR), to intuitively understand and control their complex mission environments and enable them to combat advanced adversaries all across the globe.
US soldier’s Tactical Augmented Reality,” or TAR
A novel technology called “Tactical Augmented Reality,” or TAR, is now helping Soldiers precisely locate their positions, as well as the locations of friends and foes, said Richard Nabors, an associate at CERDEC. It even enables them to see in the dark, all with a heads-up display device that looks like night-vision goggles, or NGV, he added. So in essence, TAR replaces NVG, GPS, plus it does much more.
TAR does the geo-registration automatically, he said. Geo-registration is the alignment of an observed image with a geodetically-calibrated reference image. Staff Sgt. Ronald Geer, a counterterrorism non-commissioned officer at CERDEC’s Night Vision and Electronics Sensors Directorate, said that with TAR, Soldiers don’t have to look down at their GPS device. In fact, they no longer need a separate GPS device because with TAR, the image is in the eyepiece, which is mounted to the Soldier’s helmet in the same way NVG is mounted. So what they would see, he said, is the terrain in front of them, overlaid with a map.
Furthermore, Geer pointed out that the eyepiece is connected wirelessly to a tablet the Soldiers wear on their waist and it’s wirelessly connected to a thermal site mounted on their rifle or carbine. If a Soldier is pointing his or her weapon, the image of the target, plus other details like the distance to target, can be seen through the eyepiece. The eyepiece even has a split screen, so for example, if the rifle is pointed rearward and the Soldier is looking forward, the image shows both views, he said. Also, a Soldier behind a wall or other obstacle could lift the rifle over the wall and see through the sites via the heads-up display without exposing his or her head.
Finally, Geer said that TAR’s wireless system allows a Soldier to share his or her images with other members of the squad. The tablet allows Soldiers to input information they need or to share their own information with others in their squad.
David Fellowes, an electronics engineer at CERDEC, said that the key technological breakthrough was miniaturizing the image to fit into the tiny one-inch-by-one-inch eyepiece. Current commercial technology compresses images into sizes small enough to fit into tablet and cell phone-sized windows, but getting a high-definition image into the very tiny eyepiece was a challenge that could not be met with commercial, off-the-shelf hardware.
US soldier’s ARC4 system
US Soldiers use Google Glass-like augmented reality system designed for the battlefield. Called ARC4, it allows commanders to send maps and other information directly to the soldier’s field of vision. The gadget attaches to a military helmet, and can even be integrated with weapons control system. The firm behind it, Applied Research Associates, says the system was developed as part of a six year project with substantial investment from the US Military’s DARPA unit.
Rather than looking down at a 2D map or chest-worn computer, the soldier sees virtual icons (such as navigation waypoints, friendly/blue forces, and aircraft) overlaid on their real-world view. ‘You are able to perform your mission with high awareness of their surroundings, with enhanced safety, speed, and in close coordination with team members,’ ARA claims.
The software uses Global Positioning System data, helmet-camera data, and inertial sensors to “geo-register” the soldier’s field of view. That allows symbols designating waypoints, points of interest, and friendly forces to be projected on what the soldier sees, as well as a navigational “compass” showing the direction to tracked objects when they’re not in view. Additionally, a 3-D model of terrain can be superimposed on the real world to help in navigation.
British engineers from BAE Systems developing ‘Mixed’ Reality
British engineers from BAE Systems are working in collaboration with academics at the UK’s University of Birmingham to develop applications for this ground-breaking technology concept, which “mixes together” the real and virtual worlds to allow operators to take real-time control of their environments.
The revolutionary concept called ‘mixed reality’ shall allow the commanders to see themselves and their surroundings along with virtual images, video feeds, objects and avatars, seamlessly bringing together the critical battlefield elements in a single place. The technology is brought to life by an ‘Oculus Rift’ style headset allowing military commanders to direct military operations, such as troops and Unmanned Air Vehicles, across a virtual representation of the landscape for real situations or simply as part of a training solution.
Nick Colosimo, Futurist at BAE Systems, said: “We’re already seeing virtual and augmented reality becoming more commonplace in consumer products, and the possibilities it offers the armed forces are hugely exciting. Our unique approach will identify the optimal balance between the real world and the virtual – enhancing the user’s situational awareness to provide battle-winning and life-saving tools and insights wherever they may be.
Professor Bob Stone, Simulation & Human Factors Specialist at the University of Birmingham said:”Being able to physically manipulate virtual objects in the real world has been challenging scientists for 40 years. Since my first virtual reality experience at NASA nearly 30 years ago, the technology has evolved from the primitive head-mounted displays and computers to today’s world where we can interact with complex virtual objects, integrated in real-time with real-world scenarios.”
For now the research is focused on two concepts: a portable command centre roughly the size of a briefcase, and a ‘wearable cockpit’.
Portable Command Centre
The Portable Command Centre concept uses commercial technology to create a semi-virtual environment that can be transported in a briefcase and set up anywhere from within a tent to an office to tackle emergency scenarios such as an outbreak of fire or an act of terrorism.
Users put on a virtual reality headset and interactive gloves – and a mixed reality control station appears around them. Users can monitor situations anywhere in the world, zooming in and manipulating environments, directing troops and pulling in virtual video screens that allow them to monitor news channels and feeds from UAVs.
As well as this, users can bring in artificially intelligent avatars that monitor the entire environment, provide real-time voice updates and even offer advice when asked.
BAE Systems Q-Warrior: Google Glass for Military
BAE Systems have already developed Q-Warrior, a full-color, 3D heads-up display designed to provide soldiers in the field with rapid, real-time situational awareness. Q-Warrior consists of a high-resolution transparent display that overlays data and a video stream over the soldier’s view of the world. Q-Warrior also includes enhanced night vision, waypoints and routing information, and the ability to identify hostile and non-hostile forces, track personnel and assets, and coordinate small unit actions.
“Q-Warrior increases the user’s situational awareness by providing the potential to display ‘eyes-out’ information to the user, including textual information, warnings and threats,” Paul Wright, the soldier systems business development lead at BAE Systems’ Electronic Systems, said in a statement.
Q-Warrior is initially expected to be deployed with Special Forces and at the section commander level, but BAE says it expects the technology to eventually reach all soldiers.
“This is likely to be within non-traditional military units with reconnaissance roles, such as Forward Air Controllers/Joint Tactical Aircraft Controllers (JTACS) or with Special Forces during counter terrorist tasks,” said Wright. “The next level of adoption could be light role troops such as airborne forces or marines, where technical systems and aggression help to overcome their lighter equipment.”
Augmented Reality for Navy
A U.S Navy surface warfare officer has invented a helmet system that could help revolutionize warship gunnery operations, according to service officials. Lt. Robert McClenning’s Unified Gunnery System concept is an augmented reality (AR) helmet that fuses information from a ship’s gunnery liaison officer and weapons system into an easy-to-interpret visual format for the gunner manning a naval gun system, Navy officials said.
According to a previous report by Defense Systems, the helmet was invented by Lt. Robert McClenning and visually synthesizes data from a ship’s weapons systems with information feeds from a ship’s gunnery liaison officer. A GunnAR prototype was demonstrated to select groups within the Navy late last year, said Heidi Buck, Director of the Battlespace Exploitation of Mixed Reality.
Major Software Company Strengthens Augmented Reality Market Grip
Microsoft owned Osterhaut Design Group (ODG) is currently working with the Office of Naval Research (ONR) and TechSolutions to give U.S. Marine signals intelligence (SIGINT) specialists an augmented environment for the battlefield by designing augmented-reality glasses to create digital overlays of real-time information.
The X-6 system, the one being developed, has stereoscopic optics to provide a virtual display and built-in communication devices for transferring and receiving data. The glasses only weigh 4.5 ounces, little compared to many commercial headsets.
ODG and TechSolutions modified the X-6 glasses to have a toggling weapons-mounted interface that allows Marines to take their positions and fire their weapons accurately. The OS is Android, which allows the Marines to test out and deploy new applications easily. Other applications include directional markers, maps of the surrounding environment, friendly force tracking and different alerts for sending to different groups of soldiers
Navy Dive Helmet Display Emerges as Game-Changer
The Navy’s prototype Divers Augmented Visual Display is a high-resolution, see-through head-up display (HUD) embedded directly inside of a diving helmet. This system shall enable divers to have real-time visual display of everything from sonar images showing their location, text messages, diagrams, photographs and even augmented reality videos. Having real-time operational data enables them to be more effective and safe in their missions — providing expanded situational awareness and increased accuracy in navigating to a target such as a ship, downed aircraft, or other objects of interest.
Naval Sea Systems Command (00C3) is in the process of developing enhanced sensors — such as miniaturized high resolution sonar and enhanced underwater video systems — to enable divers to ‘see’ in higher resolution up close, even when water visibility is near zero. These enhanced underwater vision systems would be fed directly into the DAVD HUD.
The DAVD HUD system can be used for various diving missions, including ship husbandry, underwater construction, and salvage operations. The same system can eventually be used by first responders and the commercial diving community.