Home / Technology / AI & IT / US Army’s future training capability is the Synthetic Training Environment (STE) to conduct realistic multi-echelon / multidomain battles and mission command training

US Army’s future training capability is the Synthetic Training Environment (STE) to conduct realistic multi-echelon / multidomain battles and mission command training

Today’s military forces encounter evolving national threats, economic constraints, and a changing operational environment with a complex, multifaceted, and uncertain security landscape across a range of political, military, state and non-state actors. Advancing and sustaining the U.S. Department of Defense’s (DoD) operational readiness for these threats require a portfolio of training capabilities that support a learning continuum from individual, staff, to collective training. This portfolio must create the training environment that prepares the total force to accomplish a diverse and complex set of missions that demand an ever-changing combination of military engagement, security cooperation and deterrence competencies.


The  Synthetic Training Environment will resolve shortcomings of the Multiple Integrated Laser Engagement System (MILES) that has been used to support direct-fire force-on-force training since 1980. “MILES was a device used for training outside, but if I hide behind a bush, you can’t shoot through the bush, which is not realistic,” says Meggitt’s Shavers. “Or you had to use blanks or simulation, but now they want to know how to have more weapons and hiding behind something isn’t a block.”


The Army today cannot simulate realistic multi-domain operations training from soldier through brigade combat team at home station, Maneuver Combat Training Centers (MCTCs), or deployed locations in live training environments, which will be integrated into Synthetic Training Environment. MILES cannot replicate the ballistic trajectory of munitions, simulate a munition’s effect on impact or engage targets using indirect fire, such as artillery or mortars. As a result, only half of the small arms and munitions assigned to a light infantry platoon can be represented accurately in live force-on-force training. The same is true for 40 percent of brigade combat team weapons effects. “We’ve been dealing with deployable systems for some time, but with improved computational power and new COTS technologies, we can provide high fidelity software in a much smaller footprint and reduced cost,” says Lenny Genna, president of the military training sector at L-3 Harris Technologies in Arlington, Texas. “The technology provides the capability to do a lot, but in some cases, you want tactile feel that isn’t fully there yet.”


The last decade has brought about significant changes in the way personnel are trained, thanks to the development of simulated or synthetic training capabilities. Virtual reality (VR) and augmented reality (AR) – layering visuals over a real environment view – are exciting advances shaping the way military service men and women, from all of the armed forces, are trained. “Training using mobile apps, VR, and AR technology will eventually replace regular training techniques,” says Reddick, reflecting on how this type of training is changing the way we learn. “In VR and AR, you can accomplish any task with no risk to your body. We’re able to recreate almost any task or mission that is asked of us in the virtual world and provide multiple outcomes, feedback, and analytics for each and every motion the user does.”


The Army’s future training capability is the Synthetic Training Environment (STE). The Synthetic Training Environment will be a single, interconnected training system that provides a Common Synthetic Environment, in which units from squad through ASCC to train in the most appropriate domain – live, virtual, constructive, and gaming, or in all four simultaneously. This training capability will enable Army units and leaders to conduct realistic multi-echelon / multidomain combined arms maneuver and mission command training, increasing proficiency through repetition. Units can then master collective training tasks in the
live environment. According to the US Army, the STE is designed to provide a cognitive, collective, multi-echelon training and mission rehearsal capability for the operational, institutional, and self-development training domains. The environment will “keep pace with and adapt to the rapid development of technologies” as part of the Army’s ‘Big Six’ modernisation priorities and builds on the US Department of Defense’s annual spend of $14bn in this field.


“As technology progresses, we are noticing an increase in realism and interaction formats that allow users to experience training almost identically to how they would experience it in real life, with the advantage of being able to stop, pause and reset the training experience,” Reddick says, adding that the possibility to simulate and reset on a constant basis, for a nearly unlimited user base and low cost, is where much of the appetite for VR and AR comes from across both the private and public sectors. “The ability to interact with a training workflow, from start to finish, with the ability to reset that workflow instantly for the next user, or have a group of people play out the experience at the same time and track all their results, is where the value of computer training comes in”, he adds. “We can simulate any military experience and, with our analytics and unique eye tracking technology, even determine how it is effecting the users mentally and physically.”


The Common Synthetic Environment, targeted for initial operational capability by September 2021 and full operational capability by September 2023, will provide the software, applications, and services necessary to enable and support next generation systems, including the Reconfigurable Virtual Collective Trainer, Soldier/Squad Virtual Trainer, and Live Training Environment.


The Synthetic Training Environment is being designed to simulate not only weapons effects at all ranges, but also the feel of each weapon’s discharge, enabling warfighters to have confidence in their training and mission rehearsals on deployment, before entering combat. Combining live environment training with the Synthetic Training Environment ecosystem enables users to measure training goals against actual performance. “That’s a huge part of being able to collect that information and provide that information back to the soldier, not only objectively but also with their trainers so they have the objective and the subjective information together,” says Kevin Hellman, capabilities developer for the Synthetic Training Environment at the Army Combined Arms Center – Training (CAC-T) at Fort Leavenworth, Kan.


Army Futures Command Synthetic Training Environment Cross-Functional Team (STE CFT)

As one of eight Army Futures Command CFTs designed to accelerate and enable Army modernization priorities, the STE CFT brings together industry and academic experts, influential government partners and both seasoned and novice Army Soldiers to inform synthetic training requirements and cultivate shared capabilities.


The STE Information System, a virtual training suite being developed by the STE Enterprise, utilizes 3D imagery of terrain and high-resolution graphics to replicate the rigor and complexity of a fast-paced, multi-domain operational environment. Within this system, the CFT is able to harness its One World Terrain mapping program to import actual terrain data and visually transport the end user to anywhere in the world.

“The key is the ability to rapidly build the terrain and the operational environment into the STE and deliver it quickly to the warfighter,” said Brig. Gen. William R. Glaser, director of the STE CFT. “Leaders can then use the STE to conduct reconnaissance, war games and rehearsals.”

Layered into STE programs are artificial intelligence and machine learning processes that help accurately simulate warfighting elements, including weapons movements, enemy threats and combat stressors. The programs not only construct and replicate tough and realistic scenarios for Soldiers, but also collect detailed data on how Soldiers react under pressure, further informing training needs and operational planning methods and continually increasing training thresholds.

Importantly, the STE CFT also seeks to integrate virtual training tools into existing Army systems and live training exercises, generating a whole-of-resources approach to delivering the best training programs possible to Soldiers.

For example, the CFT is currently refining its Squad Immersive Virtual Trainer (SiVT), a mixed-reality training tool with a head-up display and cutting-edge technology that allows Soldiers to use their organic weapons for training scenarios. SiVT is delivered through the Integrated Visual Augmentation System (IVAS) and integrates simulated images with views of a Soldier’s actual surroundings.

Once SiVT is fielded with IVAS, Soldiers will use it to conduct, collectively as a squad, multiple iterations of battle drills and rehearse for combat before executing operations. Use of the SiVT for these transportive training scenarios will require minimal preparation on the part of the trainer and will also offer the ability for leaders to conduct near-immediate after action review.

In addition, each STE tool is built intentionally to be effective, efficient, easy to use and encompassing – meaning readily available to all types of Soldiers at practically any home or deployed location. This inherent flexibility and ability to deliver at scale means the future possibilities for STE are many.

“When a training capability like the STE becomes so essential that commanders demand it to support reconnaissance, wargaming, rehearsal and AAR capability while deployed, then we will have achieved our end state,” Glaser said.


Army’s synthetic training programs gearing up for important test events

Over the next fiscal year, the Synthetic Training Environment Cross-Functional Team, part of Army Futures Command, expects to run a series of major tests on its three key programs, aimed at helping the Army change how it trains for the fight of the future. These three program are  One World Terrain, Reconfigurable Virtual Collective Trainer and the IVAS-Squad Immersive Virtual Trainer.

One World Terrain is one of the STE-CFT’s most critical programs. The program provides soldiers with a three-dimensional terrain that can virtually replicate Earth’s terrain and can simulate the complexities of the operational environment. OWT integrates with the Training Management Tool and Training Simulation Software (TSS/TMT) to create the Synthetic Training Environment-Information System, which allows soldiers to virtually train at their point of need.

One World Terrain is one of the STE-CFT’s most critical programs. The program provides soldiers with a three-dimensional terrain that can virtually replicate Earth’s terrain and can simulate the complexities of the operational environment. OWT integrates with the Training Management Tool and Training Simulation Software (TSS/TMT) to create the Synthetic Training Environment-Information System, which allows soldiers to virtually train at their point of need.

The program is on schedule to deliver a minimum viable capabilities release (MVCR) for squads, companies and platoons in the first quarter of fiscal 2023, according to Brig. Gen. William Glaser, director of the Synthetic Training Environment Cross-Functional Team.

“The capabilities include well-formed format (WFF) 3D and derivative data products for consumption by the TSS/TMT platform,” Glaser said in an email to Breaking Defense. “The scale, resolution and structure of the data will [enable squad/company/platoon] training in Q1FY23.”



Common Synthetic Environment (CSE)

In March 2019, U.S. Army released the services’ Common Synthetic Environment (CSE) statement of need, which outlined the Synthetic Training Environment (STE) the Army sees as its future training capability. The Synthetic Training Environment enables tough, iterative, dynamic and realistic multi-echelon/combined arms maneuver, mission rehearsal and mission command collective training in support of multi-domain operations, the Statement reads. The training environment will provide units the repetitions necessary to accelerate individual through unit skill and collective task proficiency resulting in achieving and sustaining training readiness. It provides complex operational environment representations anytime and anywhere in the world. The Synthetic Training Environment will deliver collective training, accessible at the Point-of-Need (PoN) in the operational, self-development and institutional training domains.


The focus is one interconnected training capability that provides a Common Synthetic Environment that delivers a comprehensive, collective training and mission rehearsal capability, the statement continues. The Common Synthetic Environment is composed of three foundational capabilities: One World Terrain (OWT), Training Management Tool (TMT) and Training Simulation Software (TSS). The Common Synthetic Environment enables the convergence of the live, virtual, constructive and gaming environments into the Synthetic Training Environment.


The Common Synthetic Environment (CSE) is the unified simulation environment Units and Soldiers use for training. The CSE provides Soldiers and Units a realistic (e.g.,physics-based effects), digital representation of the dynamic OE and the military capabilities in the  scenario; to support collective training from Squad through ASCC. Within the CSE, there are two conceptually different ways in which units in a virtual environment will need to interact with the STE:



Another signature program in the Synthetic Training Environment team’s portfolio is the Reconfigurable Virtual Collective Trainer (RVCT), a mobile, transportable virtual training system that will allow soldiers and Army aviators to virtually train together. In August, the RVCT-Ground systems successfully integrated more than 157 dismounted soldiers while supporting company training during an operational assessment.


The RVCT program is currently working towards a rapid fielding decision in December 2022 and if approved, then production will begin, Glaser said. Rapid fielding would be followed by continued integration with the Synthetic Training Environment-Information System as the program prepares for its initial operational test and evaluation in August 2023.


The last program, the IVAS-Squad Immersive Virtual Trainer, is a collective training system that will integrate with IVAS in the future.

Virtual Semi-Immersive User Interface and Hardware

Virtual Semi-Immersive interfaces are common ‘keyboard and mouse’ interfaces into a virtual threedimensional (3D) representation of a training environment. While commonly referred to as ‘keyboard and mouse’, it may include additional peripherals, such as controllers and joysticks to enhance training, but are typically not intended to provide a full ‘form, fit and function’ representation of training conditions. This form of low-overhead reconfigurable training enables the crew/team through Brigade Combat Team to interact with the Common Synthetic Environment (CSE) and a digital representation of the Mission Command Information System (MCIS) interfaces and platforms for all Warfighting Functions (WfF) and a dismounted Soldier capability.


The CSE is the unified simulation environment in which the training takes place. The interface will stimulate sight, sound and touch modalities. Sight allows the
Soldiers to see the CSE (both two-dimensional [2D] overhead and 3D 1st/3rd person views), sound allows the Soldier to hear and provide voice input into the CSE, and touch allows the Soldier to interact with the CSE. The quality of stimulation is a low fidelity approximation of what the Soldier experiences in the live environment.

Virtual Immersive User Interface and Hardware

Virtual Immersive trainers will seek a higher level of ‘form, fit, and function’ for the training audience than the semi-immersive systems. These interfaces into the CSE replace the immersive Combined Arms Tactical Trainers (CATT) found in the Army inventory. However, unlike the large overhead of current CATT trainers, the STE will need low overhead, reconfigurable, and transportable trainers to facilitate training anytime, anywhere. To accomplish this, the STE will require the use of innovative Mixed Reality and Natural User Interface technologies to deliver the following capabilities:

 Software-centric implementation
 Capitalization on rapid advancements in commercial mixed-reality technologies
 Low sustainment and concurrency costs
 Scalable interfaces to support training, without disruption, at the PoN
 Rapid concurrency updates driven through software rather than hardware changes
 Immersive collective training experiences that support suspension of trainee disbelief
 Accurate visual and haptic system representation (e.g., sensors, weapons, survivability capabilities, communications) to prevent negative training transfer or habit formation
 Natural fields of view
 The breadth of tactical trainers supporting Ground and Air Simulation

Ground: This reconfigurable and transportable trainer enables ground platform crew/team through Battalion Task Force to interact with the CSE and a digital representation of the MCIS interfaces and platforms for all WfFs. The immersive trainer provides a motion tracking capability and select highfidelity physical platform controls for crew members. The interface will stimulate sight, sound and touch modalities. Sight provides the Soldiers a natural field-of-view and allows the Soldiers to see the CSE from first person perspectives, sound allows the Soldier to hear and provide voice input into the CSE, and touch allows the Soldier to use physical and tactile controls of systems, subsystems, components, and  mission command information system interfaces to interact with the CSE.

Key considerations for ground immersive training include:
 Vehicle Commander: Weapon system control and sensor controls.
 Driver capabilities: Steer vehicle, change gear (e.g., forward, reverse), accelerate vehicle, brake
vehicle, and control/view dashboard.
 Gunner (combat vehicle): Weapon system control and sensor controls.
 Loader: Loader’s periscope, loading main weapons systems, loader’s weapons systems, radios.
 Gunner/Air Guard (wheeled vehicle): Grip, aim, fire, and reload weapon.

Air: This reconfigurable and transportable trainer enables aviation crew/team through Battalion Task Force to interact with a CSE, and a digital representation of the MCIS interfaces and platforms for all WfF. The immersive trainer provides a motion tracking capability and select high-fidelity physical platform controls for pilot, co-pilot, and non-rated crew members. The interface will stimulate sight, sound and touch modalities. An accurate representation of crew sensory inputs and feedback are critical. The relatively increased danger from crew error in aviation platforms necessitates an expectation of higher fidelity in Air immersive trainers.

Flight, weapon controls, and non-crewmember controls must provide highly accurate tactile control and switch options relative to the aircraft’s digital operational flight program (OFP) capabilities and be in the correct location relative to where the crew member is standing or sitting (e.g., collective is always on the left side, cyclic between the legs), to prevent negative training and habit transfer.

Pilot/Co-Pilot capabilities include dual flight controls to allow the pilot or co-pilot/gunner to fly the aircraft safely (Cyclic, Collective, Pedals). It also includes unique weapon systems interfaces (i.e., Target Acquisition and Display Sight (TADS) Electronic Display and Control (TEDAC) for Attack Helicopter [AH]). The TEDAC for the AH-64 is only for the co-pilot/gunner position. Non-Rated Crewmembers capabilities include unique weapon interfaces (i.e. door gun) for the Utility

Helicopter (UH) and Cargo Helicopter (CH); unique Intercommunications System (ICS) Switch and handheld push to talk capability (UH, CH); and unique hoist controls (UH and CH). Unique cargo hook view space (CH) hoist operations must provide a minimum level of tactile and visual feedback to ensure awareness of proper operations.

Unmanned Aircraft System (UAS) capabilities will include the realistic representation of unmanned systems, to include all kinetic and non-kinetic battlefield effects, as well as the appropriate affordances for user/operator interactions, in order to facilitate collective training.

Global Terrain/One World Terrain (OWT) Capability

The Global Terrain research effort is a demonstration of the global terrain capabilities needed to achieve the STE vision. This concept would ultimately include a cloud-based service that delivers a common synthetic representation of the whole Earth to include the air, land (includes subterranean), sea (includes undersea), space, and cyber domains that units will use for collective training. The STE’s Global Terrain will be delivered over the network to training audiences at home station, while deployed, and at the institution.

Global Terrain Capabilities include:

 A digital global with all terrain available to include full 2D, 3D and parametric information on all the buildings/structures, to include interiors and subterranean features, on the planet.
 Soldier-level fidelity of terrain available on a global scale.
 Training without boundaries that allows seamless integration of physical training areas into global scale wrap around exercises in the virtual and constructive training domains.
 Reuse and integration of a variety of data sources, from the reuse of existing training simulation terrain, such as Synthetic Environment – Core (SE-CORE) home station databases; to the importation of the Army’s Standard Shareable Geospatial Foundation (SSGF), the use open source data, to the collection and processing of organic terrain collection data, such as dronecaptured photogrammetry.
 The ability to export 3D mesh-based terrain to 2D vector- and raster-based terrain systems. The Global Terrain Capability concept delivers a geographical representation of the entire 3D world in a geo-referenced ellipsoid representation of the Earth. The goal for data fidelity is to provide subcentimeter resolution and accuracy in terrain, to support full live-synthetic entity interaction in a ‘fairfight’ environment. OWT will need to provide the best available terrain representation, from geo-typical to geo-specific, based on authoritative data, while making use of innovative approaches in procedural terrain generation and sensor fusion to constantly improve the quality of the available global terrain.


Additionally, training units will need a capability that allows runtime editing of exercise specific environments to set conditions needed to meet training objectives. Configuring operational variables (Political, Military, Economic, Social, Information, Infrastructure, Physical Environment, and Time [PMESII-PT]) that represent the Operational Environment (OE) enables the CSE to represent unique OE complexities. This provides enhanced realism for a realistic training experience without artificial limitations.


References and Resources also include:



About Rajesh Uppal

Check Also

Understanding the Ongoing Risks of Chinese Smart Devices

Introduction The world of smart devices has revolutionized the way we live, bringing convenience and …

error: Content is protected !!