In today’s military landscape, personnel face the challenge of performing increasingly complex tasks while interacting with sophisticated machines and platforms. To address this, DARPA launched the Perceptually-enabled Task Guidance (PTG) program in 2021, aimed at developing advanced AI technologies to provide real-time guidance and support to military personnel. By leveraging cutting-edge technologies like machine learning, computer vision, and augmented reality, PTG aims to create intuitive, context-aware, and personalized guidance systems for complex tasks.
Today, military personnel are expected to perform a growing number of complex tasks while interacting with increasingly sophisticated machines and platforms. Mechanics, for example, are asked to repair more types of increasingly sophisticated machines and platforms, and Medics are asked to perform an expanded number of procedures over extended periods of time.
DARPA launched Perceptually-enabled Task Guidance (PTG) program in 2021 to explore new ways of providing guidance and support to military personnel who perform complex tasks in challenging environments.
For deeper understanding of AI assistants, their evolving role and technologies please visit AI Assistants: Empowering Users in the Physical Realm
Enhancing Military Task Performance:
The primary goal of the PTG program is to empower mechanics, medics, and other specialists to perform tasks within and beyond their skillsets by providing real-time feedback and step-by-step instructions. By using AI that perceives the environment, reasons about physical tasks, and models the user, PTG seeks to make soldiers more versatile, knowledgeable, and efficient.
The goal of the PTG program is to develop advanced technologies that can provide real-time guidance to military personnel, helping them perform tasks more effectively and with greater safety.
The program focuses on leveraging cutting-edge technologies such as machine learning, computer vision, and augmented reality to create guidance systems that are more intuitive, context-aware, and personalized.
DARPA launched PTG will be to enable mechanics, medics, and other specialists to perform tasks within and beyond their skillsets by providing just-in-time feedback and instructions for physical tasks, using artificial intelligence technology that perceives the environment, reasons about physical tasks, and models the user.
The goal of the PTG program is to make soldiers more versatile by increasing worker knowledge, and expanding their skillset, make them more proficient by reducing their errors and enhancing their productivity, and efficiency.
“In the not too distant future, you can envision military personnel having a number of sensors on them at any given time – a microphone, a head-mounted camera – and displays like augmented reality (AR) headsets,” said Dr. Bruce Draper, a program manager in DARPA’s Information Innovation Office (I2O). “These sensor platforms generate tons of data around what the user is seeing and hearing, while AR headsets provide feedback mechanisms to display and share information or instructions. What we need in the middle is an assistant that can recognize what you are doing as you start a task, has the prerequisite know-how to accomplish that task, can provide step-by-step guidance, and can alert you to any mistakes you’re making.”
Artificial intelligence (AI) enabled assistants have the potential to aid users as they work to expand their skillsets and increase their productivity. However, the virtual assistants of today are not designed to provide advanced levels of individual support or real-time knowledge sharing.
DARPA developed the Perceptually-enabled Task Guidance (PTG) program that will develop artificial intelligence (AI) technologies to help users perform complex physical tasks. It will explore the development of methods, techniques, and technology for AI assistants capable of helping users perform complex physical tasks.
“Increasingly we seek to develop technologies that make AI a true, collaborative partner with humans,” said Draper. “Developing virtual assistants that can provide substantial aid to human users as they complete tasks will require advances across a number of machine learning and AI technology focus areas, including knowledge acquisition and reasoning.”
The development of AI-enabled agents is not new territory for DARPA. In addition to investing in the advancement of AI technology for more than 50 years, DARPA funded the creation of the technologies that underlie today’s virtual assistants, such as Siri. In the early 2000s, DARPA launched the Personalized Assistant that Learns (PAL) program. Under PAL, researchers created cognitive computing systems to make military decision-making more efficient and more effective at multiple levels of command.
The goal is to develop virtual “task guidance” assistants that can provide just-in-time visual and audio feedback to help human users expand their skillsets and minimize their errors or mistakes.
PTG is not interested in supporting the development of new sensors, computing hardware, and augmented reality headsets. Small and potentially wearable technologies are already available via the commercial market.
The Perceptually-enabled Task Guidance (PTG) program aims to address some of the key challenges associated with providing guidance and support to military personnel who perform complex tasks in challenging environments. Some of these challenges include:
- Cognitive overload: In high-stress and high-risk environments, military personnel can experience cognitive overload, making it difficult to process and respond to information effectively. The PTG program aims to address this issue by developing guidance systems that are intuitive, context-aware, and personalized to individual users. One of the key objectives of the PTG program is to develop guidance systems that can adapt to the cognitive and perceptual abilities of individual users. By analyzing a user’s cognitive and perceptual capabilities, the system can customize the guidance and support it provides to the user, ensuring that the guidance is tailored to the user’s needs and preferences.
- Limited situational awareness: Military personnel often operate in environments where situational awareness is limited, making it difficult to navigate, communicate, and make decisions. The PTG program aims to address this issue by developing guidance systems that can provide real-time updates on the environment and the task at hand, helping users make more informed decisions.An important goal of the PTG program is to develop guidance systems that are highly interactive and responsive to user feedback. The system should be able to monitor user performance in real-time and adjust the guidance it provides based on user actions and feedback.
- Limited communication: In some environments, military personnel may have limited communication capabilities, making it difficult to receive guidance and support. The PTG program aims to address this issue by developing guidance systems that can operate in low-bandwidth or disconnected environments, ensuring that users can still receive guidance and support even in challenging conditions.
To accomplish its objectives, PTG is divided into two primary research areas. The first is focused on fundamental research into addressing a set of interconnected problems: knowledge transfer, perceptual grounding, perceptual attention, and user modeling. The second is focused on integrated demonstrations of those fundamental research outputs on militarily-relevant use case scenarios. Specifically, the program will explore how the task guidance assistants could aid in mechanical repair, battlefield medicine, and/or pilot guidance.
PTG technology will exploit recent advances in deep learning for video and speech analysis, automated reasoning for task and/or plan monitoring, and augmented reality for human-computer interfaces. However, these technologies by themselves are insufficient. To create task guidance assistants, PTG is looking for novel approaches to integrated technologies that address four key (and interconnected) problems:
- Knowledge Transfer. Assistants need to automatically acquire task knowledge from instructions intended for humans, with an emphasis on checklists, illustrated manuals, and training videos;
- Perceptual Grounding. Assistants need to be able to align their perceptual inputs – including objects, settings, actions, sounds, and words recognized (inputs) -with the terms it uses to describe and model tasks (outputs), so that observations (percepts) can be mapped to its task knowledge (concepts);
- Perceptual Attention. Assistants must pay attention to percepts that are relevant to current tasks, while ignoring extraneous stimuli. Assistants must also respond to unexpected, but salient, events that may alter a user’s goals or suggest a new task; and,
- User Modeling. PTG assistants must be able to determine how much information to share with a user and when to do so. This requires developing and integrating an epistemic model of what the user knows, a physical model of what the user is doing, and a model of their attentional and emotional states.
Because these four problems are not independent of each other, PTG aims to pursue integrated approaches and solutions that collectively take on all four challenge areas. To give just one example, there is a strong interaction between knowledge transfer and perceptual grounding. If knowledge transfer translates instructions into a small predetermined library of terms, then perceptual grounding becomes easy, whereas if knowledge transfer adopts whatever terms appear in a manual, perceptual grounding is challenging. Therefore, the PTG program does not divide task guidance technology into four separate research areas, but instead expects integrated approaches and solutions that address all four problems.
AMIGOS, a new DARPA-funded program, aims to use AI and augmented reality to build real-time training manuals.
In December 2021, Xerox’s research division PARC, in collaboration with the University of California Santa Barbara, the University of Rostock, and augmented reality company Patched Reality, was awarded a $5.8 million contract by DARPA, the Pentagon’s blue-sky projects wing. The goal is to make a program that can guide users through complex operations beyond their existing knowledge, like letting a mechanic repair a machine they’ve never seen before.
“Augmented reality, computer vision, language processing, dialogue processing and reasoning are all AI technologies that have disrupted a variety of industries individually but never in such a coordinated and synergistic fashion,” Charles Ortiz, the principal investigator for AMIGOS, said in a Xerox release, “By leveraging existing instructional materials to create new AR guidance, the AMIGOS project stands to accelerate this movement, making real-time task guidance and feedback available on-demand”
AMIGOS falls under the broader goals of DARPA’s Perceptually-enabled Task Guidance research, which operates with the understanding that humans, with finite capacity, cannot possibly learn everything about every physical task they’ll be asked to perform before they are sent into the field. And yet, those humans will have to treat novel injuries, or repair unfamiliar machines, almost by definition of military work.
For now, the program will work on creating two component parts for AMIGOS. Xerox describes one of them as an offline tool that can scan text from manuals and other instructional material, like videos, to create the step-by-step guide needed for a task. The second component will be online, and intends to use AI to adapt the manual’s instructions into a real-time instructional guide. The offline component ingests learning material, preparing it for use by the online component, which draws on the ingested manuals to generate an updated guide in real time for the user. It is, at once, an exercise in learning and a teaching tool, offering results at the speed of interaction.
Since then, DARPA has awarded PTG contracts and grants to researchers from NYU, Northrop Grumman, and more.
Kitware (Columbia University; University of California, Berkeley; University of Texas at Austin); PARC (University of California, Santa Barbara; University of Rostock); Northeastern University (University of California, Santa Barbara; Stony Brook University); New York University; University of Texas at Dallas (University of California, Irvine; University of Florida); Stevens Institute of Technology (Purdue University; University of Michigan; University of Rochester); University of Florida (Northeastern University; Topos Institute; Texas A&M University; University of Arizona); Raytheon Technologies (Valkyries Austere Medical Solutions); Northrop Grumman (University of Central Florida); Red Shred (Third Insight); MIT Lincoln Laboratory
Impressive Progress and Future Outlook:
Researchers participating in the PTG program recently demonstrated the first versions of their AI assistants, showcasing their capabilities in helping users follow cooking recipes.
Recently, the groups came together at MIT’s Hacker Reactor to demonstrate the first versions of their AI assistants. For this first demo, though, the teams weren’t expected to show how their AI assistants could help fly helicopters — instead, they demonstrated how the systems could help someone follow a cooking recipe.
“[Cooking is] visually quite complex,” said Draper. “There’s specialized terminology. There are specialized devices, and there’s a lot of different ways it can be accomplished, so it’s a really good practice domain for all kinds of other highly skilled tasks.”
These early stages have shown promising results, and as the program advances, it holds the potential to significantly improve military personnel’s safety and effectiveness in complex and challenging environments.
Overall, the PTG program has the potential to significantly improve the safety and effectiveness of military personnel who perform complex tasks in challenging environments. The program is still in its early stages, but it has already generated significant interest and attracted the attention of leading researchers in the field of human-computer interaction and artificial intelligence.
The PTG program has the potential to benefit a wide range of military applications, including search and rescue, reconnaissance, and mission planning. The program is still in the research and development phase, and it will likely take several years before the technology is fully developed and deployed in the field. However, the program has already generated significant interest and has the potential to make a significant impact on the safety and effectiveness of military personnel who perform complex tasks in challenging environments.