In the trenchesPentagon looking for augmented cognition troop trainer

Published 28 April 2010

Today’s troops need to be as cognitively ready as they are physically — if not more; they have also got to spend more time on the ground in urban settings, interacting with locals and canvassing for information; the Pentagon is looking for an immersive troop trainer, one that includes voice-recognition technology, and picks up on vocal tone and facial gestures

The Pentagon is looking to better train its troops — by scanning their minds as they play video games. Adaptive, mind-reading computer systems have been a work-in-progress among military agencies for at least a decade. Katie Drummond writes that in 2000, DARPA launched Augmented Cognition, a program that sought to develop computers that used EEG scans to adjust how they displayed information — visually, orally, or otherwise — to avoid overtaxing one realm of a troop’s cognition.

The U.S. Air Force also took up the idea, by trying to use EEGs to “assess the operator’s actual cognitive state” and “avoid cognitive bottlenecks before they occur.”

Drummond notes that zeroing in on brainpower is a strategy that reflects the changing tactics of fighting wars: today’s troops need to be as cognitively ready as they are physically — if not more. They have also got to spend more time on the ground in urban settings, interacting with locals and canvassing for information. This is where virtual cultural trainers often come in handy. Troops are prepped in language, social norms, and cultural sensitivity, before they even leave their base.

The trainers are quickly becoming more sophisticated. As Peter Singer notes, the Pentagon is already using “three-dimensional experiences that hit multiple senses,” including, in one case, a wearable collar that emits key odors.

Now, the Office of the Secretary of Defense (OSD) is soliciting small business proposals for an even more immersive trainer, one that includes voice-recognition technology, and picks up on vocal tone and facial gestures. The game would then react and adapt to a war-fighter’s every action. For example, if a player’s gesture “insults the local tribal leader,” the trainee would “find that future interactions with the population are more difficult and more hostile.”

Most importantly, the new programs would react to the warrior’s own physiological and neurological cues. They would be monitored using an EEG, eye tracking, heart and respiration rate, and other physiological markers. Based on the metrics, the game would adapt in difficulty and “keep trainees in an optimal state of learning.”

The OSD is not ready to use neuro-based systems in the war zone, but the agency does want to capitalize on advances in neuroscience that have assigned meaningful value to intuitive decision-making. As the OSD solicitation points out, troops often need to make fast-paced decisions in high-stress environments, with limited information and context. Well-reasoned, analytic decisions are rarely possible — which would make intuition, if it were reliable, an ideal tool to give American troops the upper hand.

This is where neuroscience comes in, Drummond writes. OSD wants simulated games that use EEGs to monitor the cognitive patterns of trainees, particularly at what’s thought to be the locus of neurally based, intuitive decision-making — the basal ganglia. In his seminal paper on the neuroscience of intuition, Harvard’s Matthew Lieberman notes that the ganglia can “learn temporal patterns that are predictive of events of significance, regardless of conscious intent … as long as exposure is repeatedly instantiated.”

“By using neural monitoring to supervise a trainee’s progress in their simulated world, the military could bolster the odds that snap decisions in the real-world will be based on more than just a gut feeling,” Drummond concludes.