Billion-dollar, decade-long initiatives in the U.S. and Europe to map and simulate the entire human brain will change information technology fundamentally, and aerospace is unlikely to remain untouched. Advances in neurotechnology are already having an impact, as methods of monitoring the brain are applied to improving the performance of pilots, air traffic controllers and system operators.
“We are already seeing promising results from initial studies,” says Santosh Mathan, principal scientist at Honeywell Labs in Seattle. “A lot of our work focuses on neural sensing—sensing brain activity—with the aim of improving human performance.
“We are in this line of research because our technology is used in challenging task contexts—systems that support soldiers, or pilots in advanced flight decks,” he says. “Computers are being adopted in unconventional settings, but humans always remain a crucial component, and there are many vulnerabilities of humans that can cause the whole system to fail.”
Areas of concern include information overload. “You can overwhelm a person with processing so much information that they are unable to perform the task,” Mathan says. Another is attention. “Are we creating systems that allow our users to stay engaged and remain a critical part of the system, or are they outside the loop and contributing to the system failing?”
Designing systems without considering human limitations can have several consequences, he says. For operators these include higher training costs and loss of efficiency and safety. For manufacturers they include higher certification and support costs.
Tools now used to make sure system designs have a low impact on users tend to involve behavioral observation, Mathan says—putting people in a realistic task context, observing their performance and making an inference about how effective the system is. This is time-consuming, requires domain experts and can be costly.
“We use subjective ratings a lot. Pilots use the system and provide a questionnaire response, but there are all kinds of biases related to retrospection, sensitivities about what you disclose, and these subjective issues get in the way,” he says. “So we are interested in tools that are objective, automated, fine-grained and can give us insight into the cognitive state of the user as they interact with the systems we design.”
Research shows brain activity can be a source of this information, Mathan says. Examples include functional magnetic resonance imaging of the brain of an individual performing low- and high-difficulty tasks. Many more regions of the brain are active during a difficult task. “When performing a task that is familiar and well-practiced, the regions active are just those necessary to perform the motor aspects of the job. It’s all automated. But the moment it is unfamiliar or more difficult, there is a lot more reasoning happening,” he says. But clinical imaging equipment used for this research is impractical for system development, so work has centered on obtaining brain-activity information with sensors that are more practical. “Our efforts have focused on using EEG [electroencephalography] technology as the basis for making inferences about cognitive state,” Mathan says.
As currents flow through the billions of neurons in the brain they set up electrical fields, and voltages associated with these can be detected at the surface of the scalp. “You can sense those minor voltage fluctuations and make some inferences about what’s going on inside the brain,” he says.
Ten years ago, a lab system resembled a swim cap with many electrodes and wires, making it difficult for the test subject to move. “We are beginning to see and use systems that are much more practical,” Mathan says. A wireless EEG system from Advanced Brain Monitoring (ABM), for example, has the circuitry integrated into thin plastic strips and fits under a helmet.