PLAYA VISTA, Calif. — Imagine future American Warfighters in the midst of a mission-leveraging technology to maintain a new level of situational awareness. This may be possible thanks to a new suite of software tools that tap into what a Soldier or sailor sees and feels.
U.S. Army researchers developed a suite of tools under a decade-long research program that focused on how brain function and eye tracking can be used to predict situational awareness.
Researchers developed software to exploit gaze and physiological data and provide real-time estimates of human situational awareness using a systematic collection of measurements via what they call the lab streaming layer, or LSL. This data collection ecosystem addresses analytic difficulties when combining information from different types of sensors.
It also offers the capability of synchronizing physiological data from a suite of sensors that monitor eye-tracking, breathing patterns, and other physiological responses during experiments designed to mimic realistic mission events.
Researchers use the software to quantify, predict, and enhance squad-level shared situational awareness with Tactical Awareness via Collective Knowledge, or TACK.
“We can know exactly when and what someone looked at when we use TACK software tools and the physiological changes happening concurrently, including what their pupil size was, as well as heart, brain, and many other sensors,” said Dr. Russell Cohen Hoffing, a research scientist supporting TACK, who works at the U.S. Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory’s western regional site in California.
Cohen Hoffing said he extensively relies on TACK tools and LSL to do data collection and analysis. He’s bringing together DEVCOM ARL colleagues with researchers from the U.S. Army Aeromedical Research Laboratory and the Naval Research Laboratory to find synergy and collaborate on experiments for multidomain operations.