AFRL HUMAN Lab and Aptima Decode Big Data
AFRL HUMAN Lab and Aptima Decode Warfighter Big Data to Improve Mission Performance
Interpreting neuro-physiological and simulaton data leads to improved technologies and training
Orlando, FL & Woburn, MA, November 25, 2013 – Big data isn’t just from computers; it’s also being generated from humans. Signals from the brain and bodies of warfighters can possess a wealth of information that if properly decoded can inform new technologies to better support their missions.
But what do biological measures such as heart rate and eye movement tell of the performance of an Unmanned Aerial Vehicle (UAV) operator? And how can this data be interpreted to improve their training and the systems/interfaces they rely on?
To explore and solve these challenges, Aptima is partnering with the Air Force Research Laboratory’s Human Universal Measurement and Assessment Network (HUMAN) Laboratory. In this test bed facility, operators engage in simulated UAV missions while their performance is measured and their neuro-physiological functions assessed, providing insights on how stressors, such as cognitive overload, can impact mission performance. Collecting and interpreting this data is envisioned to transition to UAV operations to enable a single, better equipped operator to control multiple UAVs, rather than a team required to manage one.
“It’s critical to know at what points an operator has too many simultaneous tasks or UAVs under their control before performance diminishes,” said Michael J. Paley, President of Aptima. “This requires keen insight into their state, and when and why they may be experiencing overload. By assessing brain, body, and performance data together you can understand the dynamics between the human operator, the technical systems they’re using, and the mission demands they’re facing. This isn’t a challenge limited to UAV operations. This application will more generally enable the quantified warrior throughout a wide range of critical missions like cyber security, intelligence, and small unit operations,” Paley added.
These insights are being used to augment the UAV control systems to better support the human operator. That can include, for example, modifying and improving the interfaces in these information-intensive operations, and automating tasks when the operator might encounter overload.
Big data feedback is also being used to develop ‘adaptive’ training, which can adjust a scenario to match an individual trainee’s aptitude and focus on the mission skills to be honed, such those for UAV surveillance or targeting. The result will be a more efficient and effective training environment.
The HUMAN Lab in Action
Instrumented with sensors that monitor neural activity, heart rate, eye movement, respiration, and galvanic skin response, a UAV operator engages in a mission in the HUMAN Lab while seated at the Vigilant Spirit Control Station, a multi-UAV control system being used in realistic mission simulations to evaluate operator-interface technologies. During the exercise, Aptima’s PM Engine™ software collects raw data from the simulator and the neuro-physiological sensors. Its algorithms interpret the data, producing a real-time visual display that maps the operator’s performance and biological measures moment by moment.
“EEG and heart rate can be good correlates of cognitive states for many people, and can help identify when an operator is experiencing an overload or a lull. These are insights you don’t get from simple observation, or only reviewing simulator data after the fact,” added Kevin Durkee, Aptima’s Lead for Human System Performance Assessment. “In both training and the operational setting, it’s ideal for the human to be in the ‘sweet spot’ of engagement. Having too complex or too much information prevents effective learning and performance. If there’s too little engagement, then learning proficiency and attention aren’t as high either.”
“This new generation of neuro-physiological measures establishes a new gold standard for evaluating and optimizing the fit between humans and complex technical systems,” said Scott Galster, Director of the AFRL HUMAN Lab. “This ‘sense-assess-augment’ model has far-reaching applications, not just to improve UAV training and operations, but other socio-technical systems, such as those on submarines or in nuclear power plants.”
WHERE: Aptima ITSEC Booth #707
WHAT: “Sense, Assess, Augment” Demonstration of Vigilant Spirit Control Station and PM Engine Software Measuring Brain, Body and Performance Data of UAV Operators
WHEN: 2-5 December 2013, Orange County Convention Center, Orlando, FL
WHO: AFRL HUMAN Laboratory and Aptima, Inc.