IHMC researchers Anil Raj and Sergey Drakunov work to improve the interactivity and situation awareness between individuals and the technological systems with which they interact. The methods use integrated multisensory, multimodal and neural interfaces to both help people understand the behavior and state of a device or system and enable automation in technological systems to dynamically optimize automated assistance. The resulting augmentic solution can improve human-machine team performance on both simple and complex tasks. We have developed augmentic displays to support aerospace and motorsports applications, dismounted soldiers, diving, teleoperated robots, control of swarms of unmanned aerial vehicles (UAVs) and sensorimotor assistive devices for individuals with impairment of vision, balance, hearing or with musculoskeletal weakness (using powered and passive wearable exoskeletons). Testing has confirmed decreased cognitive workload and training time while increasing task performance.