Visualization & Human-Centered Displays

 

A human-centered approach to visualization amplifies and extends human perceptual, cognitive, and performance capabilities, casting information into a form that exploits the unique properties of our perceptual system’s information processing capabilities.

New visual displays being developed by IHMC researchers Larry Bunch, Tom Eskridge, Jeff Bradshaw, David Still, Matt Johnson, and Michael Vignati exploit current vision and neuroscience research to reinvent the presentation of information by harnessing the sophisticated processing power of the human visual system. These displays present data in a manner that allows the operator to understand the situation effortlessly. They also tap the full spectrum of the user’s vision to provide information, allowing the user to receive most necessary information without focusing on each piece individually. Such displays allow the user to absorb and understand more information with less cognitive effort.

Reducing the cognitive effort required to maintain operator situational awareness is critical in many real-time domains. For example fast moving aircraft, such as airliners and military aircraft, require a large cognitive effort by the pilot especially during uncommon or emergency circumstances. David Still at IHMC is working with leading defense contractors to transition OZ technology into the military cockpit, enabling superior flying performance with greater capacity for handling secondary tasks.

Much of the information critical to flying is available by looking out a window. Understanding the visual clues when hovering a helicopter, however, requires many hours of flight time. The instruments do not provide adequate information easily for hovering. Without visual clues, novice pilots struggle to maintain a hover, and seasoned pilots also can become disoriented. The helicopter cockpit display developed at IHMC allows even novice pilots to maintain a hover in simulation, with no visual clues.

We have extended the visualization principles from the OZ project with the Coactive Design method to develop innovative interfaces that enable effective human-machine interaction. For example, in support cyber security, Jeffrey Bradshaw, Larry Bunch and Micael Vignati have developed a Network Observatory that supports sensemaking through observability, predictability, and directability.

This approach has also been employed by Matthew Johnson in an unmanned aerial vehicle (UAV) interface designed to support human-UAV team navigation in cluttered environments, as well as an unmanned ground vehicle (UGV) interface to enable more effective obstacle avoidance while teleoperating ground vehicles.

Our approach also played in important role in IHMC’s success in the DARPA Robotics Challenge, enabling successful completion of all eight challenges including driving, egress, entering doors, turning valves, using power tools to cut out walls, walking over rough terrain, climbing stairs and tackling unknown challenges.