Applications for tactile
multi-sensory displays for dynamic environments
In aviation and other dynamic environments, reliance on single sensory channel (visual) interfaces unnecessarily leads to mishaps due to loss of situation awareness. Endsley defined situation awareness (SA) as "the perception of elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future." Unfortunately, current interfaces designed to enhance SA have resulted from improving the ergonomics of single channel displays. Recently multiple resource theory, described by Wickens and others, has led researcher to develop multi-sensory displays that improve ergonomics using different types of information streams to exploit multiple human sensory-cognitive pathways. While the concept of using tactile displays for information transduction is not new, advances in technology have enabled systems robust enough for real-world operations. Miniaturized, efficient tactile transducers computers and sensing devices are employed in novel interfaces that exploit tactile somatosensory perception to improve SA and, thereby safety in applications for aircraft, diving, surgery, and many other human-in-the-loop systems. Novel control systems currently under development integrate management of the tactile, visual and audio interfaces to optimize understanding and decision making under high stress and workload and approach truly human centered interfacing.