hide
Free keywords:
-
Abstract:
Automated systems are often designed to reduce "cognitive workload" in their users. What does this mean? In my talk, I present research that demonstrates how theory and psychophysiological methods (i.e., eye tracking, heart rate, and EEG/ERP) can lend better definition to the cognitive mechanisms that human-machine interfaces support. Three examples will be provided from the task domains of: (1) instrumented cockpit display, auditory cues for task-management, and manual assembly instructions. The first study shows that overt attention planning across single-sensor-single-instrument is compromised by high anxiety and working memory load, suggesting a need for glass cockpit systems that adapt to user states [1]. The second study uses ERP methods to re-investigate whether auditory notifications that were originally designed for supporting and cueing task management ought to necessarily favor verbal commands instead of iconic sounds [2]. Contrary to the designers' original findings, we found that verbal commands were more likely to capture early attention while auditory icons resulted in stronger working memory updating. The third study demonstrates how we used EEG methods to understand the cognitive benefits of using an in situ display system for manual assembly, which provides assembly instructions in real-time [3]. Specifically, we showed that it reduces working memory load in its users. Altogether, these examples represent the diversity of "cognitive workload", across different scenarios. While "cognitive workload" is an intuitive concept, I will argue that it is not a useful concept in designing automated systems, especially as we strive towards automating cognitive processes that are intended to replace or augment ours.