hide
Free keywords:
-
Abstract:
Driving a car requires us to switch between tasks proficiently, for example between lane-keeping and spotting jaywalkers. Recent automation has relieved us from performing some of these tasks. Unfortunately, it does not yet absolve us from the responsibility of when things go wrong. We are still expected to be vigilant for instances when automation could fail and to intervene when
necessary (SAE Level 2; e.g., Tesla S-Class). How do we
keep the human-in-the-know when the human is no longer in-the-loop? In my talk, I will present eye-tracking and EEG/ERP research that address how task switching is performed in the context of manual steering. Following this, I will discuss their implications for coordinating turn-taking between the human user and autonomous vehicles, particularly with regards to the design of in-vehicle notifications.