hide
Free keywords:
-
Abstract:
How does manual steering affect our ability to process auditory information? This question has been neglected for at least two reasons. First, manual responses, with the exception of speech production, are commonly considered to rely on separate resources from auditory processing (Wickens, 2002). Second, safety critical concerns mandate investigations into understanding how auditory processing could impair manual steering and not the other way around. Nonethe-less, our ability to process auditory information during steering contributes to our situational awareness and to be alert to anomalous events in our surroundings. Research findings often imply that it is driving itself that interferes with our ability to cognitively process information that is unrelated to driving. For example, Radeborg and colleagues (1999) found that driving task could impair recall and semantic judgment of auditory sentences. In this light, it could be argued that automated driving could allow us to converse more coherently over the mobile phone. In my talk, I will present EEG/ERP research from my lab that investigates how manual steering reduces neural responses to auditory stimuli processing. We have reported that invol-untary ERP amplitudes to novel sounds are generally reduced during manual steering (Scheer, Bülthoff, Chuang, 2016). More recently, we have found that auditory ERPs are only diminished by manipulating aspects of steering that require executive function—namely the complexity of vehicle handling and not the external unpredictability of the tracked goal. In this regard, it might be sufficient to only automate driving tasks that involve executive functions. These are likely to include tactical and strategic maneuvers, such as lane-changing or exit selection, as opposed to lane-keeping or headway maintenance. However, it is the latter, namely lateral and longitu-dinal control, that tends to be the focus of self-driving technology (Trimble, Bishop, Morgan, Blanco, 2014).