English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Meeting Abstract

Manual steering diminishes brain responses to unrelated auditory processing

MPS-Authors
/persons/resource/persons83861

Chuang,  L
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons192769

Scheer,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource

Link
(Any fulltext)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Chuang, L., & Scheer, M. (2017). Manual steering diminishes brain responses to unrelated auditory processing. In 2. Kongress Fachgruppe Verkehrspsychologie: Immer mehr Technik - von Smartphone zu Automaten (pp. 5-5).


Cite as: https://hdl.handle.net/21.11116/0000-0000-C60E-2
Abstract
How does manual steering affect our ability to process auditory information? This question has been neglected for at least two reasons. First, manual responses, with the exception of speech production, are commonly considered to rely on separate resources from auditory processing (Wickens, 2002). Second, safety critical concerns mandate investigations into understanding how auditory processing could impair manual steering and not the other way around. Nonethe-less, our ability to process auditory information during steering contributes to our situational awareness and to be alert to anomalous events in our surroundings. Research findings often imply that it is driving itself that interferes with our ability to cognitively process information that is unrelated to driving. For example, Radeborg and colleagues (1999) found that driving task could impair recall and semantic judgment of auditory sentences. In this light, it could be argued that automated driving could allow us to converse more coherently over the mobile phone. In my talk, I will present EEG/ERP research from my lab that investigates how manual steering reduces neural responses to auditory stimuli processing. We have reported that invol-untary ERP amplitudes to novel sounds are generally reduced during manual steering (Scheer, Bülthoff, Chuang, 2016). More recently, we have found that auditory ERPs are only diminished by manipulating aspects of steering that require executive function—namely the complexity of vehicle handling and not the external unpredictability of the tracked goal. In this regard, it might be sufficient to only automate driving tasks that involve executive functions. These are likely to include tactical and strategic maneuvers, such as lane-changing or exit selection, as opposed to lane-keeping or headway maintenance. However, it is the latter, namely lateral and longitu-dinal control, that tends to be the focus of self-driving technology (Trimble, Bishop, Morgan, Blanco, 2014).