hide
Free keywords:
-
Abstract:
Models of perceptual decision making, which are based on dynamic stimuli such as random dot motion, are predominantly concerned with how evidence for a stimulus is accumulated over time (e.g., Wang, 2008; Beck, 2008). However, it is unclear how the brain derives this evidence from the sensory dynamics. While it is conceivable that simple feature-detecting neurons can, for example, directly signal evidence for motion in a specific direction, it is less clear how evidence for complex motion, such as human movements, is computed from sensory input. We present a model of the perceptual lower level system which is based on probabilistic inference for dynamical systems (Friston, 2008) and can be used to provide input for higher level decision making systems. We illustrate this mechanism using a random dot motion paradigm, where we (i) consider simple uni-directional motion as typically used in neuroscience experiments and (ii) show that the same system can also infer, i.e. recognize, complex dot motion as generated by humans (cf. point light walkers) in an online fashion. The present model is implemented by a neuronal network and computes stable percepts rapidly, thereby enabling both fast decision (reaction) times and high accuracy. We suggest that the combination of the present model with recent models for evidence accumulation in perceptual decision making may be used to apply neurobiologically plausible decision making strategies to real-world stimuli like movements generated by humans.