Springe direkt zu Inhalt

Uta Noppeney

Apr 20, 2015 | 04:00 PM - 06:00 PM

"See what you hear - Constructing a representation of the world from vision and audition"


To form a coherent percept of the environment the brain needs to integrate sensory signals from a common source and segregate those from different sources. Human observers have been shown to integrate sensory signals in line with Bayesian Causal Inference by taking into account the uncertainty about the world’s causal structure. Over the past decade, evidence has accumulated that multisensory integration is not deferred to later processing in association cortices but starts already in primary, putatively unisensory, areas.Given this multitude of multisensory integration sites, characterizing their functional similarities and differences is of critical importance. Our research demonstrates that multisensory integration emerges in a functional hierarchy with temporal coincidence detection in primary sensory, informational integration in association and decisional interactions in prefrontal areas. Audiovisual interactions in low level sensory areas are mediated via multiple mechanisms including feedforward thalamocortical, direct connections between sensory areas and top down influences from higher order association areas. Combining Bayesian modelling and multivariate decoding we demonstrate that the brain integrates sensory signals in line with Bayesian Causal Inference by simultaneously encoding multiple perceptual estimates along the cortical hierarchy. Critically, only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the world’s causal structure is taken into account and sensory signals are combined weighted by their sensory reliability and task-relevance as predicted by Bayesian Causal Inference.

Time & Location

Apr 20, 2015 | 04:00 PM - 06:00 PM

KL 32/202