Change your cover photo
Change your cover photo
Alexis studies multisensory perception using psychophysics, data modeling, fMRI and MEG techniques.
This user account status is Approved

This user has not added any information to their profile yet.

In the past, neuroscience investigated perception as a modular process where visual perception was considered independent from audition and touch (and vice versa). However we can not ignore that we live in a multisensory environment, where sensory information is often correlated across sensory modalities. Our brain is a very flexible processing system capable of taking advantage of cross-modal correlations. Along the last 20 years, multisensory research is demonstrating that it is impossible to understand perception in one sensory modality without considering its interaction with the others. Perception is the result of a synergic interplay between multiple sensory modalities.

 

I am interested in understanding how the human brain encodes, retains and integrates information from multiple sensory modalities. More specifically, I believe that sensory modalities exchange and combine sensory information via predictive-coding mechanism, where sensory predictions in one sensory modality are contrasted with the sensory evidence in the other modalities. More recently my interest has broaden to investigate how the brain maintains in memory multisensory events. 

To test these hypotheses I apply psychophysics, data modeling, fMRI and MEG techniques.