Electrophysiological evidence for a multisensory speech-specific mode of perception.
نویسندگان
چکیده
We investigated whether the interpretation of auditory stimuli as speech or non-speech affects audiovisual (AV) speech integration at the neural level. Perceptually ambiguous sine-wave replicas (SWS) of natural speech were presented to listeners who were either in 'speech mode' or 'non-speech mode'. At the behavioral level, incongruent lipread information led to an illusory change of the sound only for listeners in speech mode. The neural correlates of this illusory change were examined in an audiovisual mismatch negativity (MMN) paradigm with SWS sounds. In an oddball sequence, 'standards' consisted of SWS/onso/coupled with lipread/onso/, and 'deviants' consisted of SWS/onso/coupled with lipread/omso/. The AV deviant induced a McGurk-MMN for listeners in speech mode, but not for listeners in non-speech mode. These results demonstrate that the illusory change in the sound by incongruent lipread information evoked an MMN which presumably takes place at a pre-attentive sensory processing stage.
منابع مشابه
The Default Mode of Primate Vocal Communication and Its Neural Correlates
It’s been argued that the integration of the visual and auditory channels during human speech perception is the default mode of speech processing (Rosenblum, 2005). That is, speech perception is not a capacity that is ‘piggybacked’ on to auditory-only speech perception. Visual information from the mouth and other parts of the face is used by all perceivers and readily integrates with auditory s...
متن کاملMultisensory Vocal Communication and Its Neural Bases in Nonhuman Primates
Face-to-face communication is a temporally extended, multisensory process. In human speech, the auditory component is, for the most part, the result of vocal fold movements that release sound energy. This sound energy travels up through the oral and nasal cavities, where it is radiated out of the lips and nostrils. Changes in the internal and external shape of these cavities lead to different s...
متن کاملRunning head : Audio - visual speech perception is special Audio - visual speech perception is special
In face-to-face conversation speech is perceived by ear and eye. We studied the prerequisites of audio-visual speech perception by using perceptually ambiguous sine wave replicas of natural speech as auditory stimuli. When the subjects were not aware that the auditory stimuli were speech, they showed only negligible integration of auditory and visual stimuli. When the same subjects learned to p...
متن کاملNeural correlates of multisensory integration of ecologically valid audiovisual events
A question that has emerged over recent years is whether audiovisual (AV) speech perception is a special case of multi-sensory perception. Electrophysiological (ERP) studies have found that auditory neural activity (N1 component of the ERP) induced by speech is suppressed and speeded up when a speech sound is accompanied by concordant lip movements. In Experiment 1, we show that this AV interac...
متن کاملAudio-visual speech perception is special.
In face-to-face conversation speech is perceived by ear and eye. We studied the prerequisites of audio-visual speech perception by using perceptually ambiguous sine wave replicas of natural speech as auditory stimuli. When the subjects were not aware that the auditory stimuli were speech, they showed only negligible integration of auditory and visual stimuli. When the same subjects learned to p...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neuropsychologia
دوره 50 7 شماره
صفحات -
تاریخ انتشار 2012