نتایج جستجو برای: audiovisual distraction
تعداد نتایج: 14265 فیلتر نتایج به سال:
Orienting responses to audiovisual events in the environment can benefit markedly by the integration of visual and auditory spatial information. However, logically, audiovisual integration would only be considered successful for stimuli that are spatially and temporally aligned, as these would be emitted by a single object in space-time. As humans do not have prior knowledge about whether novel...
BACKGROUND A prevailing view is that audiovisual integration requires temporally coincident signals. However, a recent study failed to find any evidence for audiovisual integration in visual search even when using synchronized audiovisual events. An important question is what information is critical to observe audiovisual integration. METHODOLOGY/PRINCIPAL FINDINGS Here we demonstrate that te...
Audiovisual resources are expanding rapidly these days. Large amounts of these complicated materials could lead to serious problems in resource descript ion, representat ion and organization. Supported by the National Social Science Foundation of China, the project entitled Innovative Study on Audiovisual Metadata and its Retrieval has been conducted since 2002. During the preparation phase, th...
Intermodal binding between affective information that is seen as well as heard triggers a mandatory process of audiovisual integration. In order to track the time course of this audiovisual binding, event related brain potentials were recorded while subjects saw facial expression and concurrently heard auditory fragment. The results suggest that the combination of the two inputs is early in tim...
Both auditory and audiovisual speech synthesis have been the subject of many research projects throughout the years. Unfortunately, in recent years only very few research focuses on synthesis for the Dutch language. Especially for audiovisual synthesis, hardly any available system or resource can be found. In this paper we describe the creation of a new extensive Dutch speech database, containi...
This study examined fMRI activation when perceivers either passively observed or observed and imitated matched or mismatched audiovisual ("McGurk") speech stimuli. Greater activation was observed in the inferior frontal gyrus (IFG) overall for imitation than for perception of audiovisual speech and for imitation of the McGurk-type mismatched stimuli than matched audiovisual stimuli. This unique...
In this study we investigate previous claims that a region in the left posterior superior temporal sulcus (pSTS) is more activated by audiovisual than unimodal processing. First, we compare audiovisual to visual-visual and auditory-auditory conceptual matching using auditory or visual object names that are paired with pictures of objects or their environmental sounds. Second, we compare congrue...
Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs) generated in the auditory cortex. The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention. We recorded ERPs in 15 subjects while they were presented with auditory, visual, a...
Speech perception engages both auditory and visual modalities. Limitations of traditional accuracy-only approaches in the investigation of audiovisual speech perception have motivated the use of new methodologies. In an audiovisual speech identification task, we utilized capacity (Townsend and Nozawa, 1995), a dynamic measure of efficiency, to quantify audiovisual integration. Capacity was used...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید