Recent studies have identified between-trial priming effects in visual search tasks, but often with constraints on the possible similarities or changes across successive trials, and usually with the ...main emphasis on effects of target repetition. Here we sought to obtain a more thorough characterization of between-trial priming effects in speeded visual search, where observers determined target presence or absence among a set of distractors. The results show that various separable priming effects have a major influence on visual search performance. Facilitation was evident when a target was repeated between-trials, but there was also strong priming due to repetition of distractor types, even between successive trials for which no target was presented on either trial. Search also proceeded faster if the same distractor types were repeated, even when the current target was different from the preceding target. We also investigated the possible impact of role-reversals for particular display items, from being a target on one trial to becoming a distractor on the next, and vice-versa. We find that such role-reversals substantially affect search performance, over and above the effects of repetition per se when those were held constant.
In temporal ventriloquism, auditory events can illusorily attract perceived timing of a visual onset
1–3. We investigated whether timing of a static sound can also influence spatio-temporal ...processing of visual apparent motion, induced here by visual bars alternating between opposite hemifields. Perceived direction typically depends on the relative interval in timing between visual left-right and right-left flashes (e.g., rightwards motion dominating when left-to-right interflash intervals are shortest
4). In our new multisensory condition, interflash intervals were equal, but auditory beeps could slightly lag the right flash, yet slightly lead the left flash, or vice versa. This auditory timing strongly influenced perceived visual motion direction, despite providing no spatial auditory motion signal whatsoever. Moreover, prolonged adaptation to such auditorily driven apparent motion produced a robust visual motion aftereffect in the opposite direction, when measured in subsequent silence. Control experiments argued against accounts in terms of possible auditory grouping, or possible attention capture. We suggest that the motion arises because the sounds change perceived visual timing, as we separately confirmed. Our results provide a new demonstration of multisensory influences on sensory-specific perception
5, with timing of a static sound influencing spatio-temporal processing of visual motion direction.
The brain should integrate related but not unrelated information from different senses. Temporal patterning of inputs to different modalities may provide critical information about whether those ...inputs are related or not. We studied effects of temporal correspondence between auditory and visual streams on human brain activity with functional magnetic resonance imaging (fMRI). Streams of visual flashes with irregularly jittered, arrhythmic timing could appear on right or left, with or without a stream of auditory tones that coincided perfectly when present (highly unlikely by chance), were noncoincident with vision (different erratic, arrhythmic pattern with same temporal statistics), or an auditory stream appeared alone. fMRI revealed blood oxygenation level-dependent (BOLD) increases in multisensory superior temporal sulcus (mSTS), contralateral to a visual stream when coincident with an auditory stream, and BOLD decreases for noncoincidence relative to unisensory baselines. Contralateral primary visual cortex and auditory cortex were also affected by audiovisual temporal correspondence or noncorrespondence, as confirmed in individuals. Connectivity analyses indicated enhanced influence from mSTS on primary sensory areas, rather than vice versa, during audiovisual correspondence. Temporal correspondence between auditory and visual streams affects a network of both multisensory (mSTS) and sensory-specific areas in humans, including even primary visual and auditory cortex, with stronger responses for corresponding and thus related audiovisual inputs.
Combining information across modalities can affect sensory performance. We studied how co-occurring sounds modulate behavioral visual detection sensitivity (d'), and neural responses, for visual ...stimuli of higher or lower intensity. Co-occurrence of a sound enhanced human detection sensitivity for lower- but not higher-intensity visual targets. Functional magnetic resonance imaging (fMRI) linked this to boosts in activity-levels for sensory-specific visual and auditory cortex, plus multisensory superior temporal sulcus (STS), specifically for a lower-intensity visual event when paired with a sound. Thalamic structures in visual and auditory pathways, the lateral and medial geniculate bodies, respectively (LGB, MGB), showed a similar pattern. Subject-by-subject psychophysical benefits correlated with corresponding fMRI signals in visual, auditory, and multisensory regions. We also analyzed differential "coupling" patterns of LGB and MGB with other regions in the different experimental conditions. Effective-connectivity analyses showed enhanced coupling of sensory-specific thalamic bodies with the affected cortical sites during enhanced detection of lower-intensity visual events paired with sounds. Coupling strength between visual and auditory thalamus with cortical regions, including STS, covaried parametrically with the psychophysical benefit for this specific multisensory context. Our results indicate that multisensory enhancement of detection sensitivity for low-contrast visual stimuli by co-occurring sounds reflects a brain network involving not only established multisensory STS and sensory-specific cortex but also visual and auditory thalamus.
Recent research on multisensory perception suggests a number of general principles for crossmodal integration and that the standard model in the field — feedforward convergence of information — must ...be modified to include a role for feedback projections from multimodal to unimodal brain areas.
While the frontal eye fields (FEF) are traditionally associated with eye movements, recent work indicates possible roles in controlling selective visual processing. We applied 10 Hz bursts of ...transcranial magnetic stimulation (TMS) over left or right human FEF while subjects performed a partial-report task that allowed quantitative estimates of top-down control and other parameters affecting visual performance. Participants selectively reported digits in a relevant color (targets) but not those in an irrelevant color (nontargets) from a brief masked display. A target could appear alone or together with an accompanying item (nontarget or target) in the same or opposite hemifield. Targets were normally identified better when presented with a nontarget than with another target, indicating prioritization of task-relevant targets and thus top-down control. We found this usual pattern of results without TMS, and also with TMS over left FEF. However, during right FEF TMS, the detrimental impact of accompanying distractors increased. Formal analysis in terms of Bundesen's (1990) theory of visual attention confirmed that right FEF TMS diminished the top-down control parameter for both hemifields, indicating an FEF role in top-down selection even for targets defined by the nonspatial property of color. Direct comparison with our previous findings for parietal TMS (Hung et al., 2005) confirmed the distinct role of FEF in top-down control, plus right-hemisphere predominance for this in humans.
Regions in human frontal cortex may have modulatory top-down influences on retinotopic visual cortex, but to date neuroimaging methods have only been able to provide indirect evidence for such ...functional interactions between remote but interconnected brain regions. Here we combined transcranial magnetic stimulation (TMS) with concurrent functional magnetic resonance imaging (fMRI), plus psychophysics, to show that stimulation of the right human frontal eye-field (FEF) produced a characteristic topographic pattern of activity changes in retinotopic visual areas V1-V4, with functional consequences for visual perception.
FEF TMS led to activity increases for retinotopic representations of the peripheral visual field, but to activity decreases for the central field, in areas V1-V4. These frontal influences on visual cortex occurred in a top-down manner, independently of visual input. TMS of a control site (vertex) did not elicit such visual modulations, and saccades, blinks, or pupil dilation could not account for our results. Finally, the effects of FEF TMS on activity in retinotopic visual cortex led to a behavioral prediction that we confirmed psychophysically by showing that TMS of the frontal site (again compared with vertex) enhanced perceived contrast for peripheral relative to central visual stimuli.
Our results provide causal evidence that circuits originating in the human FEF can modulate activity in retinotopic visual cortex, in a manner that differentiates the central and peripheral visual field, with functional consequences for perception. More generally, our study illustrates how the new approach of concurrent TMS-fMRI can now reveal causal interactions between remote but interconnected areas of the human brain.
Neuroimaging can address activity across the entire brain in relation to cognition, but is typically correlative rather than causal. Brain stimulation can target a local brain area causally, but ...without revealing the entire network affected. Combining brain stimulation with concurrent neuroimaging allows a new causal approach to how interplay between extended networks of brain regions can support cognition. Brain stimulation does not affect only the targeted local region but also activity in remote interconnected regions. These remote effects depend on cognitive factors (e.g. task-condition), revealing dynamic changes in interplay between brain areas. We illustrate this with examples from top-down modulation of visual cortex, response-competition, inter-hemispheric rivalry and motor tasks; but the new approach should be applicable to many domains of cognition.
High and low spatial frequency information in visual images is processed by distinct neural channels. Using event-related functional magnetic resonance imaging (fMRI) in humans, we show dissociable ...roles of such visual channels for processing faces and emotional fearful expressions. Neural responses in fusiform cortex, and effects of repeating the same face identity upon fusiform activity, were greater with intact or high-spatial-frequency face stimuli than with low-frequency faces, regardless of emotional expression. In contrast, amygdala responses to fearful expressions were greater for intact or low-frequency faces than for high-frequency faces. An activation of pulvinar and superior colliculus by fearful expressions occurred specifically with low-frequency faces, suggesting that these subcortical pathways may provide coarse fear-related inputs to the amygdala.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
Although much traditional sensory research has studied each sensory modality in isolation, there has been a recent explosion of interest in causal interplay between different senses. Various ...techniques have now identified numerous multisensory convergence zones in the brain. Some convergence may arise surprisingly close to low-level sensory-specific cortex, and some direct connections may exist even between primary sensory cortices. A variety of multisensory phenomena have now been reported in which sensory-specific brain responses and perceptual judgments concerning one sense can be affected by relations with other senses. We survey recent progress in this multisensory field, foregrounding human studies against the background of invasive animal work and highlighting possible underlying mechanisms. These include rapid feedforward integration, possible thalamic influences, and/or feedback from multisensory regions to sensory-specific brain areas. Multisensory interplay is more prevalent than classic modular approaches assumed, and new methods are now available to determine the underlying circuits.