In a visual–tactile interference paradigm, subjects judged whether tactile vibrations arose on a finger or thumb (upper vs. lower locations), while ignoring distant visual distractor lights that also ...appeared in upper or lower locations. Incongruent visual distractors (e.g. a lower light combined with upper touch) disrupt such tactile judgements, particularly when appearing near the tactile stimulus (e.g. on the same side of space as the stimulated hand). Here we show that actively wielding tools can change this pattern of crossmodal interference. When such tools were held in crossed positions (connecting the left hand to the right visual field, and vice-versa), the spatial constraints on crossmodal interference reversed, so that visual distractors in the other visual field now disrupted tactile judgements most for a particular hand. This phenomenon depended on active tool-use, developing with increased experience in using the tool. We relate these results to recent physiological and neuropsychological findings.
The way in which humans represent their own bodies is critical in guiding their interactions with the environment. To achieve successful body–space interactions, the body representation is strictly ...connected with that of the space immediately surrounding it through efficient visuo-tactile crossmodal integration. Such a body–space integrated representation is not fixed, but can be dynamically modulated by the use of external tools. Our study aims to explore the effect of using a complex tool, namely a functional prosthesis, on crossmodal visuo-tactile spatial interactions in healthy participants. By using the crossmodal visuo-tactile congruency paradigm, we found that prolonged training with a mechanical hand capable of distal hand movements and providing sensory feedback induces a pattern of interference, which is not observed after a brief training, between visual stimuli close to the prosthesis and touches on the body. These results suggest that after extensive, but not short, training the functional prosthesis acquires a visuo-tactile crossmodal representation akin to real limbs. This finding adds to previous evidence for the embodiment of functional prostheses in amputees, and shows that their use may also improve the crossmodal combination of somatosensory feedback delivered by the prosthesis with visual stimuli in the space around it, thus effectively augmenting the patients' visuomotor abilities.
•A functional prosthesis can be incorporated in healthy subjects' body representation.•After training, visual and tactile events are spatially remapped to the new location.•This representation arises after extensive training, but not after brief training.•Slow-paced processes are responsible for complex changes in body representation.•Training with sensory feedback may promote embodiment of prostheses on amputees.
How to reduce pain is a fundamental clinical and experimental question. Acute pain is a complex experience which seems to emerge from the co-activation of two main processes, namely the ...nociceptive/discriminative analysis and the affective/cognitive evaluation of the painful stimulus. Recently it has been found that pain threshold increases following the visual magnification of the body part targeted by the painful stimulation. This finding is compatible with the well-known notion that body representation and perceptual experience relay on complex, multisensory factors. However, the level of cognitive processing and the physiological mechanisms underlying this analgesic effect are still to be investigated. In the present work we found that following the visual magnification of a body part, the Skin Conductance Responses (SCR), to an approaching painful stimulus increases before contact and decreases following the real stimulation, compared to the non-distorted view of the hand. By contrast, an unspecific SCR increase is found when the hand is visually shrunk. Moreover a reduction of subjective pain experience was found specifically for the magnified hand in explicit pain ratings. These findings suggest that the visual increase of body size enhances the cognitive, anticipatory component of pain processing; such an anticipatory reaction reduces the response to the following contact with the noxious stimulus. The present results support the idea that cognitive aspects of pain experience relay on the multisensory representation of the body, and that could be usefully exploited for inducing a significant reduction of subjective pain experience.
In order to determine precisely the location of a tactile stimulus presented to the hand it is necessary to know not only which part of the body has been stimulated, but also where that part of the ...body lies in space. This involves the multisensory integration of visual, tactile, proprioceptive, and even auditory cues regarding limb position. In recent years, researchers have become increasingly interested in the question of how these various sensory cues are weighted and integrated in order to enable people to localize tactile stimuli, as well as to give rise to the `felt' position of our limbs, and ultimately the multisensory representation of 3-D peripersonal space. We highlight recent research on this topic using the crossmodal congruency task, in which participants make speeded elevation discrimination responses to vibrotactile targets presented to the thumb or index finger, while simultaneously trying to ignore irrelevant visual distractors presented from either the same (i.e., congruent) or a different (i.e., incongruent) elevation. Crossmodal congruency effects (calculated as performance on incongruent
−
congruent trials) are greatest when visual and vibrotactile stimuli are presented from the same azimuthal location, thus providing an index of common position across different sensory modalities. The crossmodal congruency task has been used to investigate a number of questions related to the representation of space in both normal participants and brain-damaged patients. In this review, we detail the major findings from this research, and highlight areas of convergence with other cognitive neuroscience disciplines.
Tools for the body (schema) Maravita, Angelo; Iriki, Atsushi
Trends in cognitive sciences,
02/2004, Letnik:
8, Številka:
2
Journal Article
Recenzirano
What happens in our brain when we use a tool to reach for a distant object? Recent neurophysiological, psychological and neuropsychological research suggests that this extended motor capability is ...followed by changes in specific neural networks that hold an updated map of body shape and posture (the putative ‘Body Schema’ of classical neurology). These changes are compatible with the notion of the inclusion of tools in the ‘Body Schema’, as if our own effector (e.g. the hand) were elongated to the tip of the tool. In this review we present empirical support for this intriguing idea from both single-neuron recordings in the monkey brain and behavioural performance of normal and brain-damaged humans. These relatively simple neural and behavioural aspects of tool-use shed light on more complex evolutionary and cognitive aspects of body representation and multisensory space coding for action.
Sensory attenuation (SA), the dampened perception of self-generated sensory information, is typically associated with reduced event-related potential signals, such as for the N1 component of auditory ...event-related potentials. SA, together with efficient monitoring of intentions and actions, should facilitate the distinction between self-generated and externally generated sensory events, thereby optimizing interaction with the world. According to many, SA is deficient in schizophrenia. The question arises whether altered SA reflects a sufficient mechanism to explain positive symptoms such as auditory hallucinations. A systematic association of reduced auditory SA in hallucinating patients would support this hypothesis.
We conducted a series of meta-analyses on 15 studies on auditory SA in which the N1 component of event-related potential–electroencephalogram signals was measured during talking (self-generated sensory signals condition) or when listening to prerecorded vocalizations (externally generated sensory signals condition).
We found that individuals with schizophrenia did show some auditory SA because their N1 signal was significantly attenuated in talking conditions compared with listening conditions. However, the magnitude of such attenuation was reduced in individuals with schizophrenia compared to healthy control participants. This phenomenon generalizes independently from the stage of the disease, the severity of positive symptoms, and whether patients have auditory hallucinations or not.
These findings suggest that reduced SA cannot be a sufficient mechanism for explaining positive symptoms such as auditory hallucinations in schizophrenia. Because reduced SA was also present in participants at risk of schizophrenia, reduced SA may represent a risk factor for the disorder. We discuss the implications of these results for clinical-cognitive models of schizophrenia.
The sense of agency is the experience of being the author of self-generated actions and their outcomes. Both clinical manifestations and experimental evidence suggest that the agency experience and ...the mechanisms underlying agency attribution may be dysfunctional in schizophrenia. Yet, studies investigating the sense of agency in these patients show seemingly conflicting results: some indicated under-attribution of self-agency (coherently with certain positive symptoms), while others suggested over-attribution of self-agency. In this review, we assess whether recent theoretical frameworks can reconcile these divergent results. We examine whether the identification of agency abnormalities in schizophrenia might depend on the measure of self-agency considered (depending on the specific task requirements) and the available agency-related cues. We conclude that all these aspects are relevant to predict and characterize the type of agency misattribution that schizophrenia patients might show. We argue that one particular model, based on the predictive coding theory, can reconcile the interpretation of the multifarious phenomenology of agency manifestations in schizophrenia, paving the way for testing agency disorders in novel ways.
•Literature shows both over- and under-attribution of self-agency in schizophrenia.•Patients’ agency seems to depend on the measure used and available agency cues.•A recent predictive coding model is the most effective in reconciling mixed results.
In mirror reflections, visual stimuli in near peripersonal space (e.g., an object in the hand) can project the retinal image of far, extrapersonal stimuli "beyond" the mirror. We studied the ...interaction of such visual reflections with tactile stimuli in a cross-modal congruency task. We found that visual distractors produce stronger interference on tactile judgments when placed close to the stimulated hand, but observed indirectly as distant mirror reflections, than when directly observed in equivalently distant far space, even when in contact with a dummy hand or someone else's hand in the far location. The stronger visual-tactile interference for the mirror condition implies that near stimuli seen as distant reflections in a mirror view of one 's own hands can activate neural networks coding peripersonal space, because these visual stimuli are coded as having a true source near to the body.
•Fingers have preferential associations with relative spatial positions.•Finger-space associations are found in tactile tasks with and without vision.•Finger-space associations are identifiable at ...the conceptual level.•The thumb is associated with bottom and the index finger is associated with top.•The relationship between body and space mapping is bi-directional.
The representation of the body in the brain is constantly updated to allow optimal sensorimotor interactions with the external world. In addition to dynamic features, body representation holds stable features that are still largely unknown. In the present work we explored the hypothesis that body parts have preferential associations with relative spatial locations. Specifically, in three experiments, we found consistent preferential associations between the index finger and the top position, and between the thumb and the bottom position. This association was found in a tactile sensory discrimination task, which was conducted both with and without vision, as well as at the implicit conceptual association level. These findings show that body parts and spatial locations are stably associated. Therefore, not only are body segments dynamically mapped in space for perception and action, but they also hold intrinsic spatial information that contributes to somatosensory spatial processing.
The role of arm posture in the Uznadze haptic aftereffect is investigated: two identical test stimuli (i.e., spheres, TS) clenched simultaneously appear haptically different in size after hands have ...been adapted to two spheres (adapting stimuli, AS) differing in size: the hand adapted to a small AS feels TS bigger than the hand adapted to a big AS. In two experiments, participants evaluated the haptic impressions of two TS after adaptation by finding their match on a visual scale. In Experiment 1, all tasks were carried out with arms either uncrossed or crossed. In Experiment 2, only the matching task was performed with arms either uncrossed or crossed while adaptation was conducted by continuously changing arm posture from uncrossed to crossed and vice versa. The illusion occurred irrespectively of arm posture; however, its magnitude was smaller when adaptation was carried out in the classical condition of uncrossed arms. Results are discussed in light of two functional mechanisms: low-level somatotopic mapping (i.e., stimuli conformation) and high-level level factors (i.e., arm posture) that could modulate the haptic perception.
Public Significance StatementDoes proprioceptive information regarding the position of the hands affect size haptic processing? This study shows that the position of the hands in external space does not affect the direction of the Uznadze haptic illusion, but it can influence its magnitude. Hence, the work provides further understanding regarding the role of the egocentric spatial location of body parts (i.e., arms) on the construction of haptic experiences of size through bimanual processing. Furthermore, the results presented support the use of the crossmodal method employed by Daneyko et al. (2020) for measuring the magnitude of the Uznadze illusion under different types of manipulation.