Sharing others’ emotional states may facilitate understanding their intentions and actions. Here we show that networks of brain areas “tick together” in participants who are viewing similar emotional ...events in a movie. Participants’ brain activity was measured with functional MRI while they watched movies depicting unpleasant, neutral, and pleasant emotions. After scanning, participants watched the movies again and continuously rated their experience of pleasantness–unpleasantness (i.e., valence) and of arousal–calmness. Pearson’s correlation coefficient was used to derive multisubject voxelwise similarity measures intersubject correlations (ISCs) of functional MRI data. Valence and arousal time series were used to predict the moment-to-moment ISCs computed using a 17-s moving average. During movie viewing, participants' brain activity was synchronized in lower- and higher-order sensory areas and in corticolimbic emotion circuits. Negative valence was associated with increased ISC in the emotion-processing network (thalamus, ventral striatum, insula) and in the default-mode network (precuneus, temporoparietal junction, medial prefrontal cortex, posterior superior temporal sulcus). High arousal was associated with increased ISC in the somatosensory cortices and visual and dorsal attention networks comprising the visual cortex, bilateral intraparietal sulci, and frontal eye fields. Seed-voxel–based correlation analysis confirmed that these sets of regions constitute dissociable, functional networks. We propose that negative valence synchronizes individuals’ brain areas supporting emotional sensations and understanding of another’s actions, whereas high arousal directs individuals’ attention to similar features of the environment. By enhancing the synchrony of brain activity across individuals, emotions may promote social interaction and facilitate interpersonal understanding.
•Naturalistic stimuli provide powerful stimuli for neuroimaging studies.•Temporal-receptive windows and event-segmentation are vital short-term memory mechanisms.•Suspense and perspective taking ...disclose fundamental attentional mechanisms in the brain.•Emotions can be classified based on distributed patterns of brain activity.•A priori information robustly influences how the brain processes social interactions.
Using movies and narratives as naturalistic stimuli in human neuroimaging studies has yielded significant advances in understanding of cognitive and emotional functions. The relevant literature was reviewed, with emphasis on how the use of naturalistic stimuli has helped advance scientific understanding of human memory, attention, language, emotions, and social cognition in ways that would have been difficult otherwise. These advances include discovering a cortical hierarchy of temporal receptive windows, which supports processing of dynamic information that accumulates over several time scales, such as immediate reactions vs. slowly emerging patterns in social interactions. Naturalistic stimuli have also helped elucidate how the hippocampus supports segmentation and memorization of events in day-to-day life and have afforded insights into attentional brain mechanisms underlying our ability to adopt specific perspectives during natural viewing. Further, neuroimaging studies with naturalistic stimuli have revealed the role of the default-mode network in narrative-processing and in social cognition. Finally, by robustly eliciting genuine emotions, these stimuli have helped elucidate the brain basis of both basic and social emotions apparently manifested as highly overlapping yet distinguishable patterns of brain activity.
Musical training is known to modify cortical organization. Here, we show that such modifications extend to subcortical sensory structures and generalize to processing of speech. Musicians had earlier ...and larger brainstem responses than nonmusician controls to both speech and music stimuli presented in auditory and audiovisual conditions, evident as early as 10 ms after acoustic onset. Phase-locking to stimulus periodicity, which likely underlies perception of pitch, was enhanced in musicians and strongly correlated with length of musical practice. In addition, viewing videos of speech (lip-reading) and music (instrument being played) enhanced temporal and frequency encoding in the auditory brainstem, particularly in musicians. These findings demonstrate practice-related changes in the early sensory encoding of auditory and audiovisual information.
Discrete Neural Signatures of Basic Emotions Saarimäki, Heini; Gotsopoulos, Athanasios; Jääskeläinen, Iiro P ...
Cerebral cortex (New York, N.Y. 1991),
06/2016, Letnik:
26, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Categorical models of emotions posit neurally and physiologically distinct human basic emotions. We tested this assumption by using multivariate pattern analysis (MVPA) to classify brain activity ...patterns of 6 basic emotions (disgust, fear, happiness, sadness, anger, and surprise) in 3 experiments. Emotions were induced with short movies or mental imagery during functional magnetic resonance imaging. MVPA accurately classified emotions induced by both methods, and the classification generalized from one induction condition to another and across individuals. Brain regions contributing most to the classification accuracy included medial and inferior lateral prefrontal cortices, frontal pole, precentral and postcentral gyri, precuneus, and posterior cingulate cortex. Thus, specific neural signatures across these regions hold representations of different emotional states in multimodal fashion, independently of how the emotions are induced. Similarity of subjective experiences between emotions was associated with similarity of neural patterns for the same emotions, suggesting a direct link between activity in these brain regions and the subjective emotional experience.
We investigated the neural underpinnings of timbral, tonal, and rhythmic features of a naturalistic musical stimulus. Participants were scanned with functional Magnetic Resonance Imaging (fMRI) while ...listening to a stimulus with a rich musical structure, a modern tango. We correlated temporal evolutions of timbral, tonal, and rhythmic features of the stimulus, extracted using acoustic feature extraction procedures, with the fMRI time series. Results corroborate those obtained with controlled stimuli in previous studies and highlight additional areas recruited during musical feature processing. While timbral feature processing was associated with activations in cognitive areas of the cerebellum, and sensory and default mode network cerebrocortical areas, musical pulse and tonality processing recruited cortical and subcortical cognitive, motor and emotion-related circuits. In sum, by combining neuroimaging, acoustic feature extraction and behavioral methods, we revealed the large-scale cognitive, motor and limbic brain circuitry dedicated to acoustic feature processing during listening to a naturalistic stimulus. In addition to these novel findings, our study has practical relevance as it provides a powerful means to localize neural processing of individual acoustical features, be it those of music, speech, or soundscapes, in ecological settings.
► Novel paradigm combines fMRI, acoustic feature extraction and behavioral psychology. ► Timbre recruits cerebellar cognitive areas, sensory and DMN-related cortical areas. ► Rhythm and tonality recruit limbic regions, cognitive and somatomotor areas.
Human behaviour is context-dependent—based on predictions and influenced by the environment and other people. We live in a dynamic world where both the social stimuli and their context are constantly ...changing. Similar dynamic, natural stimuli should, in the future, be increasingly used to study social brain functions, with parallel development of appropriate signal-analysis methods. Understanding dynamic neural processes also requires accurate time-sensitive characterization of the behaviour. To go beyond the traditional stimulus–response approaches, brain activity should be recorded simultaneously from two interacting subjects to reveal why human social interaction is critically different from just reacting to each other. This theme issue on Attending to and neglecting people contains original work and review papers on person perception and social interaction. The articles cover research from neuroscience, psychology, robotics, animal interaction research and microsociology. Some of the papers are co-authored by scientists who presented their own, independent views in the recent Attention and Performance XXVI conference but were brave enough to join forces with a colleague having a different background and views. In the future, information needs to converge across disciplines to provide us a more holistic view of human behaviour, its interactive nature, as well as the temporal dynamics of our social world.
Neurophysiological and psychological models posit that emotions depend on connections across wide-spread corticolimbic circuits. While previous studies using pattern recognition on neuroimaging data ...have shown differences between various discrete emotions in brain activity patterns, less is known about the differences in functional connectivity. Thus, we employed multivariate pattern analysis on functional magnetic resonance imaging data (i) to develop a pipeline for applying pattern recognition in functional connectivity data, and (ii) to test whether connectivity patterns differ across emotion categories. Six emotions (anger, fear, disgust, happiness, sadness, and surprise) and a neutral state were induced in 16 participants using one-minute-long emotional narratives with natural prosody while brain activity was measured with functional magnetic resonance imaging (fMRI). We computed emotion-wise connectivity matrices both for whole-brain connections and for 10 previously defined functionally connected brain subnetworks and trained an across-participant classifier to categorize the emotional states based on whole-brain data and for each subnetwork separately. The whole-brain classifier performed above chance level with all emotions except sadness, suggesting that different emotions are characterized by differences in large-scale connectivity patterns. When focusing on the connectivity within the 10 subnetworks, classification was successful within the default mode system and for all emotions. We thus show preliminary evidence for consistently different sustained functional connectivity patterns for instances of emotion categories particularly within the default mode system.
Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs) generated in the auditory cortex. ...The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention. We recorded ERPs in 15 subjects while they were presented with auditory, visual, and audiovisual spoken syllables. Audiovisual stimuli consisted of incongruent auditory and visual components known to elicit a McGurk effect, i.e., a visually driven alteration in the auditory speech percept. In a Dual task condition, participants were asked to identify spoken syllables whilst monitoring a rapid visual stream of pictures for targets, i.e., they had to divide their attention. In a Single task condition, participants identified the syllables without any other tasks, i.e., they were asked to ignore the pictures and focus their attention fully on the spoken syllables. The McGurk effect was weaker in the Dual task than in the Single task condition, indicating an effect of attentional load on audiovisual speech perception. Early auditory ERP components, N1 and P2, peaked earlier to audiovisual stimuli than to auditory stimuli when attention was fully focused on syllables, indicating neurophysiological audiovisual interaction. This latency decrement was reduced when attention was loaded, suggesting that attention influences early neural processing of audiovisual speech. We conclude that reduced attention weakens the interaction between vision and audition in speech.
For successful communication, we need to understand the external world consistently with others. This task requires sufficiently similar cognitive schemas or psychological perspectives that act as ...filters to guide the selection, interpretation and storage of sensory information, perceptual objects and events. Here we show that when individuals adopt a similar psychological perspective during natural viewing, their brain activity becomes synchronized in specific brain regions. We measured brain activity with functional magnetic resonance imaging (fMRI) from 33 healthy participants who viewed a 10-min movie twice, assuming once a ‘social’ (detective) and once a ‘non-social’ (interior decorator) perspective to the movie events. Pearson's correlation coefficient was used to derive multisubject voxelwise similarity measures (inter-subject correlations; ISCs) of functional MRI data. We used k-nearest-neighbor and support vector machine classifiers as well as a Mantel test on the ISC matrices to reveal brain areas wherein ISC predicted the participants' current perspective. ISC was stronger in several brain regions—most robustly in the parahippocampal gyrus, posterior parietal cortex and lateral occipital cortex—when the participants viewed the movie with similar rather than different perspectives. Synchronization was not explained by differences in visual sampling of the movies, as estimated by eye gaze. We propose that synchronous brain activity across individuals adopting similar psychological perspectives could be an important neural mechanism supporting shared understanding of the environment.