Our brains rapidly respond to human faces and can differentiate between many identities, retrieving rich semantic emotional-knowledge information. Studies provide a mixed picture of how such ...information affects event-related potentials (ERPs). We systematically examined the effect of feature-based attention on ERP modulations to briefly presented faces of individuals associated with a crime. The tasks required participants (N = 40 adults) to discriminate the orientation of lines overlaid onto the face, the age of the face, or emotional information associated with the face. Negative faces amplified the N170 ERP component during all tasks, whereas the early posterior negativity (EPN) and late positive potential (LPP) components were increased only when the emotional information was attended to. These findings suggest that during early configural analyses (N170), evaluative information potentiates face processing regardless of feature-based attention. During intermediate, only partially resource-dependent, processing stages (EPN) and late stages of elaborate stimulus processing (LPP), attention to the acquired emotional information is necessary for amplified processing of negatively evaluated faces.
•The study of neural correlates of consciousness (NCC) focused mostly on vision, while audition has been largely overlooked.•We conducted an inattentional deafness study addressing the prevalent ...confound of conscious perception with reporting it.•Auditory awareness is associated with an anterior early negativity, while later effects reflect task relevance of stimuli.
In recent years, several ERP components have been identified as potential neural correlates of consciousness (NCC), including early negativities and late positivities. Based on experiments in the visual modality, it has recently been shown that awareness is often confounded with reporting it, possibly overestimating the NCC. It is unknown whether similar constraints also exist in the auditory modality. In order to address this gap, we presented spoken words in a sustained inattentional deafness paradigm. Electrophysiological responses were obtained in three physically identical experimental conditions that differed only with respect to the participants’ instructions. Participants were either left uninformed or informed about the presence of spoken words while confronted with an auditory distractor task (U/I condition), informed about the words while exposed to the same task as before (I condition), or requested to respond to the now task-relevant speech stimuli (TR condition). After completion of the U/I condition, only informed participants reported awareness of the words. In ERPs, awareness of words in the U/I and I condition was accompanied by an anterior auditory awareness negativity (AAN). Only when stimuli were task-relevant, i.e., during the TR condition, late positivities emerged. Taken together, these results indicate that early negativities but not late positivities index awareness across sensory modalities. Thus, they provide evidence for a recurrent processing framework, which highlights the importance of early sensory processing in conscious perception.
Abstract
In neuroscientific studies, the naturalness of face presentation differs; a third of published studies makes use of close-up full coloured faces, a third uses close-up grey-scaled faces and ...another third employs cutout grey-scaled faces. Whether and how these methodological choices affect emotion-sensitive components of the event-related brain potentials (ERPs) is yet unclear. Therefore, this pre-registered study examined ERP modulations to close-up full-coloured and grey-scaled faces as well as cutout fearful and neutral facial expressions, while attention was directed to no-face oddballs. Results revealed no interaction of face naturalness and emotion for any ERP component, but showed, however, large main effects for both factors. Specifically, fearful faces and decreasing face naturalness elicited substantially enlarged N170 and early posterior negativity amplitudes and lower face naturalness also resulted in a larger P1.This pattern reversed for the LPP, showing linear increases in LPP amplitudes with increasing naturalness. We observed no interaction of emotion with face naturalness, which suggests that face naturalness and emotion are decoded in parallel at these early stages. Researchers interested in strong modulations of early components should make use of cutout grey-scaled faces, while those interested in a pronounced late positivity should use close-up coloured faces.
Emotional facial expressions lead to modulations of early event-related potentials (ERPs). However, it has so far remained unclear how far these modulations represent face-specific effects rather ...than differences in low-level visual features, and to which extent they depend on available processing resources. To examine these questions, we conducted two preregistered independent experiments (N = 40 in each experiment) using different variants of a novel task that manipulates peripheral perceptual load across levels but keeps overall visual stimulation constant. At the display center, we presented task-irrelevant angry, neutral, and happy faces and their Fourier phase-scrambled versions, which preserved low-level visual features. The results of both studies showed load-independent P1 and N170 emotional expression effects. Importantly, by using Bayesian analyses we could confirm that these facial expression effects were face-independent for the P1 but not for the N170 component. We conclude that firstly, ERP modulations during the P1 interval strongly depend on low-level visual information, while the N170 modulation requires the processing of figural facial expression features. Secondly, both P1 and N170 modulations appear to be immune to a large range of variations in perceptual load.
Dyadic interactions are associated with the exchange of personality-related messages, which can be congruent or incongruent with one’s self-view. In the current preregistered study (N = 52), we ...investigated event-related potentials (ERPs) toward real social evaluations in order to uncover the neural mechanisms underlying the processing of congruent and incongruent evaluative feedback. Participants interacted first, and then during an electroencephalogram (EEG) session, they received evaluations from their interaction partner that were either congruent or incongruent with their own ratings. Findings show potentiated processing of self-related incongruent negative evaluations at early time points (N1) followed by increased processing of both incongruent negative and positive evaluations at midlatency time windows (early posterior negativity) and a prioritized processing of self-related incongruent positive evaluations at late time points (feedback-related P3, late positive potential). These findings reveal that, after real social interactions, evaluative feedback about oneself that violates one’s self-view modulates all processing stages with an early negativity and a late positivity bias.
•The mechanisms contributing to mismatch negativity generation include both adaptation and prediction error computations as two major candidates.•We used computational modeling and a novel ...experimental design to disentangle different mechanisms of mismatch processing depending on awareness and task relevance.•Results suggest that mismatch processing does not rely on either adaptation or prediction error computation alone, but the relative contributions of different mechanisms vary with task settings.
Detection of regularities and their violations in sensory input is key to perception. Violations are indexed by an early EEG component called the mismatch negativity (MMN) – even if participants are distracted or unaware of the stimuli. On a mechanistic level, two dominant models have been suggested to contribute to the MMN: adaptation and prediction. Whether and how context conditions, such as awareness and task relevance, modulate the mechanisms of MMN generation is unknown. We conducted an EEG study disentangling influences of task relevance and awareness on the visual MMN. Then, we estimated different computational models for the generation of single-trial amplitudes in the MMN time window. Amplitudes were best explained by a prediction error model when stimuli were task-relevant but by an adaptation model when task-irrelevant and unaware. Thus, mismatch generation does not rely on one predominant mechanism but mechanisms vary with task relevance of stimuli.
Previous research on the neural correlates of consciousness (NCC) in visual perception revealed an early event-related potential (ERP), the visual awareness negativity (VAN), to be associated with ...stimulus awareness. However, due to the use of brief stimulus presentations in previous studies, it remains unclear whether awareness-related negativities represent a transient onset-related response or correspond to the duration of a conscious percept. Studies are required that allow prolonged stimulus presentation under aware and unaware conditions. The present ERP study aimed to tackle this challenge by using a novel stimulation design. Male and female human participants (n = 62) performed a visual task while task-irrelevant line stimuli were presented in the background for either 500 or 1000 ms. The line stimuli sometimes contained a face, which needed so-called visual one-shot learning to be seen. Half of the participants were informed about the presence of the face, resulting in faces being perceived by the informed but not by the uninformed participants. Comparing ERPs between the informed and uninformed group revealed an enhanced negativity over occipitotemporal electrodes that persisted for the entire duration of stimulus presentation. Our results suggest that sustained visual awareness negativities (SVAN) are associated with the duration of stimulus presentation.
Prioritized processing of fearful compared to neutral faces has been proposed to result from evolutionary adaptation of the contrast sensitivity function (CSF) to the features of emotionally relevant ...faces and/or vice versa. However, it is unknown whether a stimulus merely has to feature the amplitude spectrum of a fearful face to be prioritized or whether the relevant spatial frequencies have to occur with specific phases and orientations. Prioritized processing is indexed by specific increases of Event‐Related Potentials (ERPs) of the EEG and occurs throughout different early processing stages, indexed by emotion‐related modulations of the P1, N170, and EPN. In this pre‐registered study, we manipulated phase and amplitude properties of the Fourier spectra of neutral and fearful faces to test the effect of phase coherence (PC, face vs. scramble) and orientation coherence (OC, original vs. rotational average) and their interactions with differential emotion processing. We found that differential emotion processing was not present at the level of P1 but strongly affected N170 and EPN. In both cases, intact phase coherence was required for enhanced processing of fearful faces. OC did not interact with emotion. While faces produced the typical N170 effect, we observed a reversed effect for scrambles. Additional exploratory independent component analysis (ICA) suggests that this reversal could signal a mismatch between an early "perceptual hypothesis" and feedback of configural information. In line with our expectations, fearful‐neutral differences for the N170 and EPN depend on configural information, i.e., recognizable faces.
Human contrast sensitivity is optimally tuned to the frequency spectrum of emotional faces, suggesting that a primary visual feature analysis contributes to emotional face processing. We show that electrophysiological indices of threat processing in fearful faces require intact phase information in their spectra. Orientation information does not show significant effects. In the absence of intact phase information, mid‐latency effects are reversed, possibly indicating a mismatch between an early “perceptual hypothesis” and feedback of configural information.
Abstract
The processing of fearful facial expressions is prioritized by the human brain. This priority is maintained across various information processing stages as evident in early, intermediate and ...late components of event-related potentials (ERPs). However, emotional modulations are inconsistently reported for these different processing stages. In this pre-registered study, we investigated how feature-based attention differentially affects ERPs to fearful and neutral faces in 40 participants. The tasks required the participants to discriminate either the orientation of lines overlaid onto the face, the sex of the face or the face’s emotional expression, increasing attention to emotion-related features. We found main effects of emotion for the N170, early posterior negativity (EPN) and late positive potential (LPP). While N170 emotional modulations were task-independent, interactions of emotion and task were observed for the EPN and LPP. While EPN emotion effects were found in the sex and emotion tasks, the LPP emotion effect was mainly driven by the emotion task. This study shows that early responses to fearful faces are task-independent (N170) and likely based on low-level and configural information while during later processing stages, attention to the face (EPN) or—more specifically—to the face’s emotional expression (LPP) is crucial for reliable amplified processing of emotional faces.
Prioritized processing of fearful compared to neutral faces is reflected in increased amplitudes of components of the event-related potential (ERP). It is unknown whether specific face parts drive ...these modulations. Here, we investigated the contributions of face parts on ERPs to task-irrelevant fearful and neutral faces using an ERP-dependent facial decoding technique and a large sample of participants (N = 83). Classical ERP analyses showed typical and robust increases of N170 and EPN amplitudes by fearful relative to neutral faces. Facial decoding further showed that the absolute amplitude of these components, as well as the P1, was driven by the low-frequency contrast of specific face parts. However, the difference between fearful and neutral faces was not driven by any specific face part, as supported by Bayesian statistics. Furthermore, there were no correlations between trait anxiety and main effects or interactions. These results suggest that increased N170 and EPN amplitudes to task-irrelevant fearful compared to neutral faces are not driven by specific facial regions but represent a holistic face processing effect.