Is conscious perception of emotional face expression related to enhanced cortical responses? Electroencephalographic data (112 channels) were recorded in 15 normal adults during the presentation of cue stimuliwith neutral, happy or sad schematic faces (duration: ‘‘threshold time’’ inducing about 50%of correct recognitions), masking stimuli (2 s), and go stimuli with happy or sad schematic faces (0.5 s). The subjects clicked left (right) mouse button in response to go stimuli with happy (sad) faces. After the response, they said ‘‘seen’’ or ‘‘not seen’’ with reference to previous cue stimulus. Electroencephalographic data formed visual event-related potentials (ERPs). Cortical sources of ERPs were estimated by LORETA software. Reaction time to go stimuliwas generally shorter during ‘‘seen’’ than ‘‘not seen’’ trials, possibly due to covert attention and awareness. The cue stimuli evoked four ERP components (posteriorN100, N170, P200, and P300),which had similar peak latency in the ‘‘not seen’’ and ‘‘seen’’ ERPs. Only N170 amplitude showed differences in amplitude in the ‘‘seen’’ versus ‘‘not seen’’ ERPs. Compared to the ‘‘not seen’’ ERPs, the ‘‘seen’’ ones showed prefrontal, premotor, and posterior parietal sources of N170 higher in amplitude with the sad cue stimuli and lower in amplitude with the neutral and happy cue stimuli. These results suggest that nonconscious and conscious processing of schematic emotional facial expressions shares a similar temporal evolution of cortical activity, and conscious processing induces an early enhancement of bilateral cortical activity for the schematic sad facial expressions (N170)
Cortical responses to consciousness of schematic emotional facial expressions: A high-resolution EEG study
BUTTIGLIONE, Maura;
2010-01-01
Abstract
Is conscious perception of emotional face expression related to enhanced cortical responses? Electroencephalographic data (112 channels) were recorded in 15 normal adults during the presentation of cue stimuliwith neutral, happy or sad schematic faces (duration: ‘‘threshold time’’ inducing about 50%of correct recognitions), masking stimuli (2 s), and go stimuli with happy or sad schematic faces (0.5 s). The subjects clicked left (right) mouse button in response to go stimuli with happy (sad) faces. After the response, they said ‘‘seen’’ or ‘‘not seen’’ with reference to previous cue stimulus. Electroencephalographic data formed visual event-related potentials (ERPs). Cortical sources of ERPs were estimated by LORETA software. Reaction time to go stimuliwas generally shorter during ‘‘seen’’ than ‘‘not seen’’ trials, possibly due to covert attention and awareness. The cue stimuli evoked four ERP components (posteriorN100, N170, P200, and P300),which had similar peak latency in the ‘‘not seen’’ and ‘‘seen’’ ERPs. Only N170 amplitude showed differences in amplitude in the ‘‘seen’’ versus ‘‘not seen’’ ERPs. Compared to the ‘‘not seen’’ ERPs, the ‘‘seen’’ ones showed prefrontal, premotor, and posterior parietal sources of N170 higher in amplitude with the sad cue stimuli and lower in amplitude with the neutral and happy cue stimuli. These results suggest that nonconscious and conscious processing of schematic emotional facial expressions shares a similar temporal evolution of cortical activity, and conscious processing induces an early enhancement of bilateral cortical activity for the schematic sad facial expressions (N170)I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.