Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Journal Article

Classification of complex emotions using EEG and virtual environment: Proof of concept and therapeutic implication


Deco,  Gustavo
Computational Neuroscience Group, Department of Information and Communication Technologies, Center for Brain and Cognition, University Pompeu Fabra, Barcelona, Spain;
Catalan Institution for Research and Advanced Studies (ICREA), University Pompeu Fabra, Barcelona, Spain;
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
Turner Institute for Brain and Mental Health, Monash University, Melbourne, Australia;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

(Publisher version), 3MB

Supplementary Material (public)
There is no public supplementary material available

De Filippi, E., Wolter, M., Melo, B. R. P., Tierra-Criollo, C. J., Bortolini, T., Deco, G., et al. (2021). Classification of complex emotions using EEG and virtual environment: Proof of concept and therapeutic implication. Frontiers in Human Neuroscience, 15: 711279. doi:10.3389/fnhum.2021.711279.

Cite as: https://hdl.handle.net/21.11116/0000-0009-76E1-4
During the last decades, neurofeedback training for emotional self-regulation has received significant attention from scientific and clinical communities. Most studies have investigated emotions using functional magnetic resonance imaging (fMRI), including the real-time application in neurofeedback training. However, the electroencephalogram (EEG) is a more suitable tool for therapeutic application. Our study aims at establishing a method to classify discrete complex emotions (e.g., tenderness and anguish) elicited through a near-immersive scenario that can be later used for EEG-neurofeedback. EEG-based affective computing studies have mainly focused on emotion classification based on dimensions, commonly using passive elicitation through single-modality stimuli. Here, we integrated both passive and active elicitation methods. We recorded electrophysiological data during emotion-evoking trials, combining emotional self-induction with a multimodal virtual environment. We extracted correlational and time-frequency features, including frontal-alpha asymmetry (FAA), using Complex Morlet Wavelet convolution. Thinking about future real-time applications, we performed within-subject classification using 1-s windows as samples and we applied trial-specific cross-validation. We opted for a traditional machine-learning classifier with low computational complexity and sufficient validation in online settings, the Support Vector Machine. Results of individual-based cross-validation using the whole feature sets showed considerable between-subject variability. The individual accuracies ranged from 59.2 to 92.9% using time-frequency/FAA and 62.4 to 92.4% using correlational features. We found that features of the temporal, occipital, and left-frontal channels were the most discriminative between the two emotions. Our results show that the suggested pipeline is suitable for individual-based classification of discrete emotions, paving the way for future personalized EEG-neurofeedback training.