Zum Inhalt springen

A spatial and temporal transformer-based EEG emotion recognition in VR environment

Frontiers in Human Neuroscience

Abstract


With the rapid development of deep learning, Electroencephalograph(EEG) emotion recognition has played a significant role in aective brain-computer interfaces. Many advanced emotion recognition models have achieved excellent results. However, current research is mostly conducted in laboratory settings for emotion induction, which lacks sucient ecological validity and diers significantly from real-world scenarios. Moreover, emotion recognition models are typically trained and tested on datasets collected in laboratory environments, with little validation of their eectiveness in real-world situations. VR, providing a highly immersive and realistic experience, is an ideal tool for emotional research. In this paper, we collect EEG data from participants while they watched VR videos. We propose a purely Transformer-based method, EmoSTT. We use two separate Transformer modules to comprehensively model the temporal and spatial information of EEG signals. We validate the eectiveness of EmoSTT on a passive paradigm collected in a laboratory environment and an active paradigm emotion dataset collected in a VR environment. Compared with state-of-the-art methods, our method achieves robust emotion classification performance and can be well transferred between dierent emotion elicitation paradigms.

Frontiers in Human Neuroscience Vol. 19 2025


Authors

Li, M., Yu, P., & Shen, Y.

  https://doi.org/10.3389/fnhum.2025.1517273

Phase slips extracted from derivatives of EEG data provide a deeper insight into the formation of cortical phase transitions
Frontiers in Integrative Neuroscience