Papers
arxiv:2511.13954

A Brain Wave Encodes a Thousand Tokens: Modeling Inter-Cortical Neural Interactions for Effective EEG-based Emotion Recognition

Published on Nov 17
· Submitted by Nilay K. Bhatnagar on Nov 19
Authors:
,

Abstract

RBTransformer, a Transformer-based model, enhances EEG-based emotion recognition by capturing inter-cortical neural dynamics in latent space, outperforming existing methods on multiple datasets and dimensions.

AI-generated summary

Human emotions are difficult to convey through words and are often abstracted in the process; however, electroencephalogram (EEG) signals can offer a more direct lens into emotional brain activity. Recent studies show that deep learning models can process these signals to perform emotion recognition with high accuracy. However, many existing approaches overlook the dynamic interplay between distinct brain regions, which can be crucial to understanding how emotions unfold and evolve over time, potentially aiding in more accurate emotion recognition. To address this, we propose RBTransformer, a Transformer-based neural network architecture that models inter-cortical neural dynamics of the brain in latent space to better capture structured neural interactions for effective EEG-based emotion recognition. First, the EEG signals are converted into Band Differential Entropy (BDE) tokens, which are then passed through Electrode Identity embeddings to retain spatial provenance. These tokens are processed through successive inter-cortical multi-head attention blocks that construct an electrode x electrode attention matrix, allowing the model to learn the inter-cortical neural dependencies. The resulting features are then passed through a classification head to obtain the final prediction. We conducted extensive experiments, specifically under subject-dependent settings, on the SEED, DEAP, and DREAMER datasets, over all three dimensions, Valence, Arousal, and Dominance (for DEAP and DREAMER), under both binary and multi-class classification settings. The results demonstrate that the proposed RBTransformer outperforms all previous state-of-the-art methods across all three datasets, over all three dimensions under both classification settings. The source code is available at: https://github.com/nnilayy/RBTransformer.

Community

Paper author Paper submitter

We release RBTransformer, a Transformer-based architecture for EEG-based emotion recognition that models inter-cortical neural interactions using our proposed Inter-Cortical Multi-Head Self-Attention (Inter-Cortical MHSA) mechanism, capturing structured communication among distributed cortical regions during affective processing. RBTransformer achieves state-of-the-art performance across all three major EEG benchmarks: SEED, DEAP, and DREAMER, across all dimensions (Valence, Arousal, Dominance) and under both binary and multi-class settings, consistently surpassing prior deep learning and attention-based models.

Alongside the paper, we provide a full open-source ecosystem for reproducibility

GitHub Codebase: https://github.com/nnilayy/RBTransformer
Complete training pipeline, preprocessing framework, model architecture & ablation scripts.

Hugging Face 🤗 Model Weights Collection: https://huggingface.co/nnilayy/collections
All trained checkpoints, ablation variants, and dataset-specific model collections for SEED, DEAP, and DREAMER.

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2511.13954 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2511.13954 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2511.13954 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.