OpenCodePapers

multimodal-emotion-recognition-on-iemocap-4

Multimodal Emotion Recognition
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeWeighted F1AccuracyF1Weighted RecallModelNameReleaseDate
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion Recognition✓ Link86.5286.53GraphSmile2024-07-31
Joyful: Joint Modality Fusion and Graph Contrastive Learning for Multimodal Emotion Recognition✓ Link85.7085.60Joyful2023-11-18
COGMEN: COntextualized GNN based Multimodal Emotion recognitioN✓ Link84.50COGMEN2022-05-05
0/1 Deep Neural Networks via Block Coordinate Descent74.1bc-LSTM2022-06-19
Context-Dependent Domain Adversarial Neural Network for Multimodal Emotion Recognition82.7DANN2020-10-28
MMER: Multimodal Multi-task Learning for Speech Emotion Recognition✓ Link81.7MMER2022-03-31
Combining deep and unsupervised features for multilingual speech emotion recognition✓ Link80.478PATHOSnet v22021-01-10
Speech Emotion Recognition Based on Self-Attention Weight Correction for Acoustic and Text Features76.876.85Self-attention weight correction (A+T)2022-11-08
Multimodal Sentiment Analysis using Hierarchical Fusion with Context Modeling✓ Link76.576.8CHFusion2018-06-16
HCAM -- Hierarchical Cross Attention Model for Multi-modal Emotion Recognition70.5Audio + Text (Stage III)2023-04-14
MultiMAE-DER: Multimodal Masked Autoencoder for Dynamic Emotion Recognition✓ Link63.73MultiMAE-DER2024-04-28