Exploring Transformers in Emotion Recognition: a comparison of BERT, DistillBERT, RoBERTa, XLNet and ELECTRA
This paper investigates how Natural Language Understanding (NLU) could be applied in Emotion Recognition, a specific task in affective computing. We finetuned different transformers language models (BERT, DistilBERT, RoBERTa, XLNet, and ELECTRA) using a fine-grained emotion dataset and evaluating them in terms of performance (f1-score) and time to complete.
PDF AbstractDatasets
Add Datasets
introduced or used in this paper
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Adam •
Attention Dropout •
BERT •
BPE •
Dense Connections •
DistilBERT •
Dropout •
GELU •
Layer Normalization •
Linear Layer •
Linear Warmup With Linear Decay •
Multi-Head Attention •
Residual Connection •
RoBERTa •
Scaled Dot-Product Attention •
SentencePiece •
Softmax •
Weight Decay •
WordPiece •
XLNet