Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 22 tok/s Pro
GPT-4o 93 tok/s Pro
Kimi K2 205 tok/s Pro
GPT OSS 120B 426 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Improved Text Emotion Prediction Using Combined Valence and Arousal Ordinal Classification (2404.01805v1)

Published 2 Apr 2024 in cs.LG

Abstract: Emotion detection in textual data has received growing interest in recent years, as it is pivotal for developing empathetic human-computer interaction systems. This paper introduces a method for categorizing emotions from text, which acknowledges and differentiates between the diversified similarities and distinctions of various emotions. Initially, we establish a baseline by training a transformer-based model for standard emotion classification, achieving state-of-the-art performance. We argue that not all misclassifications are of the same importance, as there are perceptual similarities among emotional classes. We thus redefine the emotion labeling problem by shifting it from a traditional classification model to an ordinal classification one, where discrete emotions are arranged in a sequential order according to their valence levels. Finally, we propose a method that performs ordinal classification in the two-dimensional emotion space, considering both valence and arousal scales. The results show that our approach not only preserves high accuracy in emotion prediction but also significantly reduces the magnitude of errors in cases of misclassification.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. Bert-cnn: A deep learning model for detecting emotions from text. Computers, Materials & Continua, 71(2).
  2. Comparative analyses of bert, roberta, distilbert, and xlnet for text-based emotion recognition. In 2020 17th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), pages 117–121. IEEE.
  3. Emotion and sentiment analysis of tweets using bert. In EDBT/ICDT Workshops, volume 3.
  4. Diogo Cortiz. 2021. Exploring transformers in emotion recognition: a comparison of bert, distillbert, roberta, xlnet and electra. arXiv preprint arXiv:2104.02041.
  5. Goemotions: A dataset of fine-grained emotions. arXiv preprint arXiv:2005.00547.
  6. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
  7. Paul Ekman. 1992. An argument for basic emotions. Cognition & emotion, 6(3-4):169–200.
  8. Lisa Feldman Barrett and James A Russell. 1998. Independence and bipolarity in the structure of current affect. Journal of personality and social psychology, 74(4):967.
  9. Automatically classifying emotions based on text: A comparative exploration of different datasets. In 2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI), pages 342–346. IEEE.
  10. Puneet Kumar and Balasubramanian Raman. 2022. A bert based dual-channel explainable text emotion recognition system. Neural Networks, 150:392–407.
  11. Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
  12. Georgios Paltoglou and Michael Thelwall. 2012. Seeing stars of valence and arousal in blog posts. IEEE Transactions on Affective Computing, 4(1):116–123.
  13. Dimensional emotion detection from categorical emotion. arXiv preprint arXiv:1911.02499.
  14. James A Russell. 1980. A circumplex model of affect. Journal of personality and social psychology, 39(6):1161.
  15. Kuisail at semeval-2020 task 12: Bert-cnn for offensive speech identification in social media. arXiv preprint arXiv:2007.13184.
  16. Klaus R Scherer. 2005. What are emotions? and how can they be measured? Social science information, 44(4):695–729.
  17. KR Scherer and H Wallbott. 1990. International survey on emotion antecedents and reactions (isear).
  18. Varsha Suresh and Desmond C Ong. 2021. Not all negatives are equal: Label-aware contrastive loss for fine-grained text classification. arXiv preprint arXiv:2109.05427.
  19. Xlnet: Generalized autoregressive pretraining for language understanding. Advances in neural information processing systems, 32.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: