Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

EmotionX-KU: BERT-Max based Contextual Emotion Classifier (1906.11565v2)

Published 27 Jun 2019 in cs.CL

Abstract: We propose a contextual emotion classifier based on a transferable LLM and dynamic max pooling, which predicts the emotion of each utterance in a dialogue. A representative emotion analysis task, EmotionX, requires to consider contextual information from colloquial dialogues and to deal with a class imbalance problem. To alleviate these problems, our model leverages the self-attention based transferable LLM and the weighted cross entropy loss. Furthermore, we apply post-training and fine-tuning mechanisms to enhance the domain adaptability of our model and utilize several machine learning techniques to improve its performance. We conduct experiments on two emotion-labeled datasets named Friends and EmotionPush. As a result, our model outperforms the previous state-of-the-art model and also shows competitive performance in the EmotionX 2019 challenge. The code will be available in the Github page.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Kisu Yang (7 papers)
  2. Dongyub Lee (9 papers)
  3. Taesun Whang (9 papers)
  4. Seolhwa Lee (14 papers)
  5. Heuiseok Lim (49 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com