Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 58 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 17 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 179 tok/s Pro
GPT OSS 120B 463 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

TEII: Think, Explain, Interact and Iterate with Large Language Models to Solve Cross-lingual Emotion Detection (2405.17129v2)

Published 27 May 2024 in cs.CL and cs.AI

Abstract: Cross-lingual emotion detection allows us to analyze global trends, public opinion, and social phenomena at scale. We participated in the Explainability of Cross-lingual Emotion Detection (EXALT) shared task, achieving an F1-score of 0.6046 on the evaluation set for the emotion detection sub-task. Our system outperformed the baseline by more than 0.16 F1-score absolute, and ranked second amongst competing systems. We conducted experiments using fine-tuning, zero-shot learning, and few-shot learning for LLM-based models as well as embedding-based BiLSTM and KNN for non-LLM-based techniques. Additionally, we introduced two novel methods: the Multi-Iteration Agentic Workflow and the Multi-Binary-Classifier Agentic Workflow. We found that LLM-based approaches provided good performance on multilingual emotion detection. Furthermore, ensembles combining all our experimented models yielded higher F1-scores than any single approach alone.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (12)
  1. Emotion detection using a bidirectional long-short term memory (bilstm) neural network. International Journal of Current Pharmaceutical Review and Research, Vol 4, no 11:1718–1732.
  2. Cross-lingual emotion detection. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 6948–6958, Marseille, France. European Language Resources Association.
  3. Findings of the wassa 2024 exalt shared task on explainability for cross-lingual emotion in tweets. In Proceedings of the 14th Workshop of on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis@ACL 2024, Bangkok, Thailand.
  4. Gorilla: Large language model connected with massive apis. Preprint, arXiv:2305.15334.
  5. Communicative agents for software development. Preprint, arXiv:2307.07924.
  6. Hugginggpt: Solving ai tasks with chatgpt and its friends in hugging face. Preprint, arXiv:2303.17580.
  7. Reflexion: Language agents with verbal reinforcement learning. Preprint, arXiv:2303.11366.
  8. M2SA: Multimodal and multilingual model for sentiment analysis of tweets. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 10833–10845, Torino, Italia. ELRA and ICCL.
  9. Chain-of-thought prompting elicits reasoning in large language models. Preprint, arXiv:2201.11903.
  10. Autogen: Enabling next-gen llm applications via multi-agent conversation. Preprint, arXiv:2308.08155.
  11. Wenbiao Yin and Lin Shang. 2022. Efficient nearest neighbor emotion classification with BERT-whitening. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 4738–4745, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
  12. Qimin Zhou and Hao Wu. 2018. NLP at IEST 2018: BiLSTM-attention and LSTM-attention via soft voting in emotion classification. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, pages 189–194, Brussels, Belgium. Association for Computational Linguistics.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets