Understanding Slang with LLMs: Modelling Cross-Cultural Nuances through Paraphrasing

Ifeoluwa Wuraola, Nina Dethlefs, Daniel Marciniak


Abstract
In the realm of social media discourse, the integration of slang enriches communication, reflecting the sociocultural identities of users. This study investigates the capability of large language models (LLMs) to paraphrase slang within climate-related tweets from Nigeria and the UK, with a focus on identifying emotional nuances. Using DistilRoBERTa as the base-line model, we observe its limited comprehension of slang. To improve cross-cultural understanding, we gauge the effectiveness of leading LLMs ChatGPT 4, Gemini, and LLaMA3 in slang paraphrasing. While ChatGPT 4 and Gemini demonstrate comparable effectiveness in slang paraphrasing, LLaMA3 shows less coverage, with all LLMs exhibiting limitations in coverage, especially of Nigerian slang. Our findings underscore the necessity for culturally sensitive LLM development in emotion classification, particularly in non-anglocentric regions.
Anthology ID:
2024.emnlp-main.869
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15525–15531
Language:
URL:
https://aclanthology.org/2024.emnlp-main.869/
DOI:
10.18653/v1/2024.emnlp-main.869
Bibkey:
Cite (ACL):
Ifeoluwa Wuraola, Nina Dethlefs, and Daniel Marciniak. 2024. Understanding Slang with LLMs: Modelling Cross-Cultural Nuances through Paraphrasing. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 15525–15531, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Understanding Slang with LLMs: Modelling Cross-Cultural Nuances through Paraphrasing (Wuraola et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.869.pdf