Interestingly, we find that simply introducing another non-English language (not necessarily in the demonstration) helps📈, by non-English prepending context-irrelevant sentences. We show that strategic multilingual exposure bridges the gap for underrepresented languages💡.
06.03.2025 15:31
👍 0
🔁 0
💬 0
📌 1
✅ Demos in mixed high-resource languages are more robust and effective than individual high-resource languages, outperforming English-only demos.
✅ This phenomenon is generalizable across models (Llama3, GPT-4o) & tasks (math, commonsense, verbal).
06.03.2025 15:31
👍 0
🔁 0
💬 1
📌 0
Key Takeaways
✅ In-context Demonstrations in non-Latin high-resource languages (e.g., Chinese, Japanese) improves LLM performance across low-resource languages, more effectively than English-only demos.
06.03.2025 15:31
👍 1
🔁 0
💬 1
📌 2
How can we bridge LLMs' multilingual performance for low-resource languages by prompting strategies 🌍? Our latest work (w/ @fredashi.bsky.social and Andrew Xue @uwaterloo.ca) systematically analyzes various strategies of Multilingual In-Context Learning.
06.03.2025 15:31
👍 1
🔁 0
💬 1
📌 0
🚀New Paper🚀 Systematic Analysis of Multilingual In-Context Learning
📄 arxiv.org/pdf/2502.11364
Multilingual LLMs like Llama3.1/Qwen2.5 have shown English-rivalling performance on high-resource languages, while they often significantly underperform on low-resource languages.
#NLP #Multilingual #LLM
06.03.2025 15:31
👍 2
🔁 1
💬 1
📌 1