ITU Copenhagen, When Being unseen by mBERT is just the beginning, Handling New Languages With Multilingual Language Models Benjamin Muller Jan 15, 2021 PDF Source Themes Benjamin Muller AI Researcher Related First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models Bar Ilan University, Tel-Aviv, Transfer Learning on an Unseen North-African Arabic Dialect, December 2019 CamemBERT: a Tasty French Language Model Can Multilingual Language Models Transfer to an Unseen Dialect? A Case Study on North African Arabizi