ITU Copenhagen, When Being unseen by mBERT is just the beginning, Handling New Languages With Multilingual Language Models Benjamin Muller Jan 15, 2021 PDF Source Themes Benjamin Muller PhD student in Natural Language Processing Related Cross-Lingual GENQA: A Language-Agnostic Generative Question Answering Approach for Open-Domain Question Answering First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models Bar Ilan University, Tel-Aviv, Transfer Learning on an Unseen North-African Arabic Dialect, December 2019 CamemBERT: a Tasty French Language Model