News
Our paper was accepted at ACL
News
2022-03-08
A paper was accepted at the Association for Computational Linguistics (ACL), a top NLP conference to be held on May 22nd-27th. The paper demonstrates that multilingual pretrained language models can be effectively improved by leveraging the vector representations of entities for downstream cross-lingual tasks.
mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models
Ryokan Ri (Studio Ousia/The University of Tokyo), Ikuya Yamada, Yoshimasa Tsuruoka (The University of Tokyo) ACL 2022.