Loading...

News

LUKE is now in Hugging Face Transformers

News
2021-05-24

Our AI model LUKE has now been added to Hugging Face Transformers (version 4.6.0). Hugging Face Transformers is a standard library that provides general purpose implementation for Natural Language Processing (NLP) models. It provides the implementations of state of the art models to enable model training, evaluation, and sharing with minimal lines of code. We are excited to have LUKE joining this renowned library. We look forward to having users of Transformers incorporate LUKE to better solve NLP tasks.

LUKE is developed using a novel masked entity prediction pre-training task with an entity-aware self-attention mechanism, allowing it to capture the detailed real-world knowledge written in Wikipedia. With the model’s enhanced knowledge of our world, the model performs well on practical NLP problems. In particular, LUKE achieved state-of-the-art results on five well-known NLP benchmarks:

  • CoNLL-2003 (named entity recognition)
  • SQuAD 1.1 (extractive question answering)
  • TACRED (relation classification)
  • Open Entity (entity typing)
  • ReCoRD (cloze-style question answering)

We have provided three examples of LUKE on Colab Notebooks for everyone to try out easily.

We are planning to integrate LUKE into our newest AI model Sōseki to improve its accuracy further. We will also continue improving our NLP technology, including the multilingual support of LUKE. We hope that our contribution will help developers around the world adopt cutting edge natural language processing technology to their own problems.

Top