Press "Enter" to skip to content

Fine-Tuning Hugging Face for Named Entity Recognition in Japanese

Tsuyoshi Matsuzaki tries out a named entity recognition project with the Hugging Face library:

Now a lot of AI companies (such as, OpenAI, NLP Cloud, Google, NVIDIA, etc) are providing pre-trained large language models including methods that tune to enable models trained. Among such tools and framework, HuggingFace is widely used and providing over 20,000 transformer-based models.

In this post, I’ll show you brief fine-tuned example of transformer models in Hugging Face for your beginning.
In the last part of this post, I’ll also optimize training with DeepSpeed which is well integrated with HuggingFace transformers.

Click through for the results of this analysis.