The AI community building the future.
Build, train and deploy state of the art models powered by the reference open source in machine learning.
Hub
Home of Machine Learning
Create, discover and collaborate on ML better.
Join the community to start your ML journey.

Tasks
Problems solvers
Thousands of creators work as a community to solve Audio, Vision, and Language with AI.
Explore tasksOpen Source
Transformers
Transformers is our natural language processing library and our hub is now open to all ML models, with support from libraries like Flair, Asteroid, ESPnet, Pyannote, and more to come.
Read documentationfrom transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelForMaskedLM.from_pretrained("bert-base-uncased")
Our Research contributions
Weβre on a journey to advance and democratize NLP for everyone. Along the way, we contribute to the development of technology for the better.
πΈ
T0
Multitask Prompted Training Enables Zero-Shot Task Generalization
Open source state-of-the-art zero-shot language model out of BigScience.
Read moreπ
DistilBERT
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
A smaller, faster, lighter, cheaper version of BERT obtained via model distillation.
Read moreπ
HMTL
Hierarchical Multi-Task Learning
Learning embeddings from semantic tasks for multi-task learning. We have open-sourced code and a demo.
Read moreπΈ
Dynamical Language Models
Meta-learning for language modeling
A meta learner is trained via gradient descent to continuously and dynamically update language model weights.
Read moreπ€
State of the art
Neuralcoref
Our open source coreference resolution library for coreference. You can train it on your own dataset and language.
Read moreπ¦
Auto-complete your thoughts
Write with Transformers
This web app is the official demo of the Transformers repository's text generation capabilities.
Start writing