DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 125k • • 51 distilbert/distilbert-base-uncased Fill-Mask • 0.1B • Updated May 6, 2024 • 12.5M • • 750 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 1.17M • 212 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 2.93M • • 821
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 2.93M • • 821
DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 125k • • 51 distilbert/distilbert-base-uncased Fill-Mask • 0.1B • Updated May 6, 2024 • 12.5M • • 750 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 1.17M • 212 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 2.93M • • 821
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 2.93M • • 821