DistilBERT Emotion Classifier 🎭

This model classifies English text into one of six emotions: sadness, joy, love, anger, fear, surprise.

  • Base model: distilbert-base-uncased
  • Framework: Hugging Face Transformers
  • Dataset: Kaggle Emotions Dataset
  • Task: Multi-class emotion detection

πŸ“Š Evaluation

Class Precision Recall F1-score Support
0 (sadness) 0.99 0.96 0.98 24,121
1 (joy) 0.93 0.99 0.96 28,220
2 (love) 1.00 0.71 0.83 6,824
3 (anger) 0.95 0.94 0.95 11,448
4 (fear) 0.90 0.91 0.91 9,574
5 (surprise) 0.74 0.99 0.85 3,038

Overall Performance:

  • Accuracy: 94%
  • Macro F1: 0.91
  • Weighted F1: 0.94

πŸ§‘β€πŸ’» Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification, pipeline

model_name = "YamenRM/distilbert-emotion-classifier"

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

nlp = pipeline("text-classification", model=model, tokenizer=tokenizer)

print(nlp("I feel so happy and excited today!"))
# [{'label': 'joy', 'score': 0.98}]
Downloads last month
13
Safetensors
Model size
67M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Evaluation results