bedrock-sentiment-tiny

Tiny binary sentiment classifier (positive / negative) fine-tuned from prajjwal1/bert-tiny on a 3 000-example subset of SST-2.

  • Parameters: ~4.4 M (bert-tiny backbone)
  • Training hardware: CPU only (2 vCPU)
  • Wall-clock training time: ~a few minutes
  • Val accuracy: 0.790 on a 400-example slice of the SST-2 validation set

Usage

from transformers import pipeline
clf = pipeline("text-classification", model="Facececersek/bedrock-sentiment-tiny")
print(clf("this mod is sooo good"))
print(clf("it wouldn't import, it's broken"))

Training config

  • epochs: 2
  • batch size: 32
  • max length: 64
  • learning rate: 5e-5
  • seed: 42
  • base model: google/bert_uncased_L-2_H-128_A-2

Notes

Trained as a fast demonstration on a CPU-only VM. For production-quality sentiment, fine-tune on the full SST-2 or a larger base model (e.g. distilbert-base-uncased).

Downloads last month
20
Safetensors
Model size
4.39M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Facececersek/bedrock-sentiment-tiny

Finetuned
(90)
this model

Dataset used to train Facececersek/bedrock-sentiment-tiny

Space using Facececersek/bedrock-sentiment-tiny 1