🌱 TinyFabulist-TF2-1B · Gemma 3 1B EN→RO Fable Translator

tf2-1b is a parameter-efficiently fine-tuned checkpoint (LoRA adapters merged) of Google Gemma 3 1B that specialises in translating moral fables from English into Romanian.
It was produced during the TinyFabulist-TF2 project and is intended as a lightweight, cost-efficient alternative to GPT-class APIs for literary translation in low-resource settings.


📰 Model Summary

Base model google/gemma-3b-1b
Architecture Decoder-only Transformer, 1 B params
Fine-tuning method Supervised SFT (full-sequence), then instruction-tuning, then LoRA adapters (rank = 16)
Adapters merged for this release
Training data 12 000 EN–RO fable pairs (train split, TinyFabulist-TF2)
Validation 1 500 pairs
Eval set 1 500 pairs (held-out)
Objective Next-token cross-entropy on Romanian targets
Hardware / budget 1× A6000 GPU (48 GB) · ~4 h · ≈ $32
Intended use Off-line literary translation of short stories / fables
Out-of-scope News, legal, medical, or very long documents; languages other than EN ↔ RO

✨ How It Works

This model translates short English fables or moral stories into fluent, natural Romanian, capturing not just the literal meaning but also the narrative style and ethical lesson. Simply provide a short story in English, and the model will generate a Romanian version that preserves the storytelling tone and clarity, making it suitable for children’s literature, educational content, or creative writing. Designed to be lightweight, it works well even on modest hardware and is intended as a free, accessible alternative to large proprietary translation services. The model is ideal for teachers, students, and researchers looking to generate high-quality literary translations in low-resource or offline settings.

🚧 Limitations & Biases

Trained exclusively on synthetic data → may reproduce GPT-style phrasing. Domain‐specific: excels on short, moralistic narratives; underperforms on technical or colloquial prose. No guard-rails: user must filter harmful content downstream. Context window = 2 048 tokens (≈ 1 500 Romanian words).

✅ License

Released under Apache 2.0. Dataset (TinyFabulist-TF2 EN–RO 15 k) is CC-BY-4.0.

Questions or feedback? Open an issue or DM @klusai. Happy translating! 🚀

Downloads last month
35
GGUF
Model size
1,000M params
Architecture
gemma3
Hardware compatibility
Log In to view the estimation
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Evaluation results

  • BLEU on TinyFabulist-TF2 (15 k EN–RO fables)
    self-reported
    21.800
  • LLM-Eval (5-dim average) on TinyFabulist-TF2 (15 k EN–RO fables)
    self-reported
    3.75 / 5