tf2-1b-gguf / README.md
andreiPiscoran's picture
Update README.md
90e3530 verified
metadata
license: apache-2.0
tags:
  - translation
  - en-ro
  - literary
  - fables
  - low-resource
  - lora
  - gemma
  - tinyfabulist
model-index:
  - name: tf2-1b (Gemma 3 1B, EN→RO fable translator)
    results:
      - task:
          type: translation
          name: English  Romanian
        dataset:
          name: TinyFabulist-TF2 (15 k EN–RO fables)
          type: klusai/tf2-en-ro-15k
        metrics:
          - name: BLEU
            type: bleu
            value: 21.8
            verified: false
          - name: LLM-Eval (5-dim average)
            type: custom
            value: 3.75 / 5
            verified: false
language:
  - en
  - ro

🌱 TinyFabulist-TF2-1B · Gemma 3 1B EN→RO Fable Translator

tf2-1b is a parameter-efficiently fine-tuned checkpoint (LoRA adapters merged) of Google Gemma 3 1B that specialises in translating moral fables from English into Romanian.
It was produced during the TinyFabulist-TF2 project and is intended as a lightweight, cost-efficient alternative to GPT-class APIs for literary translation in low-resource settings.


📰 Model Summary

Base model google/gemma-3b-1b
Architecture Decoder-only Transformer, 1 B params
Fine-tuning method Supervised SFT (full-sequence), then instruction-tuning, then LoRA adapters (rank = 16)
Adapters merged for this release
Training data 12 000 EN–RO fable pairs (train split, TinyFabulist-TF2)
Validation 1 500 pairs
Eval set 1 500 pairs (held-out)
Objective Next-token cross-entropy on Romanian targets
Hardware / budget 1× A6000 GPU (48 GB) · ~4 h · ≈ $32
Intended use Off-line literary translation of short stories / fables
Out-of-scope News, legal, medical, or very long documents; languages other than EN ↔ RO

✨ How It Works

This model translates short English fables or moral stories into fluent, natural Romanian, capturing not just the literal meaning but also the narrative style and ethical lesson. Simply provide a short story in English, and the model will generate a Romanian version that preserves the storytelling tone and clarity, making it suitable for children’s literature, educational content, or creative writing. Designed to be lightweight, it works well even on modest hardware and is intended as a free, accessible alternative to large proprietary translation services. The model is ideal for teachers, students, and researchers looking to generate high-quality literary translations in low-resource or offline settings.

🚧 Limitations & Biases

Trained exclusively on synthetic data → may reproduce GPT-style phrasing. Domain‐specific: excels on short, moralistic narratives; underperforms on technical or colloquial prose. No guard-rails: user must filter harmful content downstream. Context window = 2 048 tokens (≈ 1 500 Romanian words).

✅ License

Released under Apache 2.0. Dataset (TinyFabulist-TF2 EN–RO 15 k) is CC-BY-4.0.

Questions or feedback? Open an issue or DM @klusai. Happy translating! 🚀