RawanAlwadeya's picture
Update README.md
1d3864a verified
metadata
library_name: transformers
license: mit
language:
  - en
metrics:
  - rouge
pipeline_tag: summarization
datasets:
  - EdinburghNLP/xsum

Model Card for BrevityBot: T5-Based Text Summarization Model

BrevityBot is a high-performance NLP model fine-tuned from Google’s t5-small architecture using the XSum dataset. Designed for abstractive summarization, it generates concise, high-quality summaries of long English documents and news articles. The model leverages the encoder-decoder structure of T5 and was optimized using Hugging Face’s Seq2SeqTrainer, making it well-suited for applications that require fast and accurate content summarization.


Model Details

Key Features:

  • Abstractive summarization trained on real-world, high-quality data (XSum)
  • Based on google-t5/t5-small, a reliable and well-researched architecture
  • Deployed in a Flutter mobile app via FastAPI backend for real-time use
  • Evaluated using ROUGE metrics to ensure output quality

Skills & Technologies Used:

  • Hugging Face Transformers and Datasets
  • Fine-tuning with Seq2SeqTrainer
  • Google Colab for training (GPU acceleration)
  • FastAPI for backend API integration

  • Developed by: Rawan Alwadeya
  • Model type: Sequence-to-Sequence (Encoder-Decoder)
  • Language(s): English (en)
  • License: MIT
  • Finetuned from: google-t5/t5-small

Uses

Used to generate short abstractive summaries from long English documents or news articles. Works well for personal productivity, education, media apps, and more.

Example Usage


from transformers import pipeline

summarizer = pipeline("summarization", model="RawanAlwadeya/t5-summarization-brevitybot")

text = """
The British prime minister said today that the new policies will help boost economic growth over the next five years.
"""

summary = summarizer(text, max_length=64, min_length=30, do_sample=False)
print(summary[0]['summary_text'])