NataliaH's picture
Initial upload of tiny GPT-2 model
3674769 verified
metadata
tags:
  - language-model
  - gpt-2
  - fine-tuned
  - tiny-shakespeare
license: mit
datasets:
  - tiny_shakespeare

GPT-2 Tiny Shakespeare Model

This is a small autoregressive language model based on the Transformer architecture trained on the Tiny Shakespeare dataset.

Model Description

The model is a custom implementation of a TransformerDecoderModel, which uses a decoder-only architecture similar to GPT-2. It was trained on the Tiny Shakespeare dataset to generate text in the style of William Shakespeare.

How to Use

To generate text with this model, you can load it and the tokenizer as follows:

from transformers import GPT2LMHeadModel, GPT2Tokenizer

model = GPT2LMHeadModel.from_pretrained('NataliaH/gpt2-tiny-shakespeare')
tokenizer = GPT2Tokenizer.from_pretrained('NataliaH/gpt2-tiny-shakespeare')

input_text = 'To be or not to be'
inputs = tokenizer(input_text, return_tensors='pt')
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Tags

  • Transformer
  • GPT-2
  • Tiny Shakespeare
  • Language Model
  • Text Generation
  • Autoregressive

Training Details

  • Epochs: 3
  • Batch size: 4
  • Learning Rate: 5e-5
  • Loss Function: Cross-Entropy Loss
  • Optimizer: AdamW

License

This model is licensed under the MIT license.