one-pager-generator / DEPLOY.md
phxdev's picture
Upload folder using huggingface_hub
51d8aa9 verified

A newer version of the Gradio SDK is available: 5.41.0

Upgrade

πŸš€ Deployment Guide

Deploy to Hugging Face Spaces

Prerequisites

  1. Install Hugging Face CLI:
pip install huggingface_hub
  1. Login to Hugging Face:
huggingface-cli login

Create and Deploy Space

  1. Create a new Space on Hugging Face Hub:
huggingface-cli repo create --type space --space_sdk gradio your-username/one-pager-generator
  1. Clone and set up the repository:
git clone https://huggingface.co/spaces/your-username/one-pager-generator
cd one-pager-generator
  1. Copy files to the Space repository:
cp ../one-pager/* .
  1. Add, commit and push:
git add .
git commit -m "Initial commit: AI One-Pager Generator"
git push

Alternative: Direct CLI Upload

You can also use the HF CLI to upload files directly:

huggingface-cli upload your-username/one-pager-generator . --repo-type=space

Files Required for Deployment

  • app.py - Main application file
  • requirements.txt - Python dependencies
  • config.yaml - Space configuration
  • README.md - Documentation
  • .gitignore - Git ignore patterns

Configuration Notes

  • The app uses distilgpt2 model for better compatibility
  • CPU-only inference for free tier compatibility
  • Fallback template system ensures reliable output
  • Gradio interface optimized for Spaces

Post-Deployment

After deployment, your Space will be available at: https://huggingface.co/spaces/your-username/one-pager-generator

The app will automatically:

  1. Install dependencies from requirements.txt
  2. Load the AI model
  3. Launch the Gradio interface
  4. Be accessible via the web

Troubleshooting

  • Model loading issues: The app falls back to structured templates
  • Memory issues: Using smaller DistilGPT2 model for efficiency
  • Timeout issues: CPU inference may be slower but more reliable