A newer version of the Gradio SDK is available:
5.42.0
Deploying GAIA on Hugging Face Spaces
This guide provides step-by-step instructions for deploying the GAIA agent on Hugging Face Spaces, making it accessible as a web application to users worldwide.
Why Deploy on Hugging Face Spaces?
Hugging Face Spaces offers several advantages for deploying GAIA:
- Free Hosting: Basic deployment is free with reasonable usage limits
- Easy Sharing: Public URL that can be shared with anyone
- Version Control: Built-in Git integration
- Secrets Management: Secure storage for API keys
- Community: Integration with the broader AI community
- Customization: Support for custom domains and branding
Prerequisites
Before deploying to Hugging Face Spaces, you'll need:
- Hugging Face Account: Create an account at huggingface.co
- API Keys: Gather all necessary API keys (OpenAI, Serper, etc.)
- GAIA Repository: A local copy of the GAIA repository
- Git: For pushing your code to Hugging Face
Deployment Steps
Step 1: Prepare Your Repository
- Clone the GAIA repository if you haven't already:
git clone https://github.com/your-organization/gaia.git
cd gaia
- Create a specific
requirements.txt
file for Hugging Face deployment:
# Copy the main requirements
cp requirements.txt requirements-hf.txt
# Edit to add Gradio (if not already included)
echo "gradio>=4.0.0" >> requirements-hf.txt
# Remove any development-specific packages
# Edit requirements-hf.txt to remove unnecessary packages
- Create a Hugging Face-specific
app.py
or modify the existing one:
import os
import gradio as gr
from src.gaia.agent import GaiaAgent
from src.gaia.config import Configuration
# Initialize configuration
config = Configuration()
# Set default values for Hugging Face deployment
config.set("demo_mode", True)
config.set("models.default", "gpt-3.5-turbo") # Use cheaper model by default
# Initialize the agent
agent = GaiaAgent(config=config)
# Define the Gradio interface
def process_query(query, history):
try:
response = agent.run(query)
return response
except Exception as e:
return f"Error: {str(e)}"
# Create the gradio app
demo = gr.ChatInterface(
fn=process_query,
title="GAIA - Grounded AI Alignment Agent",
description="Ask any question and GAIA will search for information to provide a grounded answer.",
examples=[
"What is quantum computing?",
"Explain the theory of relativity in simple terms.",
"What are the latest developments in AI safety?"
],
theme="huggingface"
)
# Launch the app
if __name__ == "__main__":
demo.launch()
Step 2: Create a Space on Hugging Face
Log in to Hugging Face and go to huggingface.co/spaces
Click on "Create new Space"
Fill in the details:
- Owner: Your username or organization
- Space name: Choose a unique name (e.g., "gaia-agent")
- License: Choose an appropriate license (e.g., MIT)
- SDK: Choose "Gradio"
- Space hardware: Start with "CPU basic" (free tier)
- Make this Space private: Optional, if you want to restrict access
Click "Create Space"
Step 3: Configure the Repository
- In your local GAIA directory, add the Hugging Face Space as a remote:
git remote add space https://huggingface.co/spaces/your-username/gaia-agent
- Create a
.gitignore
file to exclude unnecessary files:
cat > .gitignore << EOF
# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# Virtual Environment
venv/
ENV/
env/
# Local configuration
.env
config.json
# Logs
logs/
*.log
# Results
results/
# IDE files
.idea/
.vscode/
*.swp
*.swo
EOF
- Create a
README.md
file for your Space:
cat > README.md << EOF
# GAIA Agent
GAIA (Grounded AI Alignment) is an AI agent designed to answer questions with grounded, factual information.
## Features
- Web search capabilities using multiple providers
- Academic research integration
- Reasoning tools for complex problems
- Memory system for context-aware responses
## Usage
Simply type your question in the input box and click "Submit" to get a response from GAIA.
## Examples
- "What is quantum computing?"
- "Explain the theory of relativity in simple terms."
- "What are the latest developments in AI safety?"
## About
GAIA is built with Python using LangChain, LangGraph, and GPT-4. It leverages various APIs to provide accurate and up-to-date information.
EOF
Step 4: Create a Requirements File for Hugging Face
Create or modify the requirements.txt
file to include only the necessary packages:
cat > requirements.txt << EOF
gradio>=4.0.0
langchain>=0.0.267
langgraph>=0.0.15
openai>=1.1.1
tiktoken>=0.5.1
supabase>=2.0.3
requests>=2.31.0
python-dotenv>=1.0.0
EOF
Step 5: Set Up Environment Variables
In your Hugging Face Space:
- Go to the Settings tab of your Space
- Scroll down to the "Repository secrets" section
- Add your API keys and configuration as secrets:
OPENAI_API_KEY
: Your OpenAI API keySERPER_API_KEY
: Your Serper API key (if used)PERPLEXITY_API_KEY
: Your Perplexity API key (if used)- Any other necessary API keys or configuration values
Step 6: Push Your Code to Hugging Face
Commit and push your code to the Hugging Face Space:
# Add your files
git add app.py requirements.txt README.md .gitignore src/
# Commit the changes
git commit -m "Initial GAIA deployment on Hugging Face"
# Push to Hugging Face
git push space main
After pushing, Hugging Face will automatically build and deploy your application. This may take a few minutes.
Step 7: Test Your Deployment
- Once the build is complete, navigate to your Space's URL:
https://huggingface.co/spaces/your-username/gaia-agent
- Test the application by asking a few questions
- Check the Space logs for any errors or issues:
- Go to the "Settings" tab
- Scroll down to "Factory reboot" section
- Click on "View logs"
Advanced Configuration
Custom Domain
To use a custom domain with your Space:
- Go to the "Settings" tab of your Space
- Scroll down to the "Custom domain" section
- Enter your domain name (e.g.,
gaia.yourdomain.com
) - Follow the instructions to set up DNS records
Upgrading Hardware
For better performance or to handle more traffic:
- Go to the "Settings" tab of your Space
- Scroll down to the "Space hardware" section
- Choose a higher tier (note: this will incur costs)
- CPU Upgrade: For faster processing
- GPU: For model hosting or intensive processing
- Memory Boost: For handling larger datasets
Persistent Storage
To enable persistent storage for your Space:
- Go to the "Settings" tab of your Space
- Scroll down to the "Persistent storage" section
- Enable persistent storage (up to 10GB for free)
This allows you to store data that persists between restarts, such as logs or cached results.
Authentication
To restrict access to authenticated users:
- In your
app.py
, modify the launch parameters:
demo.launch(auth=("username", "password"))
- Or for more flexible authentication:
demo.launch(auth_message="Enter GAIA password",
auth=lambda u, p: p == os.environ.get("GAIA_PASSWORD", "default_password"))
- Add the
GAIA_PASSWORD
environment variable in your Space settings
Scheduled Restarts
For long-running deployments, scheduled restarts can help maintain stability:
- Go to the "Settings" tab of your Space
- Scroll down to the "Factory reboot" section
- Set a schedule for automatic reboots (e.g., daily or weekly)
Optimizing for Hugging Face Spaces
Reducing Startup Time
To reduce startup time and improve user experience:
- Lazy-load components when possible:
def load_agent():
if not hasattr(load_agent, "agent"):
load_agent.agent = GaiaAgent(config=config)
return load_agent.agent
def process_query(query, history):
agent = load_agent()
return agent.run(query)
- Use caching for expensive operations:
import functools
@functools.lru_cache(maxsize=100)
def get_cached_result(query):
# Expensive operation
return result
Memory Usage Optimization
To stay within Hugging Face's memory limits:
- Limit model usage:
config.set("models.default", "gpt-3.5-turbo") # Uses less memory than GPT-4
- Implement efficient memory management:
# Clear working memory after each query
def process_query(query, history):
agent = load_agent()
result = agent.run(query)
agent.reset(clear_memory=False) # Clear working memory but keep conversation history
return result
- Disable memory-intensive features in the configuration:
config.set("memory.supabase.enabled", False) # Use simpler memory system
config.set("tools.image_analysis.enabled", False) # Disable memory-intensive tools
Monitoring and Maintenance
Monitoring Usage
Monitor your Space's usage and performance:
- Go to the "Settings" tab of your Space
- Scroll down to the "Metrics" section
- View CPU, memory, and disk usage over time
Updating Your Deployment
To update your GAIA deployment:
- Make changes to your local repository
- Commit and push to your Hugging Face Space:
git add .
git commit -m "Update GAIA deployment"
git push space main
Handling Errors
If you encounter errors in your deployment:
- Check the Space logs for error messages
- Implement better error handling in your code:
def process_query(query, history):
try:
agent = load_agent()
return agent.run(query)
except Exception as e:
# Log the error
print(f"Error processing query: {str(e)}")
# Return a user-friendly message
return "I'm having trouble processing your request. Please try again or ask a different question."
- Set up monitoring to be notified of errors
Conclusion
You now have GAIA deployed on Hugging Face Spaces, making it accessible as a web application. This allows you to share your agent with others, collaborate with the community, and benefit from Hugging Face's infrastructure.
For more advanced deployments or custom integrations, consider exploring:
- Local Deployment Guide for self-hosting options
- API Documentation for programmatic integrations
- Hugging Face Spaces Documentation for more Spaces features
If you encounter any issues or have questions, refer to the troubleshooting section or create an issue on the project's GitHub repository.