model / README.md
likhonsheikh's picture
Update README.md
a1cf6f9 verified
metadata
license: apache-2.0
title: Model
sdk: docker
colorFrom: green
colorTo: green
pinned: true

DeepCoder Docker Deployment

Complete Docker setup for deploying the DeepCoder-14B AI code generation model.

Quick Start

  1. Setup and Deploy: ```bash chmod +x setup.sh ./setup.sh ```

  2. Test the API: ```bash curl -X POST http://localhost:8000/generate
    -H 'Content-Type: application/json'
    -d '{"prompt": "def fibonacci(n):", "max_tokens": 200}' ```

Deployment Options

Local Docker

  • Run ./setup.sh for automatic setup
  • Supports both GPU and CPU deployment
  • Includes Nginx reverse proxy with rate limiting

Hugging Face Spaces

  • Run ./deploy-hf.sh [space-name] [username]
  • Requires HF_TOKEN environment variable
  • Automatically configures for HF Spaces (port 7860)

API Endpoints

  • POST /generate - Generate code from prompts
  • POST /chat - Chat-style code assistance
  • GET /model/info - Model benchmarks and info
  • GET /health - Health check

Requirements

  • Docker & Docker Compose
  • 16GB+ RAM (32GB recommended)
  • NVIDIA GPU with 8GB+ VRAM (optional, falls back to CPU)
  • 50GB+ disk space for model cache