|
--- |
|
title: Medini |
|
emoji: π¬ |
|
colorFrom: yellow |
|
colorTo: purple |
|
sdk: gradio |
|
sdk_version: "5.42.0" |
|
app_file: app.py |
|
pinned: false |
|
--- |
|
|
|
# Medini - Multi-role AI Assistant |
|
|
|
Chat with Medini as Tutor, PM, Designer, Analyst, Innovator, Friend, and more. |
|
|
|
## Setup |
|
|
|
1. Set the following **Secrets** in Hugging Face Space: |
|
- `HF_API_URL` β Your Hugging Face Inference API URL. |
|
- `HF_TOKEN` β Your Hugging Face OAuth token (required for gated/private models). |
|
- `EMB_MODEL` β Optional embedding model (default: `sentence-transformers/all-MiniLM-L6-v2`). |
|
- `CHROMA_DIR` β Optional directory for Chroma DB storage (default: `/workspace/chroma`). |
|
- `USE_LOCAL_MODEL` β Optional: "True" to use a local model, otherwise "False". |
|
|
|
2. Ensure `requirements.txt` is included in the repo. |
|
|
|
3. Perform a **Factory rebuild** of the Space after updating files. |
|
|
|
## Features |
|
|
|
- Multi-role AI assistant: Tutor, PM, Designer, Analyst, Innovator, Friend, Video Generator. |
|
- Persistent memory using ChromaDB. |
|
- Supports both local and API-based model inference. |
|
|