Spaces:
Sleeping
Sleeping
metadata
title: EUDR Retriever
emoji: 🐠
colorFrom: yellow
colorTo: pink
sdk: docker
pinned: false
ChatFed Retriever - MCP Server
A semantic document retrieval and reranking service designed for ChatFed RAG (Retrieval-Augmented Generation) pipelines. This module serves as an MCP (Model Context Protocol) server that retrieves semantically similar documents from vector databases with optional cross-encoder reranking.
MCP Endpoint
The main MCP function is retrieve_mcp
which provides a top_k retrieval and reranking function when properly connected to an external vector database.
Parameters:
query
(str, required): The search query textreports_filter
(str, optional): Comma-separated list of specific report filenamessources_filter
(str, optional): Filter by document source typesubtype_filter
(str, optional): Filter by document subtypeyear_filter
(str, optional): Comma-separated list of years to filter by
Returns: List of dictionaries containing:
answer
: Document contentanswer_metadata
: Document metadatascore
: Relevance score [disabled when reranker used]
Example useage:
from gradio_client import Client
client = Client("ENTER CONTAINER URL / SPACE ID")
result = client.predict(
query="...",
reports_filter="",
sources_filter="",
subtype_filter="",
year_filter="",
api_name="/retrieve_mcp"
)
print(result)
Configuration
Vector Store Configuration
- Set your data source according to the provider
- Set the embedding model to match the data source
- Set the retriever parameters
- [Optional] Set the reranker parameters
- Run the app:
docker build -t chatfed-retriever .
docker run -p 7860:7860 chatfed-retriever