# 🧩 Unit 2 (Part 2): Building MCP Clients Now that we've built and deployed an MCP server using Gradio, it’s time to build **MCP clients** that can interact with it — both in the browser (JavaScript) and backend (Python). We’ll also show how to build a UI-driven MCP client using Gradio itself. --- ## 📁 Configuring MCP Clients ### MCP Configuration Files MCP clients typically read from a `mcp.json` file to know how to connect to servers. Here's a basic example: ```json { "servers": [ { "name": "MCP Server", "transport": { "type": "sse", "url": "http://localhost:7860/gradio_api/mcp/sse" } } ] } ``` - **`type: sse`** – Use this for remote servers (like Hugging Face Spaces) - **`type: stdio`** – Use this for local scripts or CLI tools ### Example for Remote (Hugging Face Space): ```json { "servers": [ { "name": "Sentiment Tool", "transport": { "type": "sse", "url": "https://japhari-mcp-sentiment.hf.space/gradio_api/mcp/sse" } } ] } ``` --- ## 🧠 Python Client using `smolagents` ### Install dependencies: ```bash pip install "smolagents[mcp]" "gradio[mcp]" ``` ### Simple Python Client: ```python from smolagents import ToolCollection, CodeAgent, InferenceClientModel from mcp.client.sse import SSEServerParameters server = SSEServerParameters(url="http://localhost:7860/gradio_api/mcp/sse") with ToolCollection.from_mcp(server, trust_remote_code=True) as tools: agent = CodeAgent(tools=[*tools.tools], model=InferenceClientModel()) agent.run("What is the sentiment of 'I love MCP?'") ``` This client connects to your server, loads available tools, and allows LLM-driven execution. --- ## 🌐 JavaScript Client using HuggingFace.js (Example) If you're using a UI front end (e.g., React or Vue), you can call your MCP server using any SSE-capable HTTP client. ```json { "mcpServers": { "mcp": { "url": "http://localhost:7860/gradio_api/mcp/sse" } } } ``` This config can be used with tools like `mcp-remote` or integrated into a custom JS agent. --- ## 🖥 Gradio as an MCP Client (UI Agent) ### Concept Gradio isn’t just for building servers — it can also be used to build **UI wrappers around MCP Clients**. This is powerful when you want a browser interface for tools provided by another MCP server. --- ### Step-by-Step Example Install necessary libraries: ```bash pip install "smolagents[mcp]" "gradio[mcp]" mcp ``` Create a new file: `app.py` ```python import gradio as gr from smolagents import InferenceClientModel, CodeAgent from smolagents.mcp_client import MCPClient try: # Connect to a remote MCP server mcp_client = MCPClient({ "url": "http://localhost:7860/gradio_api/mcp/sse" # or test this working remote one: # "url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse" }) tools = mcp_client.get_tools() model = InferenceClientModel() agent = CodeAgent(tools=[*tools], model=model) # Gradio UI demo = gr.ChatInterface( fn=lambda msg, history: str(agent.run(msg)), type="messages", title="Agent with MCP Tools", examples=["What is the sentiment of 'That’s awesome!'"], description="A UI MCP Client that connects to a remote tool server" ) demo.launch() finally: mcp_client.disconnect() ``` --- ### Deploy to Hugging Face Spaces 1. Go to [https://huggingface.co/spaces](https://huggingface.co/spaces) 2. Click **“Create new Space”** 3. Choose **Gradio** SDK 4. Name it something like: `mcp-client-ui` 5. Create `requirements.txt`: ``` gradio[mcp] smolagents[mcp] ``` 6. Push it: ```bash git init git add app.py requirements.txt git commit -m "MCP client UI" git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/mcp-client-ui git push -u origin main ```