File size: 14,905 Bytes
5e93423 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 |
{
"cells": [
{
"cell_type": "markdown",
"id": "528b6777",
"metadata": {},
"source": [
"# Complete Chatbot Application Code Explanation\n",
"\n",
"## Overview: What This Application Does\n",
"\n",
"This code creates a sophisticated web-based chatbot interface using Gradio (a Python library for building user interfaces) that connects to OpenAI's language models. Think of it as building your own ChatGPT-like interface with customizable features.\n",
"\n",
"## Import Section: Setting Up the Tools\n",
"\n",
"```python\n",
"import gradio as gr\n",
"import openai\n",
"import json\n",
"import os\n",
"import time\n",
"from typing import List, Tuple, Optional\n",
"import requests\n",
"from datetime import datetime\n",
"```\n",
"\n",
"Let me explain what each import does:\n",
"- `gradio as gr`: This is the main library that creates the web interface with buttons, text boxes, and chat windows\n",
"- `openai`: The official OpenAI library to communicate with their AI models like GPT-3.5 and GPT-4\n",
"- `json`: Handles saving and loading conversation data in a structured format\n",
"- `os`: Interacts with the operating system (like creating folders for saved files)\n",
"- `time`: Could be used for adding delays or timestamps (though not heavily used in this code)\n",
"- `typing`: Provides type hints to make the code clearer about what kind of data functions expect\n",
"- `requests`: Makes HTTP requests to web services (imported but not actively used here)\n",
"- `datetime`: Handles dates and times for timestamping conversations\n",
"\n",
"## The ChatbotManager Class: The Brain of the Operation\n",
"\n",
"The `ChatbotManager` class is like the control center that manages everything about the chatbot. Let's break it down:\n",
"\n",
"### Initialization Method\n",
"\n",
"```python\n",
"def __init__(self):\n",
" self.conversation_history = []\n",
" self.current_api_key = None\n",
" self.current_model = \"gpt-3.5-turbo\"\n",
" self.system_prompt = \"You are a helpful AI assistant. Respond in a friendly and informative manner.\"\n",
" self.max_tokens = 150\n",
" self.temperature = 0.7\n",
"```\n",
"\n",
"When a `ChatbotManager` object is created, it sets up default values:\n",
"- `conversation_history`: An empty list that will store all the chat messages\n",
"- `current_api_key`: Starts as None until the user provides their OpenAI API key\n",
"- `current_model`: Defaults to GPT-3.5-turbo (a fast, cost-effective model)\n",
"- `system_prompt`: The instructions that tell the AI how to behave\n",
"- `max_tokens`: Limits response length to 150 tokens (roughly 100-120 words)\n",
"- `temperature`: Controls creativity (0.7 means moderately creative responses)\n",
"\n",
"### API Key Validation Method\n",
"\n",
"```python\n",
"def set_api_key(self, api_key: str) -> str:\n",
" if not api_key.strip():\n",
" return \"β Please enter a valid API key\"\n",
" \n",
" self.current_api_key = api_key.strip()\n",
" openai.api_key = self.current_api_key\n",
" \n",
" try:\n",
" openai.Model.list()\n",
" return \"β
API key validated successfully!\"\n",
" except Exception as e:\n",
" return f\"β Invalid API key: {str(e)}\"\n",
"```\n",
"\n",
"This method is like a security guard checking if someone has the right credentials:\n",
"1. First, it checks if the API key isn't empty (using `strip()` to remove spaces)\n",
"2. If valid, it stores the key and tells the OpenAI library to use it\n",
"3. It then tests the key by trying to list available models\n",
"4. If the test succeeds, it returns a success message\n",
"5. If it fails, it catches the error and returns a failure message\n",
"\n",
"### Settings Update Method\n",
"\n",
"```python\n",
"def update_settings(self, model: str, system_prompt: str, max_tokens: int, temperature: float) -> str:\n",
" self.current_model = model\n",
" self.system_prompt = system_prompt\n",
" self.max_tokens = max_tokens\n",
" self.temperature = temperature\n",
" return f\"β
Settings updated: Model={model}, Max Tokens={max_tokens}, Temperature={temperature}\"\n",
"```\n",
"\n",
"This is straightforward - it updates all the chatbot's configuration settings and returns a confirmation message showing what was changed.\n",
"\n",
"### Custom Data Integration Method\n",
"\n",
"```python\n",
"def preprocess_data(self, data_text: str) -> str:\n",
" if not data_text.strip():\n",
" return \"No custom data provided\"\n",
" \n",
" base_prompt = \"You are a helpful AI assistant. Respond in a friendly and informative manner.\"\n",
" self.system_prompt = base_prompt + f\"\\n\\nAdditional Context:\\n{data_text}\"\n",
" return f\"β
Custom data integrated ({len(data_text)} characters)\"\n",
"```\n",
"\n",
"This method allows users to add custom information to the chatbot's knowledge. It takes the default system prompt and appends the custom data, creating a more specialized AI assistant that knows about specific topics or company information.\n",
"\n",
"### The Core Response Generation Method\n",
"\n",
"```python\n",
"def generate_response(self, user_input: str, history: List[Tuple[str, str]]) -> Tuple[str, List[Tuple[str, str]]]:\n",
" if not self.current_api_key:\n",
" return \"β Please set your API key first!\", history\n",
" \n",
" if not user_input.strip():\n",
" return \"Please enter a message.\", history\n",
" \n",
" try:\n",
" messages = [{\"role\": \"system\", \"content\": self.system_prompt}]\n",
" \n",
" for user_msg, assistant_msg in history:\n",
" messages.append({\"role\": \"user\", \"content\": user_msg})\n",
" messages.append({\"role\": \"assistant\", \"content\": assistant_msg})\n",
" \n",
" messages.append({\"role\": \"user\", \"content\": user_input})\n",
" \n",
" response = openai.ChatCompletion.create(\n",
" model=self.current_model,\n",
" messages=messages,\n",
" max_tokens=self.max_tokens,\n",
" temperature=self.temperature,\n",
" n=1,\n",
" stop=None,\n",
" )\n",
" \n",
" assistant_response = response.choices[0].message.content.strip()\n",
" history.append((user_input, assistant_response))\n",
" \n",
" return assistant_response, history\n",
" \n",
" except Exception as e:\n",
" error_msg = f\"β Error generating response: {str(e)}\"\n",
" return error_msg, history\n",
"```\n",
"\n",
"This is the heart of the chatbot. Here's what happens step by step:\n",
"\n",
"1. **Safety Checks**: First, it verifies that an API key exists and the user actually typed something\n",
"2. **Message Formatting**: It creates a list of messages in the format OpenAI expects:\n",
" - Starts with the system prompt (tells the AI how to behave)\n",
" - Adds all previous conversation history in order\n",
" - Adds the new user message\n",
"3. **API Call**: It sends all this information to OpenAI's servers using the specified model and settings\n",
"4. **Response Processing**: It extracts the AI's response and adds both the user input and AI response to the conversation history\n",
"5. **Error Handling**: If anything goes wrong, it catches the error and returns a helpful message\n",
"\n",
"### Utility Methods\n",
"\n",
"```python\n",
"def clear_conversation(self) -> Tuple[str, List[Tuple[str, str]]]:\n",
" self.conversation_history = []\n",
" return \"\", []\n",
"\n",
"def export_conversation(self, history: List[Tuple[str, str]]) -> Tuple[str, Optional[str]]:\n",
" # Creates a JSON file with the conversation data\n",
"```\n",
"\n",
"These methods handle housekeeping tasks like clearing the chat and saving conversations to files.\n",
"\n",
"## Model Configuration\n",
"\n",
"```python\n",
"AVAILABLE_MODELS = [\n",
" \"gpt-3.5-turbo\",\n",
" \"gpt-3.5-turbo-16k\",\n",
" \"gpt-4\",\n",
" \"gpt-4-32k\",\n",
" \"gpt-4-0613\",\n",
" \"gpt-4-32k-0613\"\n",
"]\n",
"```\n",
"\n",
"This list defines which AI models users can choose from. Each has different capabilities:\n",
"- GPT-3.5 models are faster and cheaper\n",
"- GPT-4 models are more capable but slower and more expensive\n",
"- The numbers (16k, 32k) refer to how much text they can process at once\n",
"\n",
"## The User Interface Creation\n",
"\n",
"The `create_interface()` function builds the entire web interface using Gradio. Think of it as designing a website layout:\n",
"\n",
"### Main Structure\n",
"\n",
"```python\n",
"with gr.Blocks(title=\"LLM-Based Chatbot\", theme=gr.themes.Ocean()) as demo:\n",
"```\n",
"\n",
"This creates the main container with a title and applies a blue ocean theme for visual appeal.\n",
"\n",
"### Chat Interface Tab\n",
"\n",
"The first tab contains the main chat functionality:\n",
"\n",
"```python\n",
"chatbot_interface = gr.Chatbot(\n",
" label=\"Conversation\",\n",
" height=400,\n",
" show_label=True,\n",
" avatar_images=(\"user.png\", \"assistant.png\"),\n",
" show_copy_button=True,\n",
" bubble_full_width=False,\n",
")\n",
"```\n",
"\n",
"This creates the chat window where conversations appear, with customizable height, avatars for user and AI, and a copy button for responses.\n",
"\n",
"### Input Components\n",
"\n",
"```python\n",
"user_input = gr.Textbox(\n",
" placeholder=\"Type your message here...\",\n",
" scale=4,\n",
" show_label=False,\n",
" container=False\n",
")\n",
"send_btn = gr.Button(\"π€ Send\", variant=\"primary\", scale=1)\n",
"```\n",
"\n",
"These create the text input box where users type messages and the send button to submit them.\n",
"\n",
"### Settings Panel\n",
"\n",
"The right side contains controls for configuring the chatbot:\n",
"- API key input (password type for security)\n",
"- Model selection dropdown\n",
"- Token limit slider\n",
"- Temperature slider for creativity control\n",
"- Status displays showing current settings\n",
"\n",
"### Advanced Settings Tab\n",
"\n",
"This tab provides more sophisticated configuration options:\n",
"- Custom system prompt editing\n",
"- Custom data integration\n",
"- Preset prompt buttons for different use cases\n",
"- Settings management buttons\n",
"\n",
"## Event Handling: Making It Interactive\n",
"\n",
"The code then connects user actions to functions using event handlers:\n",
"\n",
"```python\n",
"send_btn.click(\n",
" handle_chat,\n",
" inputs=[user_input, chatbot_interface],\n",
" outputs=[chatbot_interface, user_input]\n",
")\n",
"```\n",
"\n",
"This tells Gradio: \"When the send button is clicked, run the `handle_chat` function with the current user input and chat history, then update the chat display and clear the input box.\"\n",
"\n",
"### Handler Functions\n",
"\n",
"The handler functions act as translators between the user interface and the ChatbotManager:\n",
"\n",
"```python\n",
"def handle_chat(user_input, history):\n",
" if not user_input.strip():\n",
" return history or [], \"\"\n",
" \n",
" response, updated_history = chatbot.generate_response(user_input, history or [])\n",
" return updated_history, \"\"\n",
"```\n",
"\n",
"This function takes what the user typed, passes it to the chatbot manager, and returns the updated conversation to display in the interface.\n",
"\n",
"## Advanced Features\n",
"\n",
"### Preset System Prompts\n",
"\n",
"The application includes predefined prompts for different use cases:\n",
"- Customer Support: Professional, solution-focused responses\n",
"- Educational Tutor: Patient, explanatory teaching style\n",
"- Creative Assistant: Imaginative, inspiring writing help\n",
"- Technical Writer: Clear, precise documentation style\n",
"\n",
"### Data Integration\n",
"\n",
"Users can add custom data to make the chatbot an expert on specific topics, like company FAQs or technical documentation.\n",
"\n",
"### Conversation Export\n",
"\n",
"The system can save conversations as JSON files with timestamps and metadata for later analysis or record-keeping.\n",
"\n",
"## Launch Configuration\n",
"\n",
"```python\n",
"if __name__ == \"__main__\":\n",
" demo = create_interface()\n",
" demo.launch(share=True)\n",
"```\n",
"\n",
"This final section starts the application:\n",
"- Creates the interface\n",
"- Launches it with `share=True`, which creates a public URL that others can access\n",
"- The application runs on your computer but can be accessed from anywhere via the generated link\n",
"\n",
"## How It All Works Together\n",
"\n",
"When you run this code, here's the complete flow:\n",
"\n",
"1. **Startup**: The application creates a ChatbotManager instance and builds the web interface\n",
"2. **User Setup**: User enters their OpenAI API key, which gets validated\n",
"3. **Configuration**: User can adjust model settings, system prompts, and add custom data\n",
"4. **Conversation**: User types a message, clicks send\n",
"5. **Processing**: The system formats the message with conversation history and sends it to OpenAI\n",
"6. **Response**: OpenAI returns a response, which gets displayed in the chat interface\n",
"7. **Continuation**: The conversation continues with full context maintained\n",
"\n",
"This creates a fully functional, customizable chatbot interface that rivals commercial applications while giving users complete control over the AI's behavior and capabilities."
]
},
{
"cell_type": "markdown",
"id": "07498f29",
"metadata": {},
"source": []
}
],
"metadata": {
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
|