This interactive web application allows you to experiment with AI agents in your browser, using
OpenAI Agents Python SDK and
Pyodide.
You can customize agent behavior, test different prompts, and see responses in real-time.
Check the mozilla-ai/wasm-agents-blueprint
GitHub repository for more information.
How it works:
this application runs a local agent which can make use of the following tools:
count_character_occurrences
which counts the occurrences of a given character inside a word
visit_webpage
which visits a webpage at the provided url and reads its content as a markdown string
search_tavily
which performs web searches using Tavily API (requires TAVILY_API_KEY in config.js)
While the former tool is quite trivial and is mainly used to show how to address the
"r in strawberry"
issue, the latter two provide the LLM with the capability of accessing up-to-date information on the Web.
Configure:
Make sure the Local LLM Server Configuration parameters are ok for your setup. In particular,
the default expects you to have Ollama running on your system
with the qwen3:8b model installed. You
can also click the LM Studio preset button if you are using
LM Studio, and make sure to update your model name accordingly.
Optionally, add your TAVILY_API_KEY to config.js to enable web search functionality.
NOTE: if you are using LM Studio with a thinking model and are getting tool
calls directly in the model's response, disable thinking in the "Edit model default parameters" section.
Initialize:
Set up the Python environment with Pyodide and the OpenAI agents framework
by clicking on the Initialize Pyodide Environment button
Customize:
Choose one of the suggested prompts or create new ones in the text fields below.
(hint: you can also explicitly set/unset qwen3's "think mode" by prepending /think
or /no_think to the prompt).
Run:
Click on the Run Agent button to send your prompt to the agent and see what happens
đ Configuration: Config loaded from config.js
Click "Initialize Pyodide Environment" to set up the Python environment...
đ Local LLM Server Configuration
Quick presets:
Quick examples:
đ Running Python code...
Initialize the Pyodide environment first, then click "Run Agent" to test the agent