Posm's picture
Update README.md
612f515 verified
metadata
language: code
license: mit
tags:
  - machine-learning
  - ai
  - structured-planning
  - llamaindex
model_name: Structured Planning AI Agent
model_type: agent
library_name: llama-index

Implementing a Structured Planning AI Agent with LlamaIndex

  1. Set up the environment

    • Skip this step if you have already set up the environment
    python -m venv .venv
    source .venv/bin/activate
    
  2. Setup LlamaIndex

    pip install llama-index
    
  3. Create a python file

    touch worker.py
    

    Or

    echo. > worker.py
    
  4. Open the file in VSCode

    code worker.py
    
  5. Add the needed imports

    from llama_index.core.tools import FunctionTool
    from llama_index.llms.openai import OpenAI
    from llama_index.core.agent import (
        StructuredPlannerAgent,
        FunctionCallingAgentWorker,
    )
    
  6. Define the function

    def multiply(a: int, b: int) -> int:
     """Multiply two integers and returns the result integer"""
     return a * b
    
  7. Define and configure the worker agent

    multiply_tool = FunctionTool.from_defaults(fn=multiply)
    llm = OpenAI(model="gpt-4o-mini")
    worker = FunctionCallingAgentWorker.from_tools([multiply_tool], llm=llm, verbose=True)
    worker_agent = StructuredPlannerAgent(worker, [multiply_tool], verbose=True)
    
  8. Test the worker agent

    worker_agent.chat("Solve the equation x = 123 * (x + 2y + 3)")
    
  9. Create .env file & add api key

    OPENAI_API_KEY="<your_api_key>"
    
  10. Run the agent

python worker.py