AI & ML interests

None defined yet.

Recent Activity

burtenshawย  updated a Space 2 days ago
openenv/README
burtenshawย  updated a Space 6 days ago
openenv/sudoku
burtenshawย  updated a Space 6 days ago
openenv/browsergym_env
View all activity

OpenEnv: Agentic Execution Environments

An e2e framework for creating, deploying and using isolated execution environments for agentic RL training, built using Gymnasium style simple APIs.

PyPI Discord Docs

OpenEnv: Agentic Execution Environments

A community driven collection of OpenEnv-spec Environments composed of a Hub and a Standardized spec to ensure environment compatibility.

Join the Hackathon!

Additionally, weโ€™re thrilled to announce a new AgentBeats custom track: the OpenEnv Challenge: SOTA Environments to Drive General Intelligence, sponsored by the PyTorch team at Meta, Hugging Face, and Unsloth. Participants will compete to develop innovative, open-source RL environments that push the frontiers of agent learning, with a prize pool of $10K in Hugging Face credits, and the chance to be published on the PyTorch blog

Sign up here

Quick Start

You can install Hugging Face spaces for client code:

pip install git+https://huggingface.co/spaces/openenv/echo_env

Then use the environment hosted on spaces:

import asyncio
from echo_env import EchoAction, EchoEnv

async def main():
    # Connect to a running Space (async context manager)
    async with EchoEnv(base_url="https://openenv-echo-env.hf.space") as client:
        # Reset the environment
        result = await client.reset()
        print(result.observation.echoed_message)  # "Echo environment ready!"

        # Send messages
        result = await client.step(EchoAction(message="Hello, World!"))
        print(result.observation.echoed_message)  # "Hello, World!"
        print(result.reward)  # 1.3 (based on message length)

asyncio.run(main())

To pull the environments locally from Spaces as docker containers, you can do this:

import asyncio
from echo_env import EchoEnv

async def main():
    # Pulls from Hugging Face and starts a container
    client = await EchoEnv.from_env("openenv/echo_env")
    async with client:
        result = await client.reset()
        print(result.observation)

asyncio.run(main())

Hugging Face x Meta-PyTorch

Hugging Face, Meta-PyTorch and many other supporters are committed to democratizing RL post training with environmnets.

Sponsor Logos

models 0

None public yet

datasets 0

None public yet