OpenShell is the runtime environment for autonomous agents -- the infrastructure where they live, work, and verify. It provides a programmable factory where agents can generate synthetic data to fix edge cases and safely iterate through thousands of failures in isolated sandboxes. The core engine includes the sandbox runtime, policy engine, gateway (with k3s harness), privacy router, and CLI.
This repo is the community ecosystem around OpenShell -- a hub for contributed skills, sandbox images, launchables, and integrations that extend its capabilities. For the core engine, docs, and published artifacts (PyPI, containers, binaries), see the OpenShell repo.
Alpha software — single-player mode. OpenShell is proof-of-life: one developer, one environment, one gateway. We are building toward multi-tenant enterprise deployments, but the starting point is getting your own environment up and running. Expect rough edges. Bring your agent.
| Directory | Description |
|---|---|
brev/ |
Brev launchable for one-click cloud deployment of OpenShell |
sandboxes/ |
Pre-built sandbox images for domain-specific workloads (each with its own skills) |
| Sandbox | Description |
|---|---|
sandboxes/base/ |
Foundational image with system tools, users, and dev environment |
sandboxes/ollama/ |
Ollama for local and cloud LLMs with Claude Code, Codex, OpenCode pre-installed |
sandboxes/sdg/ |
Synthetic data generation workflows |
sandboxes/openclaw/ |
OpenClaw -- open agent manipulation and control |
- OpenShell CLI installed (
uv pip install openshell) - Docker or a compatible container runtime
- NVIDIA GPU with appropriate drivers (for GPU-accelerated images)
Skip the setup and launch OpenShell Community on a fully configured Brev instance, whether you want to use Brev as a remote OpenShell gateway with or without GPU accelerators, or as an all-in-one playground for sandboxes, inference, and UI workflows.
After the Brev instance is ready, access the Welcome UI to inject provider keys and access your Openclaw sandbox.
openshell sandbox create --from openclawThe --from flag accepts any sandbox defined under sandboxes/ (e.g., openclaw, ollama, sdg), a local path, or a container image reference.
The Ollama sandbox provides Ollama for running local LLMs and routing to cloud models, with Claude Code and Codex pre-installed.
Quick start:
openshell sandbox create --from ollama
curl http://127.0.0.1:11434/api/tagsSee the Ollama sandbox README for full details.
See CONTRIBUTING.md.
See SECURITY.md. Do not file public issues for security vulnerabilities.
This project is licensed under the Apache 2.0 License -- see the LICENSE file for details.