Skip to content

NVIDIA/OpenShell-Community

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

OpenShell Community

License PyPI Security Policy Project Status

OpenShell is the runtime environment for autonomous agents -- the infrastructure where they live, work, and verify. It provides a programmable factory where agents can generate synthetic data to fix edge cases and safely iterate through thousands of failures in isolated sandboxes. The core engine includes the sandbox runtime, policy engine, gateway (with k3s harness), privacy router, and CLI.

This repo is the community ecosystem around OpenShell -- a hub for contributed skills, sandbox images, launchables, and integrations that extend its capabilities. For the core engine, docs, and published artifacts (PyPI, containers, binaries), see the OpenShell repo.

Alpha software — single-player mode. OpenShell is proof-of-life: one developer, one environment, one gateway. We are building toward multi-tenant enterprise deployments, but the starting point is getting your own environment up and running. Expect rough edges. Bring your agent.

What's Here

Directory Description
brev/ Brev launchable for one-click cloud deployment of OpenShell
sandboxes/ Pre-built sandbox images for domain-specific workloads (each with its own skills)

Sandboxes

Sandbox Description
sandboxes/base/ Foundational image with system tools, users, and dev environment
sandboxes/ollama/ Ollama for local and cloud LLMs with Claude Code, Codex, OpenCode pre-installed
sandboxes/sdg/ Synthetic data generation workflows
sandboxes/openclaw/ OpenClaw -- open agent manipulation and control

Getting Started

Prerequisites

  • OpenShell CLI installed (uv pip install openshell)
  • Docker or a compatible container runtime
  • NVIDIA GPU with appropriate drivers (for GPU-accelerated images)

Quick Start with Brev

Skip the setup and launch OpenShell Community on a fully configured Brev instance, whether you want to use Brev as a remote OpenShell gateway with or without GPU accelerators, or as an all-in-one playground for sandboxes, inference, and UI workflows.

Instance Best For Deploy
CPU-only Remote OpenShell gateway deployments, external inference endpoints, remote APIs, and lighter-weight sandbox workflows Deploy on Brev
NVIDIA H100 All-in-one OpenShell playgrounds, locally hosted LLM endpoints, GPU-heavy sandboxes, and higher-throughput agent workloads Deploy on Brev

After the Brev instance is ready, access the Welcome UI to inject provider keys and access your Openclaw sandbox.

Using Sandboxes

openshell sandbox create --from openclaw

The --from flag accepts any sandbox defined under sandboxes/ (e.g., openclaw, ollama, sdg), a local path, or a container image reference.

Ollama Sandbox

The Ollama sandbox provides Ollama for running local LLMs and routing to cloud models, with Claude Code and Codex pre-installed.

Quick start:

openshell sandbox create --from ollama 

curl http://127.0.0.1:11434/api/tags

See the Ollama sandbox README for full details.

Contributing

See CONTRIBUTING.md.

Security

See SECURITY.md. Do not file public issues for security vulnerabilities.

License

This project is licensed under the Apache 2.0 License -- see the LICENSE file for details.

About

OpenShell is the safe, private runtime for autonomous AI agents.

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages