From 88bc9696b39f895434c0de097c1cc07451cee188 Mon Sep 17 00:00:00 2001 From: ParthSareen Date: Mon, 16 Mar 2026 11:03:17 -0700 Subject: [PATCH 1/2] docs(ollama): add ollama to community sandboxes catalog and supported agents --- docs/about/supported-agents.md | 1 + docs/inference/configure.md | 4 ++-- docs/sandboxes/community-sandboxes.md | 1 + docs/tutorials/index.md | 6 +++--- 4 files changed, 7 insertions(+), 5 deletions(-) diff --git a/docs/about/supported-agents.md b/docs/about/supported-agents.md index c21335a8..6fd313dc 100644 --- a/docs/about/supported-agents.md +++ b/docs/about/supported-agents.md @@ -8,6 +8,7 @@ The following table summarizes the agents that run in OpenShell sandboxes. All a | [OpenCode](https://opencode.ai/) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Partial coverage | Pre-installed. Add `opencode.ai` endpoint and OpenCode binary paths to the policy for full functionality. | | [Codex](https://developers.openai.com/codex) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | No coverage | Pre-installed. Requires a custom policy with OpenAI endpoints and Codex binary paths. Requires `OPENAI_API_KEY`. | | [OpenClaw](https://openclaw.ai/) | [`openclaw`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/openclaw) | Bundled | Agent orchestration layer. Launch with `openshell sandbox create --from openclaw`. | +| [Ollama](https://ollama.com/) | [`ollama`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/ollama) | Bundled | Run cloud and local models. Includes Claude Code, Codex, and OpenClaw. Launch with `openshell sandbox create --from ollama`. | More community agent sandboxes are available in the {doc}`../sandboxes/community-sandboxes` catalog. diff --git a/docs/inference/configure.md b/docs/inference/configure.md index fb048dc9..2370f7b4 100644 --- a/docs/inference/configure.md +++ b/docs/inference/configure.md @@ -137,7 +137,7 @@ Use this endpoint when inference should stay local to the host for privacy and s When the upstream runs on the same machine as the gateway, bind it to `0.0.0.0` and point the provider at `host.openshell.internal` or the host's LAN IP. `127.0.0.1` and `localhost` usually fail because the request originates from the gateway or sandbox runtime, not from your shell. -If the gateway runs on a remote host or behind a cloud deployment, `host.openshell.internal` points to that remote machine, not to your laptop. A laptop-local Ollama or vLLM process is not reachable from a remote gateway unless you add your own tunnel or shared network path. +If the gateway runs on a remote host or behind a cloud deployment, `host.openshell.internal` points to that remote machine, not to your laptop. A locally running Ollama or vLLM process is not reachable from a remote gateway unless you add your own tunnel or shared network path. Ollama also supports cloud-hosted models that do not require local hardware. ### Verify the Endpoint from a Sandbox @@ -165,6 +165,6 @@ A successful response confirms the privacy router can reach the configured backe Explore related topics: - To understand the inference routing flow and supported API patterns, refer to {doc}`index`. -- To follow a complete Ollama-based local setup, refer to {doc}`/tutorials/local-inference-ollama`. +- To follow a complete Ollama setup (cloud or local), refer to {doc}`/tutorials/local-inference-ollama`. - To control external endpoints, refer to [Policies](/sandboxes/policies.md). - To manage provider records, refer to {doc}`../sandboxes/manage-providers`. diff --git a/docs/sandboxes/community-sandboxes.md b/docs/sandboxes/community-sandboxes.md index 3bcb2d27..d2924657 100644 --- a/docs/sandboxes/community-sandboxes.md +++ b/docs/sandboxes/community-sandboxes.md @@ -43,6 +43,7 @@ The following community sandboxes are available in the catalog. | Sandbox | Description | |---|---| | `base` | Foundational image with system tools and dev environment | +| `ollama` | Ollama with cloud and local model support, Claude Code, Codex, and OpenClaw pre-installed | | `openclaw` | Open agent manipulation and control | | `sdg` | Synthetic data generation workflows | diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md index fcfa968d..7d845f1b 100644 --- a/docs/tutorials/index.md +++ b/docs/tutorials/index.md @@ -44,11 +44,11 @@ Launch Claude Code in a sandbox, diagnose a policy denial, and iterate on a cust {bdg-secondary}`Tutorial` ::: -:::{grid-item-card} Local Inference with Ollama +:::{grid-item-card} Inference with Ollama :link: local-inference-ollama :link-type: doc -Route inference to a local Ollama server, verify it from a sandbox, and reuse the same pattern for other OpenAI-compatible engines. +Route inference through Ollama using cloud-hosted or local models, and verify it from a sandbox. +++ {bdg-secondary}`Tutorial` ::: @@ -59,5 +59,5 @@ Route inference to a local Ollama server, verify it from a sandbox, and reuse th First Network Policy GitHub Push Access -Local Inference with Ollama +Inference with Ollama ``` From e3f85f499a4fbe71a5d17ab300236b8dd6f1f68e Mon Sep 17 00:00:00 2001 From: ParthSareen Date: Mon, 16 Mar 2026 14:04:21 -0700 Subject: [PATCH 2/2] update readme --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index d0a16e1e..ef1dbfc1 100644 --- a/README.md +++ b/README.md @@ -35,7 +35,7 @@ uv tool install -U openshell ### Create a sandbox ```bash -openshell sandbox create -- claude # or opencode, codex +openshell sandbox create -- claude # or opencode, codex, ollama ``` A gateway is created automatically on first use. To deploy on a remote host instead, pass `--remote user@host` to the create command. @@ -136,6 +136,7 @@ The CLI auto-bootstraps a GPU-enabled gateway on first use. GPU intent is also i | [OpenCode](https://opencode.ai/) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Works out of the box. Provider uses `OPENAI_API_KEY` or `OPENROUTER_API_KEY`. | | [Codex](https://developers.openai.com/codex) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Works out of the box. Provider uses `OPENAI_API_KEY`. | | [OpenClaw](https://openclaw.ai/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from openclaw`. | +| [Ollama](https://ollama.com/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from ollama`. | ## Key Commands