Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ uv tool install -U openshell
### Create a sandbox

```bash
openshell sandbox create -- claude # or opencode, codex
openshell sandbox create -- claude # or opencode, codex, ollama
```

A gateway is created automatically on first use. To deploy on a remote host instead, pass `--remote user@host` to the create command.
Expand Down Expand Up @@ -137,6 +137,7 @@ The CLI auto-bootstraps a GPU-enabled gateway on first use. GPU intent is also i
| [OpenCode](https://opencode.ai/) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Works out of the box. Provider uses `OPENAI_API_KEY` or `OPENROUTER_API_KEY`. |
| [Codex](https://developers.openai.com/codex) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Works out of the box. Provider uses `OPENAI_API_KEY`. |
| [OpenClaw](https://openclaw.ai/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from openclaw`. |
| [Ollama](https://ollama.com/) | [Community](https://github.com/NVIDIA/OpenShell-Community) | Launch with `openshell sandbox create --from ollama`. |

## Key Commands

Expand Down
1 change: 1 addition & 0 deletions docs/about/supported-agents.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ The following table summarizes the agents that run in OpenShell sandboxes. All a
| [OpenCode](https://opencode.ai/) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | Partial coverage | Pre-installed. Add `opencode.ai` endpoint and OpenCode binary paths to the policy for full functionality. |
| [Codex](https://developers.openai.com/codex) | [`base`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/base) | No coverage | Pre-installed. Requires a custom policy with OpenAI endpoints and Codex binary paths. Requires `OPENAI_API_KEY`. |
| [OpenClaw](https://openclaw.ai/) | [`openclaw`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/openclaw) | Bundled | Agent orchestration layer. Launch with `openshell sandbox create --from openclaw`. |
| [Ollama](https://ollama.com/) | [`ollama`](https://github.com/NVIDIA/OpenShell-Community/tree/main/sandboxes/ollama) | Bundled | Run cloud and local models. Includes Claude Code, Codex, and OpenClaw. Launch with `openshell sandbox create --from ollama`. |

More community agent sandboxes are available in the {doc}`../sandboxes/community-sandboxes` catalog.

Expand Down
2 changes: 1 addition & 1 deletion docs/inference/configure.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@ Use this endpoint when inference should stay local to the host for privacy and s

When the upstream runs on the same machine as the gateway, bind it to `0.0.0.0` and point the provider at `host.openshell.internal` or the host's LAN IP. `127.0.0.1` and `localhost` usually fail because the request originates from the gateway or sandbox runtime, not from your shell.

If the gateway runs on a remote host or behind a cloud deployment, `host.openshell.internal` points to that remote machine, not to your laptop. A laptop-local Ollama or vLLM process is not reachable from a remote gateway unless you add your own tunnel or shared network path.
If the gateway runs on a remote host or behind a cloud deployment, `host.openshell.internal` points to that remote machine, not to your laptop. A locally running Ollama or vLLM process is not reachable from a remote gateway unless you add your own tunnel or shared network path. Ollama also supports cloud-hosted models that do not require local hardware.

### Verify the Endpoint from a Sandbox

Expand Down
1 change: 1 addition & 0 deletions docs/sandboxes/community-sandboxes.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ The following community sandboxes are available in the catalog.
| Sandbox | Description |
|---|---|
| `base` | Foundational image with system tools and dev environment |
| `ollama` | Ollama with cloud and local model support, Claude Code, Codex, and OpenClaw pre-installed |
| `openclaw` | Open agent manipulation and control |
| `sdg` | Synthetic data generation workflows |

Expand Down
6 changes: 3 additions & 3 deletions docs/tutorials/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,11 +44,11 @@ Launch Claude Code in a sandbox, diagnose a policy denial, and iterate on a cust
{bdg-secondary}`Tutorial`
:::

:::{grid-item-card} Local Inference with Ollama
:::{grid-item-card} Inference with Ollama
:link: local-inference-ollama
:link-type: doc

Route inference to a local Ollama server, verify it from a sandbox, and reuse the same pattern for other OpenAI-compatible engines.
Route inference through Ollama using cloud-hosted or local models, and verify it from a sandbox.
+++
{bdg-secondary}`Tutorial`
:::
Expand All @@ -68,6 +68,6 @@ Route inference to a local LM Studio server via the OpenAI or Anthropic compatib

First Network Policy <first-network-policy>
GitHub Push Access <github-sandbox>
Local Inference with Ollama <local-inference-ollama>
Inference with Ollama <local-inference-ollama>
Local Inference with LM Studio <local-inference-lmstudio>
```
Loading