Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 9 additions & 5 deletions brev/launch.sh
Original file line number Diff line number Diff line change
Expand Up @@ -566,19 +566,23 @@
}

set_inference_route() {
# Set the default inference route to one model (nvidia-endpoints + kimi-k2.5).
# Canonical CLI per brev/welcome-ui/SERVER_ARCHITECTURE.md: cluster inference set
# (CLI_BIN is either "openshell" or "nemoclaw"; both accept the same subcommands.)
# Try canonical first, then "inference set" as fallback for other CLI versions.
log "Configuring inference route..."

if "$CLI_BIN" inference set --provider nvidia-endpoints --model minimaxai/minimax-m2.5 >/dev/null 2>&1; then
log "Configured inference via '$CLI_BIN inference set'."
if "$CLI_BIN" cluster inference set --provider nvidia-endpoints --model moonshotai/kimi-k2.5 --no-verify >/dev/null 2>&1; then
log "Configured inference via '$CLI_BIN cluster inference set'."
return
fi

if "$CLI_BIN" cluster inference set --provider nvidia-endpoints --model minimaxai/minimax-m2.5 >/dev/null 2>&1; then
log "Configured inference via legacy '$CLI_BIN cluster inference set'."
if "$CLI_BIN" inference set --provider nvidia-endpoints --model moonshotai/kimi-k2.5 --no-verify >/dev/null 2>&1; then
log "Configured inference via '$CLI_BIN inference set'."
return
fi

log "Unable to configure inference route with either current or legacy CLI commands."
log "Unable to configure inference route with either 'cluster inference set' or 'inference set'."
exit 1
}

Expand Down
19 changes: 19 additions & 0 deletions brev/welcome-ui/SERVER_ARCHITECTURE.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,8 @@ The server operates in **two distinct modes** depending on sandbox readiness:
| `LOG_FILE` | `/tmp/nemoclaw-sandbox-create.log` | Sandbox creation log (written by subprocess) |
| `PROVIDER_CONFIG_CACHE` | `/tmp/nemoclaw-provider-config-cache.json` | Provider config values cache |
| `OTHER_AGENTS_YAML` | `ROOT/other-agents.yaml` | YAML modal definition file |
| `INFERENCE_PROVIDERS_YAML` | `ROOT/inference-providers.yaml` | Inference provider picker and per-partner instructions |
| `NCP_LOGOS_DIR` | `SANDBOX_DIR/ncp-logos` | Partner and NVIDIA logos served at `/ncp-logos/*` |
| `NEMOCLAW_IMAGE` | `ghcr.io/nvidia/openshell-community/sandboxes/openclaw-nvidia:local` | Optional image override |
| `SANDBOX_PORT` | `18789` | Port the sandbox listens on (localhost) |

Expand Down Expand Up @@ -845,6 +847,21 @@ steps: # Array of instruction sections

4. **Fallback:** If YAML fails to parse or PyYAML is not installed, the placeholder is replaced with an HTML comment: `<!-- other-agents.yaml not available -->`

### Inference Provider Picker (`inference-providers.yaml`)

The server also renders `inference-providers.yaml` into HTML and injects it into `index.html`, replacing the `{{INFERENCE_PROVIDER_PICKER}}` placeholder. This provides:

- **Picker screen:** "Choose your inference provider" with NVIDIA (free) in its own row and paid partners in a 5×2 grid. Each tile has `data-provider-id` for JS.
- **Partner instruction blocks:** For each partner, a `div#provider-instructions-{id}` with title, intro, and steps (same schema as other-agents steps). Shown when the user clicks a partner.

**YAML schema:** Top-level `nvidia: { displayName, logoFile }` and `partners: [ { id, name, logoFile, instructions: { title, intro?, steps[] } } ]`. Logo filenames refer to files under `NCP_LOGOS_DIR`, served at `GET /ncp-logos/<filename>`.

**Fallback:** If the file is missing or invalid, the placeholder is replaced with `<!-- inference-providers.yaml not available -->` and the Install OpenClaw modal shows only the NVIDIA API key view (no picker).

### NCP Logos Route (`GET /ncp-logos/*`)

In welcome-ui mode (before sandbox is ready), `GET /ncp-logos/<file>` serves static files from `SANDBOX_DIR/ncp-logos/`. Path traversal is rejected; only files under that directory are served. Used for NVIDIA and partner logos in the provider picker. MIME type for `.webp` is `image/webp`.

---

## 9. Policy Management Pipeline
Expand Down Expand Up @@ -1146,6 +1163,8 @@ All CLI commands are executed via `subprocess.run()` or `subprocess.Popen()`. Ev
|------|------|----------|
| `ROOT/index.html` | First request to `/` | Yes |
| `ROOT/other-agents.yaml` | First request to `/` | No (graceful fallback) |
| `ROOT/inference-providers.yaml` | First request to `/` | No (graceful fallback) |
| `SANDBOX_DIR/ncp-logos/*` | `GET /ncp-logos/<file>` | No (optional logos) |
| `ROOT/styles.css` | Static file serving | Yes (for UI) |
| `ROOT/app.js` | Static file serving | Yes (for UI) |
| `SANDBOX_DIR/policy.yaml` | Sandbox creation | No (graceful fallback) |
Expand Down
5 changes: 3 additions & 2 deletions brev/welcome-ui/__tests__/cluster-inference.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ describe("POST /api/cluster-inference", () => {
expect(res.status).toBe(400);
});

it("TC-CI10: calls nemoclaw cluster inference set with --provider and --model", async () => {
it("TC-CI10: calls CLI (openshell or nemoclaw) with cluster inference set or inference set, --provider, --model, --no-verify", async () => {
execFile.mockImplementation((cmd, args, opts, cb) => {
if (typeof opts === "function") { cb = opts; opts = {}; }
cb(null, "", "");
Expand All @@ -162,13 +162,14 @@ describe("POST /api/cluster-inference", () => {
.send({ providerName: "test-prov", modelId: "test-model" });

const setCall = execFile.mock.calls.find(
(c) => c[0] === "nemoclaw" && c[1]?.includes("inference") && c[1]?.includes("set")
(c) => (c[0] === "openshell" || c[0] === "nemoclaw") && c[1]?.includes("inference") && c[1]?.includes("set")
);
expect(setCall).toBeDefined();
const args = setCall[1];
expect(args).toContain("--provider");
expect(args).toContain("test-prov");
expect(args).toContain("--model");
expect(args).toContain("test-model");
expect(args).toContain("--no-verify");
});
});
40 changes: 39 additions & 1 deletion brev/welcome-ui/__tests__/template-render.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,13 @@

import { describe, it, expect, beforeEach } from 'vitest';
import serverModule from '../server.js';
const { renderOtherAgentsModal, getRenderedIndex, escapeHtml, _resetForTesting } = serverModule;
const {
renderOtherAgentsModal,
renderInferenceProviderPickerAndInstructions,
getRenderedIndex,
escapeHtml,
_resetForTesting,
} = serverModule;

// === TC-T01 through TC-T14: YAML-to-HTML template rendering ===

Expand Down Expand Up @@ -88,6 +94,26 @@ describe("renderOtherAgentsModal", () => {
});
});

describe("renderInferenceProviderPickerAndInstructions", () => {
it("TC-T15: picker contains NVIDIA row and heading when YAML present", () => {
const html = renderInferenceProviderPickerAndInstructions();
if (!html) return;
expect(html).toContain("Choose your inference provider");
expect(html).toContain('data-provider-id="nvidia"');
expect(html).toContain("Free endpoint provider");
});

it("TC-T16: picker contains partner grid and provider-instructions blocks", () => {
const html = renderInferenceProviderPickerAndInstructions();
if (!html) return;
expect(html).toContain("provider-picker__grid");
expect(html).toContain("install-partner-instructions");
expect(html).toContain("partner-instructions-content");
expect(html).toContain("provider-instructions-");
expect(html).toContain("Back to providers");
});
});

describe("getRenderedIndex", () => {
beforeEach(() => {
_resetForTesting();
Expand All @@ -112,4 +138,16 @@ describe("getRenderedIndex", () => {
const hasComment = html.includes("<!-- other-agents.yaml not available -->");
expect(hasModal || hasComment).toBe(true);
});

it("TC-T17: {{INFERENCE_PROVIDER_PICKER}} is replaced in index.html", () => {
const html = getRenderedIndex();
expect(html).not.toContain("{{INFERENCE_PROVIDER_PICKER}}");
});

it("TC-T18: inference picker replaced with content or fallback comment", () => {
const html = getRenderedIndex();
const hasPicker = html.includes("install-provider-picker");
const hasComment = html.includes("<!-- inference-providers.yaml not available -->");
expect(hasPicker || hasComment).toBe(true);
});
});
87 changes: 83 additions & 4 deletions brev/welcome-ui/app.js
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@

// Install modal elements
const installMain = $("#install-main");
const installNvidiaKey = $("#install-nvidia-key");
const installProviderPicker = $("#install-provider-picker");
const installPartnerInstructions = $("#install-partner-instructions");
const stepError = $("#install-step-error");
const apiKeyInput = $("#api-key-input");
const toggleKeyVis = $("#toggle-key-vis");
Expand Down Expand Up @@ -233,6 +236,24 @@
stepError.hidden = true;
}

// -- Install modal screens (picker / NVIDIA / partner) -----------------

function showInstallScreen(screen, providerId) {
if (installProviderPicker) installProviderPicker.hidden = screen !== "picker";
if (installNvidiaKey) installNvidiaKey.hidden = screen !== "nvidia";
if (installPartnerInstructions) {
installPartnerInstructions.hidden = screen !== "partner";
if (screen === "partner" && providerId) {
const content = $("#partner-instructions-content");
if (content) {
content.querySelectorAll("[id^=\"provider-instructions-\"]").forEach((el) => {
el.hidden = el.id !== "provider-instructions-" + providerId;
});
}
}
}
}

function showError(msg) {
stopPolling();
installMain.hidden = true;
Expand Down Expand Up @@ -387,6 +408,7 @@
updateButtonState();

showOverlay(overlayInstall);
showInstallScreen("nvidia");
if (!keyInjected) {
startPolling();
}
Expand All @@ -398,6 +420,7 @@
updateButtonState();

showOverlay(overlayInstall);
showInstallScreen("nvidia");
startPolling();
}
} catch {
Expand Down Expand Up @@ -425,13 +448,19 @@
showOverlay(overlayInstall);
if (installFailed) {
stepError.hidden = false;
installMain.hidden = true;
if (installProviderPicker) installProviderPicker.hidden = true;
if (installNvidiaKey) installNvidiaKey.hidden = true;
if (installPartnerInstructions) installPartnerInstructions.hidden = true;
} else {
showMainView();
if (installProviderPicker) {
showInstallScreen("picker");
} else {
showInstallScreen("nvidia");
apiKeyInput.focus();
if (!installTriggered) triggerInstall();
}
}
apiKeyInput.focus();
updateButtonState();
if (!installTriggered && !installFailed) triggerInstall();
});

cardOther.addEventListener("click", () => {
Expand All @@ -442,6 +471,56 @@
closeInstall.addEventListener("click", () => hideOverlay(overlayInstall));
closeInstr.addEventListener("click", () => hideOverlay(overlayInstr));

const backFromNvidia = $("#back-from-nvidia");
if (backFromNvidia) {
backFromNvidia.addEventListener("click", () => showInstallScreen("picker"));
}
document.addEventListener("click", (e) => {
const backPartner = e.target.closest("#back-from-partner");
if (backPartner) showInstallScreen("picker");
});

function handleProviderChoice(id) {
if (id === "nvidia") {
showInstallScreen("nvidia");
apiKeyInput.focus();
if (!installTriggered && !installFailed) triggerInstall();
} else {
showInstallScreen("partner", id);
}
}

function onProviderPickerClick(e) {
const tile = e.target.closest("[data-provider-id]");
if (!tile) return;
const id = tile.getAttribute("data-provider-id");
if (!id) return;
e.preventDefault();
e.stopPropagation();
handleProviderChoice(id);
}

function onProviderPickerKeydown(e) {
if (e.key !== "Enter" && e.key !== " ") return;
const tile = e.target.closest("[data-provider-id]");
if (!tile) return;
const id = tile.getAttribute("data-provider-id");
if (!id) return;
e.preventDefault();
handleProviderChoice(id);
}

if (installProviderPicker) {
installProviderPicker.addEventListener("click", onProviderPickerClick);
installProviderPicker.addEventListener("keydown", onProviderPickerKeydown);
} else {
const installBody = $("#install-body");
if (installBody) {
installBody.addEventListener("click", onProviderPickerClick);
installBody.addEventListener("keydown", onProviderPickerKeydown);
}
}

closeOnBackdrop(overlayInstall);
closeOnBackdrop(overlayInstr);

Expand Down
11 changes: 8 additions & 3 deletions brev/welcome-ui/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<title>OpenShell — Agent Sandbox</title>
<link rel="icon" type="image/svg+xml" href="/OpenShell-Icon.svg">
<link rel="alternate icon" href="/favicon.ico" sizes="any">
<link rel="stylesheet" href="styles.css?v=7">
<link rel="stylesheet" href="styles.css?v=8">
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700;800&display=swap" rel="stylesheet">
Expand Down Expand Up @@ -82,7 +82,11 @@ <h3 class="modal__title">Install OpenClaw</h3>
</div>
<div class="modal__body" id="install-body">

<div id="install-main">
{{INFERENCE_PROVIDER_PICKER}}

<div id="install-nvidia-key" class="install-nvidia-key" hidden>
<button type="button" class="provider-picker__back" id="back-from-nvidia">Back to providers</button>
<div id="install-main">
<!-- Zone A: Active user input — interactive from second zero -->
<div class="zone-input">
<p class="zone-input__desc">
Expand Down Expand Up @@ -139,6 +143,7 @@ <h3 class="modal__title">Install OpenClaw</h3>
<span id="btn-launch-label">Waiting for API key...</span>
<svg class="btn__external-icon" viewBox="0 0 24 24" fill="none"><path d="M18 13v6a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2V8a2 2 0 0 1 2-2h6"/><polyline points="15 3 21 3 21 9"/><line x1="10" x2="21" y1="14" y2="3"/></svg>
</button>
</div>
</div>

<!-- Error state -->
Expand All @@ -164,6 +169,6 @@ <h4 class="error-card__title">Something went wrong</h4>
OpenShell Sandbox &middot; NVIDIA
</footer>

<script src="app.js?v=12"></script>
<script src="app.js?v=13"></script>
</body>
</html>
Loading
Loading