You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: USAGE.md
+64Lines changed: 64 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -153,6 +153,70 @@ cd rust
153
153
./target/debug/claw --model "openai/gpt-4.1-mini" prompt "summarize this repository in one sentence"
154
154
```
155
155
156
+
## Supported Providers & Models
157
+
158
+
`claw` has three built-in provider backends. The provider is selected automatically based on the model name, falling back to whichever credential is present in the environment.
159
+
160
+
### Provider matrix
161
+
162
+
| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL |
163
+
|---|---|---|---|---|
164
+
|**Anthropic** (direct) | Anthropic Messages API |`ANTHROPIC_API_KEY` or `ANTHROPIC_AUTH_TOKEN` or OAuth (`claw login`) |`ANTHROPIC_BASE_URL`|`https://api.anthropic.com`|
The OpenAI-compatible backend also serves as the gateway for **OpenRouter**, **Ollama**, and any other service that speaks the OpenAI `/v1/chat/completions` wire format — just point `OPENAI_BASE_URL` at the service.
169
+
170
+
### Tested models and aliases
171
+
172
+
These are the models registered in the built-in alias table with known token limits:
173
+
174
+
| Alias | Resolved model name | Provider | Max output tokens | Context window |
Any model name that does not match an alias is passed through verbatim. This is how you use OpenRouter model slugs (`openai/gpt-4.1-mini`), Ollama tags (`llama3.2`), or full Anthropic model IDs (`claude-sonnet-4-20250514`).
184
+
185
+
### User-defined aliases
186
+
187
+
You can add custom aliases in any settings file (`~/.claw/settings.json`, `.claw/settings.json`, or `.claw/settings.local.json`):
188
+
189
+
```json
190
+
{
191
+
"aliases": {
192
+
"fast": "claude-haiku-4-5-20251213",
193
+
"smart": "claude-opus-4-6",
194
+
"cheap": "grok-3-mini"
195
+
}
196
+
}
197
+
```
198
+
199
+
Local project settings override user-level settings. Aliases resolve through the built-in table, so `"fast": "haiku"` also works.
200
+
201
+
### How provider detection works
202
+
203
+
1. If the resolved model name starts with `claude` → Anthropic.
204
+
2. If it starts with `grok` → xAI.
205
+
3. Otherwise, `claw` checks which credential is set: `ANTHROPIC_API_KEY`/`ANTHROPIC_AUTH_TOKEN` first, then `OPENAI_API_KEY`, then `XAI_API_KEY`.
206
+
4. If nothing matches, it defaults to Anthropic.
207
+
208
+
## FAQ
209
+
210
+
### What about Codex?
211
+
212
+
The name "codex" appears in the Claw Code ecosystem but it does **not** refer to OpenAI Codex (the code-generation model). Here is what it means in this project:
213
+
214
+
-**`oh-my-codex` (OmX)** is the workflow and plugin layer that sits on top of `claw`. It provides planning modes, parallel multi-agent execution, notification routing, and other automation features. See [PHILOSOPHY.md](./PHILOSOPHY.md) and the [oh-my-codex repo](https://github.com/Yeachan-Heo/oh-my-codex).
215
+
-**`.codex/` directories** (e.g. `.codex/skills`, `.codex/agents`, `.codex/commands`) are legacy lookup paths that `claw` still scans alongside the primary `.claw/` directories.
216
+
-**`CODEX_HOME`** is an optional environment variable that points to a custom root for user-level skill and command lookups.
217
+
218
+
`claw` does **not** support OpenAI Codex sessions, the Codex CLI, or Codex session import/export. If you need to use OpenAI models (like GPT-4.1), configure the OpenAI-compatible provider as shown above in the [OpenAI-compatible endpoint](#openai-compatible-endpoint) and [OpenRouter](#openrouter) sections.
219
+
156
220
## HTTP proxy support
157
221
158
222
`claw` honours the standard `HTTP_PROXY`, `HTTPS_PROXY`, and `NO_PROXY` environment variables (both upper- and lower-case spellings are accepted) when issuing outbound requests to Anthropic, OpenAI-, and xAI-compatible endpoints. Set them before launching the CLI and the underlying `reqwest` client will be configured automatically.
0 commit comments