{
"customModels": [
{
"model": "gpt-5.4",
"id": "custom:GPT-5.4-0",
"index": 0,
"baseUrl": "https://myresource.cognitiveservices.azure.com/openai/v1/",
"apiKey": "mykey",
"displayName": "GPT-5.4",
"maxOutputTokens": 16384,
"noImageSupport": false,
"provider": "openai"
},
{
"model": "gpt-5.3-codex",
"id": "custom:GPT-5.3-codex-1",
"index": 1,
"baseUrl": "https://myresource.cognitiveservices.azure.com/openai/v1/",
"apiKey": "mykey",
"displayName": "GPT-5.3-codex",
"maxOutputTokens": 16384,
"noImageSupport": false,
"provider": "openai"
},
{
"model": "gpt-5.2",
"id": "custom:GPT-5.2-2",
"index": 2,
"baseUrl": "https://myresource.cognitiveservices.azure.com/openai/v1/",
"apiKey": "mykey",
"displayName": "GPT-5.2",
"maxOutputTokens": 16384,
"noImageSupport": false,
"provider": "openai"
}
]
}
Hi,
I am encountering an issue where only
gpt-5.2works correctly, while other models (gpt-5.3-codexandgpt-5.4) fail in my current setup.Notably, all of these models work without issues when using
codex-cli.Error Message:
BYOK Error: OpenAI response failed: OpenAI response failedEnvironment:
Configuration (~/.factory/settings.json):