Add support for forcing GPT-5 Codex model configuration

- Introduced a new `ForceGPT5Codex` configuration option in settings.
- Added relevant API endpoints for managing `ForceGPT5Codex`.
- Enhanced Codex client to handle GPT-5 Codex-specific logic and mapping.
- Updated example configuration file to include the new option.

Add GPT-5 Codex model support and configuration options in documentation
This commit is contained in:
Luis Pater
2025-09-16 04:27:36 +08:00
parent fa762e69a4
commit b84cbee77a
9 changed files with 127 additions and 4 deletions

View File

@@ -229,6 +229,7 @@ console.log(await claudeResponse.json());
- gemini-2.5-flash
- gemini-2.5-flash-lite
- gpt-5
- gpt-5-codex
- claude-opus-4-1-20250805
- claude-opus-4-20250514
- claude-sonnet-4-20250514
@@ -262,6 +263,7 @@ The server uses a YAML configuration file (`config.yaml`) located in the project
| `debug` | boolean | false | Enable debug mode for verbose logging. |
| `api-keys` | string[] | [] | List of API keys that can be used to authenticate requests. |
| `generative-language-api-key` | string[] | [] | List of Generative Language API keys. |
| `force-gpt-5-codex` | bool | false | Force the conversion of GPT-5 calls to GPT-5 Codex. |
| `codex-api-key` | object | {} | List of Codex API keys. |
| `codex-api-key.api-key` | string | "" | Codex API key. |
| `codex-api-key.base-url` | string | "" | Custom Codex API endpoint, if you use a third-party API endpoint. |
@@ -322,6 +324,9 @@ generative-language-api-key:
- "AIzaSy...03"
- "AIzaSy...04"
# Force the conversion of GPT-5 calls to GPT-5 Codex.
force-gpt-5-codex: true
# Codex API keys
codex-api-key:
- api-key: "sk-atSM..."
@@ -423,7 +428,7 @@ export ANTHROPIC_MODEL=gemini-2.5-pro
export ANTHROPIC_SMALL_FAST_MODEL=gemini-2.5-flash
```
Using OpenAI models:
Using OpenAI GPT 5 models:
```bash
export ANTHROPIC_BASE_URL=http://127.0.0.1:8317
export ANTHROPIC_AUTH_TOKEN=sk-dummy
@@ -431,6 +436,14 @@ export ANTHROPIC_MODEL=gpt-5
export ANTHROPIC_SMALL_FAST_MODEL=gpt-5-minimal
```
Using OpenAI GPT 5 Codex models:
```bash
export ANTHROPIC_BASE_URL=http://127.0.0.1:8317
export ANTHROPIC_AUTH_TOKEN=sk-dummy
export ANTHROPIC_MODEL=gpt-5-codex
export ANTHROPIC_SMALL_FAST_MODEL=gpt-5-codex-low
```
Using Claude models:
```bash
export ANTHROPIC_BASE_URL=http://127.0.0.1:8317
@@ -454,7 +467,7 @@ Start CLI Proxy API server, and then edit the `~/.codex/config.toml` and `~/.cod
config.toml:
```toml
model_provider = "cliproxyapi"
model = "gpt-5" # You can use any of the models that we support.
model = "gpt-5-codex" # Or gpt-5, you can also use any of the models that we support.
model_reasoning_effort = "high"
[model_providers.cliproxyapi]