feat: introduce custom provider example and remove redundant debug logs

- Added `examples/custom-provider/main.go` showcasing custom executor and translator integration using the SDK.
- Removed redundant debug logs from translator modules to enhance code cleanliness.
- Updated SDK documentation with new usage and advanced examples.
- Expanded the management API with new endpoints, including request logging and GPT-5 Codex features.
This commit is contained in:
Luis Pater
2025-09-22 03:37:53 +08:00
parent d5ad5fab87
commit f81898c906
54 changed files with 907 additions and 101 deletions

View File

@@ -16,6 +16,10 @@ Note: The following options cannot be modified via API and must be set in the co
- `Authorization: Bearer <plaintext-key>`
- `X-Management-Key: <plaintext-key>`
Additional notes:
- If `remote-management.secret-key` is empty, the entire Management API is disabled (all `/v0/management` routes return 404).
- For remote IPs, 5 consecutive authentication failures trigger a temporary ban (~30 minutes) before further attempts are allowed.
If a plaintext key is detected in the config at startup, it will be bcrypthashed and written back to the config file automatically.
## Request/Response Conventions
@@ -62,6 +66,29 @@ If a plaintext key is detected in the config at startup, it will be bcrypthas
{ "status": "ok" }
```
### Force GPT-5 Codex
- GET `/force-gpt-5-codex` — Get current flag
- Request:
```bash
curl -H 'Authorization: Bearer <MANAGEMENT_KEY>' http://localhost:8317/v0/management/force-gpt-5-codex
```
- Response:
```json
{ "gpt-5-codex": false }
```
- PUT/PATCH `/force-gpt-5-codex` — Set boolean
- Request:
```bash
curl -X PUT -H 'Content-Type: application/json' \
-H 'Authorization: Bearer <MANAGEMENT_KEY>' \
-d '{"value":true}' \
http://localhost:8317/v0/management/force-gpt-5-codex
```
- Response:
```json
{ "status": "ok" }
```
### Proxy Server URL
- GET `/proxy-url` — Get the proxy URL string
- Request:
@@ -322,6 +349,29 @@ If a plaintext key is detected in the config at startup, it will be bcrypthas
{ "status": "ok" }
```
### Request Log
- GET `/request-log` — Get boolean
- Request:
```bash
curl -H 'Authorization: Bearer <MANAGEMENT_KEY>' http://localhost:8317/v0/management/request-log
```
- Response:
```json
{ "request-log": false }
```
- PUT/PATCH `/request-log` — Set boolean
- Request:
```bash
curl -X PATCH -H 'Content-Type: application/json' \
-H 'Authorization: Bearer <MANAGEMENT_KEY>' \
-d '{"value":true}' \
http://localhost:8317/v0/management/request-log
```
- Response:
```json
{ "status": "ok" }
```
### Allow Localhost Unauthenticated
- GET `/allow-localhost-unauthenticated` — Get boolean
- Request:
@@ -564,6 +614,19 @@ These endpoints initiate provider login flows and return a URL to open in a brow
{ "status": "ok", "url": "https://..." }
```
- GET `/get-auth-status?state=<state>` — Poll OAuth flow status
- Request:
```bash
curl -H 'Authorization: Bearer <MANAGEMENT_KEY>' \
'http://localhost:8317/v0/management/get-auth-status?state=<STATE_FROM_AUTH_URL>'
```
- Response examples:
```json
{ "status": "wait" }
{ "status": "ok" }
{ "status": "error", "error": "Authentication failed" }
```
## Error Responses
Generic error format:

View File

@@ -18,6 +18,10 @@
若在启动时检测到配置中的管理密钥为明文,会自动使用 bcrypt 加密并回写到配置文件中。
其它说明:
-`remote-management.secret-key` 为空,则管理 API 整体被禁用(所有 `/v0/management` 路由均返回 404
- 对于远程 IP连续 5 次认证失败会触发临时封禁(约 30 分钟)。
## 请求/响应约定
- Content-Type`application/json`(除非另有说明)。
@@ -62,6 +66,29 @@
{ "status": "ok" }
```
### 强制 GPT-5 Codex
- GET `/force-gpt-5-codex` — 获取当前标志
- 请求:
```bash
curl -H 'Authorization: Bearer <MANAGEMENT_KEY>' http://localhost:8317/v0/management/force-gpt-5-codex
```
- 响应:
```json
{ "gpt-5-codex": false }
```
- PUT/PATCH `/force-gpt-5-codex` — 设置布尔值
- 请求:
```bash
curl -X PUT -H 'Content-Type: application/json' \
-H 'Authorization: Bearer <MANAGEMENT_KEY>' \
-d '{"value":true}' \
http://localhost:8317/v0/management/force-gpt-5-codex
```
- 响应:
```json
{ "status": "ok" }
```
### 代理服务器 URL
- GET `/proxy-url` — 获取代理 URL 字符串
- 请求:
@@ -322,6 +349,29 @@
{ "status": "ok" }
```
### 请求日志开关
- GET `/request-log` — 获取布尔值
- 请求:
```bash
curl -H 'Authorization: Bearer <MANAGEMENT_KEY>' http://localhost:8317/v0/management/request-log
```
- 响应:
```json
{ "request-log": false }
```
- PUT/PATCH `/request-log` — 设置布尔值
- 请求:
```bash
curl -X PATCH -H 'Content-Type: application/json' \
-H 'Authorization: Bearer <MANAGEMENT_KEY>' \
-d '{"value":true}' \
http://localhost:8317/v0/management/request-log
```
- 响应:
```json
{ "status": "ok" }
```
### 允许本地未认证访问
- GET `/allow-localhost-unauthenticated` — 获取布尔值
- 请求:
@@ -564,6 +614,19 @@
{ "status": "ok", "url": "https://..." }
```
- GET `/get-auth-status?state=<state>` — 轮询 OAuth 流程状态
- 请求:
```bash
curl -H 'Authorization: Bearer <MANAGEMENT_KEY>' \
'http://localhost:8317/v0/management/get-auth-status?state=<STATE_FROM_AUTH_URL>'
```
- 响应示例:
```json
{ "status": "wait" }
{ "status": "ok" }
{ "status": "error", "error": "Authentication failed" }
```
## 错误响应
通用错误格式:

View File

@@ -28,7 +28,7 @@ The first Chinese provider has now been added: [Qwen Code](https://github.com/Qw
- Qwen Code multi-account load balancing
- OpenAI Codex multi-account load balancing
- OpenAI-compatible upstream providers via config (e.g., OpenRouter)
- Reusable Go SDK for embedding the proxy (see `docs/sdk-usage.md`)
- Reusable Go SDK for embedding the proxy (see `docs/sdk-usage.md`, 中文: `docs/sdk-usage_CN.md`)
## Installation
@@ -614,6 +614,11 @@ docker run --rm -p 8317:8317 -v /path/to/your/config.yaml:/CLIProxyAPI/config.ya
see [MANAGEMENT_API.md](MANAGEMENT_API.md)
## SDK Docs
- Usage: `docs/sdk-usage.md` (中文: `docs/sdk-usage_CN.md`)
- Advanced (executors & translators): `docs/sdk-advanced.md` (中文: `docs/sdk-advanced_CN.md`)
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.

View File

@@ -48,6 +48,7 @@
- 支持 Qwen Code 多账户轮询
- 支持 OpenAI Codex 多账户轮询
- 通过配置接入上游 OpenAI 兼容提供商(例如 OpenRouter
- 可复用的 Go SDK`docs/sdk-usage.md`
## 安装
@@ -622,6 +623,12 @@ docker run --rm -p 8317:8317 -v /path/to/your/config.yaml:/CLIProxyAPI/config.ya
请参见 [MANAGEMENT_API_CN.md](MANAGEMENT_API_CN.md)
## SDK 文档
- 使用文档:`docs/sdk-usage_CN.md`English: `docs/sdk-usage.md`
- 高级(执行器与翻译器):`docs/sdk-advanced_CN.md`English: `docs/sdk-advanced.md`
- 自定义 Provider 示例:`examples/custom-provider`
## 贡献
欢迎贡献!请随时提交 Pull Request。

138
docs/sdk-advanced.md Normal file
View File

@@ -0,0 +1,138 @@
# SDK Advanced: Executors & Translators
This guide explains how to extend the embedded proxy with custom providers and schemas using the SDK. You will:
- Implement a provider executor that talks to your upstream API
- Register request/response translators for schema conversion
- Register models so they appear in `/v1/models`
The examples use Go 1.24+ and the v6 module path.
## Concepts
- Provider executor: a runtime component implementing `auth.ProviderExecutor` that performs outbound calls for a given provider key (e.g., `gemini`, `claude`, `codex`). Executors can also implement `RequestPreparer` to inject credentials on raw HTTP requests.
- Translator registry: schema conversion functions routed by `sdk/translator`. The builtin handlers translate between OpenAI/Gemini/Claude/Codex formats; you can register new ones.
- Model registry: publishes the list of available models per client/provider to power `/v1/models` and routing hints.
## 1) Implement a Provider Executor
Create a type that satisfies `auth.ProviderExecutor`.
```go
package myprov
import (
"context"
"net/http"
coreauth "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/auth"
clipexec "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/executor"
)
type Executor struct{}
func (Executor) Identifier() string { return "myprov" }
// Optional: mutate outbound HTTP requests with credentials
func (Executor) PrepareRequest(req *http.Request, a *coreauth.Auth) error {
// Example: req.Header.Set("Authorization", "Bearer "+a.APIKey)
return nil
}
func (Executor) Execute(ctx context.Context, a *coreauth.Auth, req clipexec.Request, opts clipexec.Options) (clipexec.Response, error) {
// Build HTTP request based on req.Payload (already translated into provider format)
// Use perauth transport if provided: transport := a.RoundTripper // via RoundTripperProvider
// Perform call and return provider JSON payload
return clipexec.Response{Payload: []byte(`{"ok":true}`)}, nil
}
func (Executor) ExecuteStream(ctx context.Context, a *coreauth.Auth, req clipexec.Request, opts clipexec.Options) (<-chan clipexec.StreamChunk, error) {
ch := make(chan clipexec.StreamChunk, 1)
go func() { defer close(ch); ch <- clipexec.StreamChunk{Payload: []byte("data: {\"done\":true}\n\n")} }()
return ch, nil
}
func (Executor) Refresh(ctx context.Context, a *coreauth.Auth) (*coreauth.Auth, error) {
// Optionally refresh tokens and return updated auth
return a, nil
}
```
Register the executor with the core manager before starting the service:
```go
core := coreauth.NewManager(coreauth.NewFileStore(cfg.AuthDir), nil, nil)
core.RegisterExecutor(myprov.Executor{})
svc, _ := cliproxy.NewBuilder().WithConfig(cfg).WithConfigPath(cfgPath).WithCoreAuthManager(core).Build()
```
If your auth entries use provider `"myprov"`, the manager routes requests to your executor.
## 2) Register Translators
The handlers accept OpenAI/Gemini/Claude/Codex inputs. To support a new provider format, register translation functions in `sdk/translator`s default registry.
Direction matters:
- Request: register from inbound schema to provider schema
- Response: register from provider schema back to inbound schema
Example: Convert OpenAI Chat → MyProv Chat and back.
```go
package myprov
import (
"context"
sdktr "github.com/router-for-me/CLIProxyAPI/v6/sdk/translator"
)
const (
FOpenAI = sdktr.Format("openai.chat")
FMyProv = sdktr.Format("myprov.chat")
)
func init() {
sdktr.Register(FOpenAI, FMyProv,
// Request transform (model, rawJSON, stream)
func(model string, raw []byte, stream bool) []byte { return convertOpenAIToMyProv(model, raw, stream) },
// Response transform (stream & nonstream)
sdktr.ResponseTransform{
Stream: func(ctx context.Context, model string, originalReq, translatedReq, raw []byte, param *any) []string {
return convertStreamMyProvToOpenAI(model, originalReq, translatedReq, raw)
},
NonStream: func(ctx context.Context, model string, originalReq, translatedReq, raw []byte, param *any) string {
return convertMyProvToOpenAI(model, originalReq, translatedReq, raw)
},
},
)
}
```
When the OpenAI handler receives a request that should route to `myprov`, the pipeline uses the registered transforms automatically.
## 3) Register Models
Expose models under `/v1/models` by registering them in the global model registry using the auth ID (client ID) and provider name.
```go
models := []*cliproxy.ModelInfo{
{ ID: "myprov-pro-1", Object: "model", Type: "myprov", DisplayName: "MyProv Pro 1" },
}
cliproxy.GlobalModelRegistry().RegisterClient(authID, "myprov", models)
```
The embedded server calls this automatically for builtin providers; for custom providers, register during startup (e.g., after loading auths) or upon auth registration hooks.
## Credentials & Transports
- Use `Manager.SetRoundTripperProvider` to inject perauth `*http.Transport` (e.g., proxy):
```go
core.SetRoundTripperProvider(myProvider) // returns transport per auth
```
- For raw HTTP flows, implement `PrepareRequest` and/or call `Manager.InjectCredentials(req, authID)` to set headers.
## Testing Tips
- Enable request logging: Management API GET/PUT `/v0/management/request-log`
- Toggle debug logs: Management API GET/PUT `/v0/management/debug`
- Hot reload changes in `config.yaml` and `auths/` are picked up automatically by the watcher

131
docs/sdk-advanced_CN.md Normal file
View File

@@ -0,0 +1,131 @@
# SDK 高级指南:执行器与翻译器
本文介绍如何使用 SDK 扩展内嵌代理:
- 实现自定义 Provider 执行器以调用你的上游 API
- 注册请求/响应翻译器进行协议转换
- 注册模型以出现在 `/v1/models`
示例基于 Go 1.24+ 与 v6 模块路径。
## 概念
- Provider 执行器:实现 `auth.ProviderExecutor` 的运行时组件,负责某个 provider key`gemini``claude``codex`)的真正出站调用。若实现 `RequestPreparer` 接口,可在原始 HTTP 请求上注入凭据。
- 翻译器注册表:由 `sdk/translator` 驱动的协议转换函数。内置了 OpenAI/Gemini/Claude/Codex 的互转;你也可以注册新的格式转换。
- 模型注册表:对外发布可用模型列表,供 `/v1/models` 与路由参考。
## 1) 实现 Provider 执行器
创建类型满足 `auth.ProviderExecutor` 接口。
```go
package myprov
import (
"context"
"net/http"
coreauth "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/auth"
clipexec "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/executor"
)
type Executor struct{}
func (Executor) Identifier() string { return "myprov" }
// 可选:在原始 HTTP 请求上注入凭据
func (Executor) PrepareRequest(req *http.Request, a *coreauth.Auth) error {
// 例如req.Header.Set("Authorization", "Bearer "+a.Attributes["api_key"])
return nil
}
func (Executor) Execute(ctx context.Context, a *coreauth.Auth, req clipexec.Request, opts clipexec.Options) (clipexec.Response, error) {
// 基于 req.Payload 构造上游请求,返回上游 JSON 负载
return clipexec.Response{Payload: []byte(`{"ok":true}`)}, nil
}
func (Executor) ExecuteStream(ctx context.Context, a *coreauth.Auth, req clipexec.Request, opts clipexec.Options) (<-chan clipexec.StreamChunk, error) {
ch := make(chan clipexec.StreamChunk, 1)
go func() { defer close(ch); ch <- clipexec.StreamChunk{Payload: []byte("data: {\\"done\\":true}\\n\\n")} }()
return ch, nil
}
func (Executor) Refresh(ctx context.Context, a *coreauth.Auth) (*coreauth.Auth, error) { return a, nil }
```
在启动服务前将执行器注册到核心管理器:
```go
core := coreauth.NewManager(coreauth.NewFileStore(cfg.AuthDir), nil, nil)
core.RegisterExecutor(myprov.Executor{})
svc, _ := cliproxy.NewBuilder().WithConfig(cfg).WithConfigPath(cfgPath).WithCoreAuthManager(core).Build()
```
当凭据的 `Provider``"myprov"` 时,管理器会将请求路由到你的执行器。
## 2) 注册翻译器
内置处理器接受 OpenAI/Gemini/Claude/Codex 的入站格式。要支持新的 provider 协议,需要在 `sdk/translator` 的默认注册表中注册转换函数。
方向很重要:
- 请求从“入站格式”转换为“provider 格式”
- 响应从“provider 格式”转换回“入站格式”
示例OpenAI Chat → MyProv Chat 及其反向。
```go
package myprov
import (
"context"
sdktr "github.com/router-for-me/CLIProxyAPI/v6/sdk/translator"
)
const (
FOpenAI = sdktr.Format("openai.chat")
FMyProv = sdktr.Format("myprov.chat")
)
func init() {
sdktr.Register(FOpenAI, FMyProv,
func(model string, raw []byte, stream bool) []byte { return convertOpenAIToMyProv(model, raw, stream) },
sdktr.ResponseTransform{
Stream: func(ctx context.Context, model string, originalReq, translatedReq, raw []byte, param *any) []string {
return convertStreamMyProvToOpenAI(model, originalReq, translatedReq, raw)
},
NonStream: func(ctx context.Context, model string, originalReq, translatedReq, raw []byte, param *any) string {
return convertMyProvToOpenAI(model, originalReq, translatedReq, raw)
},
},
)
}
```
当 OpenAI 处理器接到需要路由到 `myprov` 的请求时,流水线会自动应用已注册的转换。
## 3) 注册模型
通过全局模型注册表将模型暴露到 `/v1/models`
```go
models := []*cliproxy.ModelInfo{
{ ID: "myprov-pro-1", Object: "model", Type: "myprov", DisplayName: "MyProv Pro 1" },
}
cliproxy.GlobalModelRegistry().RegisterClient(authID, "myprov", models)
```
内置 Provider 会自动注册;自定义 Provider 建议在启动时(例如加载到 Auth 后)或在 Auth 注册钩子中调用。
## 凭据与传输
- 使用 `Manager.SetRoundTripperProvider` 注入按账户的 `*http.Transport`(例如代理):
```go
core.SetRoundTripperProvider(myProvider) // 按账户返回 transport
```
- 对于原始 HTTP 请求,若实现了 `PrepareRequest`,或通过 `Manager.InjectCredentials(req, authID)` 进行头部注入。
## 测试建议
- 启用请求日志:管理 API GET/PUT `/v0/management/request-log`
- 切换调试日志:管理 API GET/PUT `/v0/management/debug`
- 热更新:`config.yaml` 与 `auths/` 变化会自动被侦测并应用

163
docs/sdk-usage.md Normal file
View File

@@ -0,0 +1,163 @@
# CLI Proxy SDK Guide
The `sdk/cliproxy` module exposes the proxy as a reusable Go library so external programs can embed the routing, authentication, hotreload, and translation layers without depending on the CLI binary.
## Install & Import
```bash
go get github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy
```
```go
import (
"context"
"errors"
"time"
"github.com/router-for-me/CLIProxyAPI/v6/internal/config"
"github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy"
)
```
Note the `/v6` module path.
## Minimal Embed
```go
cfg, err := config.LoadConfig("config.yaml")
if err != nil { panic(err) }
svc, err := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml"). // absolute or working-dir relative
Build()
if err != nil { panic(err) }
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
if err := svc.Run(ctx); err != nil && !errors.Is(err, context.Canceled) {
panic(err)
}
```
The service manages config/auth watching, background token refresh, and graceful shutdown. Cancel the context to stop it.
## Server Options (middleware, routes, logs)
The server accepts options via `WithServerOptions`:
```go
svc, _ := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithServerOptions(
// Add global middleware
cliproxy.WithMiddleware(func(c *gin.Context) { c.Header("X-Embed", "1"); c.Next() }),
// Tweak gin engine early (CORS, trusted proxies, etc.)
cliproxy.WithEngineConfigurator(func(e *gin.Engine) { e.ForwardedByClientIP = true }),
// Add your own routes after defaults
cliproxy.WithRouterConfigurator(func(e *gin.Engine, _ *handlers.BaseAPIHandler, _ *config.Config) {
e.GET("/healthz", func(c *gin.Context) { c.String(200, "ok") })
}),
// Override request log writer/dir
cliproxy.WithRequestLoggerFactory(func(cfg *config.Config, cfgPath string) logging.RequestLogger {
return logging.NewFileRequestLogger(true, "logs", filepath.Dir(cfgPath))
}),
).
Build()
```
These options mirror the internals used by the CLI server.
## Management API (when embedded)
- Management endpoints are mounted only when `remote-management.secret-key` is set in `config.yaml`.
- Remote access additionally requires `remote-management.allow-remote: true`.
- See MANAGEMENT_API.md for endpoints. Your embedded server exposes them under `/v0/management` on the configured port.
## Using the Core Auth Manager
The service uses a core `auth.Manager` for selection, execution, and autorefresh. When embedding, you can provide your own manager to customize transports or hooks:
```go
core := coreauth.NewManager(coreauth.NewFileStore(cfg.AuthDir), nil, nil)
core.SetRoundTripperProvider(myRTProvider) // perauth *http.Transport
svc, _ := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithCoreAuthManager(core).
Build()
```
Implement a custom perauth transport:
```go
type myRTProvider struct{}
func (myRTProvider) RoundTripperFor(a *coreauth.Auth) http.RoundTripper {
if a == nil || a.ProxyURL == "" { return nil }
u, _ := url.Parse(a.ProxyURL)
return &http.Transport{ Proxy: http.ProxyURL(u) }
}
```
Programmatic execution is available on the manager:
```go
// Nonstreaming
resp, err := core.Execute(ctx, []string{"gemini"}, req, opts)
// Streaming
chunks, err := core.ExecuteStream(ctx, []string{"gemini"}, req, opts)
for ch := range chunks { /* ... */ }
```
Note: Builtin provider executors are wired automatically when you run the `Service`. If you want to use `Manager` standalone without the HTTP server, you must register your own executors that implement `auth.ProviderExecutor`.
## Custom Client Sources
Replace the default loaders if your creds live outside the local filesystem:
```go
type memoryTokenProvider struct{}
func (p *memoryTokenProvider) Load(ctx context.Context, cfg *config.Config) (*cliproxy.TokenClientResult, error) {
// Populate from memory/remote store and return counts
return &cliproxy.TokenClientResult{}, nil
}
svc, _ := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithTokenClientProvider(&memoryTokenProvider{}).
WithAPIKeyClientProvider(cliproxy.NewAPIKeyClientProvider()).
Build()
```
## Hooks
Observe lifecycle without patching internals:
```go
hooks := cliproxy.Hooks{
OnBeforeStart: func(cfg *config.Config) { log.Infof("starting on :%d", cfg.Port) },
OnAfterStart: func(s *cliproxy.Service) { log.Info("ready") },
}
svc, _ := cliproxy.NewBuilder().WithConfig(cfg).WithConfigPath("config.yaml").WithHooks(hooks).Build()
```
## Shutdown
`Run` defers `Shutdown`, so cancelling the parent context is enough. To stop manually:
```go
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
_ = svc.Shutdown(ctx)
```
## Notes
- Hot reload: changes to `config.yaml` and `auths/` are picked up automatically.
- Request logging can be toggled at runtime via the Management API.
- Gemini Web features (`gemini-web.*`) are honored in the embedded server.

164
docs/sdk-usage_CN.md Normal file
View File

@@ -0,0 +1,164 @@
# CLI Proxy SDK 使用指南
`sdk/cliproxy` 模块将代理能力以 Go 库的形式对外暴露,方便在其它服务中内嵌路由、鉴权、热更新与翻译层,而无需依赖可执行的 CLI 程序。
## 安装与导入
```bash
go get github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy
```
```go
import (
"context"
"errors"
"time"
"github.com/router-for-me/CLIProxyAPI/v6/internal/config"
"github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy"
)
```
注意模块路径包含 `/v6`
## 最小可用示例
```go
cfg, err := config.LoadConfig("config.yaml")
if err != nil { panic(err) }
svc, err := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml"). // 绝对路径或工作目录相对路径
Build()
if err != nil { panic(err) }
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
if err := svc.Run(ctx); err != nil && !errors.Is(err, context.Canceled) {
panic(err)
}
```
服务内部会管理配置与认证文件的监听、后台令牌刷新与优雅关闭。取消上下文即可停止服务。
## 服务器可选项(中间件、路由、日志)
通过 `WithServerOptions` 自定义:
```go
svc, _ := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithServerOptions(
// 追加全局中间件
cliproxy.WithMiddleware(func(c *gin.Context) { c.Header("X-Embed", "1"); c.Next() }),
// 提前调整 gin 引擎(如 CORS、trusted proxies
cliproxy.WithEngineConfigurator(func(e *gin.Engine) { e.ForwardedByClientIP = true }),
// 在默认路由之后追加自定义路由
cliproxy.WithRouterConfigurator(func(e *gin.Engine, _ *handlers.BaseAPIHandler, _ *config.Config) {
e.GET("/healthz", func(c *gin.Context) { c.String(200, "ok") })
}),
// 覆盖请求日志的创建(启用/目录)
cliproxy.WithRequestLoggerFactory(func(cfg *config.Config, cfgPath string) logging.RequestLogger {
return logging.NewFileRequestLogger(true, "logs", filepath.Dir(cfgPath))
}),
).
Build()
```
这些选项与 CLI 服务器内部用法保持一致。
## 管理 API内嵌时
- 仅当 `config.yaml` 中设置了 `remote-management.secret-key` 时才会挂载管理端点。
- 远程访问还需要 `remote-management.allow-remote: true`
- 具体端点见 MANAGEMENT_API_CN.md。内嵌服务器会在配置端口下暴露 `/v0/management`
## 使用核心鉴权管理器
服务内部使用核心 `auth.Manager` 负责选择、执行、自动刷新。内嵌时可自定义其传输或钩子:
```go
core := coreauth.NewManager(coreauth.NewFileStore(cfg.AuthDir), nil, nil)
core.SetRoundTripperProvider(myRTProvider) // 按账户返回 *http.Transport
svc, _ := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithCoreAuthManager(core).
Build()
```
实现每个账户的自定义传输:
```go
type myRTProvider struct{}
func (myRTProvider) RoundTripperFor(a *coreauth.Auth) http.RoundTripper {
if a == nil || a.ProxyURL == "" { return nil }
u, _ := url.Parse(a.ProxyURL)
return &http.Transport{ Proxy: http.ProxyURL(u) }
}
```
管理器提供编程式执行接口:
```go
// 非流式
resp, err := core.Execute(ctx, []string{"gemini"}, req, opts)
// 流式
chunks, err := core.ExecuteStream(ctx, []string{"gemini"}, req, opts)
for ch := range chunks { /* ... */ }
```
说明:运行 `Service` 时会自动注册内置的提供商执行器;若仅单独使用 `Manager` 而不启动 HTTP 服务器,则需要自行实现并注册满足 `auth.ProviderExecutor` 的执行器。
## 自定义凭据来源
当凭据不在本地文件系统时,替换默认加载器:
```go
type memoryTokenProvider struct{}
func (p *memoryTokenProvider) Load(ctx context.Context, cfg *config.Config) (*cliproxy.TokenClientResult, error) {
// 从内存/远端加载并返回数量统计
return &cliproxy.TokenClientResult{}, nil
}
svc, _ := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithTokenClientProvider(&memoryTokenProvider{}).
WithAPIKeyClientProvider(cliproxy.NewAPIKeyClientProvider()).
Build()
```
## 启动钩子
无需修改内部代码即可观察生命周期:
```go
hooks := cliproxy.Hooks{
OnBeforeStart: func(cfg *config.Config) { log.Infof("starting on :%d", cfg.Port) },
OnAfterStart: func(s *cliproxy.Service) { log.Info("ready") },
}
svc, _ := cliproxy.NewBuilder().WithConfig(cfg).WithConfigPath("config.yaml").WithHooks(hooks).Build()
```
## 关闭
`Run` 内部会延迟调用 `Shutdown`,因此只需取消父上下文即可。若需手动停止:
```go
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
_ = svc.Shutdown(ctx)
```
## 说明
- 热更新:`config.yaml``auths/` 变化会被自动侦测并应用。
- 请求日志可通过管理 API 在运行时开关。
- `gemini-web.*` 相关配置在内嵌服务器中会被遵循。

View File

@@ -0,0 +1,172 @@
package main
// Example: Custom provider executor + translators embedded in the SDK server.
import (
"bytes"
"context"
"errors"
"io"
"net/http"
"net/url"
"os"
"path/filepath"
"strings"
"time"
"github.com/gin-gonic/gin"
api "github.com/router-for-me/CLIProxyAPI/v6/internal/api"
"github.com/router-for-me/CLIProxyAPI/v6/internal/config"
"github.com/router-for-me/CLIProxyAPI/v6/internal/logging"
"github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy"
coreauth "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/auth"
clipexec "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/executor"
sdktr "github.com/router-for-me/CLIProxyAPI/v6/sdk/translator"
)
const (
providerKey = "myprov"
fOpenAI = sdktr.Format("openai.chat")
fMyProv = sdktr.Format("myprov.chat")
)
// Register trivial translators (pass-through demo).
func init() {
sdktr.Register(fOpenAI, fMyProv,
func(model string, raw []byte, stream bool) []byte { return raw },
sdktr.ResponseTransform{
Stream: func(ctx context.Context, model string, originalReq, translatedReq, raw []byte, param *any) []string {
return []string{string(raw)}
},
NonStream: func(ctx context.Context, model string, originalReq, translatedReq, raw []byte, param *any) string {
return string(raw)
},
},
)
}
// MyExecutor is a minimal provider implementation for demo purposes.
type MyExecutor struct{}
func (MyExecutor) Identifier() string { return providerKey }
// PrepareRequest optionally injects credentials to raw HTTP requests.
func (MyExecutor) PrepareRequest(req *http.Request, a *coreauth.Auth) error {
if req == nil || a == nil {
return nil
}
if a.Attributes != nil {
if ak := strings.TrimSpace(a.Attributes["api_key"]); ak != "" {
req.Header.Set("Authorization", "Bearer "+ak)
}
}
return nil
}
func buildHTTPClient(a *coreauth.Auth) *http.Client {
if a == nil || strings.TrimSpace(a.ProxyURL) == "" {
return http.DefaultClient
}
u, err := url.Parse(a.ProxyURL)
if err != nil || (u.Scheme != "http" && u.Scheme != "https") {
return http.DefaultClient
}
return &http.Client{Transport: &http.Transport{Proxy: http.ProxyURL(u)}}
}
func upstreamEndpoint(a *coreauth.Auth) string {
if a != nil && a.Attributes != nil {
if ep := strings.TrimSpace(a.Attributes["endpoint"]); ep != "" {
return ep
}
}
// Demo echo endpoint; replace with your upstream.
return "https://httpbin.org/post"
}
func (MyExecutor) Execute(ctx context.Context, a *coreauth.Auth, req clipexec.Request, opts clipexec.Options) (clipexec.Response, error) {
client := buildHTTPClient(a)
endpoint := upstreamEndpoint(a)
httpReq, errNew := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(req.Payload))
if errNew != nil {
return clipexec.Response{}, errNew
}
httpReq.Header.Set("Content-Type", "application/json")
// Inject credentials via PrepareRequest hook.
_ = (MyExecutor{}).PrepareRequest(httpReq, a)
resp, errDo := client.Do(httpReq)
if errDo != nil {
return clipexec.Response{}, errDo
}
defer func() {
if errClose := resp.Body.Close(); errClose != nil {
// Best-effort close; log if needed in real projects.
}
}()
body, _ := io.ReadAll(resp.Body)
return clipexec.Response{Payload: body}, nil
}
func (MyExecutor) ExecuteStream(ctx context.Context, a *coreauth.Auth, req clipexec.Request, opts clipexec.Options) (<-chan clipexec.StreamChunk, error) {
ch := make(chan clipexec.StreamChunk, 1)
go func() {
defer close(ch)
ch <- clipexec.StreamChunk{Payload: []byte("data: {\"ok\":true}\n\n")}
}()
return ch, nil
}
func (MyExecutor) Refresh(ctx context.Context, a *coreauth.Auth) (*coreauth.Auth, error) {
return a, nil
}
func main() {
cfg, err := config.LoadConfig("config.yaml")
if err != nil {
panic(err)
}
core := coreauth.NewManager(coreauth.NewFileStore(cfg.AuthDir), nil, nil)
core.RegisterExecutor(MyExecutor{})
hooks := cliproxy.Hooks{
OnAfterStart: func(s *cliproxy.Service) {
// Register demo models for the custom provider so they appear in /v1/models.
models := []*cliproxy.ModelInfo{{ID: "myprov-pro-1", Object: "model", Type: providerKey, DisplayName: "MyProv Pro 1"}}
for _, a := range core.List() {
if strings.EqualFold(a.Provider, providerKey) {
cliproxy.GlobalModelRegistry().RegisterClient(a.ID, providerKey, models)
}
}
},
}
svc, err := cliproxy.NewBuilder().
WithConfig(cfg).
WithConfigPath("config.yaml").
WithCoreAuthManager(core).
WithServerOptions(
// Optional: add a simple middleware + custom request logger
api.WithMiddleware(func(c *gin.Context) { c.Header("X-Example", "custom-provider"); c.Next() }),
api.WithRequestLoggerFactory(func(cfg *config.Config, cfgPath string) logging.RequestLogger {
return logging.NewFileRequestLogger(true, "logs", filepath.Dir(cfgPath))
}),
).
WithHooks(hooks).
Build()
if err != nil {
panic(err)
}
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
if err := svc.Run(ctx); err != nil && !errors.Is(err, context.Canceled) {
panic(err)
}
_ = os.Stderr // keep os import used (demo only)
_ = time.Second
}

View File

@@ -9,7 +9,6 @@ import (
"bytes"
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/claude/gemini"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -31,7 +30,6 @@ import (
// Returns:
// - []byte: The transformed request data in Claude Code API format
func ConvertGeminiCLIRequestToClaude(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertGeminiCLIRequestToClaude")
rawJSON := bytes.Clone(inputRawJSON)
modelResult := gjson.GetBytes(rawJSON, "model")

View File

@@ -8,7 +8,6 @@ import (
"context"
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/claude/gemini"
log "github.com/sirupsen/logrus"
"github.com/tidwall/sjson"
)
@@ -26,7 +25,6 @@ import (
// Returns:
// - []string: A slice of strings, each containing a Gemini-compatible JSON response wrapped in a response object
func ConvertClaudeResponseToGeminiCLI(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertClaudeResponseToGeminiCLI")
outputs := ConvertClaudeResponseToGemini(ctx, modelName, originalRequestRawJSON, requestRawJSON, rawJSON, param)
// Wrap each converted response in a "response" object to match Gemini CLI API structure
newOutputs := make([]string, 0)
@@ -51,7 +49,6 @@ func ConvertClaudeResponseToGeminiCLI(ctx context.Context, modelName string, ori
// Returns:
// - string: A Gemini-compatible JSON response wrapped in a response object
func ConvertClaudeResponseToGeminiCLINonStream(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) string {
log.Debug("ConvertClaudeResponseToGeminiCLINonStream")
strJSON := ConvertClaudeResponseToGeminiNonStream(ctx, modelName, originalRequestRawJSON, requestRawJSON, rawJSON, param)
// Wrap the converted response in a "response" object to match Gemini CLI API structure
json := `{"response": {}}`

View File

@@ -13,7 +13,6 @@ import (
"strings"
"github.com/router-for-me/CLIProxyAPI/v6/internal/util"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -37,7 +36,6 @@ import (
// Returns:
// - []byte: The transformed request data in Claude Code API format
func ConvertGeminiRequestToClaude(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertGeminiRequestToClaude")
rawJSON := bytes.Clone(inputRawJSON)
// Base Claude Code API template with default max_tokens value
out := `{"model":"","max_tokens":32000,"messages":[]}`

View File

@@ -12,7 +12,6 @@ import (
"strings"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -54,7 +53,6 @@ type ConvertAnthropicResponseToGeminiParams struct {
// Returns:
// - []string: A slice of strings, each containing a Gemini-compatible JSON response
func ConvertClaudeResponseToGemini(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertClaudeResponseToGemini")
if *param == nil {
*param = &ConvertAnthropicResponseToGeminiParams{
Model: modelName,
@@ -323,7 +321,6 @@ func convertMapToJSON(m map[string]interface{}) string {
// Returns:
// - string: A Gemini-compatible JSON response containing all message content and metadata
func ConvertClaudeResponseToGeminiNonStream(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertClaudeResponseToGeminiNonStream")
// Base Gemini response template for non-streaming with default values
template := `{"candidates":[{"content":{"role":"model","parts":[]},"finishReason":"STOP"}],"usageMetadata":{"trafficType":"PROVISIONED_THROUGHPUT"},"modelVersion":"","createTime":"","responseId":""}`

View File

@@ -12,7 +12,6 @@ import (
"math/big"
"strings"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -35,7 +34,6 @@ import (
// Returns:
// - []byte: The transformed request data in Claude Code API format
func ConvertOpenAIRequestToClaude(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertOpenAIRequestToClaude")
rawJSON := bytes.Clone(inputRawJSON)
// Base Claude Code API template with default max_tokens value

View File

@@ -13,7 +13,6 @@ import (
"strings"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -52,7 +51,6 @@ type ToolCallAccumulator struct {
// Returns:
// - []string: A slice of strings, each containing an OpenAI-compatible JSON response
func ConvertClaudeResponseToOpenAI(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertClaudeResponseToOpenAI")
if *param == nil {
*param = &ConvertAnthropicResponseToOpenAIParams{
CreatedAt: 0,
@@ -280,7 +278,6 @@ func mapAnthropicStopReasonToOpenAI(anthropicReason string) string {
// Returns:
// - string: An OpenAI-compatible JSON response containing all message content and metadata
func ConvertClaudeResponseToOpenAINonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertClaudeResponseToOpenAINonStream")
chunks := make([][]byte, 0)

View File

@@ -6,7 +6,6 @@ import (
"math/big"
"strings"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -22,7 +21,6 @@ import (
// - max_output_tokens -> max_tokens
// - stream passthrough via parameter
func ConvertOpenAIResponsesRequestToClaude(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertOpenAIResponsesRequestToClaude")
rawJSON := bytes.Clone(inputRawJSON)
// Base Claude message payload

View File

@@ -8,7 +8,6 @@ import (
"strings"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -43,7 +42,6 @@ func emitEvent(event string, payload string) string {
// ConvertClaudeResponseToOpenAIResponses converts Claude SSE to OpenAI Responses SSE events.
func ConvertClaudeResponseToOpenAIResponses(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertClaudeResponseToOpenAIResponses")
if *param == nil {
*param = &claudeToResponsesState{FuncArgsBuf: make(map[int]*strings.Builder), FuncNames: make(map[int]string), FuncCallIDs: make(map[int]string)}
}
@@ -391,7 +389,6 @@ func ConvertClaudeResponseToOpenAIResponses(ctx context.Context, modelName strin
// ConvertClaudeResponseToOpenAIResponsesNonStream aggregates Claude SSE into a single OpenAI Responses JSON.
func ConvertClaudeResponseToOpenAIResponsesNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertClaudeResponseToOpenAIResponsesNonStream")
// Aggregate Claude SSE lines into a single OpenAI Responses JSON (non-stream)
// We follow the same aggregation logic as the streaming variant but produce
// one final object matching docs/out.json structure.

View File

@@ -12,7 +12,6 @@ import (
"strings"
"github.com/router-for-me/CLIProxyAPI/v6/internal/misc"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -36,7 +35,6 @@ import (
// Returns:
// - []byte: The transformed request data in internal client format
func ConvertClaudeRequestToCodex(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertClaudeRequestToCodex")
rawJSON := bytes.Clone(inputRawJSON)
template := `{"model":"","instructions":"","input":[]}`

View File

@@ -11,7 +11,6 @@ import (
"context"
"fmt"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -37,7 +36,6 @@ var (
// Returns:
// - []string: A slice of strings, each containing a Claude Code-compatible JSON response
func ConvertCodexResponseToClaude(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertCodexResponseToClaude")
if *param == nil {
hasToolCall := false
*param = &hasToolCall
@@ -179,7 +177,6 @@ func ConvertCodexResponseToClaude(_ context.Context, _ string, originalRequestRa
// Returns:
// - string: A Claude Code-compatible JSON response containing all message content and metadata
func ConvertCodexResponseToClaudeNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, _ []byte, _ *any) string {
log.Debug("ConvertCodexResponseToClaudeNonStream")
return ""
}

View File

@@ -9,7 +9,6 @@ import (
"bytes"
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/codex/gemini"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -31,7 +30,6 @@ import (
// Returns:
// - []byte: The transformed request data in Codex API format
func ConvertGeminiCLIRequestToCodex(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertGeminiRequestToCodex")
rawJSON := bytes.Clone(inputRawJSON)
rawJSON = []byte(gjson.GetBytes(rawJSON, "request").Raw)

View File

@@ -8,7 +8,6 @@ import (
"context"
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/codex/gemini"
log "github.com/sirupsen/logrus"
"github.com/tidwall/sjson"
)
@@ -26,7 +25,6 @@ import (
// Returns:
// - []string: A slice of strings, each containing a Gemini-compatible JSON response wrapped in a response object
func ConvertCodexResponseToGeminiCLI(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertCodexResponseToGeminiCLI")
outputs := ConvertCodexResponseToGemini(ctx, modelName, originalRequestRawJSON, requestRawJSON, rawJSON, param)
newOutputs := make([]string, 0)
for i := 0; i < len(outputs); i++ {
@@ -50,7 +48,6 @@ func ConvertCodexResponseToGeminiCLI(ctx context.Context, modelName string, orig
// Returns:
// - string: A Gemini-compatible JSON response wrapped in a response object
func ConvertCodexResponseToGeminiCLINonStream(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) string {
log.Debug("ConvertCodexResponseToGeminiCLINonStream")
// log.Debug(string(rawJSON))
strJSON := ConvertCodexResponseToGeminiNonStream(ctx, modelName, originalRequestRawJSON, requestRawJSON, rawJSON, param)
json := `{"response": {}}`

View File

@@ -15,7 +15,6 @@ import (
"github.com/router-for-me/CLIProxyAPI/v6/internal/misc"
"github.com/router-for-me/CLIProxyAPI/v6/internal/util"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -38,7 +37,6 @@ import (
// Returns:
// - []byte: The transformed request data in Codex API format
func ConvertGeminiRequestToCodex(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertGeminiRequestToCodex")
rawJSON := bytes.Clone(inputRawJSON)
// Base template
out := `{"model":"","instructions":"","input":[]}`

View File

@@ -11,7 +11,6 @@ import (
"encoding/json"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -42,7 +41,6 @@ type ConvertCodexResponseToGeminiParams struct {
// Returns:
// - []string: A slice of strings, each containing a Gemini-compatible JSON response
func ConvertCodexResponseToGemini(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertCodexResponseToGemini")
if *param == nil {
*param = &ConvertCodexResponseToGeminiParams{
Model: modelName,
@@ -154,7 +152,6 @@ func ConvertCodexResponseToGemini(_ context.Context, modelName string, originalR
// Returns:
// - string: A Gemini-compatible JSON response containing all message content and metadata
func ConvertCodexResponseToGeminiNonStream(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertCodexResponseToGeminiNonStream")
scanner := bufio.NewScanner(bytes.NewReader(rawJSON))
buffer := make([]byte, 10240*1024)
scanner.Buffer(buffer, 10240*1024)

View File

@@ -13,7 +13,6 @@ import (
"strings"
"github.com/router-for-me/CLIProxyAPI/v6/internal/misc"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -31,7 +30,6 @@ import (
// Returns:
// - []byte: The transformed request data in OpenAI Responses API format
func ConvertOpenAIRequestToCodex(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertOpenAIRequestToCodex")
rawJSON := bytes.Clone(inputRawJSON)
// Start with empty JSON object
out := `{}`

View File

@@ -11,7 +11,6 @@ import (
"context"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -43,7 +42,6 @@ type ConvertCliToOpenAIParams struct {
// Returns:
// - []string: A slice of strings, each containing an OpenAI-compatible JSON response
func ConvertCodexResponseToOpenAI(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertCodexResponseToOpenAI")
if *param == nil {
*param = &ConvertCliToOpenAIParams{
Model: modelName,
@@ -168,7 +166,6 @@ func ConvertCodexResponseToOpenAI(_ context.Context, modelName string, originalR
// Returns:
// - string: An OpenAI-compatible JSON response containing all message content and metadata
func ConvertCodexResponseToOpenAINonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertCodexResponseToOpenAINonStream")
scanner := bufio.NewScanner(bytes.NewReader(rawJSON))
buffer := make([]byte, 10240*1024)
scanner.Buffer(buffer, 10240*1024)

View File

@@ -4,13 +4,11 @@ import (
"bytes"
"github.com/router-for-me/CLIProxyAPI/v6/internal/misc"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
func ConvertOpenAIResponsesRequestToCodex(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertOpenAIResponsesRequestToCodex")
rawJSON := bytes.Clone(inputRawJSON)
rawJSON, _ = sjson.SetBytes(rawJSON, "stream", true)

View File

@@ -6,7 +6,6 @@ import (
"context"
"fmt"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -14,7 +13,6 @@ import (
// ConvertCodexResponseToOpenAIResponses converts OpenAI Chat Completions streaming chunks
// to OpenAI Responses SSE events (response.*).
func ConvertCodexResponseToOpenAIResponses(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertCodexResponseToOpenAIResponses")
if bytes.HasPrefix(rawJSON, []byte("data:")) {
rawJSON = bytes.TrimSpace(rawJSON[5:])
if typeResult := gjson.GetBytes(rawJSON, "type"); typeResult.Exists() {
@@ -31,7 +29,6 @@ func ConvertCodexResponseToOpenAIResponses(ctx context.Context, modelName string
// ConvertCodexResponseToOpenAIResponsesNonStream builds a single Responses JSON
// from a non-streaming OpenAI Chat Completions response.
func ConvertCodexResponseToOpenAIResponsesNonStream(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertCodexResponseToOpenAIResponsesNonStream")
scanner := bufio.NewScanner(bytes.NewReader(rawJSON))
buffer := make([]byte, 10240*1024)
scanner.Buffer(buffer, 10240*1024)

View File

@@ -12,7 +12,6 @@ import (
client "github.com/router-for-me/CLIProxyAPI/v6/internal/interfaces"
"github.com/router-for-me/CLIProxyAPI/v6/internal/util"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -36,7 +35,6 @@ import (
// Returns:
// - []byte: The transformed request data in Gemini CLI API format
func ConvertClaudeRequestToCLI(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertClaudeRequestToCLI")
rawJSON := bytes.Clone(inputRawJSON)
var pathsToDelete []string
root := gjson.ParseBytes(rawJSON)

View File

@@ -12,7 +12,6 @@ import (
"fmt"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -43,7 +42,6 @@ type Params struct {
// Returns:
// - []string: A slice of strings, each containing a Claude Code-compatible JSON response
func ConvertGeminiCLIResponseToClaude(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertGeminiCLIResponseToClaude")
if *param == nil {
*param = &Params{
HasFirstResponse: false,
@@ -254,6 +252,5 @@ func ConvertGeminiCLIResponseToClaude(_ context.Context, _ string, originalReque
// Returns:
// - string: A Claude-compatible JSON response.
func ConvertGeminiCLIResponseToClaudeNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, _ []byte, _ *any) string {
log.Debug("ConvertGeminiCLIResponseToClaudeNonStream")
return ""
}

View File

@@ -32,7 +32,6 @@ import (
// Returns:
// - []byte: The transformed request data in Gemini API format
func ConvertGeminiRequestToGeminiCLI(_ string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertGeminiRequestToGeminiCLI")
rawJSON := bytes.Clone(inputRawJSON)
template := ""
template = `{"project":"","request":{},"model":""}`

View File

@@ -8,7 +8,6 @@ package gemini
import (
"context"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -30,7 +29,6 @@ import (
// Returns:
// - []string: The transformed request data in Gemini API format
func ConvertGeminiCliRequestToGemini(ctx context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) []string {
log.Debug("ConvertGeminiCliRequestToGemini")
if alt, ok := ctx.Value("alt").(string); ok {
var chunk []byte
if alt == "" {
@@ -70,7 +68,6 @@ func ConvertGeminiCliRequestToGemini(ctx context.Context, _ string, originalRequ
// Returns:
// - string: A Gemini-compatible JSON response containing the response data
func ConvertGeminiCliRequestToGeminiNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertGeminiCliRequestToGeminiNonStream")
responseResult := gjson.GetBytes(rawJSON, "response")
if responseResult.Exists() {
return responseResult.Raw

View File

@@ -12,7 +12,6 @@ import (
"time"
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/gemini/openai/chat-completions"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -37,7 +36,6 @@ type convertCliResponseToOpenAIChatParams struct {
// Returns:
// - []string: A slice of strings, each containing an OpenAI-compatible JSON response
func ConvertCliResponseToOpenAI(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertCliResponseToOpenAI")
if *param == nil {
*param = &convertCliResponseToOpenAIChatParams{
UnixTimestamp: 0,
@@ -148,7 +146,6 @@ func ConvertCliResponseToOpenAI(_ context.Context, _ string, originalRequestRawJ
// Returns:
// - string: An OpenAI-compatible JSON response containing all message content and metadata
func ConvertCliResponseToOpenAINonStream(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) string {
log.Debug("ConvertCliResponseToOpenAINonStream")
responseResult := gjson.GetBytes(rawJSON, "response")
if responseResult.Exists() {
return ConvertGeminiResponseToOpenAINonStream(ctx, modelName, originalRequestRawJSON, requestRawJSON, []byte(responseResult.Raw), param)

View File

@@ -5,11 +5,9 @@ import (
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/gemini-cli/gemini"
. "github.com/router-for-me/CLIProxyAPI/v6/internal/translator/gemini/openai/responses"
log "github.com/sirupsen/logrus"
)
func ConvertOpenAIResponsesRequestToGeminiCLI(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertOpenAIResponsesRequestToGeminiCLI")
rawJSON := bytes.Clone(inputRawJSON)
rawJSON = ConvertOpenAIResponsesRequestToGemini(modelName, rawJSON, stream)
return ConvertGeminiRequestToGeminiCLI(modelName, rawJSON, stream)

View File

@@ -9,7 +9,6 @@ import (
)
func ConvertGeminiCLIResponseToOpenAIResponses(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertGeminiCLIResponseToOpenAIResponses")
responseResult := gjson.GetBytes(rawJSON, "response")
if responseResult.Exists() {
rawJSON = []byte(responseResult.Raw)
@@ -18,7 +17,6 @@ func ConvertGeminiCLIResponseToOpenAIResponses(ctx context.Context, modelName st
}
func ConvertGeminiCLIResponseToOpenAIResponsesNonStream(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) string {
log.Debug("ConvertGeminiCLIResponseToOpenAIResponsesNonStream")
responseResult := gjson.GetBytes(rawJSON, "response")
if responseResult.Exists() {
rawJSON = []byte(responseResult.Raw)

View File

@@ -12,7 +12,6 @@ import (
client "github.com/router-for-me/CLIProxyAPI/v6/internal/interfaces"
"github.com/router-for-me/CLIProxyAPI/v6/internal/util"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -29,7 +28,6 @@ import (
// Returns:
// - []byte: The transformed request in Gemini CLI format.
func ConvertClaudeRequestToGemini(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertClaudeRequestToGemini")
rawJSON := bytes.Clone(inputRawJSON)
var pathsToDelete []string
root := gjson.ParseBytes(rawJSON)

View File

@@ -12,7 +12,6 @@ import (
"fmt"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -42,7 +41,6 @@ type Params struct {
// Returns:
// - []string: A slice of strings, each containing a Claude-compatible JSON response.
func ConvertGeminiResponseToClaude(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertGeminiResponseToClaude")
if *param == nil {
*param = &Params{
IsGlAPIKey: false,
@@ -248,6 +246,5 @@ func ConvertGeminiResponseToClaude(_ context.Context, _ string, originalRequestR
// Returns:
// - string: A Claude-compatible JSON response.
func ConvertGeminiResponseToClaudeNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, _ []byte, _ *any) string {
log.Debug("ConvertGeminiResponseToClaudeNonStream")
return ""
}

View File

@@ -17,7 +17,6 @@ import (
// It extracts the model name, system instruction, message contents, and tool declarations
// from the raw JSON request and returns them in the format expected by the internal client.
func ConvertGeminiCLIRequestToGemini(_ string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertGeminiCLIRequestToGemini")
rawJSON := bytes.Clone(inputRawJSON)
modelResult := gjson.GetBytes(rawJSON, "model")
rawJSON = []byte(gjson.GetBytes(rawJSON, "request").Raw)

View File

@@ -26,7 +26,6 @@ import (
// Returns:
// - []string: A slice of strings, each containing a Gemini CLI-compatible JSON response.
func ConvertGeminiResponseToGeminiCLI(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) []string {
log.Debug("ConvertGeminiResponseToGeminiCLI")
if bytes.Equal(rawJSON, []byte("[DONE]")) {
return []string{}
}
@@ -46,7 +45,6 @@ func ConvertGeminiResponseToGeminiCLI(_ context.Context, _ string, originalReque
// Returns:
// - string: A Gemini CLI-compatible JSON response.
func ConvertGeminiResponseToGeminiCLINonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertGeminiResponseToGeminiCLINonStream")
json := `{"response": {}}`
rawJSON, _ = sjson.SetRawBytes([]byte(json), "response", rawJSON)
return string(rawJSON)

View File

@@ -18,7 +18,6 @@ import (
//
// It keeps the payload otherwise unchanged.
func ConvertGeminiRequestToGemini(_ string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertClaudeRequestToGemini")
rawJSON := bytes.Clone(inputRawJSON)
// Fast path: if no contents field, return as-is
contents := gjson.GetBytes(rawJSON, "contents")

View File

@@ -9,8 +9,6 @@ import (
// PassthroughGeminiResponseStream forwards Gemini responses unchanged.
func PassthroughGeminiResponseStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) []string {
log.Debug("PassthroughGeminiResponseStream")
if bytes.HasPrefix(rawJSON, []byte("data:")) {
rawJSON = bytes.TrimSpace(rawJSON[5:])
}
@@ -24,6 +22,5 @@ func PassthroughGeminiResponseStream(_ context.Context, _ string, originalReques
// PassthroughGeminiResponseNonStream forwards Gemini responses unchanged.
func PassthroughGeminiResponseNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("PassthroughGeminiResponseNonStream")
return string(rawJSON)
}

View File

@@ -25,7 +25,6 @@ import (
// Returns:
// - []byte: The transformed request data in Gemini API format
func ConvertOpenAIRequestToGemini(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertOpenAIRequestToGemini")
rawJSON := bytes.Clone(inputRawJSON)
// Base envelope
out := []byte(`{"contents":[],"generationConfig":{"thinkingConfig":{"include_thoughts":true}}}`)

View File

@@ -37,7 +37,6 @@ type convertGeminiResponseToOpenAIChatParams struct {
// Returns:
// - []string: A slice of strings, each containing an OpenAI-compatible JSON response
func ConvertGeminiResponseToOpenAI(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertGeminiResponseToOpenAI")
if *param == nil {
*param = &convertGeminiResponseToOpenAIChatParams{
UnixTimestamp: 0,
@@ -180,7 +179,6 @@ func ConvertGeminiResponseToOpenAI(_ context.Context, _ string, originalRequestR
// Returns:
// - string: An OpenAI-compatible JSON response containing all message content and metadata
func ConvertGeminiResponseToOpenAINonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertGeminiResponseToOpenAINonStream")
var unixTimestamp int64
template := `{"id":"","object":"chat.completion","created":123456,"model":"model","choices":[{"index":0,"message":{"role":"assistant","content":null,"reasoning_content":null,"tool_calls":null},"finish_reason":null,"native_finish_reason":null}]}`
if modelVersionResult := gjson.GetBytes(rawJSON, "modelVersion"); modelVersionResult.Exists() {

View File

@@ -10,7 +10,6 @@ import (
)
func ConvertOpenAIResponsesRequestToGemini(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertOpenAIResponsesRequestToGemini")
rawJSON := bytes.Clone(inputRawJSON)
// Note: modelName and stream parameters are part of the fixed method signature

View File

@@ -7,7 +7,6 @@ import (
"strings"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -44,7 +43,6 @@ func emitEvent(event string, payload string) string {
// ConvertGeminiResponseToOpenAIResponses converts Gemini SSE chunks into OpenAI Responses SSE events.
func ConvertGeminiResponseToOpenAIResponses(_ context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertGeminiResponseToOpenAIResponses")
if *param == nil {
*param = &geminiToResponsesState{
FuncArgsBuf: make(map[int]*strings.Builder),
@@ -424,7 +422,6 @@ func ConvertGeminiResponseToOpenAIResponses(_ context.Context, modelName string,
// ConvertGeminiResponseToOpenAIResponsesNonStream aggregates Gemini response JSON into a single OpenAI Responses JSON object.
func ConvertGeminiResponseToOpenAIResponsesNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertGeminiResponseToOpenAIResponsesNonStream")
root := gjson.ParseBytes(rawJSON)
// Base response scaffold

View File

@@ -10,7 +10,6 @@ import (
"encoding/json"
"strings"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -19,7 +18,6 @@ import (
// It extracts the model name, system instruction, message contents, and tool declarations
// from the raw JSON request and returns them in the format expected by the OpenAI API.
func ConvertClaudeRequestToOpenAI(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertClaudeRequestToOpenAI")
rawJSON := bytes.Clone(inputRawJSON)
// Base OpenAI Chat Completions API template
out := `{"model":"","messages":[]}`

View File

@@ -59,7 +59,6 @@ type ToolCallAccumulator struct {
// Returns:
// - []string: A slice of strings, each containing an Anthropic-compatible JSON response.
func ConvertOpenAIResponseToClaude(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertOpenAIResponseToClaude")
if *param == nil {
*param = &ConvertOpenAIResponseToAnthropicParams{
MessageID: "",
@@ -453,6 +452,5 @@ func mapOpenAIFinishReasonToAnthropic(openAIReason string) string {
// Returns:
// - string: An Anthropic-compatible JSON response.
func ConvertOpenAIResponseToClaudeNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, _ []byte, _ *any) string {
log.Debug("ConvertOpenAIResponseToClaudeNonStream")
return ""
}

View File

@@ -18,7 +18,6 @@ import (
// It extracts the model name, generation config, message contents, and tool declarations
// from the raw JSON request and returns them in the format expected by the OpenAI API.
func ConvertGeminiCLIRequestToOpenAI(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertGeminiCLIRequestToOpenAI")
rawJSON := bytes.Clone(inputRawJSON)
rawJSON = []byte(gjson.GetBytes(rawJSON, "request").Raw)
rawJSON, _ = sjson.SetBytes(rawJSON, "model", modelName)

View File

@@ -26,7 +26,6 @@ import (
// Returns:
// - []string: A slice of strings, each containing a Gemini-compatible JSON response.
func ConvertOpenAIResponseToGeminiCLI(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertOpenAIResponseToGeminiCLI")
outputs := ConvertOpenAIResponseToGemini(ctx, modelName, originalRequestRawJSON, requestRawJSON, rawJSON, param)
newOutputs := make([]string, 0)
for i := 0; i < len(outputs); i++ {
@@ -48,7 +47,6 @@ func ConvertOpenAIResponseToGeminiCLI(ctx context.Context, modelName string, ori
// Returns:
// - string: A Gemini-compatible JSON response.
func ConvertOpenAIResponseToGeminiCLINonStream(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) string {
log.Debug("ConvertOpenAIResponseToGeminiCLINonStream")
strJSON := ConvertOpenAIResponseToGeminiNonStream(ctx, modelName, originalRequestRawJSON, requestRawJSON, rawJSON, param)
json := `{"response": {}}`
strJSON, _ = sjson.SetRaw(json, "response", strJSON)

View File

@@ -21,7 +21,6 @@ import (
// It extracts the model name, generation config, message contents, and tool declarations
// from the raw JSON request and returns them in the format expected by the OpenAI API.
func ConvertGeminiRequestToOpenAI(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertGeminiRequestToOpenAI")
rawJSON := bytes.Clone(inputRawJSON)
// Base OpenAI Chat Completions API template
out := `{"model":"","messages":[]}`

View File

@@ -12,7 +12,6 @@ import (
"strconv"
"strings"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -47,7 +46,6 @@ type ToolCallAccumulator struct {
// Returns:
// - []string: A slice of strings, each containing a Gemini-compatible JSON response.
func ConvertOpenAIResponseToGemini(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertOpenAIResponseToGemini")
if *param == nil {
*param = &ConvertOpenAIResponseToGeminiParams{
ToolCallsAccumulator: nil,
@@ -514,7 +512,6 @@ func tryParseNumber(s string) (interface{}, bool) {
// Returns:
// - string: A Gemini-compatible JSON response.
func ConvertOpenAIResponseToGeminiNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertOpenAIResponseToGeminiNonStream")
root := gjson.ParseBytes(rawJSON)
// Base Gemini response template

View File

@@ -4,8 +4,6 @@ package chat_completions
import (
"bytes"
log "github.com/sirupsen/logrus"
)
// ConvertOpenAIRequestToOpenAI converts an OpenAI Chat Completions request (raw JSON)
@@ -19,6 +17,5 @@ import (
// Returns:
// - []byte: The transformed request data in Gemini CLI API format
func ConvertOpenAIRequestToOpenAI(modelName string, inputRawJSON []byte, _ bool) []byte {
log.Debug("ConvertOpenAIRequestToOpenAI")
return bytes.Clone(inputRawJSON)
}

View File

@@ -27,7 +27,6 @@ import (
// Returns:
// - []string: A slice of strings, each containing an OpenAI-compatible JSON response
func ConvertOpenAIResponseToOpenAI(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertOpenAIResponseToOpenAI")
if bytes.HasPrefix(rawJSON, []byte("data:")) {
rawJSON = bytes.TrimSpace(rawJSON[5:])
}
@@ -51,6 +50,5 @@ func ConvertOpenAIResponseToOpenAI(_ context.Context, _ string, originalRequestR
// Returns:
// - string: An OpenAI-compatible JSON response containing all message content and metadata
func ConvertOpenAIResponseToOpenAINonStream(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) string {
log.Debug("ConvertOpenAIResponseToOpenAINonStream")
return string(rawJSON)
}

View File

@@ -28,7 +28,6 @@ import (
// Returns:
// - []byte: The transformed request data in OpenAI chat completions format
func ConvertOpenAIResponsesRequestToOpenAIChatCompletions(modelName string, inputRawJSON []byte, stream bool) []byte {
log.Debug("ConvertOpenAIResponsesRequestToOpenAIChatCompletions")
rawJSON := bytes.Clone(inputRawJSON)
// Base OpenAI chat completions template with default values
out := `{"model":"","messages":[],"stream":false}`

View File

@@ -7,7 +7,6 @@ import (
"strings"
"time"
log "github.com/sirupsen/logrus"
"github.com/tidwall/gjson"
"github.com/tidwall/sjson"
)
@@ -42,7 +41,6 @@ func emitRespEvent(event string, payload string) string {
// ConvertOpenAIChatCompletionsResponseToOpenAIResponses converts OpenAI Chat Completions streaming chunks
// to OpenAI Responses SSE events (response.*).
func ConvertOpenAIChatCompletionsResponseToOpenAIResponses(ctx context.Context, modelName string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, param *any) []string {
log.Debug("ConvertOpenAIChatCompletionsResponseToOpenAIResponses")
if *param == nil {
*param = &oaiToResponsesState{
FuncArgsBuf: make(map[int]*strings.Builder),
@@ -518,7 +516,6 @@ func ConvertOpenAIChatCompletionsResponseToOpenAIResponses(ctx context.Context,
// ConvertOpenAIChatCompletionsResponseToOpenAIResponsesNonStream builds a single Responses JSON
// from a non-streaming OpenAI Chat Completions response.
func ConvertOpenAIChatCompletionsResponseToOpenAIResponsesNonStream(_ context.Context, _ string, originalRequestRawJSON, requestRawJSON, rawJSON []byte, _ *any) string {
log.Debug("ConvertOpenAIChatCompletionsResponseToOpenAIResponsesNonStream")
root := gjson.ParseBytes(rawJSON)
// Basic response scaffold