Compare commits

...

32 Commits

Author SHA1 Message Date
Supra4E8C
b5fe78eb70 Merge pull request #1597 from router-for-me/kimi-fix
feat(registry): add support for 'kimi' channel in model definitions
2026-02-15 15:35:17 +08:00
Supra4E8C
d1f667cf8d feat(registry): add support for 'kimi' channel in model definitions 2026-02-15 15:21:33 +08:00
Luis Pater
55789df275 chore(docker): update Go base image to 1.26-alpine 2026-02-15 14:26:44 +08:00
Luis Pater
46a6782065 refactor(all): replace manual pointer assignments with new to enhance code readability and maintainability 2026-02-15 14:10:10 +08:00
Luis Pater
c359f61859 fix(auth): normalize Gemini credential file prefix for consistency 2026-02-15 13:59:33 +08:00
Luis Pater
908c8eab5b Merge pull request #1543 from sususu98/feat/gemini-cli-google-one
feat(gemini-cli): add Google One login and improve auto-discovery
2026-02-15 13:58:21 +08:00
Luis Pater
f5f2c69233 Merge pull request #1595 from alexey-yanchenko/feature/cache-usage-from-codex-to-chat-completions
Pass cache usage from codex to openai chat completions
2026-02-15 13:56:46 +08:00
Alexey Yanchenko
63d4de5eea Pass cache usage from codex to openai chat completions 2026-02-15 12:04:15 +07:00
Luis Pater
ae1e8a5191 chore(runtime, registry): update Codex client version and GPT-5.3 model creation date 2026-02-13 12:47:48 +08:00
Luis Pater
b3ccc55f09 Merge pull request #1574 from fbettag/feat/gpt-5.3-codex-spark
feat(registry): add gpt-5.3-codex-spark model definition
2026-02-13 12:46:08 +08:00
Franz Bettag
1ce56d7413 Update internal/registry/model_definitions_static_data.go
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-02-12 23:37:27 +01:00
Franz Bettag
41a78be3a2 feat(registry): add gpt-5.3-codex-spark model definition 2026-02-12 23:24:08 +01:00
Luis Pater
1ff5de9a31 docs(readme): add CLIProxyAPI Dashboard to project list 2026-02-13 00:40:39 +08:00
Luis Pater
46a6853046 Merge pull request #1568 from itsmylife44/add-cliproxyapi-dashboard
Add CLIProxyAPI Dashboard to 'Who is with us?' section
2026-02-13 00:37:41 +08:00
xSpaM
4b2d40bd67 Add CLIProxyAPI Dashboard to 'Who is with us?' section 2026-02-12 17:15:46 +01:00
Luis Pater
575881cb59 feat(registry): add new model definition for MiniMax-M2.5 2026-02-12 22:43:01 +08:00
hkfires
f361b2716d feat(registry): add glm-5 model to iflow 2026-02-12 11:13:28 +08:00
Luis Pater
58e09f8e5f Merge pull request #1542 from APE-147/fix/gemini-antigravity-schema-sanitization
fix(schema): sanitize Gemini-incompatible tool metadata fields
2026-02-11 21:34:04 +08:00
Luis Pater
a146c6c0aa Merge pull request #1523 from xxddff/feature/removeUserField
fix(codex): remove unsupported 'user' field from /v1/responses payload
2026-02-11 20:38:16 +08:00
Luis Pater
4c133d3ea9 test(sdk/watcher): add tests for excluded models merging and priority parsing logic
- Added unit tests for combining OAuth excluded models across global and attribute-specific scopes.
- Implemented priority attribute parsing with support for different formats and trimming.
2026-02-11 20:35:13 +08:00
sususu98
f3ccd85ba1 feat(gemini-cli): add Google One login and improve auto-discovery
Add Google One personal account login to Gemini CLI OAuth flow:
- CLI --login shows mode menu (Code Assist vs Google One)
- Web management API accepts project_id=GOOGLE_ONE sentinel
- Auto-discover project via onboardUser without cloudaicompanionProject when project is unresolved

Improve robustness of auto-discovery and token handling:
- Add context-aware auto-discovery polling (30s timeout, 2s interval)
- Distinguish network errors from project-selection-required errors
- Refresh expired access tokens in readAuthFile before project lookup
- Extend project_id auto-fill to gemini auth type (was antigravity-only)

Unify credential file naming to geminicli- prefix for both CLI and web.

Add extractAccessToken unit tests (9 cases).
2026-02-11 17:53:03 +08:00
RGBadmin
dc279de443 refactor: reduce code duplication in extractExcludedModelsFromMetadata 2026-02-11 15:57:16 +08:00
RGBadmin
bf1634bda0 refactor: simplify per-account excluded_models merge in routing 2026-02-11 15:57:15 +08:00
Nathan
166d2d24d9 fix(schema): remove Gemini-incompatible tool metadata fields
Sanitize tool schemas by stripping prefill, enumTitles, $id, and patternProperties to prevent Gemini INVALID_ARGUMENT 400 errors, and add unit and executor-level tests to lock in the behavior.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 18:29:17 +11:00
RGBadmin
4cbcc835d1 feat: read per-account excluded_models at routing time 2026-02-11 15:21:19 +08:00
RGBadmin
b93026d83a feat: merge per-account excluded_models with global config 2026-02-11 15:21:15 +08:00
RGBadmin
5ed2133ff9 feat: add per-account excluded_models and priority parsing 2026-02-11 15:21:12 +08:00
Luis Pater
1510bfcb6f fix(translator): improve content handling for system and user messages
- Added support for single and array-based `content` cases.
- Enhanced `system_instruction` structure population logic.
- Improved handling of user role assignment for string-based `content`.
2026-02-11 15:04:01 +08:00
xxddff
bb9fe52f1e Update internal/translator/codex/openai/responses/codex_openai-responses_request_test.go
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-02-10 18:24:58 +09:00
xxddff
afe4c1bfb7 更新internal/translator/codex/openai/responses/codex_openai-responses_request.go
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-02-10 18:24:26 +09:00
xxddff
865af9f19e Implement test for user field deletion
Add test to verify deletion of user field in response
2026-02-10 17:38:49 +09:00
xxddff
2b97cb98b5 Delete 'user' field from raw JSON
Remove the 'user' field from the raw JSON as requested.
2026-02-10 17:35:54 +09:00
41 changed files with 940 additions and 109 deletions

View File

@@ -19,7 +19,7 @@ jobs:
- run: git fetch --force --tags
- uses: actions/setup-go@v4
with:
go-version: '>=1.24.0'
go-version: '>=1.26.0'
cache: true
- name: Generate Build Metadata
run: |

View File

@@ -1,4 +1,4 @@
FROM golang:1.24-alpine AS builder
FROM golang:1.26-alpine AS builder
WORKDIR /app

View File

@@ -146,6 +146,10 @@ A Windows tray application implemented using PowerShell scripts, without relying
霖君 is a cross-platform desktop application for managing AI programming assistants, supporting macOS, Windows, and Linux systems. Unified management of Claude Code, Gemini CLI, OpenAI Codex, Qwen Code, and other AI coding tools, with local proxy for multi-account quota tracking and one-click configuration.
### [CLIProxyAPI Dashboard](https://github.com/itsmylife44/cliproxyapi-dashboard)
A modern web-based management dashboard for CLIProxyAPI built with Next.js, React, and PostgreSQL. Features real-time log streaming, structured configuration editing, API key management, OAuth provider integration for Claude/Gemini/Codex, usage analytics, container management, and config sync with OpenCode via companion plugin - no manual YAML editing needed.
> [!NOTE]
> If you developed a project based on CLIProxyAPI, please open a PR to add it to this list.

View File

@@ -145,6 +145,10 @@ Windows 托盘应用,基于 PowerShell 脚本实现,不依赖任何第三方
霖君是一款用于管理AI编程助手的跨平台桌面应用支持macOS、Windows、Linux系统。统一管理Claude Code、Gemini CLI、OpenAI Codex、Qwen Code等AI编程工具本地代理实现多账户配额跟踪和一键配置。
### [CLIProxyAPI Dashboard](https://github.com/itsmylife44/cliproxyapi-dashboard)
一个面向 CLIProxyAPI 的现代化 Web 管理仪表盘,基于 Next.js、React 和 PostgreSQL 构建。支持实时日志流、结构化配置编辑、API Key 管理、Claude/Gemini/Codex 的 OAuth 提供方集成、使用量分析、容器管理,并可通过配套插件与 OpenCode 同步配置,无需手动编辑 YAML。
> [!NOTE]
> 如果你开发了基于 CLIProxyAPI 的项目,请提交一个 PR拉取请求将其添加到此列表中。

2
go.mod
View File

@@ -1,6 +1,6 @@
module github.com/router-for-me/CLIProxyAPI/v6
go 1.24.0
go 1.26.0
require (
github.com/andybalholm/brotli v1.0.6

View File

@@ -1188,6 +1188,30 @@ func (h *Handler) RequestGeminiCLIToken(c *gin.Context) {
}
ts.ProjectID = strings.Join(projects, ",")
ts.Checked = true
} else if strings.EqualFold(requestedProjectID, "GOOGLE_ONE") {
ts.Auto = false
if errSetup := performGeminiCLISetup(ctx, gemClient, &ts, ""); errSetup != nil {
log.Errorf("Google One auto-discovery failed: %v", errSetup)
SetOAuthSessionError(state, "Google One auto-discovery failed")
return
}
if strings.TrimSpace(ts.ProjectID) == "" {
log.Error("Google One auto-discovery returned empty project ID")
SetOAuthSessionError(state, "Google One auto-discovery returned empty project ID")
return
}
isChecked, errCheck := checkCloudAPIIsEnabled(ctx, gemClient, ts.ProjectID)
if errCheck != nil {
log.Errorf("Failed to verify Cloud AI API status: %v", errCheck)
SetOAuthSessionError(state, "Failed to verify Cloud AI API status")
return
}
ts.Checked = isChecked
if !isChecked {
log.Error("Cloud AI API is not enabled for the auto-discovered project")
SetOAuthSessionError(state, "Cloud AI API not enabled")
return
}
} else {
if errEnsure := ensureGeminiProjectAndOnboard(ctx, gemClient, &ts, requestedProjectID); errEnsure != nil {
log.Errorf("Failed to complete Gemini CLI onboarding: %v", errEnsure)
@@ -2036,7 +2060,48 @@ func performGeminiCLISetup(ctx context.Context, httpClient *http.Client, storage
}
}
if projectID == "" {
return &projectSelectionRequiredError{}
// Auto-discovery: try onboardUser without specifying a project
// to let Google auto-provision one (matches Gemini CLI headless behavior
// and Antigravity's FetchProjectID pattern).
autoOnboardReq := map[string]any{
"tierId": tierID,
"metadata": metadata,
}
autoCtx, autoCancel := context.WithTimeout(ctx, 30*time.Second)
defer autoCancel()
for attempt := 1; ; attempt++ {
var onboardResp map[string]any
if errOnboard := callGeminiCLI(autoCtx, httpClient, "onboardUser", autoOnboardReq, &onboardResp); errOnboard != nil {
return fmt.Errorf("auto-discovery onboardUser: %w", errOnboard)
}
if done, okDone := onboardResp["done"].(bool); okDone && done {
if resp, okResp := onboardResp["response"].(map[string]any); okResp {
switch v := resp["cloudaicompanionProject"].(type) {
case string:
projectID = strings.TrimSpace(v)
case map[string]any:
if id, okID := v["id"].(string); okID {
projectID = strings.TrimSpace(id)
}
}
}
break
}
log.Debugf("Auto-discovery: onboarding in progress, attempt %d...", attempt)
select {
case <-autoCtx.Done():
return &projectSelectionRequiredError{}
case <-time.After(2 * time.Second):
}
}
if projectID == "" {
return &projectSelectionRequiredError{}
}
log.Infof("Auto-discovered project ID via onboarding: %s", projectID)
}
onboardReqBody := map[string]any{

View File

@@ -28,8 +28,7 @@ func (h *Handler) GetConfig(c *gin.Context) {
c.JSON(200, gin.H{})
return
}
cfgCopy := *h.cfg
c.JSON(200, &cfgCopy)
c.JSON(200, new(*h.cfg))
}
type releaseInfo struct {

View File

@@ -127,8 +127,7 @@ func (m *AmpModule) Register(ctx modules.Context) error {
m.modelMapper = NewModelMapper(settings.ModelMappings)
// Store initial config for partial reload comparison
settingsCopy := settings
m.lastConfig = &settingsCopy
m.lastConfig = new(settings)
// Initialize localhost restriction setting (hot-reloadable)
m.setRestrictToLocalhost(settings.RestrictManagementToLocalhost)

View File

@@ -40,8 +40,7 @@ func DoClaudeLogin(cfg *config.Config, options *LoginOptions) {
_, savedPath, err := manager.Login(context.Background(), "claude", cfg, authOpts)
if err != nil {
var authErr *claude.AuthenticationError
if errors.As(err, &authErr) {
if authErr, ok := errors.AsType[*claude.AuthenticationError](err); ok {
log.Error(claude.GetUserFriendlyMessage(authErr))
if authErr.Type == claude.ErrPortInUse.Type {
os.Exit(claude.ErrPortInUse.Code)

View File

@@ -32,8 +32,7 @@ func DoIFlowLogin(cfg *config.Config, options *LoginOptions) {
_, savedPath, err := manager.Login(context.Background(), "iflow", cfg, authOpts)
if err != nil {
var emailErr *sdkAuth.EmailRequiredError
if errors.As(err, &emailErr) {
if emailErr, ok := errors.AsType[*sdkAuth.EmailRequiredError](err); ok {
log.Error(emailErr.Error())
return
}

View File

@@ -100,49 +100,74 @@ func DoLogin(cfg *config.Config, projectID string, options *LoginOptions) {
log.Info("Authentication successful.")
projects, errProjects := fetchGCPProjects(ctx, httpClient)
if errProjects != nil {
log.Errorf("Failed to get project list: %v", errProjects)
return
var activatedProjects []string
useGoogleOne := false
if trimmedProjectID == "" && promptFn != nil {
fmt.Println("\nSelect login mode:")
fmt.Println(" 1. Code Assist (GCP project, manual selection)")
fmt.Println(" 2. Google One (personal account, auto-discover project)")
choice, errPrompt := promptFn("Enter choice [1/2] (default: 1): ")
if errPrompt == nil && strings.TrimSpace(choice) == "2" {
useGoogleOne = true
}
}
selectedProjectID := promptForProjectSelection(projects, trimmedProjectID, promptFn)
projectSelections, errSelection := resolveProjectSelections(selectedProjectID, projects)
if errSelection != nil {
log.Errorf("Invalid project selection: %v", errSelection)
return
}
if len(projectSelections) == 0 {
log.Error("No project selected; aborting login.")
return
}
activatedProjects := make([]string, 0, len(projectSelections))
seenProjects := make(map[string]bool)
for _, candidateID := range projectSelections {
log.Infof("Activating project %s", candidateID)
if errSetup := performGeminiCLISetup(ctx, httpClient, storage, candidateID); errSetup != nil {
var projectErr *projectSelectionRequiredError
if errors.As(errSetup, &projectErr) {
log.Error("Failed to start user onboarding: A project ID is required.")
showProjectSelectionHelp(storage.Email, projects)
return
}
log.Errorf("Failed to complete user setup: %v", errSetup)
if useGoogleOne {
log.Info("Google One mode: auto-discovering project...")
if errSetup := performGeminiCLISetup(ctx, httpClient, storage, ""); errSetup != nil {
log.Errorf("Google One auto-discovery failed: %v", errSetup)
return
}
finalID := strings.TrimSpace(storage.ProjectID)
if finalID == "" {
finalID = candidateID
autoProject := strings.TrimSpace(storage.ProjectID)
if autoProject == "" {
log.Error("Google One auto-discovery returned empty project ID")
return
}
log.Infof("Auto-discovered project: %s", autoProject)
activatedProjects = []string{autoProject}
} else {
projects, errProjects := fetchGCPProjects(ctx, httpClient)
if errProjects != nil {
log.Errorf("Failed to get project list: %v", errProjects)
return
}
// Skip duplicates
if seenProjects[finalID] {
log.Infof("Project %s already activated, skipping", finalID)
continue
selectedProjectID := promptForProjectSelection(projects, trimmedProjectID, promptFn)
projectSelections, errSelection := resolveProjectSelections(selectedProjectID, projects)
if errSelection != nil {
log.Errorf("Invalid project selection: %v", errSelection)
return
}
if len(projectSelections) == 0 {
log.Error("No project selected; aborting login.")
return
}
seenProjects := make(map[string]bool)
for _, candidateID := range projectSelections {
log.Infof("Activating project %s", candidateID)
if errSetup := performGeminiCLISetup(ctx, httpClient, storage, candidateID); errSetup != nil {
if _, ok := errors.AsType[*projectSelectionRequiredError](errSetup); ok {
log.Error("Failed to start user onboarding: A project ID is required.")
showProjectSelectionHelp(storage.Email, projects)
return
}
log.Errorf("Failed to complete user setup: %v", errSetup)
return
}
finalID := strings.TrimSpace(storage.ProjectID)
if finalID == "" {
finalID = candidateID
}
if seenProjects[finalID] {
log.Infof("Project %s already activated, skipping", finalID)
continue
}
seenProjects[finalID] = true
activatedProjects = append(activatedProjects, finalID)
}
seenProjects[finalID] = true
activatedProjects = append(activatedProjects, finalID)
}
storage.Auto = false
@@ -235,7 +260,48 @@ func performGeminiCLISetup(ctx context.Context, httpClient *http.Client, storage
}
}
if projectID == "" {
return &projectSelectionRequiredError{}
// Auto-discovery: try onboardUser without specifying a project
// to let Google auto-provision one (matches Gemini CLI headless behavior
// and Antigravity's FetchProjectID pattern).
autoOnboardReq := map[string]any{
"tierId": tierID,
"metadata": metadata,
}
autoCtx, autoCancel := context.WithTimeout(ctx, 30*time.Second)
defer autoCancel()
for attempt := 1; ; attempt++ {
var onboardResp map[string]any
if errOnboard := callGeminiCLI(autoCtx, httpClient, "onboardUser", autoOnboardReq, &onboardResp); errOnboard != nil {
return fmt.Errorf("auto-discovery onboardUser: %w", errOnboard)
}
if done, okDone := onboardResp["done"].(bool); okDone && done {
if resp, okResp := onboardResp["response"].(map[string]any); okResp {
switch v := resp["cloudaicompanionProject"].(type) {
case string:
projectID = strings.TrimSpace(v)
case map[string]any:
if id, okID := v["id"].(string); okID {
projectID = strings.TrimSpace(id)
}
}
}
break
}
log.Debugf("Auto-discovery: onboarding in progress, attempt %d...", attempt)
select {
case <-autoCtx.Done():
return &projectSelectionRequiredError{}
case <-time.After(2 * time.Second):
}
}
if projectID == "" {
return &projectSelectionRequiredError{}
}
log.Infof("Auto-discovered project ID via onboarding: %s", projectID)
}
onboardReqBody := map[string]any{
@@ -617,7 +683,7 @@ func updateAuthRecord(record *cliproxyauth.Auth, storage *gemini.GeminiTokenStor
return
}
finalName := gemini.CredentialFileName(storage.Email, storage.ProjectID, false)
finalName := gemini.CredentialFileName(storage.Email, storage.ProjectID, true)
if record.Metadata == nil {
record.Metadata = make(map[string]any)

View File

@@ -54,8 +54,7 @@ func DoCodexLogin(cfg *config.Config, options *LoginOptions) {
_, savedPath, err := manager.Login(context.Background(), "codex", cfg, authOpts)
if err != nil {
var authErr *codex.AuthenticationError
if errors.As(err, &authErr) {
if authErr, ok := errors.AsType[*codex.AuthenticationError](err); ok {
log.Error(codex.GetUserFriendlyMessage(authErr))
if authErr.Type == codex.ErrPortInUse.Type {
os.Exit(codex.ErrPortInUse.Code)

View File

@@ -44,8 +44,7 @@ func DoQwenLogin(cfg *config.Config, options *LoginOptions) {
_, savedPath, err := manager.Login(context.Background(), "qwen", cfg, authOpts)
if err != nil {
var emailErr *sdkAuth.EmailRequiredError
if errors.As(err, &emailErr) {
if emailErr, ok := errors.AsType[*sdkAuth.EmailRequiredError](err); ok {
log.Error(emailErr.Error())
return
}

View File

@@ -19,6 +19,7 @@ import (
// - codex
// - qwen
// - iflow
// - kimi
// - antigravity (returns static overrides only)
func GetStaticModelDefinitionsByChannel(channel string) []*ModelInfo {
key := strings.ToLower(strings.TrimSpace(channel))
@@ -39,6 +40,8 @@ func GetStaticModelDefinitionsByChannel(channel string) []*ModelInfo {
return GetQwenModels()
case "iflow":
return GetIFlowModels()
case "kimi":
return GetKimiModels()
case "antigravity":
cfg := GetAntigravityModelConfig()
if len(cfg) == 0 {
@@ -83,6 +86,7 @@ func LookupStaticModelInfo(modelID string) *ModelInfo {
GetOpenAIModels(),
GetQwenModels(),
GetIFlowModels(),
GetKimiModels(),
}
for _, models := range allModels {
for _, m := range models {

View File

@@ -742,6 +742,20 @@ func GetOpenAIModels() []*ModelInfo {
SupportedParameters: []string{"tools"},
Thinking: &ThinkingSupport{Levels: []string{"low", "medium", "high", "xhigh"}},
},
{
ID: "gpt-5.3-codex-spark",
Object: "model",
Created: 1770912000,
OwnedBy: "openai",
Type: "openai",
Version: "gpt-5.3",
DisplayName: "GPT 5.3 Codex Spark",
Description: "Ultra-fast coding model.",
ContextLength: 128000,
MaxCompletionTokens: 128000,
SupportedParameters: []string{"tools"},
Thinking: &ThinkingSupport{Levels: []string{"low", "medium", "high", "xhigh"}},
},
}
}
@@ -814,6 +828,7 @@ func GetIFlowModels() []*ModelInfo {
{ID: "kimi-k2-0905", DisplayName: "Kimi-K2-Instruct-0905", Description: "Moonshot Kimi K2 instruct 0905", Created: 1757030400},
{ID: "glm-4.6", DisplayName: "GLM-4.6", Description: "Zhipu GLM 4.6 general model", Created: 1759190400, Thinking: iFlowThinkingSupport},
{ID: "glm-4.7", DisplayName: "GLM-4.7", Description: "Zhipu GLM 4.7 general model", Created: 1766448000, Thinking: iFlowThinkingSupport},
{ID: "glm-5", DisplayName: "GLM-5", Description: "Zhipu GLM 5 general model", Created: 1770768000, Thinking: iFlowThinkingSupport},
{ID: "kimi-k2", DisplayName: "Kimi-K2", Description: "Moonshot Kimi K2 general model", Created: 1752192000},
{ID: "kimi-k2-thinking", DisplayName: "Kimi-K2-Thinking", Description: "Moonshot Kimi K2 thinking model", Created: 1762387200},
{ID: "deepseek-v3.2-chat", DisplayName: "DeepSeek-V3.2", Description: "DeepSeek V3.2 Chat", Created: 1764576000},
@@ -828,6 +843,7 @@ func GetIFlowModels() []*ModelInfo {
{ID: "qwen3-235b", DisplayName: "Qwen3-235B-A22B", Description: "Qwen3 235B A22B", Created: 1753401600},
{ID: "minimax-m2", DisplayName: "MiniMax-M2", Description: "MiniMax M2", Created: 1758672000, Thinking: iFlowThinkingSupport},
{ID: "minimax-m2.1", DisplayName: "MiniMax-M2.1", Description: "MiniMax M2.1", Created: 1766448000, Thinking: iFlowThinkingSupport},
{ID: "minimax-m2.5", DisplayName: "MiniMax-M2.5", Description: "MiniMax M2.5", Created: 1770825600, Thinking: iFlowThinkingSupport},
{ID: "iflow-rome-30ba3b", DisplayName: "iFlow-ROME", Description: "iFlow Rome 30BA3B model", Created: 1736899200},
{ID: "kimi-k2.5", DisplayName: "Kimi-K2.5", Description: "Moonshot Kimi K2.5", Created: 1769443200, Thinking: iFlowThinkingSupport},
}

View File

@@ -596,8 +596,7 @@ func (r *ModelRegistry) SetModelQuotaExceeded(clientID, modelID string) {
defer r.mutex.Unlock()
if registration, exists := r.models[modelID]; exists {
now := time.Now()
registration.QuotaExceededClients[clientID] = &now
registration.QuotaExceededClients[clientID] = new(time.Now())
log.Debugf("Marked model %s as quota exceeded for client %s", modelID, clientID)
}
}

View File

@@ -0,0 +1,159 @@
package executor
import (
"context"
"encoding/json"
"io"
"testing"
cliproxyauth "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/auth"
)
func TestAntigravityBuildRequest_SanitizesGeminiToolSchema(t *testing.T) {
body := buildRequestBodyFromPayload(t, "gemini-2.5-pro")
decl := extractFirstFunctionDeclaration(t, body)
if _, ok := decl["parametersJsonSchema"]; ok {
t.Fatalf("parametersJsonSchema should be renamed to parameters")
}
params, ok := decl["parameters"].(map[string]any)
if !ok {
t.Fatalf("parameters missing or invalid type")
}
assertSchemaSanitizedAndPropertyPreserved(t, params)
}
func TestAntigravityBuildRequest_SanitizesAntigravityToolSchema(t *testing.T) {
body := buildRequestBodyFromPayload(t, "claude-opus-4-6")
decl := extractFirstFunctionDeclaration(t, body)
params, ok := decl["parameters"].(map[string]any)
if !ok {
t.Fatalf("parameters missing or invalid type")
}
assertSchemaSanitizedAndPropertyPreserved(t, params)
}
func buildRequestBodyFromPayload(t *testing.T, modelName string) map[string]any {
t.Helper()
executor := &AntigravityExecutor{}
auth := &cliproxyauth.Auth{}
payload := []byte(`{
"request": {
"tools": [
{
"function_declarations": [
{
"name": "tool_1",
"parametersJsonSchema": {
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "root-schema",
"type": "object",
"properties": {
"$id": {"type": "string"},
"arg": {
"type": "object",
"prefill": "hello",
"properties": {
"mode": {
"type": "string",
"enum": ["a", "b"],
"enumTitles": ["A", "B"]
}
}
}
},
"patternProperties": {
"^x-": {"type": "string"}
}
}
}
]
}
]
}
}`)
req, err := executor.buildRequest(context.Background(), auth, "token", modelName, payload, false, "", "https://example.com")
if err != nil {
t.Fatalf("buildRequest error: %v", err)
}
raw, err := io.ReadAll(req.Body)
if err != nil {
t.Fatalf("read request body error: %v", err)
}
var body map[string]any
if err := json.Unmarshal(raw, &body); err != nil {
t.Fatalf("unmarshal request body error: %v, body=%s", err, string(raw))
}
return body
}
func extractFirstFunctionDeclaration(t *testing.T, body map[string]any) map[string]any {
t.Helper()
request, ok := body["request"].(map[string]any)
if !ok {
t.Fatalf("request missing or invalid type")
}
tools, ok := request["tools"].([]any)
if !ok || len(tools) == 0 {
t.Fatalf("tools missing or empty")
}
tool, ok := tools[0].(map[string]any)
if !ok {
t.Fatalf("first tool invalid type")
}
decls, ok := tool["function_declarations"].([]any)
if !ok || len(decls) == 0 {
t.Fatalf("function_declarations missing or empty")
}
decl, ok := decls[0].(map[string]any)
if !ok {
t.Fatalf("first function declaration invalid type")
}
return decl
}
func assertSchemaSanitizedAndPropertyPreserved(t *testing.T, params map[string]any) {
t.Helper()
if _, ok := params["$id"]; ok {
t.Fatalf("root $id should be removed from schema")
}
if _, ok := params["patternProperties"]; ok {
t.Fatalf("patternProperties should be removed from schema")
}
props, ok := params["properties"].(map[string]any)
if !ok {
t.Fatalf("properties missing or invalid type")
}
if _, ok := props["$id"]; !ok {
t.Fatalf("property named $id should be preserved")
}
arg, ok := props["arg"].(map[string]any)
if !ok {
t.Fatalf("arg property missing or invalid type")
}
if _, ok := arg["prefill"]; ok {
t.Fatalf("prefill should be removed from nested schema")
}
argProps, ok := arg["properties"].(map[string]any)
if !ok {
t.Fatalf("arg.properties missing or invalid type")
}
mode, ok := argProps["mode"].(map[string]any)
if !ok {
t.Fatalf("mode property missing or invalid type")
}
if _, ok := mode["enumTitles"]; ok {
t.Fatalf("enumTitles should be removed from nested schema")
}
}

View File

@@ -28,8 +28,8 @@ import (
)
const (
codexClientVersion = "0.98.0"
codexUserAgent = "codex_cli_rs/0.98.0 (Mac OS 26.0.1; arm64) Apple_Terminal/464"
codexClientVersion = "0.101.0"
codexUserAgent = "codex_cli_rs/0.101.0 (Mac OS 26.0.1; arm64) Apple_Terminal/464"
)
var dataTag = []byte("data:")

View File

@@ -899,8 +899,7 @@ func parseRetryDelay(errorBody []byte) (*time.Duration, error) {
if matches := re.FindStringSubmatch(message); len(matches) > 1 {
seconds, err := strconv.Atoi(matches[1])
if err == nil {
duration := time.Duration(seconds) * time.Second
return &duration, nil
return new(time.Duration(seconds) * time.Second), nil
}
}
}

View File

@@ -90,6 +90,9 @@ func ConvertCodexResponseToOpenAI(_ context.Context, modelName string, originalR
if inputTokensResult := usageResult.Get("input_tokens"); inputTokensResult.Exists() {
template, _ = sjson.Set(template, "usage.prompt_tokens", inputTokensResult.Int())
}
if cachedTokensResult := usageResult.Get("input_tokens_details.cached_tokens"); cachedTokensResult.Exists() {
template, _ = sjson.Set(template, "usage.prompt_tokens_details.cached_tokens", cachedTokensResult.Int())
}
if reasoningTokensResult := usageResult.Get("output_tokens_details.reasoning_tokens"); reasoningTokensResult.Exists() {
template, _ = sjson.Set(template, "usage.completion_tokens_details.reasoning_tokens", reasoningTokensResult.Int())
}
@@ -205,6 +208,9 @@ func ConvertCodexResponseToOpenAINonStream(_ context.Context, _ string, original
if inputTokensResult := usageResult.Get("input_tokens"); inputTokensResult.Exists() {
template, _ = sjson.Set(template, "usage.prompt_tokens", inputTokensResult.Int())
}
if cachedTokensResult := usageResult.Get("input_tokens_details.cached_tokens"); cachedTokensResult.Exists() {
template, _ = sjson.Set(template, "usage.prompt_tokens_details.cached_tokens", cachedTokensResult.Int())
}
if reasoningTokensResult := usageResult.Get("output_tokens_details.reasoning_tokens"); reasoningTokensResult.Exists() {
template, _ = sjson.Set(template, "usage.completion_tokens_details.reasoning_tokens", reasoningTokensResult.Int())
}

View File

@@ -27,6 +27,9 @@ func ConvertOpenAIResponsesRequestToCodex(modelName string, inputRawJSON []byte,
rawJSON, _ = sjson.DeleteBytes(rawJSON, "top_p")
rawJSON, _ = sjson.DeleteBytes(rawJSON, "service_tier")
// Delete the user field as it is not supported by the Codex upstream.
rawJSON, _ = sjson.DeleteBytes(rawJSON, "user")
// Convert role "system" to "developer" in input array to comply with Codex API requirements.
rawJSON = convertSystemRoleToDeveloper(rawJSON)

View File

@@ -263,3 +263,20 @@ func TestConvertSystemRoleToDeveloper_AssistantRole(t *testing.T) {
t.Errorf("Expected third role 'assistant', got '%s'", thirdRole.String())
}
}
func TestUserFieldDeletion(t *testing.T) {
inputJSON := []byte(`{
"model": "gpt-5.2",
"user": "test-user",
"input": [{"role": "user", "content": "Hello"}]
}`)
output := ConvertOpenAIResponsesRequestToCodex("gpt-5.2", inputJSON, false)
outputStr := string(output)
// Verify user field is deleted
userField := gjson.Get(outputStr, "user")
if userField.Exists() {
t.Errorf("user field should be deleted, but it was found with value: %s", userField.Raw)
}
}

View File

@@ -117,19 +117,29 @@ func ConvertOpenAIResponsesRequestToGemini(modelName string, inputRawJSON []byte
switch itemType {
case "message":
if strings.EqualFold(itemRole, "system") {
if contentArray := item.Get("content"); contentArray.Exists() && contentArray.IsArray() {
var builder strings.Builder
contentArray.ForEach(func(_, contentItem gjson.Result) bool {
text := contentItem.Get("text").String()
if builder.Len() > 0 && text != "" {
builder.WriteByte('\n')
}
builder.WriteString(text)
return true
})
if !gjson.Get(out, "system_instruction").Exists() {
systemInstr := `{"parts":[{"text":""}]}`
systemInstr, _ = sjson.Set(systemInstr, "parts.0.text", builder.String())
if contentArray := item.Get("content"); contentArray.Exists() {
systemInstr := ""
if systemInstructionResult := gjson.Get(out, "system_instruction"); systemInstructionResult.Exists() {
systemInstr = systemInstructionResult.Raw
} else {
systemInstr = `{"parts":[]}`
}
if contentArray.IsArray() {
contentArray.ForEach(func(_, contentItem gjson.Result) bool {
part := `{"text":""}`
text := contentItem.Get("text").String()
part, _ = sjson.Set(part, "text", text)
systemInstr, _ = sjson.SetRaw(systemInstr, "parts.-1", part)
return true
})
} else if contentArray.Type == gjson.String {
part := `{"text":""}`
part, _ = sjson.Set(part, "text", contentArray.String())
systemInstr, _ = sjson.SetRaw(systemInstr, "parts.-1", part)
}
if systemInstr != `{"parts":[]}` {
out, _ = sjson.SetRaw(out, "system_instruction", systemInstr)
}
}
@@ -236,8 +246,22 @@ func ConvertOpenAIResponsesRequestToGemini(modelName string, inputRawJSON []byte
})
flush()
}
} else if contentArray.Type == gjson.String {
effRole := "user"
if itemRole != "" {
switch strings.ToLower(itemRole) {
case "assistant", "model":
effRole = "model"
default:
effRole = strings.ToLower(itemRole)
}
}
one := `{"role":"","parts":[{"text":""}]}`
one, _ = sjson.Set(one, "role", effRole)
one, _ = sjson.Set(one, "parts.0.text", contentArray.String())
out, _ = sjson.SetRaw(out, "contents.-1", one)
}
case "function_call":
// Handle function calls - convert to model message with functionCall
name := item.Get("name").String()

View File

@@ -428,8 +428,9 @@ func flattenTypeArrays(jsonStr string) string {
func removeUnsupportedKeywords(jsonStr string) string {
keywords := append(unsupportedConstraints,
"$schema", "$defs", "definitions", "const", "$ref", "additionalProperties",
"propertyNames", // Gemini doesn't support property name validation
"$schema", "$defs", "definitions", "const", "$ref", "$id", "additionalProperties",
"propertyNames", "patternProperties", // Gemini doesn't support these schema keywords
"enumTitles", "prefill", // Claude/OpenCode schema metadata fields unsupported by Gemini
)
deletePaths := make([]string, 0)

View File

@@ -870,6 +870,57 @@ func TestCleanJSONSchemaForAntigravity_BooleanEnumToString(t *testing.T) {
}
}
func TestCleanJSONSchemaForGemini_RemovesGeminiUnsupportedMetadataFields(t *testing.T) {
input := `{
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "root-schema",
"type": "object",
"properties": {
"payload": {
"type": "object",
"prefill": "hello",
"properties": {
"mode": {
"type": "string",
"enum": ["a", "b"],
"enumTitles": ["A", "B"]
}
},
"patternProperties": {
"^x-": {"type": "string"}
}
},
"$id": {
"type": "string",
"description": "property name should not be removed"
}
}
}`
expected := `{
"type": "object",
"properties": {
"payload": {
"type": "object",
"properties": {
"mode": {
"type": "string",
"enum": ["a", "b"],
"description": "Allowed: a, b"
}
}
},
"$id": {
"type": "string",
"description": "property name should not be removed"
}
}
}`
result := CleanJSONSchemaForGemini(input)
compareJSON(t, expected, result)
}
func TestRemoveExtensionFields(t *testing.T) {
tests := []struct {
name string

View File

@@ -5,6 +5,7 @@ import (
"fmt"
"os"
"path/filepath"
"strconv"
"strings"
"time"
@@ -92,6 +93,9 @@ func (s *FileSynthesizer) Synthesize(ctx *SynthesisContext) ([]*coreauth.Auth, e
status = coreauth.StatusDisabled
}
// Read per-account excluded models from the OAuth JSON file
perAccountExcluded := extractExcludedModelsFromMetadata(metadata)
a := &coreauth.Auth{
ID: id,
Provider: provider,
@@ -108,11 +112,23 @@ func (s *FileSynthesizer) Synthesize(ctx *SynthesisContext) ([]*coreauth.Auth, e
CreatedAt: now,
UpdatedAt: now,
}
ApplyAuthExcludedModelsMeta(a, cfg, nil, "oauth")
// Read priority from auth file
if rawPriority, ok := metadata["priority"]; ok {
switch v := rawPriority.(type) {
case float64:
a.Attributes["priority"] = strconv.Itoa(int(v))
case string:
priority := strings.TrimSpace(v)
if _, errAtoi := strconv.Atoi(priority); errAtoi == nil {
a.Attributes["priority"] = priority
}
}
}
ApplyAuthExcludedModelsMeta(a, cfg, perAccountExcluded, "oauth")
if provider == "gemini-cli" {
if virtuals := SynthesizeGeminiVirtualAuths(a, metadata, now); len(virtuals) > 0 {
for _, v := range virtuals {
ApplyAuthExcludedModelsMeta(v, cfg, nil, "oauth")
ApplyAuthExcludedModelsMeta(v, cfg, perAccountExcluded, "oauth")
}
out = append(out, a)
out = append(out, virtuals...)
@@ -167,6 +183,10 @@ func SynthesizeGeminiVirtualAuths(primary *coreauth.Auth, metadata map[string]an
if authPath != "" {
attrs["path"] = authPath
}
// Propagate priority from primary auth to virtual auths
if priorityVal, hasPriority := primary.Attributes["priority"]; hasPriority && priorityVal != "" {
attrs["priority"] = priorityVal
}
metadataCopy := map[string]any{
"email": email,
"project_id": projectID,
@@ -239,3 +259,40 @@ func buildGeminiVirtualID(baseID, projectID string) string {
replacer := strings.NewReplacer("/", "_", "\\", "_", " ", "_")
return fmt.Sprintf("%s::%s", baseID, replacer.Replace(project))
}
// extractExcludedModelsFromMetadata reads per-account excluded models from the OAuth JSON metadata.
// Supports both "excluded_models" and "excluded-models" keys, and accepts both []string and []interface{}.
func extractExcludedModelsFromMetadata(metadata map[string]any) []string {
if metadata == nil {
return nil
}
// Try both key formats
raw, ok := metadata["excluded_models"]
if !ok {
raw, ok = metadata["excluded-models"]
}
if !ok || raw == nil {
return nil
}
var stringSlice []string
switch v := raw.(type) {
case []string:
stringSlice = v
case []interface{}:
stringSlice = make([]string, 0, len(v))
for _, item := range v {
if s, ok := item.(string); ok {
stringSlice = append(stringSlice, s)
}
}
default:
return nil
}
result := make([]string, 0, len(stringSlice))
for _, s := range stringSlice {
if trimmed := strings.TrimSpace(s); trimmed != "" {
result = append(result, trimmed)
}
}
return result
}

View File

@@ -297,6 +297,117 @@ func TestFileSynthesizer_Synthesize_PrefixValidation(t *testing.T) {
}
}
func TestFileSynthesizer_Synthesize_PriorityParsing(t *testing.T) {
tests := []struct {
name string
priority any
want string
hasValue bool
}{
{
name: "string with spaces",
priority: " 10 ",
want: "10",
hasValue: true,
},
{
name: "number",
priority: 8,
want: "8",
hasValue: true,
},
{
name: "invalid string",
priority: "1x",
hasValue: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
tempDir := t.TempDir()
authData := map[string]any{
"type": "claude",
"priority": tt.priority,
}
data, _ := json.Marshal(authData)
errWriteFile := os.WriteFile(filepath.Join(tempDir, "auth.json"), data, 0644)
if errWriteFile != nil {
t.Fatalf("failed to write auth file: %v", errWriteFile)
}
synth := NewFileSynthesizer()
ctx := &SynthesisContext{
Config: &config.Config{},
AuthDir: tempDir,
Now: time.Now(),
IDGenerator: NewStableIDGenerator(),
}
auths, errSynthesize := synth.Synthesize(ctx)
if errSynthesize != nil {
t.Fatalf("unexpected error: %v", errSynthesize)
}
if len(auths) != 1 {
t.Fatalf("expected 1 auth, got %d", len(auths))
}
value, ok := auths[0].Attributes["priority"]
if tt.hasValue {
if !ok {
t.Fatal("expected priority attribute to be set")
}
if value != tt.want {
t.Fatalf("expected priority %q, got %q", tt.want, value)
}
return
}
if ok {
t.Fatalf("expected priority attribute to be absent, got %q", value)
}
})
}
}
func TestFileSynthesizer_Synthesize_OAuthExcludedModelsMerged(t *testing.T) {
tempDir := t.TempDir()
authData := map[string]any{
"type": "claude",
"excluded_models": []string{"custom-model", "MODEL-B"},
}
data, _ := json.Marshal(authData)
errWriteFile := os.WriteFile(filepath.Join(tempDir, "auth.json"), data, 0644)
if errWriteFile != nil {
t.Fatalf("failed to write auth file: %v", errWriteFile)
}
synth := NewFileSynthesizer()
ctx := &SynthesisContext{
Config: &config.Config{
OAuthExcludedModels: map[string][]string{
"claude": {"shared", "model-b"},
},
},
AuthDir: tempDir,
Now: time.Now(),
IDGenerator: NewStableIDGenerator(),
}
auths, errSynthesize := synth.Synthesize(ctx)
if errSynthesize != nil {
t.Fatalf("unexpected error: %v", errSynthesize)
}
if len(auths) != 1 {
t.Fatalf("expected 1 auth, got %d", len(auths))
}
got := auths[0].Attributes["excluded_models"]
want := "custom-model,model-b,shared"
if got != want {
t.Fatalf("expected excluded_models %q, got %q", want, got)
}
}
func TestSynthesizeGeminiVirtualAuths_NilInputs(t *testing.T) {
now := time.Now()
@@ -533,6 +644,7 @@ func TestFileSynthesizer_Synthesize_MultiProjectGemini(t *testing.T) {
"type": "gemini",
"email": "multi@example.com",
"project_id": "project-a, project-b, project-c",
"priority": " 10 ",
}
data, _ := json.Marshal(authData)
err := os.WriteFile(filepath.Join(tempDir, "gemini-multi.json"), data, 0644)
@@ -565,6 +677,9 @@ func TestFileSynthesizer_Synthesize_MultiProjectGemini(t *testing.T) {
if primary.Status != coreauth.StatusDisabled {
t.Errorf("expected primary status disabled, got %s", primary.Status)
}
if gotPriority := primary.Attributes["priority"]; gotPriority != "10" {
t.Errorf("expected primary priority 10, got %q", gotPriority)
}
// Remaining auths should be virtuals
for i := 1; i < 4; i++ {
@@ -575,6 +690,9 @@ func TestFileSynthesizer_Synthesize_MultiProjectGemini(t *testing.T) {
if v.Attributes["gemini_virtual_parent"] != primary.ID {
t.Errorf("expected virtual %d parent to be %s, got %s", i, primary.ID, v.Attributes["gemini_virtual_parent"])
}
if gotPriority := v.Attributes["priority"]; gotPriority != "10" {
t.Errorf("expected virtual %d priority 10, got %q", i, gotPriority)
}
}
}

View File

@@ -53,6 +53,8 @@ func (g *StableIDGenerator) Next(kind string, parts ...string) (string, string)
// ApplyAuthExcludedModelsMeta applies excluded models metadata to an auth entry.
// It computes a hash of excluded models and sets the auth_kind attribute.
// For OAuth entries, perKey (from the JSON file's excluded-models field) is merged
// with the global oauth-excluded-models config for the provider.
func ApplyAuthExcludedModelsMeta(auth *coreauth.Auth, cfg *config.Config, perKey []string, authKind string) {
if auth == nil || cfg == nil {
return
@@ -72,9 +74,13 @@ func ApplyAuthExcludedModelsMeta(auth *coreauth.Auth, cfg *config.Config, perKey
}
if authKindKey == "apikey" {
add(perKey)
} else if cfg.OAuthExcludedModels != nil {
providerKey := strings.ToLower(strings.TrimSpace(auth.Provider))
add(cfg.OAuthExcludedModels[providerKey])
} else {
// For OAuth: merge per-account excluded models with global provider-level exclusions
add(perKey)
if cfg.OAuthExcludedModels != nil {
providerKey := strings.ToLower(strings.TrimSpace(auth.Provider))
add(cfg.OAuthExcludedModels[providerKey])
}
}
combined := make([]string, 0, len(seen))
for k := range seen {
@@ -88,6 +94,10 @@ func ApplyAuthExcludedModelsMeta(auth *coreauth.Auth, cfg *config.Config, perKey
if hash != "" {
auth.Attributes["excluded_models_hash"] = hash
}
// Store the combined excluded models list so that routing can read it at runtime
if len(combined) > 0 {
auth.Attributes["excluded_models"] = strings.Join(combined, ",")
}
if authKind != "" {
auth.Attributes["auth_kind"] = authKind
}

View File

@@ -6,6 +6,7 @@ import (
"testing"
"github.com/router-for-me/CLIProxyAPI/v6/internal/config"
"github.com/router-for-me/CLIProxyAPI/v6/internal/watcher/diff"
coreauth "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/auth"
)
@@ -200,6 +201,30 @@ func TestApplyAuthExcludedModelsMeta(t *testing.T) {
}
}
func TestApplyAuthExcludedModelsMeta_OAuthMergeWritesCombinedModels(t *testing.T) {
auth := &coreauth.Auth{
Provider: "claude",
Attributes: make(map[string]string),
}
cfg := &config.Config{
OAuthExcludedModels: map[string][]string{
"claude": {"global-a", "shared"},
},
}
ApplyAuthExcludedModelsMeta(auth, cfg, []string{"per", "SHARED"}, "oauth")
const wantCombined = "global-a,per,shared"
if gotCombined := auth.Attributes["excluded_models"]; gotCombined != wantCombined {
t.Fatalf("expected excluded_models=%q, got %q", wantCombined, gotCombined)
}
expectedHash := diff.ComputeExcludedModelsHash([]string{"global-a", "per", "shared"})
if gotHash := auth.Attributes["excluded_models_hash"]; gotHash != expectedHash {
t.Fatalf("expected excluded_models_hash=%q, got %q", expectedHash, gotHash)
}
}
func TestAddConfigHeadersToAttrs(t *testing.T) {
tests := []struct {
name string

View File

@@ -185,8 +185,7 @@ func (h *GeminiCLIAPIHandler) handleInternalGenerateContent(c *gin.Context, rawJ
func (h *GeminiCLIAPIHandler) forwardCLIStream(c *gin.Context, flusher http.Flusher, alt string, cancel func(error), data <-chan []byte, errs <-chan *interfaces.ErrorMessage) {
var keepAliveInterval *time.Duration
if alt != "" {
disabled := time.Duration(0)
keepAliveInterval = &disabled
keepAliveInterval = new(time.Duration(0))
}
h.ForwardStream(c, flusher, cancel, data, errs, handlers.StreamForwardOptions{

View File

@@ -300,8 +300,7 @@ func (h *GeminiAPIHandler) handleGenerateContent(c *gin.Context, modelName strin
func (h *GeminiAPIHandler) forwardGeminiStream(c *gin.Context, flusher http.Flusher, alt string, cancel func(error), data <-chan []byte, errs <-chan *interfaces.ErrorMessage) {
var keepAliveInterval *time.Duration
if alt != "" {
disabled := time.Duration(0)
keepAliveInterval = &disabled
keepAliveInterval = new(time.Duration(0))
}
h.ForwardStream(c, flusher, cancel, data, errs, handlers.StreamForwardOptions{

View File

@@ -28,8 +28,7 @@ func (AntigravityAuthenticator) Provider() string { return "antigravity" }
// RefreshLead instructs the manager to refresh five minutes before expiry.
func (AntigravityAuthenticator) RefreshLead() *time.Duration {
lead := 5 * time.Minute
return &lead
return new(5 * time.Minute)
}
// Login launches a local OAuth flow to obtain antigravity tokens and persists them.

View File

@@ -32,8 +32,7 @@ func (a *ClaudeAuthenticator) Provider() string {
}
func (a *ClaudeAuthenticator) RefreshLead() *time.Duration {
d := 4 * time.Hour
return &d
return new(4 * time.Hour)
}
func (a *ClaudeAuthenticator) Login(ctx context.Context, cfg *config.Config, opts *LoginOptions) (*coreauth.Auth, error) {

View File

@@ -34,8 +34,7 @@ func (a *CodexAuthenticator) Provider() string {
}
func (a *CodexAuthenticator) RefreshLead() *time.Duration {
d := 5 * 24 * time.Hour
return &d
return new(5 * 24 * time.Hour)
}
func (a *CodexAuthenticator) Login(ctx context.Context, cfg *config.Config, opts *LoginOptions) (*coreauth.Auth, error) {

View File

@@ -4,8 +4,10 @@ import (
"context"
"encoding/json"
"fmt"
"io"
"io/fs"
"net/http"
"net/url"
"os"
"path/filepath"
"strings"
@@ -186,15 +188,21 @@ func (s *FileTokenStore) readAuthFile(path, baseDir string) (*cliproxyauth.Auth,
if provider == "" {
provider = "unknown"
}
if provider == "antigravity" {
if provider == "antigravity" || provider == "gemini" {
projectID := ""
if pid, ok := metadata["project_id"].(string); ok {
projectID = strings.TrimSpace(pid)
}
if projectID == "" {
accessToken := ""
if token, ok := metadata["access_token"].(string); ok {
accessToken = strings.TrimSpace(token)
accessToken := extractAccessToken(metadata)
// For gemini type, the stored access_token is likely expired (~1h lifetime).
// Refresh it using the long-lived refresh_token before querying.
if provider == "gemini" {
if tokenMap, ok := metadata["token"].(map[string]any); ok {
if refreshed, errRefresh := refreshGeminiAccessToken(tokenMap, http.DefaultClient); errRefresh == nil {
accessToken = refreshed
}
}
}
if accessToken != "" {
fetchedProjectID, errFetch := FetchAntigravityProjectID(context.Background(), accessToken, http.DefaultClient)
@@ -304,6 +312,67 @@ func (s *FileTokenStore) baseDirSnapshot() string {
return s.baseDir
}
func extractAccessToken(metadata map[string]any) string {
if at, ok := metadata["access_token"].(string); ok {
if v := strings.TrimSpace(at); v != "" {
return v
}
}
if tokenMap, ok := metadata["token"].(map[string]any); ok {
if at, ok := tokenMap["access_token"].(string); ok {
if v := strings.TrimSpace(at); v != "" {
return v
}
}
}
return ""
}
func refreshGeminiAccessToken(tokenMap map[string]any, httpClient *http.Client) (string, error) {
refreshToken, _ := tokenMap["refresh_token"].(string)
clientID, _ := tokenMap["client_id"].(string)
clientSecret, _ := tokenMap["client_secret"].(string)
tokenURI, _ := tokenMap["token_uri"].(string)
if refreshToken == "" || clientID == "" || clientSecret == "" {
return "", fmt.Errorf("missing refresh credentials")
}
if tokenURI == "" {
tokenURI = "https://oauth2.googleapis.com/token"
}
data := url.Values{
"grant_type": {"refresh_token"},
"refresh_token": {refreshToken},
"client_id": {clientID},
"client_secret": {clientSecret},
}
resp, err := httpClient.PostForm(tokenURI, data)
if err != nil {
return "", fmt.Errorf("refresh request: %w", err)
}
defer func() { _ = resp.Body.Close() }()
body, _ := io.ReadAll(resp.Body)
if resp.StatusCode != http.StatusOK {
return "", fmt.Errorf("refresh failed: status %d", resp.StatusCode)
}
var result map[string]any
if errUnmarshal := json.Unmarshal(body, &result); errUnmarshal != nil {
return "", fmt.Errorf("decode refresh response: %w", errUnmarshal)
}
newAccessToken, _ := result["access_token"].(string)
if newAccessToken == "" {
return "", fmt.Errorf("no access_token in refresh response")
}
tokenMap["access_token"] = newAccessToken
return newAccessToken, nil
}
// jsonEqual compares two JSON blobs by parsing them into Go objects and deep comparing.
func jsonEqual(a, b []byte) bool {
var objA any

View File

@@ -0,0 +1,80 @@
package auth
import "testing"
func TestExtractAccessToken(t *testing.T) {
t.Parallel()
tests := []struct {
name string
metadata map[string]any
expected string
}{
{
"antigravity top-level access_token",
map[string]any{"access_token": "tok-abc"},
"tok-abc",
},
{
"gemini nested token.access_token",
map[string]any{
"token": map[string]any{"access_token": "tok-nested"},
},
"tok-nested",
},
{
"top-level takes precedence over nested",
map[string]any{
"access_token": "tok-top",
"token": map[string]any{"access_token": "tok-nested"},
},
"tok-top",
},
{
"empty metadata",
map[string]any{},
"",
},
{
"whitespace-only access_token",
map[string]any{"access_token": " "},
"",
},
{
"wrong type access_token",
map[string]any{"access_token": 12345},
"",
},
{
"token is not a map",
map[string]any{"token": "not-a-map"},
"",
},
{
"nested whitespace-only",
map[string]any{
"token": map[string]any{"access_token": " "},
},
"",
},
{
"fallback to nested when top-level empty",
map[string]any{
"access_token": "",
"token": map[string]any{"access_token": "tok-fallback"},
},
"tok-fallback",
},
}
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
got := extractAccessToken(tt.metadata)
if got != tt.expected {
t.Errorf("extractAccessToken() = %q, want %q", got, tt.expected)
}
})
}
}

View File

@@ -26,8 +26,7 @@ func (a *IFlowAuthenticator) Provider() string { return "iflow" }
// RefreshLead indicates how soon before expiry a refresh should be attempted.
func (a *IFlowAuthenticator) RefreshLead() *time.Duration {
d := 24 * time.Hour
return &d
return new(24 * time.Hour)
}
// Login performs the OAuth code flow using a local callback server.

View File

@@ -27,8 +27,7 @@ func (a *QwenAuthenticator) Provider() string {
}
func (a *QwenAuthenticator) RefreshLead() *time.Duration {
d := 3 * time.Hour
return &d
return new(3 * time.Hour)
}
func (a *QwenAuthenticator) Login(ctx context.Context, cfg *config.Config, opts *LoginOptions) (*coreauth.Auth, error) {

View File

@@ -599,8 +599,7 @@ func (m *Manager) executeMixedOnce(ctx context.Context, providers []string, req
return cliproxyexecutor.Response{}, errCtx
}
result.Error = &Error{Message: errExec.Error()}
var se cliproxyexecutor.StatusError
if errors.As(errExec, &se) && se != nil {
if se, ok := errors.AsType[cliproxyexecutor.StatusError](errExec); ok && se != nil {
result.Error.HTTPStatus = se.StatusCode()
}
if ra := retryAfterFromError(errExec); ra != nil {
@@ -655,8 +654,7 @@ func (m *Manager) executeCountMixedOnce(ctx context.Context, providers []string,
return cliproxyexecutor.Response{}, errCtx
}
result.Error = &Error{Message: errExec.Error()}
var se cliproxyexecutor.StatusError
if errors.As(errExec, &se) && se != nil {
if se, ok := errors.AsType[cliproxyexecutor.StatusError](errExec); ok && se != nil {
result.Error.HTTPStatus = se.StatusCode()
}
if ra := retryAfterFromError(errExec); ra != nil {
@@ -710,8 +708,7 @@ func (m *Manager) executeStreamMixedOnce(ctx context.Context, providers []string
return nil, errCtx
}
rerr := &Error{Message: errStream.Error()}
var se cliproxyexecutor.StatusError
if errors.As(errStream, &se) && se != nil {
if se, ok := errors.AsType[cliproxyexecutor.StatusError](errStream); ok && se != nil {
rerr.HTTPStatus = se.StatusCode()
}
result := Result{AuthID: auth.ID, Provider: provider, Model: routeModel, Success: false, Error: rerr}
@@ -732,8 +729,7 @@ func (m *Manager) executeStreamMixedOnce(ctx context.Context, providers []string
if chunk.Err != nil && !failed {
failed = true
rerr := &Error{Message: chunk.Err.Error()}
var se cliproxyexecutor.StatusError
if errors.As(chunk.Err, &se) && se != nil {
if se, ok := errors.AsType[cliproxyexecutor.StatusError](chunk.Err); ok && se != nil {
rerr.HTTPStatus = se.StatusCode()
}
m.MarkResult(streamCtx, Result{AuthID: streamAuth.ID, Provider: streamProvider, Model: routeModel, Success: false, Error: rerr})
@@ -1431,8 +1427,7 @@ func retryAfterFromError(err error) *time.Duration {
if retryAfter == nil {
return nil
}
val := *retryAfter
return &val
return new(*retryAfter)
}
func statusCodeFromResult(err *Error) int {

View File

@@ -740,6 +740,13 @@ func (s *Service) registerModelsForAuth(a *coreauth.Auth) {
provider = "openai-compatibility"
}
excluded := s.oauthExcludedModels(provider, authKind)
// The synthesizer pre-merges per-account and global exclusions into the "excluded_models" attribute.
// If this attribute is present, it represents the complete list of exclusions and overrides the global config.
if a.Attributes != nil {
if val, ok := a.Attributes["excluded_models"]; ok && strings.TrimSpace(val) != "" {
excluded = strings.Split(val, ",")
}
}
var models []*ModelInfo
switch provider {
case "gemini":

View File

@@ -0,0 +1,65 @@
package cliproxy
import (
"strings"
"testing"
coreauth "github.com/router-for-me/CLIProxyAPI/v6/sdk/cliproxy/auth"
"github.com/router-for-me/CLIProxyAPI/v6/sdk/config"
)
func TestRegisterModelsForAuth_UsesPreMergedExcludedModelsAttribute(t *testing.T) {
service := &Service{
cfg: &config.Config{
OAuthExcludedModels: map[string][]string{
"gemini-cli": {"gemini-2.5-pro"},
},
},
}
auth := &coreauth.Auth{
ID: "auth-gemini-cli",
Provider: "gemini-cli",
Status: coreauth.StatusActive,
Attributes: map[string]string{
"auth_kind": "oauth",
"excluded_models": "gemini-2.5-flash",
},
}
registry := GlobalModelRegistry()
registry.UnregisterClient(auth.ID)
t.Cleanup(func() {
registry.UnregisterClient(auth.ID)
})
service.registerModelsForAuth(auth)
models := registry.GetAvailableModelsByProvider("gemini-cli")
if len(models) == 0 {
t.Fatal("expected gemini-cli models to be registered")
}
for _, model := range models {
if model == nil {
continue
}
modelID := strings.TrimSpace(model.ID)
if strings.EqualFold(modelID, "gemini-2.5-flash") {
t.Fatalf("expected model %q to be excluded by auth attribute", modelID)
}
}
seenGlobalExcluded := false
for _, model := range models {
if model == nil {
continue
}
if strings.EqualFold(strings.TrimSpace(model.ID), "gemini-2.5-pro") {
seenGlobalExcluded = true
break
}
}
if !seenGlobalExcluded {
t.Fatal("expected global excluded model to be present when attribute override is set")
}
}