- Updated `README_CN.md` to introduce Amp CLI and IDE support.
- Added detailed integration guide in `docs/amp-cli-integration_CN.md`.
- Covered setup, configuration, OAuth, security, and usage of Amp CLI with Google/ChatGPT/Claude subscriptions.
- Deleted `MANAGEMENT_API.md` and `MANAGEMENT_API_CN.md` as they are no longer necessary.
- Streamlined project documentation by removing redundant API details already covered elsewhere.
- Streamlined Chinese README by reducing redundancy and unnecessary sections.
- Added a concise link to CLIProxyAPI user manual for detailed instructions.
- Reorganized the original README with a simplified overview.
- Streamlined Chinese README by reducing redundancy and unnecessary sections.
- Added a concise link to CLIProxyAPI user manual for detailed instructions.
- Reorganized the original README with a simplified overview.
- Streamlined Chinese README by reducing redundancy and unnecessary sections.
- Added a concise link to CLIProxyAPI user manual for detailed instructions.
- Reorganized the original README with a simplified overview.
- Updated `README.md` and `README_CN.md` with steps to install via the Linux installer.
- Acknowledged [brokechubb](https://github.com/brokechubb) for building the installer.
- Updated `README.md` and `README_CN.md` to include additional instructions on requesting undefined models using the `openrouter://` format.
- Added example for `moonshotai/kimi-k2:free` usage.
- Introduced model alias mapping for Claude configurations, enabling upstream and client-facing model name associations.
- Added `computeClaudeModelsHash` to generate a consistent hash for model aliases.
- Implemented `normalizeClaudeKey` function to standardize input API key configuration, including models.
- Enhanced executor to resolve model aliases to upstream names dynamically.
- Updated documentation and configuration examples to reflect new model alias support.
feat(translator): add token counting functionality for Gemini, Claude, and CLI
- Introduced `TokenCount` handling across various Codex translators (Gemini, Claude, CLI) with respective implementations.
- Added utility methods for token counting and formatting responses.
- Integrated `tiktoken-go/tokenizer` library for tokenization.
- Updated CodexExecutor with token counting logic to support multiple models including GPT-5 variants.
- Refined go.mod and go.sum to include new dependencies.
feat(runtime): add token counting functionality across executors
- Implemented token counting in OpenAICompatExecutor, QwenExecutor, and IFlowExecutor.
- Added utilities for token counting and response formatting using `tiktoken-go/tokenizer`.
- Integrated token counting into translators for Gemini, Claude, and Gemini CLI.
- Enhanced multiple model support, including GPT-5 variants, for token counting.
docs: update environment variable instructions for multi-model support
- Added details for setting `ANTHROPIC_DEFAULT_OPUS_MODEL`, `ANTHROPIC_DEFAULT_SONNET_MODEL`, and `ANTHROPIC_DEFAULT_HAIKU_MODEL` for version 2.x.x.
- Clarified usage of `ANTHROPIC_MODEL` and `ANTHROPIC_SMALL_FAST_MODEL` for version 1.x.x.
- Expanded examples for setting environment variables across different models including Gemini, GPT-5, Claude, and Qwen3.
- Added instructions for configuring a Git repository as a backend for `config.yaml` and token storage.
- Included example environment variable configurations for Docker and Docker Compose.
- Updated both English (README.md) and Chinese (README_CN.md) documentation.
- Added `glm-4.6` model to registry and documentation.
- Updated Gemini CLI executor to pass configuration to `prepareGeminiCLITokenSource` for improved token management.