Model List
The AIClient-2-API supports listing models provided by backend LLMs. Multiple providers are supported, and the default model list is defined in src/providers/provider-models.js.
OpenAI Compatible Interface
bash
GET http://localhost:3000/v1/modelsThis returns a response mimicking the OpenAI models endpoint, containing the list of LLM models configured in the backend.
Gemini Compatible Interface
bash
GET http://localhost:3000/v1beta/modelsThis returns a response mimicking the Gemini models endpoint.
Supported Models Preview
Below is a preview of pre-defined models for major providers (note: these can be dynamically extended via configuration):
Google Gemini (GCP OAuth)
gemini-2.5-flashgemini-2.5-progemini-3.1-pro-previewgemini-3.1-flash-lite-previewgemini-3-pro-previewgemini-3-flash-preview
Anthropic Claude (Kiro/CodeWhisperer)
claude-3-7-sonnet-20250219claude-sonnet-4-6claude-opus-4-6claude-haiku-4-5
Grok (xAI)
grok-3grok-3-minigrok-3-thinkinggrok-4grok-4-minigrok-4-thinking
OpenAI Codex
gpt-5gpt-5-codexgpt-5.4
Qwen / iFlow
qwen3-coder-plusqwen3-maxdeepseek-v3.2deepseek-r1kimi-k2.5glm-5
Automatic Model Selection (AUTO Mode)
If MODEL_PROVIDER is set to auto in the configuration, the system automatically matches the best provider based on the model name in the request:
- The system iterates through all configured provider pools.
- Matches the model name with the appropriate driver.
- If the primary account fails, it automatically switches to a backup account within the same or a compatible protocol.