Skip to content

Model List

The AIClient-2-API supports listing models provided by backend LLMs. Multiple providers are supported, and the default model list is defined in src/providers/provider-models.js.

OpenAI Compatible Interface

bash
GET http://localhost:3000/v1/models

This returns a response mimicking the OpenAI models endpoint, containing the list of LLM models configured in the backend.

Gemini Compatible Interface

bash
GET http://localhost:3000/v1beta/models

This returns a response mimicking the Gemini models endpoint.


Supported Models Preview

Below is a preview of pre-defined models for major providers (note: these can be dynamically extended via configuration):

Google Gemini (GCP OAuth)

  • gemini-2.5-flash
  • gemini-2.5-pro
  • gemini-3.1-pro-preview
  • gemini-3.1-flash-lite-preview
  • gemini-3-pro-preview
  • gemini-3-flash-preview

Anthropic Claude (Kiro/CodeWhisperer)

  • claude-3-7-sonnet-20250219
  • claude-sonnet-4-6
  • claude-opus-4-6
  • claude-haiku-4-5

Grok (xAI)

  • grok-3
  • grok-3-mini
  • grok-3-thinking
  • grok-4
  • grok-4-mini
  • grok-4-thinking

OpenAI Codex

  • gpt-5
  • gpt-5-codex
  • gpt-5.4

Qwen / iFlow

  • qwen3-coder-plus
  • qwen3-max
  • deepseek-v3.2
  • deepseek-r1
  • kimi-k2.5
  • glm-5

Automatic Model Selection (AUTO Mode)

If MODEL_PROVIDER is set to auto in the configuration, the system automatically matches the best provider based on the model name in the request:

  1. The system iterates through all configured provider pools.
  2. Matches the model name with the appropriate driver.
  3. If the primary account fails, it automatically switches to a backup account within the same or a compatible protocol.