GoModel routes OpenAI-compatible requests to many AI providers through a single gateway. Most providers work out of the box once you set their API key — set the env var, start GoModel, and callDocumentation Index
Fetch the complete documentation index at: https://gomodel-docs-providers-restructure.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
/v1/chat/completions or /v1/responses
as usual.
The pages in this section exist for providers whose setup is not purely
“set an API key”: auth flows that pull from cloud credentials, deployment-based
URLs, region/project requirements, dual native + OpenAI-compatible API
surfaces, or other quirks worth documenting. If a provider isn’t listed
separately, its default configuration is enough.
Supported providers
| Provider | Credential | Guide |
|---|---|---|
| OpenAI | OPENAI_API_KEY | — |
| Anthropic | ANTHROPIC_API_KEY | — |
| Google Gemini | GEMINI_API_KEY | Google Gemini |
| Google Vertex AI | VERTEX_PROJECT + VERTEX_LOCATION + GCP credentials | Google Vertex AI |
| DeepSeek | DEEPSEEK_API_KEY | DeepSeek |
| Groq | GROQ_API_KEY | — |
| OpenRouter | OPENROUTER_API_KEY | — |
| Z.ai | ZAI_API_KEY (ZAI_BASE_URL optional) | — |
| xAI (Grok) | XAI_API_KEY | — |
| MiniMax | MINIMAX_API_KEY (MINIMAX_BASE_URL optional) | — |
| Azure OpenAI | AZURE_API_KEY + AZURE_BASE_URL (AZURE_API_VERSION optional) | Azure OpenAI |
| Amazon Bedrock | BEDROCK_BASE_URL (region or endpoint) + AWS credentials | Amazon Bedrock |
| Oracle GenAI | ORACLE_API_KEY + ORACLE_BASE_URL | Oracle GenAI |
| Ollama | OLLAMA_BASE_URL | Ollama |
| vLLM | VLLM_BASE_URL (VLLM_API_KEY optional) | vLLM |
Why some providers have dedicated pages
These are the providers most users hit friction on:- Google Vertex AI — needs a GCP project, region, and either Application Default Credentials or a service-account JSON key. Multi-region or multi-account setups use suffixed env vars.
- Amazon Bedrock — no API key of its own; authenticates through the AWS credential chain (env, profile, IAM Identity Center, instance/container roles). Requires explicit model access in the Bedrock console.
- Azure OpenAI — deployment-scoped base URLs, the
api-versionquery parameter, and theapi-keyheader instead ofAuthorization: Bearer. - Oracle GenAI — requires an OCI IAM policy for
generativeaiapikeyand a region-specific OpenAI-compatible endpoint URL. - Google Gemini (AI Studio) — two routing modes (native
generateContentvs OpenAI-compatible) with different image-input behavior. - DeepSeek — reasoning effort mapping quirks for DeepSeek V4.
- Ollama / vLLM — local-model hosting with optional multi-instance setup through suffixed env vars and provider-qualified model IDs.