Skip to main content
Version: latest

Supported AI Providers

Since the Envoy AI Gateway is designed to provide a Unified API for routing and managing LLM/AI traffic, it supports various AI providers out of the box. A "support of provider" means two things: the API schema support and the Authentication support.
The former can be configured in the AIServiceBackend resource's schema field, while the latter is configured in the BackendSecurityPolicy.

Below is a table of currently supported providers and their respective configurations.

Provider NameAPI Schema Config on AIServiceBackendUpstream Authentication Config on BackendSecurityPolicyStatusNote
OpenAI{"name":"OpenAI","version":"v1"}API Key
AWS Bedrock{"name":"AWSBedrock"}AWS Bedrock Credentials
Azure OpenAI{"name":"AzureOpenAI","version":"2025-01-01-preview"}Azure Credentials
Google Gemini on AI Studio{"name":"OpenAI","version":"v1beta/openai"}API KeyOnly the OpenAI compatible endpoint
Groq{"name":"OpenAI","version":"openai/v1"}API Key
Grok{"name":"OpenAI","version":"v1"}API Key
Together AI{"name":"OpenAI","version":"v1"}API Key
Cohere{"name":"OpenAI","version":"compatibility/v1"}API KeyOnly the OpenAI compatible endpoint
Mistral{"name":"OpenAI","version":"v1"}API Key
DeepInfra{"name":"OpenAI","version":"v1/openai"}API KeyOnly the OpenAI compatible endpoint
DeepSeek{"name":"OpenAI","version":"v1"}API Key
Hunyuan{"name":"OpenAI","version":"v1"}API Key
Tencent LLM Knowledge Engine{"name":"OpenAI","version":"v1"}API Key
Google Vertex AIN/AN/A🚧Work-in-progress: issue#609
Anthropic on Vertex AIN/AN/A🚧Work-in-progress: issue#609
SambaNova{"name":"OpenAI","version":"v1"}API Key
Self-hosted-models{"name":"OpenAI","version":"v1"}N/A⚠️Depending on the API schema spoken by self-hosted servers. For example, vLLM speaks the OpenAI format. Also, API Key auth can be configured as well.