AI Gateway
The LeanMCP AI Gateway provides a unified API proxy for multiple LLM providers with built-in authentication, token tracking, and observability.Features
Multi-Provider Support
OpenAI, Anthropic, xAI (Grok), Fireworks, ElevenLabs
Drop-in Replacement
Use official SDKs or LangChain with minimal code changes
Authentication
Firebase JWT or API key authentication
Session Tracking
Track requests across sessions for observability
Supported Providers
| Provider | Endpoint | Features |
|---|---|---|
| OpenAI | /v1/openai/* | Chat, Vision, DALL-E, TTS, Structured Output |
| Anthropic | /v1/anthropic/* | Messages, Vision, Structured Output |
| xAI (Grok) | /v1/xai/* | Chat with Web Search |
| Fireworks | /v1/fireworks/* | Open-source models (Llama, etc.) |
| ElevenLabs | /v1/elevenlabs/* | Text-to-Speech |
Authentication
All requests require authentication via theAuthorization header:
- Firebase JWT: Standard Firebase ID token from your app
- API Key: LeanMCP API keys (prefixed with
leanmcp_)
Usage Examples
OpenAI SDK
Use the OpenAI SDK by simply changing thebaseURL:
Anthropic SDK
LangChain (Python)
LangChain provides a powerful abstraction for building LLM applications. The AI Gateway works seamlessly with LangChain by simply changing thebase_url parameter.
- OpenAI
- Anthropic
Streaming with LangChain
Structured Output with Pydantic
Tool Calling with LangChain
Multi-Model Chains
Orchestrate multiple providers in a single LangChain workflow:Requirements: Install LangChain packages:
curl Examples
OpenAI Streaming
Anthropic Messages
xAI (Grok) with Web Search
Fireworks (Llama)
ElevenLabs TTS
Advanced Features
Structured Output (OpenAI)
Session Tracking
Include a session ID to track requests across a conversation:Bring Your Own API Key
To use your own provider API key instead of platform keys:Related
- Authentication Overview - Server-side authentication with
@Authenticated - OAuth Server & Proxy - Build OAuth authorization servers
- GPT Apps Guide - Build apps for ChatGPT