Skip to main content

OpenCode Integration

OpenCode is a powerful CLI tool for AI-assisted coding that supports 75+ LLM providers. You can configure it to use the LeanMCP AI Gateway as a custom provider.
OpenCode uses the AI SDK and supports any OpenAI-compatible API, making it perfect for the LeanMCP AI Gateway.

Prerequisites

1

Get Credits

Purchase credits at ship.leanmcp.com
2

Create API Key

Create an API key at ship.leanmcp.com/api-keys with SDK permissions
3

Install OpenCode

npm install -g opencode
# or
brew install opencode

Configuration

Method 1: Using /connect Command

1

Run the connect command

opencode
/connect
2

Select 'Other'

Scroll down and select Other from the provider list.
┌  Add credential

◆  Select provider
│  ...
│  ● Other

3

Enter provider ID

Enter a unique ID like leanmcp:
┌  Add credential

◇  Enter provider id
│  leanmcp

4

Enter API Key

Enter your LeanMCP API key:
┌  Add credential

◇  Enter your API key
│  leanmcp_your_api_key_here

Create or update opencode.json in your project directory:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "leanmcp": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LeanMCP AI Gateway",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/openai"
      },
      "models": {
        "gpt-5.2": {
          "name": "GPT-5.2 via LeanMCP"
        },
        "gpt-5.2": {
          "name": "GPT-5.2 via LeanMCP"
        },
        "gpt-5.2": {
          "name": "GPT-5.2 via LeanMCP"
        }
      }
    }
  }
}

Method 3: Global Configuration

For system-wide configuration, create ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "leanmcp-openai": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LeanMCP (OpenAI)",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/openai"
      },
      "models": {
        "gpt-5.2": { "name": "GPT-5.2" },
        "gpt-5.2": { "name": "GPT-5.2" },
        "gpt-5.2": { "name": "GPT-5.2" }
      }
    },
    "leanmcp-anthropic": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LeanMCP (Anthropic)",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/anthropic"
      },
      "models": {
        "claude-sonnet-4-5-20250929": { "name": "Claude Sonnet 4.5" },
        "claude-opus-4-5-20251101": { "name": "Claude Opus 4.5" }
      }
    }
  }
}

Using LeanMCP in OpenCode

Once configured, select your LeanMCP models:
opencode
/models
Select LeanMCP AI Gateway or your configured provider name from the list.
OpenCode model selection

Configuration Options

OptionDescription
npmAI SDK package (@ai-sdk/openai-compatible for gateway)
nameDisplay name in OpenCode UI
options.baseURLLeanMCP AI Gateway endpoint
options.apiKeyCan be set here or via /connect
options.headersCustom headers for requests
modelsMap of available models

Setting Token Limits

For proper context management, specify model limits:
{
  "provider": {
    "leanmcp": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LeanMCP AI Gateway",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/openai"
      },
      "models": {
        "gpt-5.2": {
          "name": "GPT-5.2",
          "limit": {
            "context": 128000,
            "output": 4096
          }
        }
      }
    }
  }
}

Environment Variables

Alternatively, set credentials via environment variables:
# Add to your shell profile (~/.bashrc, ~/.zshrc)
export LEANMCP_API_KEY="leanmcp_your_api_key_here"
Then reference in config:
{
  "provider": {
    "leanmcp": {
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/openai",
        "apiKey": "{env:LEANMCP_API_KEY}"
      }
    }
  }
}

Multiple Providers via LeanMCP

Route all your AI providers through LeanMCP:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "leanmcp-openai": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "OpenAI (via LeanMCP)",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/openai"
      },
      "models": {
        "gpt-5.2": {},
        "gpt-5.2": {},
        "gpt-5.2": {}
      }
    },
    "leanmcp-anthropic": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Anthropic (via LeanMCP)",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/anthropic"
      },
      "models": {
        "claude-sonnet-4-5-20250929": {},
        "claude-opus-4-5-20251101": {}
      }
    },
    "leanmcp-xai": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "xAI Grok (via LeanMCP)",
      "options": {
        "baseURL": "https://aigateway.leanmcp.com/v1/xai"
      },
      "models": {
        "grok-beta": {}
      }
    }
  }
}

Verifying Setup

  1. Run OpenCode: opencode
  2. Select a LeanMCP model: /models
  3. Ask a question
  4. Check your LeanMCP Dashboard to see the logged request

Benefits

Unified Logging

All your OpenCode sessions logged in one place

Multi-Provider

Use OpenAI, Anthropic, xAI through one gateway

Cost Tracking

Track spending across all your coding sessions

Security

Detect if sensitive code is being sent

Troubleshooting

  • Verify opencode.json is in your project root or ~/.config/opencode/
  • Check JSON syntax is valid
  • Restart OpenCode after config changes
  • Run /connect and re-enter your API key
  • Verify the API key has SDK permissions
  • Check you have credits in your account
  • Ensure the model name matches what LeanMCP supports
  • Check the baseURL matches the provider (openai, anthropic, etc.)
  • Try a different model to isolate the issue

Resources

Next Steps