Skip to content

Custom Endpoint

Use this if you run:

  • Self-hosted models (vLLM, LiteLLM, etc.)
  • Enterprise proxy/gateway endpoints
  • Any OpenAI-compatible API server
  1. Provider: Custom Endpoint
  2. Enter endpoint URL (example: https://your-endpoint/v1)
  3. Enter model ID exactly as your endpoint expects
  4. If required, paste API key and save
  • Endpoint must be OpenAI-compatible
  • If your endpoint needs extra headers, place them at your gateway/proxy layer