Skip to content
  • Auto
  • Light
  • Dark
DiscordForumGitHubSign up
Self-hosting
Model providers
More providers
View as Markdown
Copy Markdown

Open in Claude
Open in ChatGPT

OpenAI-compatible endpoint

You can configure Letta to use OpenAI-compatible ChatCompletions endpoints by setting OPENAI_API_BASE in your environment variables (in addition to setting OPENAI_API_KEY).

Create an account on OpenRouter, then create an API key.

Once you have your API key, set both OPENAI_API_KEY and OPENAI_API_BASE in your environment variables.

Set the environment variables when you use docker run:

Terminal window
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
-e OPENAI_API_BASE="https://openrouter.ai/api/v1" \
-e OPENAI_API_KEY="your_openai_api_key" \
letta/letta:latest

See the self-hosting guide for more information on running Letta with Docker.

Once the Letta server is running, you can select OpenRouter models from the ADE dropdown or via the Python SDK.

For information on how to configure agents to use OpenRouter or other OpenAI-compatible endpoints providers, refer to our guide on using OpenAI.