OpenAI-compatible endpoint
You can configure Letta to use OpenAI-compatible ChatCompletions endpoints by setting OPENAI_API_BASE in your environment variables (in addition to setting OPENAI_API_KEY).
OpenRouter example
Section titled “OpenRouter example”Create an account on OpenRouter, then create an API key.
Once you have your API key, set both OPENAI_API_KEY and OPENAI_API_BASE in your environment variables.
Using with Docker
Section titled “Using with Docker”Set the environment variables when you use docker run:
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent datadocker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e OPENAI_API_BASE="https://openrouter.ai/api/v1" \ -e OPENAI_API_KEY="your_openai_api_key" \ letta/letta:latestSee the self-hosting guide for more information on running Letta with Docker.
Once the Letta server is running, you can select OpenRouter models from the ADE dropdown or via the Python SDK.
For information on how to configure agents to use OpenRouter or other OpenAI-compatible endpoints providers, refer to our guide on using OpenAI.