Skip to content
  • Auto
  • Light
  • Dark
DiscordForumGitHubSign up

Run Letta with Docker

The Letta server can be connected to various LLM API backends (OpenAI, Anthropic, vLLM, Ollama, etc.). To enable access to these LLM API providers, set the appropriate environment variables when you use docker run:

Terminal window
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
-e OPENAI_API_KEY="your_openai_api_key" \
letta/letta:latest

Environment variables will determine which LLM and embedding providers are enabled on your Letta server. For example, if you set OPENAI_API_KEY, then your Letta server will attempt to connect to OpenAI as a model provider. Similarly, if you set OLLAMA_BASE_URL, then your Letta server will attempt to connect to an Ollama server to provide local models as LLM options on the server.

If you have many different LLM API keys, you can also set up a .env file instead and pass that to docker run:

Terminal window
# using a .env file instead of passing environment variables
docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
--env-file .env \
letta/letta:latest

Once the Letta server is running, you can access it via port 8283 (e.g. sending REST API requests to http://localhost:8283/v1). You can also connect your server to the Letta ADE to access and manage your agents in a web interface.

If you are using a .env file, it should contain environment variables for each of the LLM providers you wish to use (replace ... with your actual API keys and endpoint URLs):

Terminal window
# To use OpenAI
OPENAI_API_KEY=...
# To use Anthropic
ANTHROPIC_API_KEY=...
# To use with Ollama (replace with Ollama server URL)
OLLAMA_BASE_URL=...
# To use with Google AI
GEMINI_API_KEY=...
# To use with Azure
AZURE_API_KEY=...
AZURE_BASE_URL=...
# To use with vLLM (replace with vLLM server URL)
VLLM_API_BASE=...

When you use the latest tag, you will get the latest stable release of Letta.

The nightly image is a development image thkat is updated frequently off of main (it is not recommended for production use). If you would like to use the development image, you can use the nightly tag instead of latest:

Terminal window
docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
-e OPENAI_API_KEY="your_openai_api_key" \
letta/letta:nightly

To password protect your server, include SECURE=true and LETTA_SERVER_PASSWORD=yourpassword in your docker run command:

Terminal window
# If LETTA_SERVER_PASSWORD isn't set, the server will autogenerate a password
docker run \
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
-p 8283:8283 \
--env-file .env \
-e SECURE=true \
-e LETTA_SERVER_PASSWORD=yourpassword \
letta/letta:latest

With password protection enabled, you will have to provide your password in the bearer token header in your API requests:

Terminal window
curl --request POST \
--url http://localhost:8283/v1/agents/$AGENT_ID/messages \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer yourpassword' \
--data '{
"messages": [
{
"role": "user",
"text": "hows it going????"
}
]
}'