LM Studio
Setup LM Studio
Section titled “Setup LM Studio”- Download + install LM Studio and the model you want to test with
- Make sure to start the LM Studio server
Enabling LM Studio with Docker
Section titled “Enabling LM Studio with Docker”To enable LM Studio models when running the Letta server with Docker, set the LMSTUDIO_BASE_URL environment variable.
macOS/Windows:
Since LM Studio is running on the host network, you will need to use host.docker.internal to connect to the LM Studio server instead of localhost.
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent datadocker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ -p 8283:8283 \ -e LMSTUDIO_BASE_URL="http://host.docker.internal:1234" \ letta/letta:latestLinux:
Use --network host and localhost:
docker run \ -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \ --network host \ -e LMSTUDIO_BASE_URL="http://localhost:1234" \ letta/letta:latestSee the self-hosting guide for more information on running Letta with Docker.
Model support
Section titled “Model support”The following models have been tested with Letta as of 7-11-2025 on LM Studio
0.3.18.
qwen3-30b-a3bqwen3-14b-mlxqwen3-8b-mlxqwen2.5-32b-instructqwen2.5-14b-instruct-1mqwen2.5-7b-instructmeta-llama-3.1-8b-instruct
Some models recommended on LM Studio such as mlx-community/ministral-8b-instruct-2410 and bartowski/ministral-8b-instruct-2410 may not work well with Letta due to default prompt templates being incompatible. Adjusting templates can enable compatibility but will impact model performance.