Skip to content
  • Auto
  • Light
  • Dark
DiscordForumGitHubSign up
View as Markdown
Copy Markdown

Open in Claude
Open in ChatGPT

Send Message Streaming

agents.messages.stream(stragent_id, MessageStreamParams**kwargs) -> LettaStreamingResponse
post/v1/agents/{agent_id}/messages/stream

Process a user message and return the agent's response. This endpoint accepts a message from a user and processes it through the agent. It will stream the steps of the response always, and stream the tokens if 'stream_tokens' is set to True.

ParametersExpand Collapse
agent_id: str

The ID of the agent in the format 'agent-'

minLength42
maxLength42
Deprecatedassistant_message_tool_kwarg: Optional[str]

The name of the message argument in the designated message tool. Still supported for legacy agent types, but deprecated for letta_v1_agent onward.

Deprecatedassistant_message_tool_name: Optional[str]

The name of the designated message tool. Still supported for legacy agent types, but deprecated for letta_v1_agent onward.

background: Optional[bool]

Whether to process the request in the background (only used when streaming=true).

Deprecatedenable_thinking: Optional[str]

If set to True, enables reasoning before responses or tool calls from the agent.

include_pings: Optional[bool]

Whether to include periodic keepalive ping messages in the stream to prevent connection timeouts (only used when streaming=true).

include_return_message_types: Optional[List[MessageType]]

Only return specified message types in the response. If None (default) returns all messages.

Accepts one of the following:
"system_message"
"user_message"
"assistant_message"
"reasoning_message"
"hidden_reasoning_message"
"tool_call_message"
"tool_return_message"
"approval_request_message"
"approval_response_message"
input: Optional[Union[str, Iterable[InputUnionMember1], null]]

Syntactic sugar for a single user message. Equivalent to messages=[{'role': 'user', 'content': input}].

Accepts one of the following:
InputUnionMember0 = str
InputUnionMember1 = Iterable[InputUnionMember1]
Accepts one of the following:
class TextContent:
text: str

The text content of the message.

signature: Optional[str]

Stores a unique identifier for any reasoning associated with this text content.

type: Optional[Literal["text"]]

The type of the message.

Accepts one of the following:
"text"
class ImageContent:
source: Source

The source of the image.

Accepts one of the following:
class SourceURLImage:
url: str

The URL of the image.

type: Optional[Literal["url"]]

The source type for the image.

Accepts one of the following:
"url"
class SourceBase64Image:
data: str

The base64 encoded image data.

media_type: str

The media type for the image.

detail: Optional[str]

What level of detail to use when processing and understanding the image (low, high, or auto to let the model decide)

type: Optional[Literal["base64"]]

The source type for the image.

Accepts one of the following:
"base64"
class SourceLettaImage:
file_id: str

The unique identifier of the image file persisted in storage.

data: Optional[str]

The base64 encoded image data.

detail: Optional[str]

What level of detail to use when processing and understanding the image (low, high, or auto to let the model decide)

media_type: Optional[str]

The media type for the image.

type: Optional[Literal["letta"]]

The source type for the image.

Accepts one of the following:
"letta"
type: Optional[Literal["image"]]

The type of the message.

Accepts one of the following:
"image"
class ToolCallContent:
id: str

A unique identifier for this specific tool call instance.

input: Dict[str, object]

The parameters being passed to the tool, structured as a dictionary of parameter names to values.

name: str

The name of the tool being called.

signature: Optional[str]

Stores a unique identifier for any reasoning associated with this tool call.

type: Optional[Literal["tool_call"]]

Indicates this content represents a tool call event.

Accepts one of the following:
"tool_call"
class ToolReturnContent:
content: str

The content returned by the tool execution.

is_error: bool

Indicates whether the tool execution resulted in an error.

tool_call_id: str

References the ID of the ToolCallContent that initiated this tool call.

type: Optional[Literal["tool_return"]]

Indicates this content represents a tool return event.

Accepts one of the following:
"tool_return"
class ReasoningContent:

Sent via the Anthropic Messages API

is_native: bool

Whether the reasoning content was generated by a reasoner model that processed this step.

reasoning: str

The intermediate reasoning or thought process content.

signature: Optional[str]

A unique identifier for this reasoning step.

type: Optional[Literal["reasoning"]]

Indicates this is a reasoning/intermediate step.

Accepts one of the following:
"reasoning"
class RedactedReasoningContent:

Sent via the Anthropic Messages API

data: str

The redacted or filtered intermediate reasoning content.

type: Optional[Literal["redacted_reasoning"]]

Indicates this is a redacted thinking step.

Accepts one of the following:
"redacted_reasoning"
class OmittedReasoningContent:

A placeholder for reasoning content we know is present, but isn't returned by the provider (e.g. OpenAI GPT-5 on ChatCompletions)

signature: Optional[str]

A unique identifier for this reasoning step.

type: Optional[Literal["omitted_reasoning"]]

Indicates this is an omitted reasoning step.

Accepts one of the following:
"omitted_reasoning"
class InputUnionMember1SummarizedReasoningContent:

The style of reasoning content returned by the OpenAI Responses API

id: str

The unique identifier for this reasoning step.

summary: Iterable[InputUnionMember1SummarizedReasoningContentSummary]

Summaries of the reasoning content.

index: int

The index of the summary part.

text: str

The text of the summary part.

encrypted_content: Optional[str]

The encrypted reasoning content.

type: Optional[Literal["summarized_reasoning"]]

Indicates this is a summarized reasoning step.

Accepts one of the following:
"summarized_reasoning"
max_steps: Optional[int]

Maximum number of steps the agent should take to process the request.

messages: Optional[Iterable[Message]]

The messages to be sent to the agent.

Accepts one of the following:
class MessageCreate:

Request to create a message

content: Union[List[LettaMessageContentUnion], str]

The content of the message.

Accepts one of the following:
ContentUnionMember0 = List[LettaMessageContentUnion]
Accepts one of the following:
class TextContent:
text: str

The text content of the message.

signature: Optional[str]

Stores a unique identifier for any reasoning associated with this text content.

type: Optional[Literal["text"]]

The type of the message.

Accepts one of the following:
"text"
class ImageContent:
source: Source

The source of the image.

Accepts one of the following:
class SourceURLImage:
url: str

The URL of the image.

type: Optional[Literal["url"]]

The source type for the image.

Accepts one of the following:
"url"
class SourceBase64Image:
data: str

The base64 encoded image data.

media_type: str

The media type for the image.

detail: Optional[str]

What level of detail to use when processing and understanding the image (low, high, or auto to let the model decide)

type: Optional[Literal["base64"]]

The source type for the image.

Accepts one of the following:
"base64"
class SourceLettaImage:
file_id: str

The unique identifier of the image file persisted in storage.

data: Optional[str]

The base64 encoded image data.

detail: Optional[str]

What level of detail to use when processing and understanding the image (low, high, or auto to let the model decide)

media_type: Optional[str]

The media type for the image.

type: Optional[Literal["letta"]]

The source type for the image.

Accepts one of the following:
"letta"
type: Optional[Literal["image"]]

The type of the message.

Accepts one of the following:
"image"
class ToolCallContent:
id: str

A unique identifier for this specific tool call instance.

input: Dict[str, object]

The parameters being passed to the tool, structured as a dictionary of parameter names to values.

name: str

The name of the tool being called.

signature: Optional[str]

Stores a unique identifier for any reasoning associated with this tool call.

type: Optional[Literal["tool_call"]]

Indicates this content represents a tool call event.

Accepts one of the following:
"tool_call"
class ToolReturnContent:
content: str

The content returned by the tool execution.

is_error: bool

Indicates whether the tool execution resulted in an error.

tool_call_id: str

References the ID of the ToolCallContent that initiated this tool call.

type: Optional[Literal["tool_return"]]

Indicates this content represents a tool return event.

Accepts one of the following:
"tool_return"
class ReasoningContent:

Sent via the Anthropic Messages API

is_native: bool

Whether the reasoning content was generated by a reasoner model that processed this step.

reasoning: str

The intermediate reasoning or thought process content.

signature: Optional[str]

A unique identifier for this reasoning step.

type: Optional[Literal["reasoning"]]

Indicates this is a reasoning/intermediate step.

Accepts one of the following:
"reasoning"
class RedactedReasoningContent:

Sent via the Anthropic Messages API

data: str

The redacted or filtered intermediate reasoning content.

type: Optional[Literal["redacted_reasoning"]]

Indicates this is a redacted thinking step.

Accepts one of the following:
"redacted_reasoning"
class OmittedReasoningContent:

A placeholder for reasoning content we know is present, but isn't returned by the provider (e.g. OpenAI GPT-5 on ChatCompletions)

signature: Optional[str]

A unique identifier for this reasoning step.

type: Optional[Literal["omitted_reasoning"]]

Indicates this is an omitted reasoning step.

Accepts one of the following:
"omitted_reasoning"
ContentUnionMember1 = str
role: Literal["user", "system", "assistant"]

The role of the participant.

Accepts one of the following:
"user"
"system"
"assistant"
batch_item_id: Optional[str]

The id of the LLMBatchItem that this message is associated with

group_id: Optional[str]

The multi-agent group that the message was sent in

name: Optional[str]

The name of the participant.

otid: Optional[str]

The offline threading id associated with this message

sender_id: Optional[str]

The id of the sender of the message, can be an identity id or agent id

type: Optional[Literal["message"]]

The message type to be created.

Accepts one of the following:
"message"
class ApprovalCreate:

Input to approve or deny a tool call request

Deprecatedapproval_request_id: Optional[str]

The message ID of the approval request

approvals: Optional[List[Approval]]

The list of approval responses

Accepts one of the following:
class ApprovalApprovalReturn:
approve: bool

Whether the tool has been approved

tool_call_id: str

The ID of the tool call that corresponds to this approval

reason: Optional[str]

An optional explanation for the provided approval status

type: Optional[Literal["approval"]]

The message type to be created.

Accepts one of the following:
"approval"
class ToolReturn:
status: Literal["success", "error"]
Accepts one of the following:
"success"
"error"
tool_call_id: str
tool_return: str
stderr: Optional[List[str]]
stdout: Optional[List[str]]
type: Optional[Literal["tool"]]

The message type to be created.

Accepts one of the following:
"tool"
Deprecatedapprove: Optional[bool]

Whether the tool has been approved

group_id: Optional[str]

The multi-agent group that the message was sent in

Deprecatedreason: Optional[str]

An optional explanation for the provided approval status

type: Optional[Literal["approval"]]

The message type to be created.

Accepts one of the following:
"approval"
stream_tokens: Optional[bool]

Flag to determine if individual tokens should be streamed, rather than streaming per step (only used when streaming=true).

streaming: Optional[bool]

If True, returns a streaming response (Server-Sent Events). If False (default), returns a complete response.

Deprecateduse_assistant_message: Optional[bool]

Whether the server should parse specific tool call arguments (default send_message) as AssistantMessage objects. Still supported for legacy agent types, but deprecated for letta_v1_agent onward.

ReturnsExpand Collapse
LettaStreamingResponse = LettaStreamingResponse

Streaming response type for Server-Sent Events (SSE) endpoints. Each event in the stream will be one of these types.

Accepts one of the following:
class SystemMessage:

A message generated by the system. Never streamed back on a response, only used for cursor pagination.

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message content (str): The message content sent by the system

id: str
content: str

The message content sent by the system

date: datetime
is_err: Optional[bool]
message_type: Optional[Literal["system_message"]]

The type of the message.

Accepts one of the following:
"system_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
class UserMessage:

A message sent by the user. Never streamed back on a response, only used for cursor pagination.

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message content (Union[str, List[LettaUserMessageContentUnion]]): The message content sent by the user (can be a string or an array of multi-modal content parts)

id: str
content: Union[List[LettaUserMessageContentUnion], str]

The message content sent by the user (can be a string or an array of multi-modal content parts)

Accepts one of the following:
ContentUnionMember0 = List[LettaUserMessageContentUnion]
Accepts one of the following:
class TextContent:
text: str

The text content of the message.

signature: Optional[str]

Stores a unique identifier for any reasoning associated with this text content.

type: Optional[Literal["text"]]

The type of the message.

Accepts one of the following:
"text"
class ImageContent:
source: Source

The source of the image.

Accepts one of the following:
class SourceURLImage:
url: str

The URL of the image.

type: Optional[Literal["url"]]

The source type for the image.

Accepts one of the following:
"url"
class SourceBase64Image:
data: str

The base64 encoded image data.

media_type: str

The media type for the image.

detail: Optional[str]

What level of detail to use when processing and understanding the image (low, high, or auto to let the model decide)

type: Optional[Literal["base64"]]

The source type for the image.

Accepts one of the following:
"base64"
class SourceLettaImage:
file_id: str

The unique identifier of the image file persisted in storage.

data: Optional[str]

The base64 encoded image data.

detail: Optional[str]

What level of detail to use when processing and understanding the image (low, high, or auto to let the model decide)

media_type: Optional[str]

The media type for the image.

type: Optional[Literal["letta"]]

The source type for the image.

Accepts one of the following:
"letta"
type: Optional[Literal["image"]]

The type of the message.

Accepts one of the following:
"image"
ContentUnionMember1 = str
date: datetime
is_err: Optional[bool]
message_type: Optional[Literal["user_message"]]

The type of the message.

Accepts one of the following:
"user_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
class ReasoningMessage:

Representation of an agent's internal reasoning.

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message source (Literal["reasoner_model", "non_reasoner_model"]): Whether the reasoning content was generated natively by a reasoner model or derived via prompting reasoning (str): The internal reasoning of the agent signature (Optional[str]): The model-generated signature of the reasoning step

id: str
date: datetime
reasoning: str
is_err: Optional[bool]
message_type: Optional[Literal["reasoning_message"]]

The type of the message.

Accepts one of the following:
"reasoning_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
signature: Optional[str]
source: Optional[Literal["reasoner_model", "non_reasoner_model"]]
Accepts one of the following:
"reasoner_model"
"non_reasoner_model"
step_id: Optional[str]
class HiddenReasoningMessage:

Representation of an agent's internal reasoning where reasoning content has been hidden from the response.

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message state (Literal["redacted", "omitted"]): Whether the reasoning content was redacted by the provider or simply omitted by the API hidden_reasoning (Optional[str]): The internal reasoning of the agent

id: str
date: datetime
state: Literal["redacted", "omitted"]
Accepts one of the following:
"redacted"
"omitted"
hidden_reasoning: Optional[str]
is_err: Optional[bool]
message_type: Optional[Literal["hidden_reasoning_message"]]

The type of the message.

Accepts one of the following:
"hidden_reasoning_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
class ToolCallMessage:

A message representing a request to call a tool (generated by the LLM to trigger tool execution).

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message tool_call (Union[ToolCall, ToolCallDelta]): The tool call

id: str
date: datetime
Deprecatedtool_call: ToolCall
Accepts one of the following:
class ToolCall:
arguments: str
name: str
tool_call_id: str
class ToolCallDelta:
arguments: Optional[str]
name: Optional[str]
tool_call_id: Optional[str]
is_err: Optional[bool]
message_type: Optional[Literal["tool_call_message"]]

The type of the message.

Accepts one of the following:
"tool_call_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
tool_calls: Optional[ToolCalls]
Accepts one of the following:
ToolCallsUnionMember0 = List[ToolCall]
arguments: str
name: str
tool_call_id: str
class ToolCallDelta:
arguments: Optional[str]
name: Optional[str]
tool_call_id: Optional[str]
class ToolReturnMessage:

A message representing the return value of a tool call (generated by Letta executing the requested tool).

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message tool_return (str): The return value of the tool (deprecated, use tool_returns) status (Literal["success", "error"]): The status of the tool call (deprecated, use tool_returns) tool_call_id (str): A unique identifier for the tool call that generated this message (deprecated, use tool_returns) stdout (Optional[List(str)]): Captured stdout (e.g. prints, logs) from the tool invocation (deprecated, use tool_returns) stderr (Optional[List(str)]): Captured stderr from the tool invocation (deprecated, use tool_returns) tool_returns (Optional[List[ToolReturn]]): List of tool returns for multi-tool support

id: str
date: datetime
Deprecatedstatus: Literal["success", "error"]
Accepts one of the following:
"success"
"error"
Deprecatedtool_call_id: str
Deprecatedtool_return: str
is_err: Optional[bool]
message_type: Optional[Literal["tool_return_message"]]

The type of the message.

Accepts one of the following:
"tool_return_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
Deprecatedstderr: Optional[List[str]]
Deprecatedstdout: Optional[List[str]]
step_id: Optional[str]
tool_returns: Optional[List[ToolReturn]]
status: Literal["success", "error"]
Accepts one of the following:
"success"
"error"
tool_call_id: str
tool_return: str
stderr: Optional[List[str]]
stdout: Optional[List[str]]
type: Optional[Literal["tool"]]

The message type to be created.

Accepts one of the following:
"tool"
class AssistantMessage:

A message sent by the LLM in response to user input. Used in the LLM context.

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message content (Union[str, List[LettaAssistantMessageContentUnion]]): The message content sent by the agent (can be a string or an array of content parts)

id: str
content: Union[List[LettaAssistantMessageContentUnion], str]

The message content sent by the agent (can be a string or an array of content parts)

Accepts one of the following:
ContentUnionMember0 = List[LettaAssistantMessageContentUnion]
text: str

The text content of the message.

signature: Optional[str]

Stores a unique identifier for any reasoning associated with this text content.

type: Optional[Literal["text"]]

The type of the message.

Accepts one of the following:
"text"
ContentUnionMember1 = str
date: datetime
is_err: Optional[bool]
message_type: Optional[Literal["assistant_message"]]

The type of the message.

Accepts one of the following:
"assistant_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
class ApprovalRequestMessage:

A message representing a request for approval to call a tool (generated by the LLM to trigger tool execution).

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message tool_call (ToolCall): The tool call

id: str
date: datetime
Deprecatedtool_call: ToolCall

The tool call that has been requested by the llm to run

Accepts one of the following:
class ToolCall:
arguments: str
name: str
tool_call_id: str
class ToolCallDelta:
arguments: Optional[str]
name: Optional[str]
tool_call_id: Optional[str]
is_err: Optional[bool]
message_type: Optional[Literal["approval_request_message"]]

The type of the message.

Accepts one of the following:
"approval_request_message"
name: Optional[str]
otid: Optional[str]
run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
tool_calls: Optional[ToolCalls]

The tool calls that have been requested by the llm to run, which are pending approval

Accepts one of the following:
ToolCallsUnionMember0 = List[ToolCall]
arguments: str
name: str
tool_call_id: str
class ToolCallDelta:
arguments: Optional[str]
name: Optional[str]
tool_call_id: Optional[str]
class ApprovalResponseMessage:

A message representing a response form the user indicating whether a tool has been approved to run.

Args: id (str): The ID of the message date (datetime): The date the message was created in ISO format name (Optional[str]): The name of the sender of the message approve: (bool) Whether the tool has been approved approval_request_id: The ID of the approval request reason: (Optional[str]) An optional explanation for the provided approval status

id: str
date: datetime
Deprecatedapproval_request_id: Optional[str]

The message ID of the approval request

approvals: Optional[List[Approval]]

The list of approval responses

Accepts one of the following:
class ApprovalApprovalReturn:
approve: bool

Whether the tool has been approved

tool_call_id: str

The ID of the tool call that corresponds to this approval

reason: Optional[str]

An optional explanation for the provided approval status

type: Optional[Literal["approval"]]

The message type to be created.

Accepts one of the following:
"approval"
class ToolReturn:
status: Literal["success", "error"]
Accepts one of the following:
"success"
"error"
tool_call_id: str
tool_return: str
stderr: Optional[List[str]]
stdout: Optional[List[str]]
type: Optional[Literal["tool"]]

The message type to be created.

Accepts one of the following:
"tool"
Deprecatedapprove: Optional[bool]

Whether the tool has been approved

is_err: Optional[bool]
message_type: Optional[Literal["approval_response_message"]]

The type of the message.

Accepts one of the following:
"approval_response_message"
name: Optional[str]
otid: Optional[str]
Deprecatedreason: Optional[str]

An optional explanation for the provided approval status

run_id: Optional[str]
sender_id: Optional[str]
seq_id: Optional[int]
step_id: Optional[str]
class LettaPing:

Ping messages are a keep-alive to prevent SSE streams from timing out during long running requests.

message_type: Literal["ping"]

The type of the message.

Accepts one of the following:
"ping"
class LettaStopReason:

The stop reason from Letta indicating why agent loop stopped execution.

stop_reason: StopReasonType

The reason why execution stopped.

Accepts one of the following:
"end_turn"
"error"
"llm_api_error"
"invalid_llm_response"
"invalid_tool_call"
"max_steps"
"no_tool_call"
"tool_rule"
"cancelled"
"requires_approval"
message_type: Optional[Literal["stop_reason"]]

The type of the message.

Accepts one of the following:
"stop_reason"
class LettaUsageStatistics:

Usage statistics for the agent interaction.

Attributes: completion_tokens (int): The number of tokens generated by the agent. prompt_tokens (int): The number of tokens in the prompt. total_tokens (int): The total number of tokens processed by the agent. step_count (int): The number of steps taken by the agent.

completion_tokens: Optional[int]

The number of tokens generated by the agent.

message_type: Optional[Literal["usage_statistics"]]
Accepts one of the following:
"usage_statistics"
prompt_tokens: Optional[int]

The number of tokens in the prompt.

run_ids: Optional[List[str]]

The background task run IDs associated with the agent interaction

step_count: Optional[int]

The number of steps taken by the agent.

total_tokens: Optional[int]

The total number of tokens processed by the agent.

Send Message Streaming
from letta_client import Letta

client = Letta(
    api_key="My API Key",
)
letta_streaming_response = client.agents.messages.stream(
    agent_id="agent-123e4567-e89b-42d3-8456-426614174000",
)
print(letta_streaming_response)
{
  "id": "id",
  "content": "content",
  "date": "2019-12-27T18:11:19.117Z",
  "is_err": true,
  "message_type": "system_message",
  "name": "name",
  "otid": "otid",
  "run_id": "run_id",
  "sender_id": "sender_id",
  "seq_id": 0,
  "step_id": "step_id"
}
Returns Examples
{
  "id": "id",
  "content": "content",
  "date": "2019-12-27T18:11:19.117Z",
  "is_err": true,
  "message_type": "system_message",
  "name": "name",
  "otid": "otid",
  "run_id": "run_id",
  "sender_id": "sender_id",
  "seq_id": 0,
  "step_id": "step_id"
}