Skip to content
  • Auto
  • Light
  • Dark
DiscordForumGitHubSign up
View as Markdown
Copy Markdown

Open in Claude
Open in ChatGPT

Retrieve Stream

client.runs.messages.stream(stringrunID, MessageStreamParams { batch_size, include_pings, poll_interval, starting_after } body?, RequestOptionsoptions?): MessageStreamResponse | Stream<LettaStreamingResponse>
post/v1/runs/{run_id}/stream

Retrieve Stream

ParametersExpand Collapse
runID: string
body: MessageStreamParams { batch_size, include_pings, poll_interval, starting_after }
batch_size?: number | null

Number of entries to read per batch.

include_pings?: boolean | null

Whether to include periodic keepalive ping messages in the stream to prevent connection timeouts.

poll_interval?: number | null

Seconds to wait between polls when no new data.

starting_after?: number

Sequence id to use as a cursor for pagination. Response will start streaming after this chunk sequence id

ReturnsExpand Collapse
MessageStreamResponse = unknown
Retrieve Stream
import Letta from '@letta-ai/letta-client';

const client = new Letta({
  apiKey: 'My API Key',
});

const response = await client.runs.messages.stream('run_id');

console.log(response);
{}
Returns Examples
{}