These docs are for the v1.0 alpha release of LangGraph. For the latest stable docs, please see the old sites: LangGraph Python | LangGraph JavaScript
- Workflow progress — get state updates after each graph node is executed.
- LLM tokens — stream language model tokens as they’re generated.
- Custom updates — emit user-defined signals (e.g., “Fetched 10/100 records”).
What’s possible with LangGraph streaming
- Stream LLM tokens — capture token streams from anywhere: inside nodes, subgraphs, or tools.
- Emit progress notifications from tools — send custom updates or progress signals directly from tool functions.
- Stream from subgraphs — include outputs from both the parent graph and any nested subgraphs.
- Use any LLM — stream tokens from any LLM, even if it’s not a LangChain model using the
custom
streaming mode. - Use multiple streaming modes — choose from
values
(full state),updates
(state deltas),messages
(LLM tokens + metadata),custom
(arbitrary user data), ordebug
(detailed traces).