{ "cells": [ { "cell_type": "markdown", "id": "562ddb82", "metadata": {}, "source": [ "# How to stream LLM tokens from your graph\n", "\n", "In this example, we will stream tokens from the language model powering an\n", "agent. We will use a ReAct agent as an example.\n", "\n", "
\n", "

Note

\n", "

\n", " If you are using a version of @langchain/core < 0.2.3, when calling chat models or LLMs you need to call await model.stream() within your nodes to get token-by-token streaming events, and aggregate final outputs if needed to update the graph state. In later versions of @langchain/core, this occurs automatically, and you can call await model.invoke().\n", "
\n", " For more on how to upgrade @langchain/core, check out the instructions here.\n", "

\n", "
\n", "\n", "This how-to guide closely follows the others in this directory, showing how to\n", "incorporate the functionality into a prototypical agent in LangGraph.\n", "\n", "
\n", "

Streaming Support

\n", "

\n", " Token streaming is supported by many, but not all chat models. Check to see if your LLM integration supports token streaming here (doc). Note that some integrations may support general token streaming but lack support for streaming tool calls.\n", "

\n", "
\n", "\n", "
\n", "

Note

\n", "

\n", " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent({ llm, tools }) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", "

\n", "
\n", "\n", "## Setup\n", "\n", "This guide will use OpenAI's GPT-4o model. We will optionally set our API key\n", "for [LangSmith tracing](https://smith.langchain.com/), which will give us\n", "best-in-class observability.\n", "\n", "---" ] }, { "cell_type": "code", "execution_count": 1, "id": "8e76833b", "metadata": { "lines_to_next_cell": 2 }, "outputs": [], "source": [ "// process.env.OPENAI_API_KEY = \"sk_...\";\n", "\n", "// Optional, add tracing in LangSmith\n", "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", "// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", "// process.env.LANGCHAIN_TRACING = \"true\";\n", "// process.env.LANGCHAIN_PROJECT = \"Stream Tokens: LangGraphJS\";" ] }, { "cell_type": "markdown", "id": "ab95dc97", "metadata": {}, "source": [ "## Define the state\n", "\n", "The state is the interface for all of the nodes in our graph.\n" ] }, { "cell_type": "code", "execution_count": 2, "id": "1648124b", "metadata": { "lines_to_next_cell": 2 }, "outputs": [], "source": [ "import { Annotation } from \"@langchain/langgraph\";\n", "import { BaseMessage } from \"@langchain/core/messages\";\n", "\n", "const StateAnnotation = Annotation.Root({\n", " messages: Annotation({\n", " reducer: (x, y) => x.concat(y),\n", " }),\n", "});" ] }, { "cell_type": "markdown", "id": "da50fbd8", "metadata": {}, "source": [ "## Set up the tools\n", "\n", "First define the tools you want to use. For this simple example, we'll create a placeholder search engine, but see the documentation [here](https://js.langchain.com/docs/how_to/custom_tools) on how to create your own custom tools." ] }, { "cell_type": "code", "execution_count": 3, "id": "a8f1ae1c", "metadata": { "lines_to_next_cell": 2 }, "outputs": [], "source": [ "import { tool } from \"@langchain/core/tools\";\n", "import { z } from \"zod\";\n", "\n", "const searchTool = tool((_) => {\n", " // This is a placeholder for the actual implementation\n", " return \"Cold, with a low of 3℃\";\n", "}, {\n", " name: \"search\",\n", " description:\n", " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", " schema: z.object({\n", " query: z.string().describe(\"The query to use in your search.\"),\n", " }),\n", "});\n", "\n", "await searchTool.invoke({ query: \"What's the weather like?\" });\n", "\n", "const tools = [searchTool];" ] }, { "cell_type": "markdown", "id": "19b27cb3", "metadata": {}, "source": [ "We can now wrap these tools in a prebuilt\n", "[ToolNode](/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n", "This object will actually run the tools (functions) whenever they are invoked by\n", "our LLM." ] }, { "cell_type": "code", "execution_count": 4, "id": "f02278b1", "metadata": { "lines_to_next_cell": 2 }, "outputs": [], "source": [ "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", "\n", "const toolNode = new ToolNode(tools);" ] }, { "cell_type": "markdown", "id": "dd55ee5a", "metadata": {}, "source": [ "## Set up the model\n", "\n", "Now load the [chat model](https://js.langchain.com/docs/concepts/#chat-models).\n", "\n", "1. It should work with messages. We will represent all agent state in the form\n", " of messages, so it needs to be able to work well with them.\n", "2. It should work with\n", " [tool calling](https://js.langchain.com/docs/how_to/tool_calling/#passing-tools-to-llms),\n", " meaning it can return function arguments in its response.\n", "\n", "
\n", "

Note

\n", "

\n", " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", "

\n", "
" ] }, { "cell_type": "code", "execution_count": 5, "id": "9c7210e7", "metadata": { "lines_to_next_cell": 2 }, "outputs": [], "source": [ "import { ChatOpenAI } from \"@langchain/openai\";\n", "\n", "const model = new ChatOpenAI({\n", " model: \"gpt-4o-mini\",\n", " temperature: 0,\n", " streaming: true\n", "});" ] }, { "cell_type": "markdown", "id": "73e59248", "metadata": {}, "source": [ "After you've done this, we should make sure the model knows that it has these\n", "tools available to call. We can do this by calling\n", "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." ] }, { "cell_type": "code", "execution_count": 7, "id": "b4ff23ee", "metadata": { "lines_to_next_cell": 2 }, "outputs": [], "source": [ "const boundModel = model.bindTools(tools);" ] }, { "cell_type": "markdown", "id": "dbe67356", "metadata": {}, "source": [ "## Define the graph\n", "\n", "We can now put it all together." ] }, { "cell_type": "code", "execution_count": 8, "id": "0ba603bb", "metadata": {}, "outputs": [], "source": [ "import { StateGraph, END } from \"@langchain/langgraph\";\n", "import { AIMessage } from \"@langchain/core/messages\";\n", "\n", "const routeMessage = (state: typeof StateAnnotation.State) => {\n", " const { messages } = state;\n", " const lastMessage = messages[messages.length - 1] as AIMessage;\n", " // If no tools are called, we can finish (respond to the user)\n", " if (!lastMessage?.tool_calls?.length) {\n", " return END;\n", " }\n", " // Otherwise if there is, we continue and call the tools\n", " return \"tools\";\n", "};\n", "\n", "const callModel = async (\n", " state: typeof StateAnnotation.State,\n", ") => {\n", " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", " // and aggregate the message from chunks instead of calling `.invoke()`.\n", " const { messages } = state;\n", " const responseMessage = await boundModel.invoke(messages);\n", " return { messages: [responseMessage] };\n", "};\n", "\n", "const workflow = new StateGraph(StateAnnotation)\n", " .addNode(\"agent\", callModel)\n", " .addNode(\"tools\", toolNode)\n", " .addEdge(\"__start__\", \"agent\")\n", " .addConditionalEdges(\"agent\", routeMessage)\n", " .addEdge(\"tools\", \"agent\");\n", "\n", "const agent = workflow.compile();" ] }, { "cell_type": "code", "execution_count": 9, "id": "a88cf20a", "metadata": {}, "outputs": [ { "data": { "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAD5ANYDASIAAhEBAxEB/8QAHQABAAICAwEBAAAAAAAAAAAAAAYHAwUCBAgBCf/EAFIQAAEEAQIDAgUOCQkGBwAAAAEAAgMEBQYRBxIhEzEWFyJBlAgUFTJRVVZhcXSy0dLTIzY3QlSBkZOVGDVDUnWCkrO0JCUncpahMzRTZLHB8P/EABsBAQEAAwEBAQAAAAAAAAAAAAABAgMFBAYH/8QAMxEBAAECAQkFCAIDAAAAAAAAAAECEQMEEiExQVFSkdEUM2FxoQUTFSNiscHhgZIi8PH/2gAMAwEAAhEDEQA/AP1TREQEREBERAWG1cr0o+exPHXZ/WleGj9pWju37uevz47FTGlVrnkt5NrQ5zX/APpQhwLS4d7nuBa3cNAc4u5Ptbh/p+F5llxcF+ydua1fb65mcR5y9+5/Z0W+KKae8n+IW293fCrC++9D0ln1p4VYX34oeks+tPBXC+89D0Zn1J4K4X3noejM+pX5Pj6LoPCrC+/FD0ln1p4VYX34oeks+tPBXC+89D0Zn1J4K4X3noejM+pPk+PoaDwqwvvxQ9JZ9aeFWF9+KHpLPrTwVwvvPQ9GZ9SeCuF956HozPqT5Pj6Gg8KsL78UPSWfWu5UyFW+0uq2YbLR3mGQOA/Yun4K4X3noejM+pdS1oHTluQSuw1OGdp3bYrRCGZp+KRmzh+op8mds+n6TQ36KMR2bmkZ4Yb9qbJYeVwjZen5e1quJ2a2UgAOYegD9twdubfcuEnWuujN8YJgREWtBERAREQEREBERAREQEREBajV2Yfp/S+VyMQDpq1Z8kTXdxft5IP69lt1HuIVOW9onMxwtMkza7pWMaNy5zPLAA90luy24MROJTFWq8LGtsNP4ePAYapQjPN2LPLk88khO73n43OLnE+6StisNO1FeqQWYHc8MzGyMd7rSNwf2FZlhVMzVM1a0FEuIHFbS3C6LHv1JkzSfkJHRVIIa01madzW8z+SKFj3kNHUnbYbjchS1Up6pWhUfBp3Jx4/WDdSY59mTEZzR2ON2ahK6NocyaIBwdHL0Ba5paeXqW9CsR2cp6pjT+N4q6b0m2tetUc3hfZeHJ1cdbnB55IWwtDY4XeS5sjnOkJAZs0O5S4KQWuP2gqOuW6Qs571vnX2m0WxS052wmw4bthE5j7LtDuNm8+53A2VUx5fWendd8Ltfax0nlrtuxpGzicxDp6g+4+neklrTDnij3LWu7J43G4aehPnUA4t4/Wep5tTDMYbX+W1Bj9VwW8fUxsEwwsOJguRSRyRtjIjsSGJpJGz5ec9GgDoHpi3x20TT1je0ocpYsahozR17VCnjbVh8DpI2yMLzHE4NYWvb5ZPLuSN9wQNXwF4943jngrNyrRu465XsWY5K89KyyMRssSRRubNJExj3OawOcxpJYSWuAIXW4S6fu4zjFxpyVrG2KkGSy2PdVtzQOY21GzHQNJY4jZ7Wv529NwDzDv3Wr9THYyGl8PlNCZjT2axuSxeUylr19YovbQswy3pJY3Q2NuR5c2Zp5Qdxyu3A2QXgiIg6+QoV8rQs0rcTZ6tmN0MsT+57HDZwPyglajQ1+e/puEWpe3t1JZqM0p33kfDK6IvO/9bk5v1rfqM8PG9pp+S4N+S/dtXI+YbbxyTvdGdvjZyn9a9FPc1X3x+V2JMiIvOgiIgIiICIiAiIgIiICIiAiIgilOdmg3mjb2iwDnl1O315Km53MMp7mN3J5H9G7bMOxDe0x6r4RaG1/kY8lqPSWEz95sQhZayFGKeQRgkhoc4E8u7nHb4ypa9jZGOY9oexw2LXDcEe4VGn8PsdCScbZyGFB/osdbfHEPc2iO8bf1NH/YL0TVRiaa5tPO/wDv8stEo8fU28KC0N8W+luUEkD2Jg2B8/5vxBSbR/DvS3D2GzFpjT2M0/FZc107MbUZAJSNwC4NA323Pf7qw+BNj4VZ799D90ngTY+FWe/fQ/dJ7vD4/SUtG9KEUX8CbHwqz376H7pRO9jstX4q4PTzNU5j2OuYW/flJlh7TtYZ6bGbfg/a8tiTfp38vUed7vD4/SS0b1qLS6s0XgNd4xuO1HhaGdx7ZBM2rka7Z4w8AgO5XAjcBxG/xldHwJsfCrPfvofuk8CbHwqz376H7pPd4fH6SWje0DfU3cKWBwbw40u0PGzgMTB1G4Ox8n3QP2LZ6Z4K6A0Zl4srgNF4HDZOIObHco4+KGVocNnAOa0EbgkFdzwJsfCrPfvoful98AKdh3+8MhlcqzffsbV14iPysZytcPicCEzMONdfKP8AhaHHK5Dwu7fDYqXnqP5ochkYXeRCzqHRRuHfKe7p7QbuJB5WuksEEdaCOGFjYoo2hjGMGwa0DYADzBfKtWGlXjr14Y68EbQ1kUTQ1rQO4ADoAsqwrriYzadUEiIi1IIiICIiAiIgIiICIiAiIgIiICIiAiIgKvssW+P7SwJPN4MZfYebb11jd/P8nm/WPPYKr/K7+P7S3Vu3gxl+hA3/APNY3u8+3ydO7fzILAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQFXuWA/lA6VPM0HwXzHk7dT/ALXjOu+3d+vzj9VhKvctt/KC0r1PN4L5jYcv/u8Z5/8A9/2QWEiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIonf1ZkbVyxBg6NazFXkMMtu7O6JhkG4c1gaxxdykbE9ADuBuQdtuHh1Yk2pW10sRQj2d1h+gYP0ub7tPZ3WH6Bg/S5vu1v7LXvjnBZN14D1j6vbK6e9URXxNrhXO7UOJjuadGPizAd28s9is5r2O9b78p9bjbYeUHg+YL2L7O6w/QMH6XN92qgz3qf5tQ+qDw/Fqxj8MMzjqvYmoLEhinmaOWKdx7PfnY07D/lZ/V6uy1745wWelkUI9ndYfoGD9Lm+7T2d1h+gYP0ub7tOy1745wWTdFCPZ3WH6Bg/S5vu1li1flsW5kmdoU4qBcGvtUbD5OwJOwc9jmDyN9t3AnbfcjYFwk5LibLT/ADBZMkRF5EEREBERAREQEREBERAREQEREBERAVeaGO+BeT3m/eJ+M+upVYarzQv8wP8An13/AFUq9+T93V5x+V2JAiItiCIiAiLo2M5j6uXqYua7BHkrcckteo6QCWVjOXnc1veQ3mbufNzD3UHeUd4jnbh7qg9Nxi7RG43/AKJykSjnEj8neqf7Ktf5LluwO9o84+7KnXCxGe1HyLkuLPaN+RclxmIiIgIiICIiAiIgIiICIiAiIgIiICrzQv8AMD/n13/VSqw1Xmhf5gf8+u/6qVe/J+7q84/K7EgXkPiHrLUMOqb+t9KXNSMw2K1ZVw1qfI6gIpTO9dx1rEEOOEZa6Pdzm9o5zXhwLhuAvXirXOepw4dajyOSvZDTgnnyMxtWGtuWGRmc7bzsjbIGRzdP/FYGv7/K6lWqJnUig+Kua1DndU8QMQNRarp68hzFSrpvT2IsWIaU+NeIfwjhFs0hwNkvlc4FnJ0LdgDtci/iXxc1xxGOCuWKR0/lX4jHMg1XLi2U+SGNzJpKrKsrbAe55fvI4gjyQG8u5kHE/wBTzq3VeuM7k9PS4fADJyxyx52tm8tWu1ntjYwymrFIK80gDBsTyggNDgdtzZ2p+AGhda5o5jOYQXctJCyC1aiszV/XjWDZonZE9rZQPceHbDp3LDNmbiqW4jU+tNea+xeoNW5vGXcLpfEWew0/k5a1aO/JDZ7WVnLsS3ni6NOzXD2zSQNtHgqLuK3ELgJn83lMvDksroqzasyY7KT0w+ZgqOJAie0DmMji4Do4BoO4a3b0xDofCQZzN5iOly5HNVoad+btX/hoog8Rt5ebZuwlf1aATzdSdhtHstwH0Nm9OadwdvCE4/T0XY4oQ3LEU1WPkDC1szJBIQWgAguPNsN99llmyJ8o5xI/J3qn+yrX+S5SMDYAKOcSPyd6p/sq1/kuXqwO9o84+7KnXCxGe0b8i5Liz2jfkXJcZiIiICIiAiIgIiICIiAiIgIiICIiAq80L/MD/n13/VSqw1XczMhpjMzY7HYufO0rEs9thqPa19RznCSSKQyFrBu6YFg5g4tcQG7Rlx92TzGbVRe0zadOjVfqsarJCi0nstnvgZlfSqX36ey2e+BmV9Kpffr05n1R/aOq2btFpPZbPfAzK+lUvv1F7vGOtj+IWP0PYwd+LVWQqPu1scZ6vNJCzfmdzdtyjucdidyGkgbApmfVH9o6llhotJ7LZ74GZX0ql9+nstnvgZlfSqX36Zn1R/aOpZu1HOJH5O9U/wBlWv8AJcux7LZ74GZX0ql9+sWQx+e1Tj56EmElxVWaMtsOtWYjJIzY7xs7NzgHO9rzEgNDidiRsc8O2HXFdVUWib646kRabrBZ7RvyLktZhs/Xy7WRFrqWSFeKxYxdl7PXNVsnNyiRrHOA6se3mBLSWO5XHZbNcViIiICIiAiIgIiICIiAiIgIiICL45wY0ucQ1oG5J7gtDG+xqew2SOSaliIJz7URublIzF0IduS2Lmee7lc50QIPZn8IHGfIWdSiatiZZadMxwyszkXZSRSgyeXHCNyS7kad3lvKO0YW85Dg3bY3FU8PDJDRqxVIpJpLD2xMDQ6SR5fI87d7nOcST5ySs1atDSrRV68TIIImCOOKJoa1jQNg0AdAAOmyyoCIiAvzx4g+pl43Z71XVTWVbUWlaufnM2ZxcbrtoxQVKksEQgeRX84sRggAg7v3Pu/ocq/yHLNx8wHKGl1fTOR5zueZoktUeXp3bHsnf4flQWAiIgIiINbmcFBmIXDtZqVrZoZepuDJ4w17XgB2x8kuY3dpBa4dHAgkLpw5y5jrorZuGGIWrskNCxSEkkb4gznZ2/k7Qv6Pb1cWuLAQ4OkEY3y+OaHtLXAOaRsQe4oPqKMCrNoam31jBLa05SqNiZjasTprUJEnVzCXbvYI3H8GAXARAMDiQ1SSOVkrS5j2vaCW7tO43B2I/UQR+pBzREQEREBERAREQEREBEWK1P61rTTcj5ezYX8kY3c7Yb7AecoNBZEOsr1zHu5J8JUdJTyVK5j+eO690bHBjXv8l0bQ883K1wL9m8wMcjDJFodBx8mi8I7tcpMZKkcxfmz/ALbu9ocRMB0DxzbFo6AjYdAFvkBERAREQFX3DgnVeodQa435qOREWOxDt9w+jAXkTjrttLLLM4Ee2jbCfc256ltS8QsrY0pjJnR4iu8Mz+Qhc5ruXYO9ZROHdI8Edo4Hdkbths+RrmTqvXiqQRwQRshhiaGMjjaGtY0DYAAdwA8yDIiIgIiICIiAo9fqeC5tZShEGUS+W7kadeo+eaw7kA54g078/kAloa4v67DmO5kKIMdexHbrxTwvEkUrQ9jx3OaRuCsi0OBgmxeZy2O7C++kXNvQ3bdgTRudM+TtII9zzNDCwO5T0AlaGnYcrd8gIiICIiAiIgIi0uY1tp7T9oVsnnMdj7JHN2Nm0xj9vd5Sd9lnTRVXNqYvK2u3SKLeNLR3wpxHpsf1qM8S7/DbivoTM6Sz+o8VNispB2MoZfja9pBDmPad/bNe1rhv03aNwR0W3s+NwTylc2dzY6F4gaXhlqaMOpN9TUnS0his7kInZicQlw7Z8fNzvD42CVr9vKjc157yp8vzi9RTwXo8FfVE6vv6jzeLkx+Hpmticp65YIrhmcPwkZ323EbXBw72l+x+P3p40tHfCnEemx/WnZ8bgnlJmzuSlFFvGlo74U4j02P608aWjvhTiPTY/rTs+NwTykzZ3JSobns7kNQZeTTmm5ewkiLRlczy8zcewjfsotxyvsub3NO4ia4SPB3jjm1GS4jVdZ51ml9LZypA+WPnt5eKeNzoWEe0rNduJZj7uxZGOrtzysdOsHg6Gm8XDjsbWbVpw8xbG0kkuc4ue9zjuXOc5znOc4lznOJJJJK1VUVUTauLJaz5gcDQ0xiK2MxlcVqVcEMZzFxJJLnOc5xLnvc4lznuJc5ziSSSStgiLBBERAREQEREBERBHrVH/iDjbjcZPJ/uu1E/JNsbRQ/ha5bC6L85z/KcHfmiJw/OUhVMZT1QHCqHibh3y690xzwYvIQvu+EtVsNdxmp7wyR9p1kfyktcerRDIPzlc6AiIgIiICIiDpZq47H4e9aYAXwQSStB91rSR/8ACiOkqkdbAUpAOaezEyeeZ3V80jmgue4nqSSf1d3cFJ9VfixmPmc30Co9pr8XMV80i+gF0MDRhT5rsbJERZoIiICIiDq5LG1stTkrWoxJE/49i0jqHNI6tcDsQ4dQQCOq7+g8pPmtF4O9af2tmenE+WTbbndyjd23m3PXb41iWHhZ+TnTnzGL6KxxdODPhMfaei7EpREXOQREQERRvXWs4NFYgWHRizcnf2VWrzcvav7ySfM1o3JPuDYbkgHZh4dWLXFFEXmRucnlqOEqOt5G5XoVW+2ntStjYPlc4gKMS8YdHQvLTnIXEdN445Hj9oaQqPydq1ncj7IZWw6/e68skg8mIb+1jb3Mb0HQdTsCST1WNfW4XsPDin5tc38P3cvC8fHNo336b6PL9hPHNo336b6PL9hUci3fA8m4qucdC8KC4kep00nqn1Y2O1JXuRnh7kpPZjKuEUgbHYYd3wcu3N+FfynoNgHu9xe7vHNo336b6PL9hUcifA8m4qucdC8Lx8c2jffpvo8v2F9Zxk0a923s3G343wyNH7S1UaifA8m4qucdC8PS2H1BjNQ13T4vIVchE08rnVpWyBp9w7HofiK2C8sQGSlejvUp5KN+P2lquQ17fiPQhw6DyXAg7dQVevDfXw1jSmr22sgy9MNE8bPaytPdKweZpIII72kEdRsTxcu9l1ZLT7yib0+sLr1JkiIuEjV6q/FjMfM5voFR7TX4uYr5pF9AKQ6q/FjMfM5voFR7TX4uYr5pF9ALo4Pcz5/hdjvWHSMgkdCxsswaSxjncoc7boCdjt18+xXnbhbx61RjOCuY1nrzFRWK9S9bgqzY+6JrN2f2Qkrx1hD2MbWbO5I2u5jzAcxDeq9Grz3DwC1dLoHUugp8jhYsA6/Nl8DloTK65DZN4XImzxFoZyteXNJa8kjboFJvsRIG+qEn0tazNTiHpg6QtUMLLn4vWuQbkI7NaJwbK1rwxm0rXOYOTbY842cQsFfjfnZ7FXEan0dNo6bUGLt2sJZjybbTnvih7V0UoaxphlDDzgAuHku8rcLW5ngRqji5kM3e4i3MNRdPp2xp+hU086WaOHt3NdJZe+VrCXbxx7MA2AB3J713cdwo11q/VWmsjr+/gmVNNU7UNRmBMz33LE8Brunl7RrRGBGX7MbzdXnyugU/yGj0lxxzGmuGHBbGRYt2q9UarwjJmz5XLCoyR8UETpOad7Xl8rzINm7Eu2cSRsvQmPmns0K01msadmSJr5a5eH9k8gEs5h0Ox3G46HZefrHBbXzuCGB4e2KOhdRV8fUkx0kmV9ctHZsa1lWxHyscWTNAcXAefbleFdmg9P29KaJwGFv5KTMXsdQgqT5CbfnsvZGGukO5J3cQT1JPXqSrTfaN6sPCz8nOnPmMX0VmWHhZ+TnTnzGL6KuL3M+cfaV2JSiIucgiIgKguLOSdkuIliBziYsbVjgjae5rpPwjyPlHZA/8gV+qguLONdjOIc87mkRZOrHPG89znx/g3gfIOyP98Lvexc3tWnXaben4uuyUWRdfI34sXRntziUwwsL3iGF8r9h7jGAucfiAJUVHFvT5/os5/wBO5D7hfb1YlFGiqYhrTJzg1pJIAHUk+ZUnS9VBh7uQqPZBjzhLdtlSKdmagde8p/I2R1MeWGFxB9sXBp3LQp2zijp++9tXsc0e3PZ7P0/fY079OrjAAB17ydlHuH2hNXaDix+n2v0/e0zQkc2K9M2UX3V9yWsLAOTmG4HPzdw9ruvJiV111U+5q0bbWndb8qxT8br9eHKZKTSxbp7F5mTD3L/sg3tGltgQiVkXJ5Td3NJBc0jcgcwG56/EzihmJsPrmjpfCTXIMLRniu5pt8VjVnMBftCNiXvja5rjsW7HoDus+R4TZe3w61hgGWaQuZjOzZOu9z39m2J9tkwDzybh3K0jYAjfz+dYNQ8NNYV/DnH6cs4WTCaqE00gybpmTVbEsAikLeRpD2u5Wnrtsfd8+iqcozbTfTHhfb+hY+i55bWjsFNNI+aaShA98kji5znGNpJJPeSfOtwoLj9b4rRuMoYO+3KSXcfWhrTOp4W9PEXNjaCWyMhLXD4wVn8bunj/AEWd/wCnch9wvbTi4cRETVF/NEzW20VknYfXuAsscWiac0pQPz2StIA/xiN391RvC5qtn8dHdqCw2B5IAtVpa8nQ7HdkjWuHd5x1Uk0TjXZnXuArMbzNgnN2Uj8xkbSQf8ZjH95TKJonArmrVafsyp1vSCIi/MFavVX4sZj5nN9AqPaa/FzFfNIvoBSnM03ZHEXqjCA+eCSIE+YuaR/9qIaSuR2MDThB5LNaFkFiB3R8MjWgOY4HqCD+0bEdCF0MDThTHiuxuERFmgiIgIiICw8LPyc6c+YxfRWPJ5StiKj7NqURxt6Ad7nuPQNa0dXOJIAaNySQB1K2GhMXPhNGYSjaZ2dmCnEyWPffkfyjdu/n2PTf4lji6MGfGY+09V2N6iIucgiIgKOa50ZBrXDis+QVrcL+1q2uXmMT+7qOm7SNwRv3HoQQCJGi2YeJVhVxXRNpgeXcrUtafyHrDLVzj7nXla87slH9aN/c8d3d1G43DT0WNenMli6WZqPq36kF6s/20NmJsjD8rSCFGJeEGjpXFxwNdpPXaNz2D9gIC+twvbmHNPzaJv4fstCikV5eJvRvvHF+9k+0nib0b7xxfvZPtLd8cybhq5R1LQo1FeXib0b7xxfvZPtJ4m9G+8cX72T7SfHMm4auUdS0KNRXl4m9G+8cX72T7S+s4O6NY7f2Cgd8T3vcP2F2yfHMm4auUdS0b1F1hLkLzKNGCS/ff7WrXAc8/GeuzR1HlOIA36lXtw40ENG0Zp7T2T5e3ymeRntI2j2sTD3loJJ3PVxJOwGzWyLEYLG4CuYMZQrY+EncsrRNjDj7p2HU/GV31xMu9qVZXT7uiLU+srq1CIi4aC0uY0Vp/UNgWMpg8bkZwOUS2qkcjwPc3cCdlukWVNdVE3pm0mpFvFXoz4J4T+HxfZTxV6M+CeE/h8X2VKUW7tGNxzzlbzvRbxV6M+CeE/h8X2U8VejPgnhP4fF9lSlE7Rjcc85LzvRbxV6M+CeE/h8X2U8VejPgnhP4fF9lSlE7Rjcc85LzvaPFaG05grLbOOwGMoWG78s1apHG9u/fsQNxut4iLVVXVXN6pumsREWAIiICIiAiIgIiICIiAiIgIiICIiD/2Q==" }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import * as tslab from \"tslab\";\n", "\n", "const runnableGraph = agent.getGraph();\n", "const image = await runnableGraph.drawMermaidPng();\n", "const arrayBuffer = await image.arrayBuffer();\n", "\n", "await tslab.display.png(new Uint8Array(arrayBuffer));" ] }, { "cell_type": "markdown", "id": "b00e15e2", "metadata": {}, "source": [ "## Streaming LLM Tokens\n", "\n", "You can access the LLM tokens as they are produced by each node with two methods:\n", "\n", "- The `stream` method along with `streamMode: \"messages\"`\n", "- The `streamEvents` method\n", "\n", "### The stream method\n", "\n", "
\n", "

Compatibility

\n", "

\n", " This section requires @langchain/langgraph>=0.2.20. For help upgrading, see this guide.\n", "

\n", "
\n", "\n", "For this method, you must be using an LLM that supports streaming as well and enable it when constructing the LLM (e.g. `new ChatOpenAI({ model: \"gpt-4o-mini\", streaming: true })`) or call `.stream` on the internal LLM call." ] }, { "cell_type": "code", "execution_count": 15, "id": "5af113ef", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "ai MESSAGE TOOL CALL CHUNK: \n", "ai MESSAGE TOOL CALL CHUNK: {\"\n", "ai MESSAGE TOOL CALL CHUNK: query\n", "ai MESSAGE TOOL CALL CHUNK: \":\"\n", "ai MESSAGE TOOL CALL CHUNK: current\n", "ai MESSAGE TOOL CALL CHUNK: weather\n", "ai MESSAGE TOOL CALL CHUNK: in\n", "ai MESSAGE TOOL CALL CHUNK: Nepal\n", "ai MESSAGE TOOL CALL CHUNK: \"}\n", "ai MESSAGE CONTENT: \n", "tool MESSAGE CONTENT: Cold, with a low of 3℃\n", "ai MESSAGE CONTENT: \n", "ai MESSAGE CONTENT: The\n", "ai MESSAGE CONTENT: current\n", "ai MESSAGE CONTENT: weather\n", "ai MESSAGE CONTENT: in\n", "ai MESSAGE CONTENT: Nepal\n", "ai MESSAGE CONTENT: is\n", "ai MESSAGE CONTENT: cold\n", "ai MESSAGE CONTENT: ,\n", "ai MESSAGE CONTENT: with\n", "ai MESSAGE CONTENT: a\n", "ai MESSAGE CONTENT: low\n", "ai MESSAGE CONTENT: temperature\n", "ai MESSAGE CONTENT: of\n", "ai MESSAGE CONTENT: \n", "ai MESSAGE CONTENT: 3\n", "ai MESSAGE CONTENT: ℃\n", "ai MESSAGE CONTENT: .\n", "ai MESSAGE CONTENT: \n" ] } ], "source": [ "import { isAIMessageChunk } from \"@langchain/core/messages\";\n", "\n", "const stream = await agent.stream(\n", " { messages: [{ role: \"user\", content: \"What's the current weather in Nepal?\" }] },\n", " { streamMode: \"messages\" },\n", ");\n", "\n", "for await (const [message, _metadata] of stream) {\n", " if (isAIMessageChunk(message) && message.tool_call_chunks?.length) {\n", " console.log(`${message.getType()} MESSAGE TOOL CALL CHUNK: ${message.tool_call_chunks[0].args}`);\n", " } else {\n", " console.log(`${message.getType()} MESSAGE CONTENT: ${message.content}`);\n", " }\n", "}" ] }, { "cell_type": "markdown", "id": "f8332924", "metadata": {}, "source": [ "### The streamEvents method\n", "\n", "You can also use the `streamEvents` method like this:" ] }, { "cell_type": "code", "execution_count": 17, "id": "ec7c31a2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[\n", " {\n", " name: 'search',\n", " args: '',\n", " id: 'call_fNhlT6qSYWdJGPSYaVqLtTKO',\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: '{\"',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: 'query',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: '\":\"',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: 'current',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: ' weather',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: ' today',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n", "[\n", " {\n", " name: undefined,\n", " args: '\"}',\n", " id: undefined,\n", " index: 0,\n", " type: 'tool_call_chunk'\n", " }\n", "]\n" ] } ], "source": [ "const eventStream = await agent.streamEvents(\n", " { messages: [{ role: \"user\", content: \"What's the weather like today?\" }] },\n", " {\n", " version: \"v2\",\n", " }\n", ");\n", "\n", "for await (const { event, data } of eventStream) {\n", " if (event === \"on_chat_model_stream\" && isAIMessageChunk(data.chunk)) {\n", " if (data.chunk.tool_call_chunks !== undefined && data.chunk.tool_call_chunks.length > 0) {\n", " console.log(data.chunk.tool_call_chunks);\n", " }\n", " }\n", "}" ] }, { "cell_type": "code", "execution_count": null, "id": "5d6f7346", "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "TypeScript", "language": "typescript", "name": "tslab" }, "language_info": { "codemirror_mode": { "mode": "typescript", "name": "javascript", "typescript": true }, "file_extension": ".ts", "mimetype": "text/typescript", "name": "typescript", "version": "3.7.2" } }, "nbformat": 4, "nbformat_minor": 5 }