Skip to content

Commit fada375

Browse files
authored
docs[patch]: Adds docs on cancelling execution (#6364)
* Adds docs on cancelling execution * Fix build, remove stray TODOs * Fix regex * Fix * Update regex * Test * log * fix * Fix
1 parent dbcf167 commit fada375

File tree

6 files changed

+347
-10
lines changed

6 files changed

+347
-10
lines changed

β€Ždocs/core_docs/.gitignore

+45-1
Original file line numberDiff line numberDiff line change
@@ -192,6 +192,8 @@ docs/how_to/chat_streaming.md
192192
docs/how_to/chat_streaming.mdx
193193
docs/how_to/character_text_splitter.md
194194
docs/how_to/character_text_splitter.mdx
195+
docs/how_to/cancel_execution.md
196+
docs/how_to/cancel_execution.mdx
195197
docs/how_to/callbacks_runtime.md
196198
docs/how_to/callbacks_runtime.mdx
197199
docs/how_to/callbacks_custom_events.md
@@ -208,8 +210,32 @@ docs/how_to/assign.md
208210
docs/how_to/assign.mdx
209211
docs/how_to/agent_executor.md
210212
docs/how_to/agent_executor.mdx
213+
docs/integrations/text_embedding/togetherai.md
214+
docs/integrations/text_embedding/togetherai.mdx
215+
docs/integrations/text_embedding/openai.md
216+
docs/integrations/text_embedding/openai.mdx
217+
docs/integrations/text_embedding/azure_openai.md
218+
docs/integrations/text_embedding/azure_openai.mdx
219+
docs/integrations/retrievers/exa.md
220+
docs/integrations/retrievers/exa.mdx
221+
docs/integrations/retrievers/bedrock-knowledge-bases.md
222+
docs/integrations/retrievers/bedrock-knowledge-bases.mdx
223+
docs/integrations/llms/openai.md
224+
docs/integrations/llms/openai.mdx
225+
docs/integrations/llms/mistralai.md
226+
docs/integrations/llms/mistralai.mdx
211227
docs/integrations/llms/mistral.md
212228
docs/integrations/llms/mistral.mdx
229+
docs/integrations/llms/google_vertex_ai.md
230+
docs/integrations/llms/google_vertex_ai.mdx
231+
docs/integrations/llms/fireworks.md
232+
docs/integrations/llms/fireworks.mdx
233+
docs/integrations/llms/cohere.md
234+
docs/integrations/llms/cohere.mdx
235+
docs/integrations/llms/bedrock.md
236+
docs/integrations/llms/bedrock.mdx
237+
docs/integrations/llms/azure.md
238+
docs/integrations/llms/azure.mdx
213239
docs/integrations/chat/togetherai.md
214240
docs/integrations/chat/togetherai.mdx
215241
docs/integrations/chat/openai.md
@@ -232,5 +258,23 @@ docs/integrations/chat/azure.md
232258
docs/integrations/chat/azure.mdx
233259
docs/integrations/chat/anthropic.md
234260
docs/integrations/chat/anthropic.mdx
261+
docs/integrations/document_loaders/web_loaders/web_puppeteer.md
262+
docs/integrations/document_loaders/web_loaders/web_puppeteer.mdx
235263
docs/integrations/document_loaders/web_loaders/web_cheerio.md
236-
docs/integrations/document_loaders/web_loaders/web_cheerio.mdx
264+
docs/integrations/document_loaders/web_loaders/web_cheerio.mdx
265+
docs/integrations/document_loaders/web_loaders/recursive_url_loader.md
266+
docs/integrations/document_loaders/web_loaders/recursive_url_loader.mdx
267+
docs/integrations/document_loaders/web_loaders/pdf.md
268+
docs/integrations/document_loaders/web_loaders/pdf.mdx
269+
docs/integrations/document_loaders/web_loaders/firecrawl.md
270+
docs/integrations/document_loaders/web_loaders/firecrawl.mdx
271+
docs/integrations/document_loaders/file_loaders/unstructured.md
272+
docs/integrations/document_loaders/file_loaders/unstructured.mdx
273+
docs/integrations/document_loaders/file_loaders/text.md
274+
docs/integrations/document_loaders/file_loaders/text.mdx
275+
docs/integrations/document_loaders/file_loaders/pdf.md
276+
docs/integrations/document_loaders/file_loaders/pdf.mdx
277+
docs/integrations/document_loaders/file_loaders/directory.md
278+
docs/integrations/document_loaders/file_loaders/directory.mdx
279+
docs/integrations/document_loaders/file_loaders/csv.md
280+
docs/integrations/document_loaders/file_loaders/csv.mdx
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,297 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# How to cancel execution\n",
8+
"\n",
9+
"```{=mdx}\n",
10+
":::info Prerequisites\n",
11+
"\n",
12+
"This guide assumes familiarity with the following concepts:\n",
13+
"\n",
14+
"- [LangChain Expression Language](/docs/concepts/#langchain-expression-language)\n",
15+
"- [Chains](/docs/how_to/sequence/)\n",
16+
"- [Streaming](/docs/how_to/streaming/)\n",
17+
"\n",
18+
":::\n",
19+
"```\n",
20+
"\n",
21+
"When building longer-running chains or [LangGraph](https://langchain-ai.github.io/langgraphjs/) agents, you may want to interrupt execution in situations such as a user leaving your app or submitting a new query.\n",
22+
"\n",
23+
"[LangChain Expression Language (LCEL)](/docs/concepts#langchain-expression-language) supports aborting runnables that are in-progress via a runtime [signal](https://developer.mozilla.org/en-US/docs/Web/API/AbortController/signal) option.\n",
24+
"\n",
25+
"```{=mdx}\n",
26+
":::caution Compatibility\n",
27+
"\n",
28+
"Built-in signal support requires `@langchain/core>=0.2.20`. Please see here for a [guide on upgrading](/docs/how_to/installation/#installing-integration-packages).\n",
29+
"\n",
30+
":::\n",
31+
"```\n",
32+
"\n",
33+
"**Note:** Individual integrations like chat models or retrievers may have missing or differing implementations for aborting execution. Signal support as described in this guide will apply in between steps of a chain.\n",
34+
"\n",
35+
"To see how this works, construct a chain such as the one below that performs [retrieval-augmented generation](/docs/tutorials/rag). It answers questions by first searching the web using [Tavily](/docs/integrations/retrievers/tavily), then passing the results to a chat model to generate a final answer:\n",
36+
"\n",
37+
"```{=mdx}\n",
38+
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
39+
"\n",
40+
"<ChatModelTabs />\n",
41+
"```"
42+
]
43+
},
44+
{
45+
"cell_type": "code",
46+
"execution_count": 1,
47+
"metadata": {},
48+
"outputs": [],
49+
"source": [
50+
"// @lc-docs-hide-cell\n",
51+
"import { ChatAnthropic } from \"@langchain/anthropic\";\n",
52+
"\n",
53+
"const llm = new ChatAnthropic({\n",
54+
" model: \"claude-3-5-sonnet-20240620\",\n",
55+
"});"
56+
]
57+
},
58+
{
59+
"cell_type": "code",
60+
"execution_count": 2,
61+
"metadata": {},
62+
"outputs": [],
63+
"source": [
64+
"import { TavilySearchAPIRetriever } from \"@langchain/community/retrievers/tavily_search_api\";\n",
65+
"import type { Document } from \"@langchain/core/documents\";\n",
66+
"import { StringOutputParser } from \"@langchain/core/output_parsers\";\n",
67+
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
68+
"import { RunnablePassthrough, RunnableSequence } from \"@langchain/core/runnables\";\n",
69+
"\n",
70+
"const formatDocsAsString = (docs: Document[]) => {\n",
71+
" return docs.map((doc) => doc.pageContent).join(\"\\n\\n\")\n",
72+
"}\n",
73+
"\n",
74+
"const retriever = new TavilySearchAPIRetriever({\n",
75+
" k: 3,\n",
76+
"});\n",
77+
"\n",
78+
"const prompt = ChatPromptTemplate.fromTemplate(`\n",
79+
"Use the following context to answer questions to the best of your ability:\n",
80+
"\n",
81+
"<context>\n",
82+
"{context}\n",
83+
"</context>\n",
84+
"\n",
85+
"Question: {question}`)\n",
86+
"\n",
87+
"const chain = RunnableSequence.from([\n",
88+
" {\n",
89+
" context: retriever.pipe(formatDocsAsString),\n",
90+
" question: new RunnablePassthrough(),\n",
91+
" },\n",
92+
" prompt,\n",
93+
" llm,\n",
94+
" new StringOutputParser(),\n",
95+
"]);"
96+
]
97+
},
98+
{
99+
"cell_type": "markdown",
100+
"metadata": {},
101+
"source": [
102+
"If you invoke it normally, you can see it returns up-to-date information:"
103+
]
104+
},
105+
{
106+
"cell_type": "code",
107+
"execution_count": 3,
108+
"metadata": {},
109+
"outputs": [
110+
{
111+
"name": "stdout",
112+
"output_type": "stream",
113+
"text": [
114+
"Based on the provided context, the current weather in San Francisco is:\n",
115+
"\n",
116+
"Temperature: 17.6Β°C (63.7Β°F)\n",
117+
"Condition: Sunny\n",
118+
"Wind: 14.4 km/h (8.9 mph) from WSW direction\n",
119+
"Humidity: 74%\n",
120+
"Cloud cover: 15%\n",
121+
"\n",
122+
"The information indicates it's a sunny day with mild temperatures and light winds. The data appears to be from August 2, 2024, at 17:00 local time.\n"
123+
]
124+
}
125+
],
126+
"source": [
127+
"await chain.invoke(\"what is the current weather in SF?\");"
128+
]
129+
},
130+
{
131+
"cell_type": "markdown",
132+
"metadata": {},
133+
"source": [
134+
"Now, let's interrupt it early. Initialize an [`AbortController`](https://developer.mozilla.org/en-US/docs/Web/API/AbortController) and pass its `signal` property into the chain execution. To illustrate the fact that the cancellation occurs as soon as possible, set a timeout of 100ms:"
135+
]
136+
},
137+
{
138+
"cell_type": "code",
139+
"execution_count": 4,
140+
"metadata": {},
141+
"outputs": [
142+
{
143+
"name": "stdout",
144+
"output_type": "stream",
145+
"text": [
146+
"Error: Aborted\n",
147+
" at EventTarget.<anonymous> (/Users/jacoblee/langchain/langchainjs/langchain-core/dist/utils/signal.cjs:19:24)\n",
148+
" at [nodejs.internal.kHybridDispatch] (node:internal/event_target:825:20)\n",
149+
" at EventTarget.dispatchEvent (node:internal/event_target:760:26)\n",
150+
" at abortSignal (node:internal/abort_controller:370:10)\n",
151+
" at AbortController.abort (node:internal/abort_controller:392:5)\n",
152+
" at Timeout._onTimeout (evalmachine.<anonymous>:7:29)\n",
153+
" at listOnTimeout (node:internal/timers:573:17)\n",
154+
" at process.processTimers (node:internal/timers:514:7)\n",
155+
"timer1: 103.204ms\n"
156+
]
157+
}
158+
],
159+
"source": [
160+
"const controller = new AbortController();\n",
161+
"\n",
162+
"const startTimer = console.time(\"timer1\");\n",
163+
"\n",
164+
"setTimeout(() => controller.abort(), 100);\n",
165+
"\n",
166+
"try {\n",
167+
" await chain.invoke(\"what is the current weather in SF?\", {\n",
168+
" signal: controller.signal,\n",
169+
" });\n",
170+
"} catch (e) {\n",
171+
" console.log(e);\n",
172+
"}\n",
173+
"\n",
174+
"console.timeEnd(\"timer1\");"
175+
]
176+
},
177+
{
178+
"cell_type": "markdown",
179+
"metadata": {},
180+
"source": [
181+
"And you can see that execution ends after just over 100ms. Looking at [this LangSmith trace](https://smith.langchain.com/public/63c04c3b-2683-4b73-a4f7-fb12f5cb9180/r), you can see that the model is never called.\n",
182+
"\n",
183+
"## Streaming\n",
184+
"\n",
185+
"You can pass a `signal` when streaming too. This gives you more control over using a `break` statement within the `for await... of` loop to cancel the current run, which will only trigger after final output has already started streaming. The below example uses a `break` statement - note the time elapsed before cancellation occurs:"
186+
]
187+
},
188+
{
189+
"cell_type": "code",
190+
"execution_count": 5,
191+
"metadata": {},
192+
"outputs": [
193+
{
194+
"name": "stdout",
195+
"output_type": "stream",
196+
"text": [
197+
"chunk \n",
198+
"timer2: 3.990s\n"
199+
]
200+
}
201+
],
202+
"source": [
203+
"const startTimer2 = console.time(\"timer2\");\n",
204+
"\n",
205+
"const stream = await chain.stream(\"what is the current weather in SF?\");\n",
206+
"\n",
207+
"for await (const chunk of stream) {\n",
208+
" console.log(\"chunk\", chunk);\n",
209+
" break;\n",
210+
"}\n",
211+
"\n",
212+
"console.timeEnd(\"timer2\");"
213+
]
214+
},
215+
{
216+
"cell_type": "markdown",
217+
"metadata": {},
218+
"source": [
219+
"Now compare this to using a signal. Note that you will need to wrap the stream in a `try/catch` block:"
220+
]
221+
},
222+
{
223+
"cell_type": "code",
224+
"execution_count": 6,
225+
"metadata": {},
226+
"outputs": [
227+
{
228+
"name": "stdout",
229+
"output_type": "stream",
230+
"text": [
231+
"Error: Aborted\n",
232+
" at EventTarget.<anonymous> (/Users/jacoblee/langchain/langchainjs/langchain-core/dist/utils/signal.cjs:19:24)\n",
233+
" at [nodejs.internal.kHybridDispatch] (node:internal/event_target:825:20)\n",
234+
" at EventTarget.dispatchEvent (node:internal/event_target:760:26)\n",
235+
" at abortSignal (node:internal/abort_controller:370:10)\n",
236+
" at AbortController.abort (node:internal/abort_controller:392:5)\n",
237+
" at Timeout._onTimeout (evalmachine.<anonymous>:7:38)\n",
238+
" at listOnTimeout (node:internal/timers:573:17)\n",
239+
" at process.processTimers (node:internal/timers:514:7)\n",
240+
"timer3: 100.684ms\n"
241+
]
242+
}
243+
],
244+
"source": [
245+
"const controllerForStream = new AbortController();\n",
246+
"\n",
247+
"const startTimer3 = console.time(\"timer3\");\n",
248+
"\n",
249+
"setTimeout(() => controllerForStream.abort(), 100);\n",
250+
"\n",
251+
"try {\n",
252+
" const streamWithSignal = await chain.stream(\"what is the current weather in SF?\", {\n",
253+
" signal: controllerForStream.signal\n",
254+
" });\n",
255+
" for await (const chunk of streamWithSignal) {\n",
256+
" console.log(chunk);\n",
257+
" break;\n",
258+
" } \n",
259+
"} catch (e) {\n",
260+
" console.log(e); \n",
261+
"}\n",
262+
"\n",
263+
"console.timeEnd(\"timer3\");"
264+
]
265+
},
266+
{
267+
"cell_type": "markdown",
268+
"metadata": {},
269+
"source": [
270+
"## Related\n",
271+
"\n",
272+
"- [Pass through arguments from one step to the next](/docs/how_to/passthrough)\n",
273+
"- [Dispatching custom events](/docs/how_to/callbacks_custom_events)"
274+
]
275+
}
276+
],
277+
"metadata": {
278+
"kernelspec": {
279+
"display_name": "TypeScript",
280+
"language": "typescript",
281+
"name": "tslab"
282+
},
283+
"language_info": {
284+
"codemirror_mode": {
285+
"mode": "typescript",
286+
"name": "javascript",
287+
"typescript": true
288+
},
289+
"file_extension": ".ts",
290+
"mimetype": "text/typescript",
291+
"name": "typescript",
292+
"version": "3.7.2"
293+
}
294+
},
295+
"nbformat": 4,
296+
"nbformat_minor": 2
297+
}

β€Ždocs/core_docs/docs/how_to/index.mdx

+1
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,7 @@ LangChain Expression Language is a way to create arbitrary custom chains. It is
4040
- [How to: add message history](/docs/how_to/message_history)
4141
- [How to: route execution within a chain](/docs/how_to/routing)
4242
- [How to: add fallbacks](/docs/how_to/fallbacks)
43+
- [How to: cancel execution](/docs/how_to/cancel_execution/)
4344

4445
## Components
4546

β€Ždocs/core_docs/docs/integrations/document_loaders/web_loaders/pdf.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -68,9 +68,7 @@
6868
"source": [
6969
"## Instantiation\n",
7070
"\n",
71-
"Now we can instantiate our model object and load documents:\n",
72-
"\n",
73-
"- TODO: Update model instantiation with relevant params."
71+
"Now we can instantiate our model object and load documents:"
7472
]
7573
},
7674
{

β€Ždocs/core_docs/docs/integrations/llms/bedrock.ipynb

-4
Original file line numberDiff line numberDiff line change
@@ -35,10 +35,6 @@
3535
"## Overview\n",
3636
"### Integration details\n",
3737
"\n",
38-
"- TODO: Fill in table features.\n",
39-
"- TODO: Remove JS support link if not relevant, otherwise ensure link is correct.\n",
40-
"- TODO: Make sure API reference links are correct.\n",
41-
"\n",
4238
"| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/llms/bedrock) | Package downloads | Package latest |\n",
4339
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
4440
"| [Bedrock](https://api.js.langchain.com/classes/langchain_community_llms_bedrock.Bedrock.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_llms_bedrock.html) | ❌ | βœ… | βœ… | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n",

0 commit comments

Comments
Β (0)