Open
Description
Checked other resources
- I added a very descriptive title to this issue.
- I searched the LangChain.js documentation with the integrated search.
- I used the GitHub search to find a similar question and didn't find it.
- I am sure that this is a bug in LangChain.js rather than my code.
- The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
const model = new AzureChatOpenAI({
modelName: 'o4-mini',
streaming: true,
azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY_41,
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE,
azureOpenAIApiDeploymentName: 'o4-mini',
azureOpenAIApiVersion: '2025-04-01-preview',
reasoningEffort: 'low',
__includeRawResponse: true,
timeout: 220000,
reasoning: {
summary: 'auto',
},
});
// ...
const pipedLlm = ragPrompt.pipe(model);
const response = await pipedLlm.invoke({
//...
});
Error Message and Stack Trace (if applicable)
Troubleshooting URL: https://js.langchain.com/docs/troubleshooting/errors/MODEL_NOT_FOUND/
@web:dev:
@web:dev: at APIError.generate (webpack-internal:///(rsc)/../../node_modules/.pnpm/[email protected][email protected][email protected][email protected]/node_modules/openai/error.mjs:69:20)
@web:dev: at AzureOpenAI.makeStatusError (webpack-internal:///(rsc)/../../node_modules/.pnpm/[email protected][email protected][email protected][email protected]/node_modules/openai/core.mjs:333:65)
@web:dev: at AzureOpenAI.makeRequest (webpack-internal:///(rsc)/../../node_modules/.pnpm/[email protected][email protected][email protected][email protected]/node_modules/openai/core.mjs:377:30)
@web:dev: at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
@web:dev: at async eval (webpack-internal:///(rsc)/../../node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/@langchain/openai/dist/chat_models.js:2370:24)
@web:dev: at async RetryOperation.eval [as _fn] (webpack-internal:///(rsc)/../../node_modules/.pnpm/[email protected]/node_modules/p-retry/index.js:50:12) {
@web:dev: status: 404,
@web:dev: headers: {
@web:dev: 'apim-request-id': '2863c8c1-d34a-4cf1-9868-45a5866f3286',
@web:dev: 'content-length': '56',
@web:dev: 'content-type': 'application/json',
@web:dev: date: 'Fri, 13 Jun 2025 06:42:15 GMT',
@web:dev: 'strict-transport-security': 'max-age=31536000; includeSubDomains; preload',
@web:dev: 'x-content-type-options': 'nosniff'
@web:dev: },
@web:dev: request_id: undefined,
@web:dev: error: { code: '404', message: 'Resource not found' },
@web:dev: code: '404',
@web:dev: param: undefined,
@web:dev: type: undefined,
@web:dev: lc_error_code: 'MODEL_NOT_FOUND',
@web:dev: attemptNumber: 1,
@web:dev: retriesLeft: 6,
@web:dev: pregelTaskId: '3923942f-e3cd-557f-b379-b87d914c76ea'
@web:dev: }
Description
AzureChatOpenAI throws a 404 error when trying to invoke the model with reasoning summaries enabled. What I'm trying to achieve is to stream thinking tokens and I followed the implementation here.
- If I remove the reasoning parameter in the model instantiation, I get a response, but it is missing reasoning tokens.
- this is reportedly fixed in the python package by passing in
use_responses_api=True
on instantiation here . This parameter is not available in the js library
System Info
"@langchain/aws": "^0.1.1",
"@langchain/core": "^0.3.57",
"@langchain/google-webauth": "^0.2.9",
"@langchain/groq": "^0.1.2",
"@langchain/langgraph": "^0.2.45",
"@langchain/openai": "^0.5.12",
"@langchain/pinecone": "^0.1.1",
"@langchain/textsplitters": "^0.1.0",
"langchain": "^0.3.2",
"langsmith": "^0.2.10",