You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems Ollama client doesn't like having non-text response of tool calling:
[2025-05-06 22:27:49.692 -0700] ERROR: LLM error: 1:llm:ChatOllama (3ms)
error: "Non string tool message content is not supported\n\nError: Non string tool message content is not supported\n at convertToolMessageToOllama (file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/ollama/dist/utils.js:119:15)\n at file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/ollama/dist/utils.js:140:20\n at Array.flatMap (<anonymous>)\n at convertToOllamaMessages (file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/ollama/dist/utils.js:129:21)\n at ChatOllama._streamResponseChunks (file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/ollama/dist/chat_models.js:730:32)\n at _streamResponseChunks.next (<anonymous>)\n at ChatOllama._generateUncached (file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/core/dist/language_models/chat_models.js:211:34)\n at process.processTicksAndRejections (node:internal/process/task_queues:105:5)\n at async ChatOllama.invoke (file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/core/dist/language_models/chat_models.js:88:24)\n at async RunnableSequence.invoke (file:///Users/ryosuke.iwanaga/.npm/_npx/25982cb890102a2e/node_modules/aethr/node_modules/@langchain/core/dist/runnables/base.js:1280:27)"
[2025-05-06 22:27:49.694 -0700] INFO: Test result: FAIL, Summary:
Error: Non string tool message content is not supported
Previously, we had converted non-text response (array) to a text but dropped it by #2 since most of providers accept the array as-is.
It seems Ollama client doesn't like having non-text response of tool calling:
Previously, we had converted non-text response (array) to a text but dropped it by #2 since most of providers accept the array as-is.
ref. langchain-ai/langchainjs#8151
The text was updated successfully, but these errors were encountered: