Skip to content

LLM node not streaming #17085

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
5 tasks done
DavideDelbianco opened this issue Mar 28, 2025 · 3 comments
Open
5 tasks done

LLM node not streaming #17085

DavideDelbianco opened this issue Mar 28, 2025 · 3 comments
Assignees
Labels
🐞 bug Something isn't working

Comments

@DavideDelbianco
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

1.1.3

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

I've seen the issue #13626 where the bug is supposedly fixed
I've seen the issue #15700 where the solution seems to be to remove the IF/ELSE node before the Question classifier node
I've seen the issue #15700 where the bug seems to be relate to crossing Question Classifier nodes
I've seen the issue #16882 where the solution is to place the streaming LLM+Answer nodes inside a Loop Node

In the video you can see me:

  1. Use the streaming node inside the Loop node (it works)
  2. Use the same flow with the streaming node outside the Loop node (it does not work)
  3. Remove the branch coming from a non executed branch of the first question classifier (it works even if there's an IF/ELSE node before the second Classifier node)
  4. Execute a simple flow that seems to work in a freshly created chatflow
  5. Tear down my flow (upgraded from previous versions) to make it exactly like the working example (it does not work)
  6. Demonstrate that adding an IF/ELSE node before the Classifer nodes that merge into the LLM node breaks the streaming output.

It seems to me that there are multiple issue related to the Question Classifier node + IF/ELSE node + merging branches before a LLM node, but only if one of the branches remains not executed, and possibly even something broken in old DSL because of point 4 and 5 yelding different results

https://drive.google.com/file/d/1uwQ142aKNAQKbwCex2dDgFQ8-G_hmNUS/view?usp=drive_link

✔️ Expected Behavior

LLM nodes should stream the output independendantly of what happened in previous branches with Classier nodes

❌ Actual Behavior

LLM node does not stream the output in more than one condition when a Classifier node was previously used

@dosubot dosubot bot added the 🐞 bug Something isn't working label Mar 28, 2025
Copy link

dosubot bot commented Mar 28, 2025

There are a few relevant discussions and issues related to the problem you're experiencing with the LLM node not streaming output when used with Classifier and IF/ELSE nodes:

  1. Discussion on Slow Response with Streaming: There is an ongoing discussion about slow response times when using the Dify chatflow's API with streaming enabled [1].

  2. Issue with Streaming Output Failing: A closed issue addressed a problem where streaming output failed when LLM nodes enabled error handling. This was resolved by a pull request that fixed the stream output error [2].

  3. Classifier Node and IF/ELSE Node Issues: There is a known limitation in version v0.15.3 where the parallel chatflow gets stuck at the Question Classifier node when there is a connection between the Variable Assigner and If/Else node. This is expected to be fixed in the upcoming v1.0.0 release [3].

  4. Nested IF/ELSE Workflow Issue: The LLM node not streaming output in certain conditions, especially with nested IF/ELSE nodes, was due to an issue in the _fetch_node_ids_in_reachable_branch method. This has been addressed in a recent update through PR fix: issue #12068 by test is answer in the ids #12105 in Dify version 0.14.2 [4].

It seems like your issue might be related to these known problems, particularly with the handling of branches and the execution order in workflows involving Classifier and IF/ELSE nodes. You might want to check if your setup aligns with the solutions provided in these discussions and issues. If the problem persists, consider whether any recent updates or patches might address your specific scenario.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@ft4710403
Copy link

ft4710403 commented Mar 29, 2025

我找到一个解决方法,把流输出有问题的节点,套在一个循环组件里面,只循环1次,在里面放入llm和回复组件,即可恢复正常的流输出~~~

Image

@DavideDelbianco
Copy link
Author

@ft4710403 it's among the tests that I've done, but the issue needs to be fixed 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants