-
Notifications
You must be signed in to change notification settings - Fork 14k
The LLM node following multiple consecutive IF-ELSE nodes cannot perform streaming output normally. #13264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
The issue you're experiencing with the LLM node not performing streaming output normally after multiple consecutive IF-ELSE nodes is a known problem in Dify workflows (version 0.14.2). This issue has been addressed in a recent update through pull request #12105. To resolve this, you should update to the latest version of Dify once the fix is released [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
#12105 cannot fix my issue. |
0.15.3 has same question |
Hi, @berg-wang. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale. Issue Summary:
Next Steps:
Thank you for your understanding and contribution! |
Self Checks
Dify version
0.15.2,0.14.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
In the chatflow, input p1=1,p2=2 to reproduce the bug
The code of the DSL file is as follows
✔️ Expected Behavior
the Direct-Reply node can handle streaming data responses.
❌ Actual Behavior
the Direct-Reply node is unable to handle streaming data responses.
The text was updated successfully, but these errors were encountered: