-
Notifications
You must be signed in to change notification settings - Fork 14k
LLM node not streaming #17085
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
There are a few relevant discussions and issues related to the problem you're experiencing with the LLM node not streaming output when used with Classifier and IF/ELSE nodes:
It seems like your issue might be related to these known problems, particularly with the handling of branches and the execution order in workflows involving Classifier and IF/ELSE nodes. You might want to check if your setup aligns with the solutions provided in these discussions and issues. If the problem persists, consider whether any recent updates or patches might address your specific scenario. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
@ft4710403 it's among the tests that I've done, but the issue needs to be fixed 😊 |
Self Checks
Dify version
1.1.3
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I've seen the issue #13626 where the bug is supposedly fixed
I've seen the issue #15700 where the solution seems to be to remove the IF/ELSE node before the Question classifier node
I've seen the issue #15700 where the bug seems to be relate to crossing Question Classifier nodes
I've seen the issue #16882 where the solution is to place the streaming LLM+Answer nodes inside a Loop Node
In the video you can see me:
It seems to me that there are multiple issue related to the Question Classifier node + IF/ELSE node + merging branches before a LLM node, but only if one of the branches remains not executed, and possibly even something broken in old DSL because of point 4 and 5 yelding different results
https://drive.google.com/file/d/1uwQ142aKNAQKbwCex2dDgFQ8-G_hmNUS/view?usp=drive_link
✔️ Expected Behavior
LLM nodes should stream the output independendantly of what happened in previous branches with Classier nodes
❌ Actual Behavior
LLM node does not stream the output in more than one condition when a Classifier node was previously used
The text was updated successfully, but these errors were encountered: