Skip to content

The LLM node following multiple consecutive IF-ELSE nodes cannot perform streaming output normally. #13264

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
5 tasks done
berg-wang opened this issue Feb 6, 2025 · 7 comments
Labels
🌊 feat:workflow Workflow related stuff.

Comments

@berg-wang
Copy link

berg-wang commented Feb 6, 2025

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.15.2,0.14.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

In the chatflow, input p1=1,p2=2 to reproduce the bug

Image

The code of the DSL file is as follows

app:
  description: ''
  icon: 🤖
  icon_background: '#FFEAD5'
  mode: advanced-chat
  name: 聊天助手
  use_icon_as_answer_icon: false
kind: app
version: 0.1.5
workflow:
  conversation_variables: []
  environment_variables: []
  features:
    file_upload:
      allowed_file_extensions:
      - .JPG
      - .JPEG
      - .PNG
      - .GIF
      - .WEBP
      - .SVG
      allowed_file_types:
      - image
      allowed_file_upload_methods:
      - local_file
      - remote_url
      enabled: false
      fileUploadConfig:
        audio_file_size_limit: 50
        batch_count_limit: 10
        file_size_limit: 15
        image_file_size_limit: 10
        video_file_size_limit: 100
        workflow_file_upload_limit: 10
      image:
        enabled: false
        number_limits: 3
        transfer_methods:
        - local_file
        - remote_url
      number_limits: 3
    opening_statement: ''
    retriever_resource:
      enabled: true
    sensitive_word_avoidance:
      enabled: false
    speech_to_text:
      enabled: false
    suggested_questions: []
    suggested_questions_after_answer:
      enabled: false
    text_to_speech:
      enabled: false
      language: ''
      voice: ''
  graph:
    edges:
    - data:
        isInIteration: false
        sourceType: start
        targetType: if-else
      id: 1736473141049-source-1736500039341-target
      selected: false
      source: '1736473141049'
      sourceHandle: source
      target: '1736500039341'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: llm
        targetType: answer
      id: llm-source-1736905725300-target
      source: llm
      sourceHandle: source
      target: '1736905725300'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: if-else
        targetType: if-else
      id: 1736500039341-true-1736912452686-target
      source: '1736500039341'
      sourceHandle: 'true'
      target: '1736912452686'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: if-else
        targetType: llm
      id: 1736912452686-false-llm-target
      source: '1736912452686'
      sourceHandle: 'false'
      target: llm
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: if-else
        targetType: answer
      id: 1736912452686-true-1736500282728-target
      source: '1736912452686'
      sourceHandle: 'true'
      target: '1736500282728'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: if-else
        targetType: llm
      id: 1736500039341-false-llm-target
      selected: false
      source: '1736500039341'
      sourceHandle: 'false'
      target: llm
      targetHandle: target
      type: custom
      zIndex: 0
    nodes:
    - data:
        desc: ''
        selected: false
        title: 开始
        type: start
        variables:
        - label: p1
          max_length: 48
          options:
          - rag
          - coding
          required: true
          type: number
          variable: p1
        - label: p2
          max_length: 48
          options: []
          required: true
          type: number
          variable: p2
      height: 116
      id: '1736473141049'
      position:
        x: -168.62450458982806
        y: -116.76570389946635
      positionAbsolute:
        x: -168.62450458982806
        y: -116.76570389946635
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        context:
          enabled: true
          variable_selector:
          - sys
          - query
        desc: ''
        memory:
          query_prompt_template: '{{#sys.query#}}'
          role_prefix:
            assistant: ''
            user: ''
          window:
            enabled: true
            size: 10
        model:
          completion_params:
            temperature: 0.1
          mode: chat
          name: qwen2.5-instruct-7b
          provider: xinference
        prompt_template:
        - id: 297c5dcb-fcaa-4fcc-8f5e-a958a38cdd33
          role: system
          text: '{{#context#}}'
        selected: false
        title: LLM
        type: llm
        variables: []
        vision:
          enabled: false
      height: 98
      id: llm
      position:
        x: 901.7295665243596
        y: -50.052447015913074
      positionAbsolute:
        x: 901.7295665243596
        y: -50.052447015913074
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        cases:
        - case_id: 'true'
          conditions:
          - comparison_operator: '='
            id: 67f95487-2a09-4819-a7cd-8689ee4478a9
            value: '1'
            varType: number
            variable_selector:
            - '1736473141049'
            - p1
          id: 'true'
          logical_operator: and
        desc: ''
        selected: false
        title: 条件分支
        type: if-else
      height: 126
      id: '1736500039341'
      position:
        x: 177.47424548618807
        y: -124.68275136410372
      positionAbsolute:
        x: 177.47424548618807
        y: -124.68275136410372
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        answer: p1={{#1736473141049.p1#}},p2={{#1736473141049.p2#}}
        desc: ''
        selected: false
        title: 直接回复 2
        type: answer
        variables: []
      height: 122
      id: '1736500282728'
      position:
        x: 901.7295665243596
        y: -259.7791887951555
      positionAbsolute:
        x: 901.7295665243596
        y: -259.7791887951555
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        answer: '{{#llm.text#}}'
        desc: ''
        selected: false
        title: 直接回复 4
        type: answer
        variables: []
      height: 103
      id: '1736905725300'
      position:
        x: 1266.2290315905668
        y: -50.052447015913074
      positionAbsolute:
        x: 1266.2290315905668
        y: -50.052447015913074
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        cases:
        - case_id: 'true'
          conditions:
          - comparison_operator: '='
            id: e356acb3-6154-4f45-9e7f-af42f39e353b
            value: '1'
            varType: number
            variable_selector:
            - '1736473141049'
            - p2
          id: 'true'
          logical_operator: and
        desc: ''
        selected: false
        title: 条件分支 3
        type: if-else
      height: 126
      id: '1736912452686'
      position:
        x: 521.2606128153778
        y: -268.81052581580366
      positionAbsolute:
        x: 521.2606128153778
        y: -268.81052581580366
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    viewport:
      x: 404.110405425826
      y: 482.0449711931172
      zoom: 0.6597539553864473

✔️ Expected Behavior

the Direct-Reply node can handle streaming data responses.

❌ Actual Behavior

the Direct-Reply node is unable to handle streaming data responses.

Copy link

dosubot bot commented Feb 6, 2025

The issue you're experiencing with the LLM node not performing streaming output normally after multiple consecutive IF-ELSE nodes is a known problem in Dify workflows (version 0.14.2). This issue has been addressed in a recent update through pull request #12105. To resolve this, you should update to the latest version of Dify once the fix is released [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the 🌊 feat:workflow Workflow related stuff. label Feb 6, 2025
@crazywoola crazywoola added 🤔 cant-reproduce We can not reproduce it or the information is limited and removed 🤔 cant-reproduce We can not reproduce it or the information is limited labels Feb 6, 2025
@berg-wang
Copy link
Author

#12105 cannot fix my issue.

@Wasim-yang
Copy link

Wasim-yang commented Feb 8, 2025

0.15.3 has same question

@ITMeow
Copy link

ITMeow commented Feb 14, 2025

In the llm node after the http request node, none of them can achieve stream output

Image

@lasthash
Copy link

In the llm node after the http request node, none of them can achieve stream output

Image

after selecting the default value for the exception mode configuration in the HTTP request node, I encountered the same issue

@Kain-90
Copy link

Kain-90 commented Mar 3, 2025

Image

I have found the simplest example to reproduce this issue.

dify version: 0.15.3(self-hosted)

app:
  description: ''
  icon: 🤖
  icon_background: '#FFEAD5'
  mode: advanced-chat
  name: '1'
  use_icon_as_answer_icon: false
kind: app
version: 0.1.5
workflow:
  conversation_variables: []
  environment_variables: []
  features:
    file_upload:
      allowed_file_extensions:
      - .doc
      - .docx
      allowed_file_types:
      - custom
      allowed_file_upload_methods:
      - remote_url
      - local_file
      enabled: true
      image:
        enabled: false
        number_limits: 3
        transfer_methods:
        - local_file
        - remote_url
      number_limits: 1
    opening_statement: ''
    retriever_resource:
      enabled: true
    sensitive_word_avoidance:
      enabled: false
    speech_to_text:
      enabled: false
    suggested_questions: []
    suggested_questions_after_answer:
      enabled: false
    text_to_speech:
      enabled: false
      language: ''
      voice: ''
  graph:
    edges:
    - data:
        isInIteration: false
        sourceType: start
        targetType: if-else
      id: 1739251774129-source-1740729243446-target
      selected: false
      source: '1739251774129'
      sourceHandle: source
      target: '1740729243446'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: llm
        targetType: answer
      id: 1740729249423-source-answer-target
      source: '1740729249423'
      sourceHandle: source
      target: answer
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: if-else
        targetType: llm
      id: 1740729243446-true-1740729249423-target
      source: '1740729243446'
      sourceHandle: 'true'
      target: '1740729249423'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: if-else
        targetType: llm
      id: 1740993386020-true-1740729249423-target
      source: '1740993386020'
      sourceHandle: 'true'
      target: '1740729249423'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: start
        targetType: if-else
      id: 1739251774129-source-1740993386020-target
      source: '1739251774129'
      sourceHandle: source
      target: '1740993386020'
      targetHandle: target
      type: custom
      zIndex: 0
    nodes:
    - data:
        desc: ''
        selected: false
        title: Start
        type: start
        variables: []
      height: 54
      id: '1739251774129'
      position:
        x: 30
        y: 263
      positionAbsolute:
        x: 30
        y: 263
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        answer: '{{#1740729249423.text#}}'
        desc: ''
        selected: false
        title: Answer
        type: answer
        variables: []
      height: 103
      id: answer
      position:
        x: 1246
        y: 263
      positionAbsolute:
        x: 1246
        y: 263
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        cases:
        - case_id: 'true'
          conditions:
          - comparison_operator: empty
            id: 3add315b-8fa0-468f-b8bd-7521175862e6
            value: ''
            varType: string
            variable_selector:
            - sys
            - query
          id: 'true'
          logical_operator: and
        desc: ''
        selected: false
        title: IF-ELSE
        type: if-else
      height: 126
      id: '1740729243446'
      position:
        x: 399.42857142857144
        y: 163
      positionAbsolute:
        x: 399.42857142857144
        y: 163
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        context:
          enabled: false
          variable_selector: []
        desc: ''
        model:
          completion_params:
            temperature: 0.7
          mode: chat
          name: qwen-turbo
          provider: tongyi
        prompt_template:
        - id: 1a79a159-bec1-4051-a7e4-a549acef2e68
          role: system
          text: you're a helpful assistant!
        - id: a1eab112-e2ed-44b8-a129-f3b601945b3c
          role: user
          text: '{{#sys.query#}}'
        selected: true
        title: LLM
        type: llm
        variables: []
        vision:
          enabled: false
      height: 98
      id: '1740729249423'
      position:
        x: 942
        y: 263
      positionAbsolute:
        x: 942
        y: 263
      selected: true
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        cases:
        - case_id: 'true'
          conditions:
          - comparison_operator: not empty
            id: ca1524c8-64c4-42d1-9c8c-aad54afc4f02
            value: ''
            varType: string
            variable_selector:
            - sys
            - user_id
          id: 'true'
          logical_operator: and
        desc: ''
        selected: false
        title: IF-ELSE
        type: if-else
      height: 126
      id: '1740993386020'
      position:
        x: 399.42857142857144
        y: 363.4285714285714
      positionAbsolute:
        x: 399.42857142857144
        y: 363.4285714285714
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    viewport:
      x: 6.810069538623679
      y: 176.66501435412357
      zoom: 0.6999251370153812

Copy link

dosubot bot commented Apr 3, 2025

Hi, @berg-wang. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.

Issue Summary:

  • A bug in the self-hosted version of Dify (0.15.2, 0.14.2) affects the LLM node's streaming output after multiple IF-ELSE nodes.
  • Despite a suggested fix in pull request fix: issue #12068 by test is answer in the ids #12105, the issue persists in version 0.15.3.
  • Other users, including Wasim-yang and ITMeow, confirm the ongoing problem.
  • Kain-90 provided a YAML configuration to reproduce the issue, showing consistency across setups.
  • The community is actively engaged, with multiple reactions and comments.

Next Steps:

  • Please confirm if this issue is still relevant to the latest version of the Dify repository. If so, you can keep the discussion open by commenting here.
  • If there is no further activity, this issue will be automatically closed in 15 days.

Thank you for your understanding and contribution!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Apr 3, 2025
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Apr 18, 2025
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Apr 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🌊 feat:workflow Workflow related stuff.
Projects
None yet
Development

No branches or pull requests

6 participants