Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(llm/nodes.py): Ensure that the output returns without any exceptions #14880

Merged
merged 1 commit into from
Mar 5, 2025

Conversation

auxpd
Copy link
Contributor

@auxpd auxpd commented Mar 4, 2025

Summary

Using a workflow for tool encapsulation, when core/workflow/nodes/llm/node.py LLMNode._run is called before the result_text is initialized, it causes the last output dict to reference an uninitialized variable

I directly appended the correct output to the result of the LLM output to ensure that there were no exceptions in the preceding process.

Tip

Close issue syntax: Fixes #<issue number> or Resolves #<issue number>, see documentation for more details.
Fixes #14876

Screenshots

Before After
... ...

Checklist

Important

Please review the checklist below before submitting your pull request.

  • This change requires a documentation update, included: Dify Document
  • I understand that this PR may be closed in case there was no previous discussion or issues. (This doesn't apply to typos!)
  • I've added a test for each change that was introduced, and I tried as much as possible to make a single atomic change.
  • I've updated the documentation accordingly.
  • I ran dev/reformat(backend) and cd web && npx lint-staged(frontend) to appease the lint gods

@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. 🐞 bug Something isn't working labels Mar 4, 2025
@auxpd
Copy link
Contributor Author

auxpd commented Mar 4, 2025

I have a question, why the source code pulls the outside of the exception catch? I'm not sure if I'm missing any designs

@crazywoola crazywoola requested a review from laipz8200 March 5, 2025 02:43
@laipz8200
Copy link
Member

Can this change fix #14876? It seems that unless the issue occurs at

outputs = {"text": result_text, "usage": jsonable_encoder(usage), "finish_reason": finish_reason}

there should be no difference between the two.

@auxpd
Copy link
Contributor Author

auxpd commented Mar 5, 2025

This is the previous error stack message:
2025-03-01 18:56:13.046 ERROR [ThreadPoolExecutor-10_5] [graph_engine.py:815] - Node 信息提取 run failed
Traceback (most recent call last):
File "/app/api/core/workflow/graph_engine/graph_engine.py", line 626, in _run_node
for item in generator:
^^^^^^^^^
File "/app/api/core/workflow/nodes/base/node.py", line 84, in run
yield from result
File "/app/api/core/workflow/nodes/llm/node.py", line 212, in _run
outputs = {"text": result_text, "usage": jsonable_encoder(usage), "finish_reason": finish_reason}
^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'result_text' where it is not associated with a value
2025-03-01 18:56:13.047 ERROR [ThreadPoolExecutor-10_5] [graph_engine.py:576] - Unknown Error when generating in parallel
Traceback (most recent call last):
File "/app/api/core/workflow/graph_engine/graph_engine.py", line 553, in _run_parallel_node
for item in generator:
^^^^^^^^^
File "/app/api/core/workflow/graph_engine/graph_engine.py", line 312, in _run
raise e
File "/app/api/core/workflow/graph_engine/graph_engine.py", line 283, in _run
for item in generator:
^^^^^^^^^
File "/app/api/core/workflow/graph_engine/graph_engine.py", line 816, in _run_node
raise e
File "/app/api/core/workflow/graph_engine/graph_engine.py", line 626, in _run_node
for item in generator:
^^^^^^^^^
File "/app/api/core/workflow/nodes/base/node.py", line 84, in run
yield from result
File "/app/api/core/workflow/nodes/llm/node.py", line 212, in _run
outputs = {"text": result_text, "usage": jsonable_encoder(usage), "finish_reason": finish_reason}
^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'result_text' where it is not associated with a value

@auxpd
Copy link
Contributor Author

auxpd commented Mar 5, 2025

If "outputs" dict is expected to be returned with a normal response, would it be more appropriate to put it in the try struct?

@laipz8200
Copy link
Member

laipz8200 commented Mar 5, 2025

The root issue is that a return statement is missing in the except block, but your method can also resolve the problem.

Copy link
Member

@laipz8200 laipz8200 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Mar 5, 2025
@laipz8200 laipz8200 merged commit 4f6a4f2 into langgenius:main Mar 5, 2025
6 checks passed
jackzhuo pushed a commit to jackzhuo/dify that referenced this pull request Mar 14, 2025
parambharat pushed a commit to parambharat/dify that referenced this pull request Mar 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working lgtm This PR has been approved by a maintainer size:M This PR changes 30-99 lines, ignoring generated files.
Projects
None yet
2 participants