Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deepseek reports an error when using Response Format #11677

Closed
5 tasks done
zhaobingshuang opened this issue Dec 16, 2024 · 1 comment · Fixed by #11678
Closed
5 tasks done

deepseek reports an error when using Response Format #11677

zhaobingshuang opened this issue Dec 16, 2024 · 1 comment · Fixed by #11678
Labels
🐞 bug Something isn't working

Comments

@zhaobingshuang
Copy link
Contributor

zhaobingshuang commented Dec 16, 2024

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.13.2

Cloud or Self Hosted

Self Hosted (Docker), Self Hosted (Source)

Steps to reproduce

1 Create an llm node in the workflow, select deepseek-chat as the model, and set the Response Format to text or json_object
2 Run the node
image

[deepseek] Error: API request failed with status code 422: Failed to deserialize the JSON body into the target type: response_format: invalid type: string "text", expected internally tagged enum ResponseFormat at line 1 column 88
1

✔️ Expected Behavior

❌ Actual Behavior

[deepseek] Error: API request failed with status code 422: Failed to deserialize the JSON body into the target type: response_format: invalid type: string "text", expected internally tagged enum ResponseFormat at line 1 column 88

Copy link

dosubot bot commented Dec 16, 2024

The error you're encountering suggests that the ResponseFormat parameter is not being recognized correctly when set to "text" or "json_object". This could be due to a mismatch between the expected values for the ResponseFormat and what is being provided.

In the Dify platform, the valid options for the 'Response Format' when configuring an LLM node are "text", "json_object", and "json_schema" [1][2]. However, the error message indicates that there might be an issue with how these values are being processed or expected by the deepseek-chat model.

Here are a few steps you can take to troubleshoot and potentially resolve this issue:

  1. Verify Configuration: Double-check the configuration of your LLM node to ensure that the Response Format is set correctly to one of the valid options.

  2. Check for Updates: Ensure that you are using the latest version of the Dify platform and the deepseek-chat model, as there might have been updates or bug fixes related to this issue.

  3. Review Model Integration: Make sure that the deepseek-chat model is integrated correctly according to the Dify documentation, especially regarding the ResponseFormat parameter.

If the issue persists, it might be necessary to look into the codebase or configuration files to ensure that the ResponseFormat enum is correctly defined and used in the model's API request handling.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant