-
Notifications
You must be signed in to change notification settings - Fork 13.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
deepseek reports an error when using Response Format #11677
Comments
The error you're encountering suggests that the In the Dify platform, the valid options for the 'Response Format' when configuring an LLM node are "text", "json_object", and "json_schema" [1][2]. However, the error message indicates that there might be an issue with how these values are being processed or expected by the Here are a few steps you can take to troubleshoot and potentially resolve this issue:
If the issue persists, it might be necessary to look into the codebase or configuration files to ensure that the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
) Co-authored-by: zhaobs <[email protected]>
Self Checks
Dify version
0.13.2
Cloud or Self Hosted
Self Hosted (Docker), Self Hosted (Source)
Steps to reproduce
1 Create an llm node in the workflow, select deepseek-chat as the model, and set the Response Format to text or json_object

2 Run the node
[deepseek] Error: API request failed with status code 422: Failed to deserialize the JSON body into the target type: response_format: invalid type: string "text", expected internally tagged enum ResponseFormat at line 1 column 88

✔️ Expected Behavior
❌ Actual Behavior
[deepseek] Error: API request failed with status code 422: Failed to deserialize the JSON body into the target type: response_format: invalid type: string "text", expected internally tagged enum ResponseFormat at line 1 column 88
The text was updated successfully, but these errors were encountered: