Skip to content

Azure OpenAI o1 default temperature is 0 unsuppoerted value error #10053

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
5 tasks done
kelvintsim opened this issue Oct 30, 2024 · 1 comment · Fixed by #10067
Closed
5 tasks done

Azure OpenAI o1 default temperature is 0 unsuppoerted value error #10053

kelvintsim opened this issue Oct 30, 2024 · 1 comment · Fixed by #10067

Comments

@kelvintsim
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.10.2

Cloud or Self Hosted

Self Hosted (Docker), Self Hosted (Source)

Steps to reproduce

Apply Azure openAI o1-preview or Azure openAI o1-mini in agent
螢幕截圖 2024-10-30 下午2 18 29

Default temperature is set to 0
螢幕截圖 2024-10-30 下午12 26 56

✔️ Expected Behavior

Default temperature should be 1 to use o1-preview or o1-mini

❌ Actual Behavior

Default temperature is 0

Copy link

dosubot bot commented Oct 30, 2024

Hello @kelvintsim! I'm here to assist you with any bugs, questions, or contributions. Let's work together to resolve the issue you're experiencing.

The default temperature setting in the Dify codebase for Azure OpenAI models is indeed 0.0 [1]. If you're encountering an error indicating that a temperature of 0 is unsupported by the model, you might need to adjust the default temperature setting to 1 for the specific models you're using, such as o1-preview or o1-mini.

To resolve this, you can manually change the temperature setting in the configuration to 1 through the user interface or by modifying the relevant configuration files if you're self-hosting. This should help avoid the "Bad Request Error" you're experiencing.

To continue talking to Dosu, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant