-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Is vllm==0.8.3 causing some incompatible problems #602
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
vllm==0.7.1,it works |
这是来自QQ邮箱的假期自动回复邮件。off line sorry
|
same question about VLLMModelConfig.init() got an unexpected keyword argument 'max_num_batched_tokens' |
Hi everyone, yes you need to use the pinned version of There's a separate issue with DP>1 that is being tracked here: huggingface/lighteval#670 |
Thanks! |
Well, I try to clone the latest version of lighteval repository and install from source.
The DeepSeek-R1-Distill-Qwen-7B's performance on AIME24 seems to be too large? If any one can get the similar or diffirent results, we can share with each other and have a further discussion. |
@lewtun plz tell me which version of lighteval is compatiable, I am also struggling with that |
Hi, you can follow what I mentioned before:
|
After I increase the max_new_token from 16k to 28k, I get higher performance on 7B model but a slightly lower performance on 1.5B model. |
Do you encounter the same problem mentioned in #463 when using |
Hi there, we are encountering the truncating problem of |
Hi @Nativu5 I think you can mostly ignore the truncation warning or alternatively set |
First I got vllm=0.8.3 and lighteval=0.8.1dev
but problem AttributeError
then I follow some suggestions like checkout the git repository to a certain version, and the problem really disappeared
However, a new problem says VLLMModelConfig.init() got an unexpected keyword argument 'max_num_batched_tokens'
Then I remember when I clone the project at the beginning, vllm version was 0.7.2
I pip install a 0.7.2 version, but more problems arise
:<
The text was updated successfully, but these errors were encountered: