Skip to content

I encounter a error: "AttributeError: 'LlamaAttention' object has no attribute 'rotary_emb'",when i run code with llama-1-7b. It happened in int_llama_layer.py: self.rotary_emb = copy.deepcopy(org_module.rotary_emb) #104

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
WX-yh opened this issue Jan 5, 2025 · 2 comments

Comments

@WX-yh
Copy link

WX-yh commented Jan 5, 2025

I encounter a error: "AttributeError: 'LlamaAttention' object has no attribute 'rotary_emb'",when i run code with llama-1-7b.
It happened in int_llama_layer.py:
self.rotary_emb = copy.deepcopy(org_module.rotary_emb)
what should i do?

@mostafaelhoushi
Copy link

Downgrading transformers to 4.31.0 solved the problem for me:

pip install transformers==4.31.0

@WX-yh
Copy link
Author

WX-yh commented Jan 27, 2025

Downgrading transformers to 4.31.0 solved the problem for me:

pip install transformers==4.31.0

thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@mostafaelhoushi @WX-yh and others