Skip to content

Update README.md #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from
Closed

Update README.md #2

wants to merge 2 commits into from

Conversation

krenax
Copy link

@krenax krenax commented Dec 14, 2024

The example programs fail without these installation steps.

@davidmezzetti
Copy link
Member

Hello, I appreciate the PR!

Would you mind sharing what the error was you're encountering? Installing autoawq[kernels] is supposed to install autoawq-kernels.

@krenax
Copy link
Author

krenax commented Dec 15, 2024

The installation returns the following.

Collecting flash-attn>=2.2.0
  Using cached flash_attn-2.7.2.post1.tar.gz (3.1 MB)
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [6 lines of output]
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-8w16qsvf/flash-attn_3fc2607ec6af40a481216e2b4a618a20/setup.py", line 11, in <module>
          from packaging.version import parse, Version
      ModuleNotFoundError: No module named 'packaging'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
``

@davidmezzetti
Copy link
Member

I figured that was the case. I submitted this issue previously: casper-hansen/AutoAWQ#678

And put in this PR to the upstream project to try to fix the underlying issue. Dao-AILab/flash-attention#1380

But as of now it's not fixed. Perhaps this is a good solution. Were you sure it used the GPU without installing flash-attn?

@davidmezzetti
Copy link
Member

There were a couple other updates regarding the install. I updated the README to include this. Thank you for submitting this issue.

@davidmezzetti davidmezzetti added this to the v0.2.0 milestone Dec 17, 2024
@davidmezzetti davidmezzetti self-assigned this Dec 17, 2024
@krenax krenax changed the title Update README.md update README.md Jan 9, 2025
@krenax krenax changed the title update README.md Update README.md Jan 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants