Skip to content

Python and flash_attn can be incompatible #320

@clessig

Description

@clessig

What happened?

We have python >=3.11 and <3.13 but flash_attn requires 3.12:

{ url = "https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp312-cp312-linux_x86_64.whl", marker = "sys_platform == 'linux'" },

What are the steps to reproduce the bug?

install project with python 3.11

Version

develop

Platform (OS and architecture)

Linux

Relevant log output

Accompanying data

No response

Organisation

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinginfraIssues related to infrastructure

    Type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions