Skip to content

[Flex Attention] Remove forked pytorch usage from benchmarks workflow #3887

Closed
@vlad-penkin

Description

@vlad-penkin

# Install Pytorch with FlexAttention XPU support enabled

Our guiding principle is to use upstream PyTorch repo only if some PR's are not merged add those here:

Metadata

Metadata

Assignees

Type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions