Skip to content

[Flex Attention] Apply patch from pytorch#143553 instead of using fork #3945

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 16, 2025

Conversation

anmyachev
Copy link
Contributor

No description provided.

@@ -18,3 +18,4 @@ cd "$REPO_ROOT"

# curl -sSL https://github.com/pytorch/pytorch/pull/126516.diff | git apply -
git apply "${SCRIPT_DIR}/pytorch_fp64.patch"
curl -sSL https://github.com/pytorch/pytorch/pull/143553.diff | git apply --exclude=test/inductor/test_flex_attention.py --exclude=test/inductor/test_flex_decoding.py -
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test files contain a lot of changes at the moment. Since the changes from them are not directly needed to run the benchmarks, I excluded them (to simplify).

@anmyachev anmyachev marked this pull request as ready for review April 16, 2025 19:30
@anmyachev
Copy link
Contributor Author

@whitneywhtsang do you want to merge #3943 before?

@whitneywhtsang
Copy link
Contributor

@whitneywhtsang do you want to merge #3943 before?

It is now merged.

@whitneywhtsang whitneywhtsang merged commit 4a53fb0 into main Apr 16, 2025
6 checks passed
@whitneywhtsang whitneywhtsang deleted the amyachev/3887 branch April 16, 2025 19:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Flex Attention] Remove forked pytorch usage from benchmarks workflow
3 participants