Skip to content

Commit feff99f

Browse files
authored
update flash attn select (#54630) (#54716)
1 parent 570daa1 commit feff99f

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

python/paddle/nn/functional/flash_attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ def _math_attention(
8181

8282

8383
def _select_sdp_cuda(head_dim):
84-
if head_dim < 128:
84+
if head_dim <= 128:
8585
return "flash_attn"
8686
else:
8787
return "mem_efficient"

0 commit comments

Comments
 (0)