-
Notifications
You must be signed in to change notification settings - Fork 363
misc: jit: Deprecate load_cuda_ops()
#1066
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you mind checking the failed case in
https://ci.tlcpack.ai/blue/organizations/jenkins/flashinfer-ci/detail/lequn%2Fremove-load_cuda_ops/2/pipeline/16
[2025-05-19T07:21:06.255Z] !!!!! _pytest.outcomes.Exit: 'tuple' object has no attribute 'ninja_path' !!!!!!
7cf6b0f
to
3274066
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we still keep load_cuda_ops
for a while? As some of the other PRs still depends on it.
Can you point to other use cases for |
Coming from some of the PRs that have not been merged yet. |
3274066
to
939a30d
Compare
load_cuda_ops()
load_cuda_ops()
Updated. Added deprecation warning. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you!
Part of AOT Refactor (flashinfer-ai#1064). This PR changes the caller of `load_cuda_ops()` to directly use `JitSpec`. Also add deprecation warning for `load_cuda_ops()`.
Part of AOT Refactor (#1064).
This PR changes the caller of
load_cuda_ops()
to directly useJitSpec
. Also add deprecation warning forload_cuda_ops()
.