Skip to content

Commit 9c264c0

Browse files
authored
misc: update REAMDME.md (#1003)
1 parent 382a4d7 commit 9c264c0

File tree

2 files changed

+18
-3
lines changed

2 files changed

+18
-3
lines changed

README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -48,15 +48,15 @@ Using our PyTorch API is the easiest way to get started:
4848
We provide prebuilt python wheels for Linux. Install FlashInfer with the following command:
4949

5050
```bash
51-
# For CUDA 12.4 & torch 2.5
52-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.5
51+
# For CUDA 12.6 & torch 2.6
52+
pip install flashinfer-python -i https://flashinfer.ai/whl/cu126/torch2.6
5353
# For other CUDA & torch versions, check https://docs.flashinfer.ai/installation.html
5454
```
5555

5656
To try the latest features from the main branch, use our nightly-built wheels:
5757

5858
```bash
59-
pip install flashinfer-python -i https://flashinfer.ai/whl/nightly/cu124/torch2.4
59+
pip install flashinfer-python -i https://flashinfer.ai/whl/nightly/cu126/torch2.6
6060
```
6161

6262
For a JIT version (compiling every kernel from scratch, [NVCC](https://developer.nvidia.com/cuda-downloads) is required), install from [PyPI](https://pypi.org/project/flashinfer-python/):

docs/installation.rst

+15
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,21 @@ Quick Start
2727
The easiest way to install FlashInfer is via pip, we host wheels with indexed URL for different PyTorch versions and CUDA versions. Please note that the package currently used by FlashInfer is named ``flashinfer-python``, not ``flashinfer``.
2828

2929
.. tabs::
30+
.. tab:: PyTorch 2.6
31+
32+
.. tabs::
33+
34+
.. tab:: CUDA 12.6
35+
36+
.. code-block:: bash
37+
38+
pip install flashinfer-python -i https://flashinfer.ai/whl/cu126/torch2.6/
39+
40+
.. tab:: CUDA 12.4
41+
42+
.. code-block:: bash
43+
44+
pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.6/
3045
3146
.. tab:: PyTorch 2.5
3247

0 commit comments

Comments
 (0)