@@ -57,7 +57,7 @@ following platforms. See [Building on Unsupported
57
57
Platforms] ( #building-on-unsupported-platforms ) if you are attempting
58
58
to build Triton on a platform that is not listed here.
59
59
60
- * [ Ubuntu 20 .04, x86-64] ( #building-for-ubuntu-2004 )
60
+ * [ Ubuntu 22 .04, x86-64] ( #building-for-ubuntu-2204 )
61
61
62
62
* [ Jetpack 4.x, NVIDIA Jetson (Xavier, Nano, TX2)] ( #building-for-jetpack-4x )
63
63
@@ -67,9 +67,9 @@ If you are developing or debugging Triton, see [Development and
67
67
Incremental Builds] ( #development-and-incremental-builds ) for information
68
68
on how to perform incremental build.
69
69
70
- ## Building for Ubuntu 20 .04
70
+ ## Building for Ubuntu 22 .04
71
71
72
- For Ubuntu-20 .04, build.py supports both a Docker build and a
72
+ For Ubuntu-22 .04, build.py supports both a Docker build and a
73
73
non-Docker build.
74
74
75
75
* [ Build using Docker] ( #building-with-docker ) and the TensorFlow and PyTorch
@@ -274,7 +274,7 @@ issues since non-supported versions are not tested.
274
274
## Building for Windows 10
275
275
276
276
For Windows 10, build.py supports both a Docker build and a non-Docker
277
- build in a similar way as described for [ Ubuntu] ( #building-for-ubuntu-2004 ) . The primary
277
+ build in a similar way as described for [ Ubuntu] ( #building-for-ubuntu-2204 ) . The primary
278
278
difference is that the minimal/base image used as the base of
279
279
Dockerfile.buildbase image can be built from the provided
280
280
[ Dockerfile.win10.min] ( https://github.com/triton-inference-server/server/blob/main/Dockerfile.win10.min )
@@ -378,7 +378,7 @@ platforms by reading the above documentation and then follow the
378
378
process for the supported platform that most closely matches the
379
379
platform you are interested in (for example, if you are trying to
380
380
build for RHEL/x86-64 then follow the [ Building for Ubuntu
381
- 20 .04] ( #building-for-ubuntu-2004 ) process. You will likely need to
381
+ 22 .04] ( #building-for-ubuntu-2204 ) process. You will likely need to
382
382
make changes in the following areas and then manually run docker_build
383
383
and cmake_build or the equivalent commands to perform a build.
384
384
@@ -410,7 +410,7 @@ and cmake_build or the equivalent commands to perform a build.
410
410
[ TensorFlow] ( https://github.com/triton-inference-server/tensorflow_backend )
411
411
backend extracts pre-built shared libraries from the TensorFlow NGC
412
412
container as part of the build. This container is only available for
413
- Ubuntu-20 .04 / x86-64, so if you require the TensorFlow backend for
413
+ Ubuntu-22 .04 / x86-64, so if you require the TensorFlow backend for
414
414
your platform you will need download the TensorFlow container and
415
415
modify its build to produce shared libraries for your platform. You
416
416
must use the TensorFlow source and build scripts from within the NGC
0 commit comments