You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This tutorial takes `mmdeploy-1.0.0rc0-windows-amd64-onnxruntime1.8.1.zip` and `mmdeploy-1.0.0rc0-windows-amd64-cuda11.1-tensorrt8.2.3.0.zip` as examples to show how to use the prebuilt packages.
24
+
This tutorial takes `mmdeploy-1.0.0rc1-windows-amd64-onnxruntime1.8.1.zip` and `mmdeploy-1.0.0rc1-windows-amd64-cuda11.1-tensorrt8.2.3.0.zip` as examples to show how to use the prebuilt packages.
25
25
26
26
The directory structure of the prebuilt package is as follows, where the `dist` folder is about model converter, and the `sdk` folder is related to model inference.
27
27
@@ -80,9 +80,9 @@ In order to use `ONNX Runtime` backend, you should also do the following steps.
80
80
5. Install `mmdeploy` (Model Converter) and `mmdeploy_python` (SDK Python API).
@@ -329,15 +329,15 @@ The following describes how to use the SDK's C API for inference
329
329
330
330
:point_right: The purpose is to make the exe find the relevant dll
331
331
332
-
If choose to add environment variables, add the runtime libraries path of `mmdeploy` (`mmdeploy-1.0.0rc0-windows-amd64-onnxruntime1.8.1\sdk\bin`) to the `PATH`.
332
+
If choose to add environment variables, add the runtime libraries path of `mmdeploy` (`mmdeploy-1.0.0rc1-windows-amd64-onnxruntime1.8.1\sdk\bin`) to the `PATH`.
333
333
334
334
If choose to copy the dynamic libraries, copy the dll in the bin directory to the same level directory of the just compiled exe (build/Release).
335
335
336
336
3. Inference:
337
337
338
338
It is recommended to use `CMD` here.
339
339
340
-
Under `mmdeploy-1.0.0rc0-windows-amd64-onnxruntime1.8.1\\sdk\\example\\build\\Release` directory:
340
+
Under `mmdeploy-1.0.0rc1-windows-amd64-onnxruntime1.8.1\\sdk\\example\\build\\Release` directory:
341
341
342
342
```
343
343
.\image_classification.exe cpu C:\workspace\work_dir\onnx\resnet\ C:\workspace\mmclassification\demo\demo.JPEG
@@ -347,15 +347,15 @@ The following describes how to use the SDK's C API for inference
347
347
348
348
1. Build examples
349
349
350
-
Under `mmdeploy-1.0.0rc0-windows-amd64-cuda11.1-tensorrt8.2.3.0\\sdk\\example` directory
350
+
Under `mmdeploy-1.0.0rc1-windows-amd64-cuda11.1-tensorrt8.2.3.0\\sdk\\example` directory
351
351
352
352
```
353
353
// Path should be modified according to the actual location
@@ -365,15 +365,15 @@ The following describes how to use the SDK's C API for inference
365
365
366
366
:point_right: The purpose is to make the exe find the relevant dll
367
367
368
-
If choose to add environment variables, add the runtime libraries path of `mmdeploy` (`mmdeploy-1.0.0rc0-windows-amd64-cuda11.1-tensorrt8.2.3.0\sdk\bin`) to the `PATH`.
368
+
If choose to add environment variables, add the runtime libraries path of `mmdeploy` (`mmdeploy-1.0.0rc1-windows-amd64-cuda11.1-tensorrt8.2.3.0\sdk\bin`) to the `PATH`.
369
369
370
370
If choose to copy the dynamic libraries, copy the dll in the bin directory to the same level directory of the just compiled exe (build/Release).
371
371
372
372
3. Inference
373
373
374
374
It is recommended to use `CMD` here.
375
375
376
-
Under `mmdeploy-1.0.0rc0-windows-amd64-cuda11.1-tensorrt8.2.3.0\\sdk\\example\\build\\Release` directory
376
+
Under `mmdeploy-1.0.0rc1-windows-amd64-cuda11.1-tensorrt8.2.3.0\\sdk\\example\\build\\Release` directory
377
377
378
378
```
379
379
.\image_classification.exe cuda C:\workspace\work_dir\trt\resnet C:\workspace\mmclassification\demo\demo.JPEG
0 commit comments