首页 星云 工具 资源 星选 资讯 热门工具
:

PDF转图片 完全免费 小红书视频下载 无水印 抖音视频下载 无水印 数字星空

YOLOV8 多batch Tensorrt推理(Python)

后端 46.85MB 12 需要积分: 1
立即下载

资源介绍:

Pytorch =》 ONNX =》 Tensorrt(python推理)
# YOLOv8-TensorRT `YOLOv8` using TensorRT accelerate ! --- [![Build Status](https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fatrox%2Fsync-dotenv%2Fbadge&style=flat)](https://github.com/triple-Mu/YOLOv8-TensorRT) [![Python Version](https://img.shields.io/badge/Python-3.8--3.10-FFD43B?logo=python)](https://github.com/triple-Mu/YOLOv8-TensorRT) [![img](https://badgen.net/badge/icon/tensorrt?icon=azurepipelines&label)](https://developer.nvidia.com/tensorrt) [![C++](https://img.shields.io/badge/CPP-11%2F14-yellow)](https://github.com/triple-Mu/YOLOv8-TensorRT) [![img](https://badgen.net/github/license/triple-Mu/YOLOv8-TensorRT)](https://github.com/triple-Mu/YOLOv8-TensorRT/blob/main/LICENSE) [![img](https://badgen.net/github/prs/triple-Mu/YOLOv8-TensorRT)](https://github.com/triple-Mu/YOLOv8-TensorRT/pulls) [![img](https://img.shields.io/github/stars/triple-Mu/YOLOv8-TensorRT?color=ccf)](https://github.com/triple-Mu/YOLOv8-TensorRT) --- # Prepare the environment 1. Install `CUDA` follow [`CUDA official website`](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#download-the-nvidia-cuda-toolkit). 🚀 RECOMMENDED `CUDA` >= 11.4 2. Install `TensorRT` follow [`TensorRT official website`](https://developer.nvidia.com/nvidia-tensorrt-8x-download). 🚀 RECOMMENDED `TensorRT` >= 8.4 2. Install python requirements. ``` shell pip install -r requirements.txt ``` 3. Install [`ultralytics`](https://github.com/ultralytics/ultralytics) package for ONNX export or TensorRT API building. ``` shell pip install ultralytics ``` 5. Prepare your own PyTorch weight such as `yolov8s.pt` or `yolov8s-seg.pt`. ***NOTICE:*** Please use the latest `CUDA` and `TensorRT`, so that you can achieve the fastest speed ! If you have to use a lower version of `CUDA` and `TensorRT`, please read the relevant issues carefully ! # Normal Usage If you get ONNX from origin [`ultralytics`](https://github.com/ultralytics/ultralytics) repo, you should build engine by yourself. You can only use the `c++` inference code to deserialize the engine and do inference. You can get more information in [`Normal.md`](docs/Normal.md) ! Besides, other scripts won't work. # Export End2End ONNX with NMS You can export your onnx model by `ultralytics` API and add postprocess such as bbox decoder and `NMS` into ONNX model at the same time. ``` shell python3 export-det.py \ --weights yolov8s.pt \ --iou-thres 0.65 \ --conf-thres 0.25 \ --topk 100 \ --opset 11 \ --sim \ --input-shape 1 3 640 640 \ --device cuda:0 ``` #### Description of all arguments - `--weights` : The PyTorch model you trained. - `--iou-thres` : IOU threshold for NMS plugin. - `--conf-thres` : Confidence threshold for NMS plugin. - `--topk` : Max number of detection bboxes. - `--opset` : ONNX opset version, default is 11. - `--sim` : Whether to simplify your onnx model. - `--input-shape` : Input shape for you model, should be 4 dimensions. - `--device` : The CUDA deivce you export engine . You will get an onnx model whose prefix is the same as input weights. ### Just Taste First If you just want to taste first, you can download the onnx model which are exported by `YOLOv8` package and modified by me. [**YOLOv8-n**](https://triplemu-shared.oss-cn-beijing.aliyuncs.com/models/yolov8n.onnx?OSSAccessKeyId=LTAI5tNk9iiMqhFC64jCcgpv&Expires=2690974569&Signature=3ct9pnRygBduWdgAtfKOQAt4PeU%3D) [**YOLOv8-s**](https://triplemu-shared.oss-cn-beijing.aliyuncs.com/models/yolov8s.onnx?OSSAccessKeyId=LTAI5tNk9iiMqhFC64jCcgpv&Expires=10000000001690974000&Signature=cbHjUwmRsYdvilcirzjBI6%2BzmvI%3D) [**YOLOv8-m**](https://triplemu-shared.oss-cn-beijing.aliyuncs.com/models/yolov8m.onnx?OSSAccessKeyId=LTAI5tNk9iiMqhFC64jCcgpv&Expires=101690974603&Signature=XnJnQqbKsnJSKSgqVQ41kxoeETU%3D) [**YOLOv8-l**](https://triplemu-shared.oss-cn-beijing.aliyuncs.com/models/yolov8l.onnx?OSSAccessKeyId=LTAI5tNk9iiMqhFC64jCcgpv&Expires=2690974619&Signature=djxvNzcaFosHrMS5ylWh1R0%2Ff8E%3D) [**YOLOv8-x**](https://triplemu-shared.oss-cn-beijing.aliyuncs.com/models/yolov8x.onnx?OSSAccessKeyId=LTAI5tNk9iiMqhFC64jCcgpv&Expires=2690974637&Signature=DMmuT2wlfBzai%2BBpYJFcmNbkMKU%3D) # Build End2End Engine from ONNX ### 1. Build Engine by TensorRT ONNX Python api You can export TensorRT engine from ONNX by [`build.py` ](build.py). Usage: ``` shell python3 build.py \ --weights yolov8s.onnx \ --iou-thres 0.65 \ --conf-thres 0.25 \ --topk 100 \ --fp16 \ --device cuda:0 ``` #### Description of all arguments - `--weights` : The ONNX model you download. - `--iou-thres` : IOU threshold for NMS plugin. - `--conf-thres` : Confidence threshold for NMS plugin. - `--topk` : Max number of detection bboxes. - `--fp16` : Whether to export half-precision engine. - `--device` : The CUDA deivce you export engine . You can modify `iou-thres` `conf-thres` `topk` by yourself. ### 2. Export Engine by Trtexec Tools You can export TensorRT engine by [`trtexec`](https://github.com/NVIDIA/TensorRT/tree/main/samples/trtexec) tools. Usage: ``` shell /usr/src/tensorrt/bin/trtexec \ --onnx=yolov8s.onnx \ --saveEngine=yolov8s.engine \ --fp16 ``` **If you installed TensorRT by a debian package, then the installation path of `trtexec` is `/usr/src/tensorrt/bin/trtexec`** **If you installed TensorRT by a tar package, then the installation path of `trtexec` is under the `bin` folder in the path you decompressed** # Build TensorRT Engine by TensorRT API Please see more information in [`API-Build.md`](docs/API-Build.md) ***Notice !!!*** We don't support YOLOv8-seg model now !!! # Inference ## 1. Infer with python script You can infer images with the engine by [`infer-det.py`](infer-det.py) . Usage: ``` shell python3 infer-det.py \ --engine yolov8s.engine \ --imgs data \ --show \ --out-dir outputs \ --device cuda:0 ``` #### Description of all arguments - `--engine` : The Engine you export. - `--imgs` : The images path you want to detect. - `--show` : Whether to show detection results. - `--out-dir` : Where to save detection results images. It will not work when use `--show` flag. - `--device` : The CUDA deivce you use. - `--profile` : Profile the TensorRT engine. ## 2. Infer with C++ You can infer with c++ in [`csrc/detect/end2end`](csrc/detect/end2end) . ### Build: Please set you own librarys in [`CMakeLists.txt`](csrc/detect/end2end/CMakeLists.txt) and modify `CLASS_NAMES` and `COLORS` in [`main.cpp`](csrc/detect/end2end/main.cpp). ``` shell export root=${PWD} cd csrc/detect/end2end mkdir -p build && cd build cmake .. make mv yolov8 ${root} cd ${root} ``` Usage: ``` shell # infer image ./yolov8 yolov8s.engine data/bus.jpg # infer images ./yolov8 yolov8s.engine data # infer video ./yolov8 yolov8s.engine data/test.mp4 # the video path ``` # TensorRT Segment Deploy Please see more information in [`Segment.md`](docs/Segment.md) # TensorRT Pose Deploy Please see more information in [`Pose.md`](docs/Pose.md) # DeepStream Detection Deploy See more in [`README.md`](csrc/deepstream/README.md) # Jetson Deploy Only test on `Jetson-NX 4GB`. See more in [`Jetson.md`](docs/Jetson.md) # Profile you engine If you want to profile the TensorRT engine: Usage: ``` shell python3 trt-profile.py --engine yolov8s.engine --device cuda:0 ``` # Refuse To Use PyTorch for Model Inference !!! If you need to break away from pytorch and use tensorrt inference, you can get more information in [`infer-det-without-torch.py`](infer-det-without-torch.py), the usage is the same as the pytorch version, but its performance is much worse. You can use `cuda-python` or `pycuda` for inference. Please install by such command: ```shell pip install cuda-python # or pip install pycuda ``` Usage: ``` shell python3 infer-det-without-torch.py \ --engine yolov8s.engine \ --imgs data \ --show \ --out-dir outputs \ --method cudart ``` #### Description of all arguments - `--engine` : The Engine you export. - `--imgs` : The i

资源文件列表:

YOLOv8-TensorRT-main-2024.9.18.zip 大约有128个文件
  1. YOLOv8-TensorRT-main-2024.9.18/
  2. YOLOv8-TensorRT-main-2024.9.18/.gitignore 1.82KB
  3. YOLOv8-TensorRT-main-2024.9.18/.idea/
  4. YOLOv8-TensorRT-main-2024.9.18/.idea/.gitignore 50B
  5. YOLOv8-TensorRT-main-2024.9.18/.idea/inspectionProfiles/
  6. YOLOv8-TensorRT-main-2024.9.18/.idea/inspectionProfiles/profiles_settings.xml 174B
  7. YOLOv8-TensorRT-main-2024.9.18/.idea/inspectionProfiles/Project_Default.xml 4.1KB
  8. YOLOv8-TensorRT-main-2024.9.18/.idea/misc.xml 292B
  9. YOLOv8-TensorRT-main-2024.9.18/.idea/modules.xml 299B
  10. YOLOv8-TensorRT-main-2024.9.18/.idea/workspace.xml 7.08KB
  11. YOLOv8-TensorRT-main-2024.9.18/.idea/YOLOv8-TensorRT-main.iml 329B
  12. YOLOv8-TensorRT-main-2024.9.18/.pre-commit-config.yaml 646B
  13. YOLOv8-TensorRT-main-2024.9.18/build.py 1.87KB
  14. YOLOv8-TensorRT-main-2024.9.18/cmd.txt 1.36KB
  15. YOLOv8-TensorRT-main-2024.9.18/config.py 2.62KB
  16. YOLOv8-TensorRT-main-2024.9.18/csrc/
  17. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/
  18. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/CMakeLists.txt 1.52KB
  19. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/config_yoloV8.txt 3.06KB
  20. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/custom_bbox_parser/
  21. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/custom_bbox_parser/nvdsparsebbox_yoloV8.cpp 4.77KB
  22. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/deepstream_app_config.txt 2.56KB
  23. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/labels.txt 625B
  24. YOLOv8-TensorRT-main-2024.9.18/csrc/deepstream/README.md 2.08KB
  25. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/
  26. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/end2end/
  27. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/end2end/CMakeLists.txt 1.55KB
  28. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/end2end/include/
  29. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/end2end/include/common.hpp 4.34KB
  30. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/end2end/include/yolov8.hpp 9.79KB
  31. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/end2end/main.cpp 5.45KB
  32. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/normal/
  33. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/normal/CMakeLists.txt 1.69KB
  34. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/normal/include/
  35. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/normal/include/common.hpp 4.34KB
  36. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/normal/include/yolov8.hpp 11.19KB
  37. YOLOv8-TensorRT-main-2024.9.18/csrc/detect/normal/main.cpp 5.65KB
  38. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/
  39. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/detect/
  40. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/detect/CMakeLists.txt 1.53KB
  41. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/detect/include/
  42. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/detect/include/common.hpp 4.34KB
  43. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/detect/include/yolov8.hpp 9.75KB
  44. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/detect/main.cpp 5.41KB
  45. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/pose/
  46. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/pose/CMakeLists.txt 1.68KB
  47. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/pose/include/
  48. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/pose/include/common.hpp 4.37KB
  49. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/pose/include/yolov8-pose.hpp 12.65KB
  50. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/pose/main.cpp 6.81KB
  51. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/segment/
  52. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/segment/CMakeLists.txt 1.68KB
  53. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/segment/include/
  54. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/segment/include/common.hpp 4.37KB
  55. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/segment/include/yolov8-seg.hpp 12.7KB
  56. YOLOv8-TensorRT-main-2024.9.18/csrc/jetson/segment/main.cpp 6.17KB
  57. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/
  58. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/normal/
  59. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/normal/CMakeLists.txt 1.7KB
  60. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/normal/include/
  61. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/normal/include/common.hpp 4.37KB
  62. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/normal/include/yolov8-pose.hpp 12.69KB
  63. YOLOv8-TensorRT-main-2024.9.18/csrc/pose/normal/main.cpp 6.81KB
  64. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/
  65. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/normal/
  66. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/normal/CMakeLists.txt 1.7KB
  67. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/normal/include/
  68. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/normal/include/common.hpp 4.37KB
  69. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/normal/include/yolov8-seg.hpp 13.59KB
  70. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/normal/main.cpp 6.17KB
  71. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/simple/
  72. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/simple/CMakeLists.txt 1.7KB
  73. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/simple/include/
  74. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/simple/include/common.hpp 4.37KB
  75. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/simple/include/yolov8-seg.hpp 12.74KB
  76. YOLOv8-TensorRT-main-2024.9.18/csrc/segment/simple/main.cpp 6.17KB
  77. YOLOv8-TensorRT-main-2024.9.18/data/
  78. YOLOv8-TensorRT-main-2024.9.18/data/bus.jpg 476.01KB
  79. YOLOv8-TensorRT-main-2024.9.18/data/bus1.jpg 476.01KB
  80. YOLOv8-TensorRT-main-2024.9.18/data/zidane.jpg 164.99KB
  81. YOLOv8-TensorRT-main-2024.9.18/data/zidane1.jpg 164.99KB
  82. YOLOv8-TensorRT-main-2024.9.18/docs/
  83. YOLOv8-TensorRT-main-2024.9.18/docs/API-Build.md 719B
  84. YOLOv8-TensorRT-main-2024.9.18/docs/Jetson.md 4.66KB
  85. YOLOv8-TensorRT-main-2024.9.18/docs/Normal.md 2.09KB
  86. YOLOv8-TensorRT-main-2024.9.18/docs/Pose.md 2.89KB
  87. YOLOv8-TensorRT-main-2024.9.18/docs/Segment.md 6.22KB
  88. YOLOv8-TensorRT-main-2024.9.18/docs/star.md 172B
  89. YOLOv8-TensorRT-main-2024.9.18/export-det.py 3.06KB
  90. YOLOv8-TensorRT-main-2024.9.18/export-seg.py 2.25KB
  91. YOLOv8-TensorRT-main-2024.9.18/gen_pkl.py 1.28KB
  92. YOLOv8-TensorRT-main-2024.9.18/infer-det-bach1.py 2.83KB
  93. YOLOv8-TensorRT-main-2024.9.18/infer-det-bach4.py 5.62KB
  94. YOLOv8-TensorRT-main-2024.9.18/infer-det-without-torch.py 2.71KB
  95. YOLOv8-TensorRT-main-2024.9.18/infer-det.py 2.77KB
  96. YOLOv8-TensorRT-main-2024.9.18/infer-pose-without-torch.py 4.08KB
  97. YOLOv8-TensorRT-main-2024.9.18/infer-pose.py 4.04KB
  98. YOLOv8-TensorRT-main-2024.9.18/infer-seg-without-torch.py 3.68KB
  99. YOLOv8-TensorRT-main-2024.9.18/infer-seg.py 3.9KB
  100. YOLOv8-TensorRT-main-2024.9.18/LICENSE 1.04KB
  101. YOLOv8-TensorRT-main-2024.9.18/models/
  102. YOLOv8-TensorRT-main-2024.9.18/models/api.py 13.45KB
  103. YOLOv8-TensorRT-main-2024.9.18/models/common.py 6.26KB
  104. YOLOv8-TensorRT-main-2024.9.18/models/cudart_api.py 6.02KB
  105. YOLOv8-TensorRT-main-2024.9.18/models/engine.py 14.13KB
  106. YOLOv8-TensorRT-main-2024.9.18/models/pycuda_api.py 5.21KB
  107. YOLOv8-TensorRT-main-2024.9.18/models/torch_utils.py 3.47KB
  108. YOLOv8-TensorRT-main-2024.9.18/models/utils.py 9.94KB
  109. YOLOv8-TensorRT-main-2024.9.18/models/__init__.py 556B
  110. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/
  111. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/common.cpython-39.pyc 6.8KB
  112. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/engine.cpython-311.pyc 25.56KB
  113. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/engine.cpython-39.pyc 12.11KB
  114. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/torch_utils.cpython-39.pyc 2.91KB
  115. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/utils.cpython-39.pyc 7.47KB
  116. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/__init__.cpython-311.pyc 866B
  117. YOLOv8-TensorRT-main-2024.9.18/models/__pycache__/__init__.cpython-39.pyc 532B
  118. YOLOv8-TensorRT-main-2024.9.18/README.md 8.05KB
  119. YOLOv8-TensorRT-main-2024.9.18/requirements.txt 107B
  120. YOLOv8-TensorRT-main-2024.9.18/trt-profile.py 767B
  121. YOLOv8-TensorRT-main-2024.9.18/yolov8n.engine 8.73MB
  122. YOLOv8-TensorRT-main-2024.9.18/yolov8n.onnx 12.24MB
  123. YOLOv8-TensorRT-main-2024.9.18/yolov8n.pt 6.25MB
  124. YOLOv8-TensorRT-main-2024.9.18/yolov8n_bach4.engine 8.34MB
  125. YOLOv8-TensorRT-main-2024.9.18/yolov8n_bach4.onnx 12.62MB
  126. YOLOv8-TensorRT-main-2024.9.18/yolov8n_bach4.pt 6.25MB
  127. YOLOv8-TensorRT-main-2024.9.18/__pycache__/
  128. YOLOv8-TensorRT-main-2024.9.18/__pycache__/config.cpython-39.pyc 2.2KB
0评论
提交 加载更多评论
其他资源 Dev-Cpp 5.8.3 TDM-GCC 4.8.1 Setup(用于win10).zip
Dev-Cpp 5.8.3 TDM-GCC 4.8.1 Setup(用于win10).zip
1273134875180带式输送机.zip
1273134875180带式输送机.zip
谷歌浏览器禁用F1按键
谷歌浏览器禁用F1按键,解决谷歌按F1按键弹出谷歌浏览器帮助
西门子S7200smartPLC与三菱FX3uPlc做485Modbus RTU通信,西门子S7200smartPLC做主站轮训
西门子S7200smartPLC与三菱FX3uPlc做485Modbus RTU通信,西门子S7200smartPLC做主站轮训扫描读取写去数据转入三菱Plc 通信已测试没有问题,
西门子S7200smartPLC与三菱FX3uPlc做485Modbus RTU通信,西门子S7200smartPLC做主站轮训
sourcecodesinlecturenotes_17a18f188672d8636c05.zip
sourcecodesinlecturenotes_17a18f188672d8636c05.zip
F103C8T6-AD
F103C8T6-AD
F103C8T6-AD
23346264346546747
474747667547
MATLAB模糊控制算法,驾驶员制动意图识别,Fuzzy模糊控制算法,试验案例+模型+模糊控制器
MATLAB模糊控制算法,驾驶员制动意图识别,Fuzzy模糊控制算法,试验案例+模型+模糊控制器
MATLAB模糊控制算法,驾驶员制动意图识别,Fuzzy模糊控制算法,试验案例+模型+模糊控制器