This project can convert original AlexeyAB/darknet model weights & cfg to ONNX format.
main.py
shows all the steps as following:
- Export darknet weights to ONNX format via PyTorch
- Run the inference including preprocessing & postprocessing
- Visualize the result
Supported models:
- YOLOv4
- YOLOv3
- YOLOv4-csp (Scaled-YOLOv4)
- YOLOv4-tiny
Other models are not tested but you can try
Use data/dog.jpg
as the example:
YOLOv4
-
Darknet
bicycle: 92% (left_x: 114 top_y: 128 width: 458 height: 299) dog: 98% (left_x: 129 top_y: 225 width: 184 height: 317) truck: 92% (left_x: 464 top_y: 77 width: 221 height: 93) pottedplant: 33% (left_x: 681 top_y: 109 width: 37 height: 45)
-
DarknetONNX
bicycle: 92% (left_x: 114 top_y: 127 width: 458 height: 299) dog: 98% (left_x: 128 top_y: 224 width: 185 height: 317) truck: 92% (left_x: 463 top_y: 76 width: 221 height: 93) pottedplant: 33% (left_x: 681 top_y: 109 width: 36 height: 45)
More visualizations & Inference speed comparison can be found at docs/results/COMPARISON.md.
- torch >= 1.9.1 (verified: 1.9.1 ~ 1.13.1)
- opencv-python
- onnxruntime >= 1.9.0 (verified: 1.9.0 ~ 1.14.1)
- onnxmltools >= 1.10.0 (verified: 1.10.0 ~ 1.11.2)
- packaging
pip install -r requirements.txt
-
Prepare pretrained model weights or your custom model weights
mkdir weights wget -O weights/yolov3.weights https://pjreddie.com/media/files/yolov3.weights wget -O weights/yolov4.weights https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v3_optimal/yolov4.weights
-
Convert Darknet model
-
Convert and visualize the result:
# help python3 main.py -h # darknet cfg & weights python3 main.py --cfg cfg/yolov3.cfg --weight weights/yolov3.weights --img data/dog.jpg --names data/coco.names python3 main.py --cfg cfg/yolov4.cfg --weight weights/yolov4.weights --img data/dog.jpg --names data/coco.names python3 main.py --cfg cfg/yolov4-csp.cfg --weight weights/yolov4-csp.weights --img data/dog.jpg --names data/coco.names # custom cfg & weights python3 main.py --cfg cfg/yolov4-obj.cfg --weight weights/yolov4-obj.weights --img your.jpg --names your.names # it will show index if not specifying `--names` python3 main.py --cfg cfg/yolov4.cfg --weight weights/yolov4.weights --img data/dog.jpg
Outputs
model.onnx
andonnx_predictions.jpg
. -
Only convert the model (use standalone
darknetonnx.darknet
)# help python3 -m darknetonnx.darknet -h # darknet yolov3 python3 -m darknetonnx.darknet --cfg cfg/yolov3.cfg --weight weights/yolov3.weights # darknet yolov3 with float16 python3 -m darknetonnx.darknet --cfg cfg/yolov3.cfg --weight weights/yolov3.weights --to-float16
-
- Create Python package
- (22/03/03) Support FP16 (half precision) conversion (
--to_float16
) - (21/11/09) Support YOLOv3, YOLOv4, YOLOv4-csp, YOLOv4-tiny conversion
pjreddie/darknet#558 (comment)
WongKinYiu/ScaledYOLOv4#202 (comment)
- https://github.com/AlexeyAB/darknet
- https://github.com/Tianxiaomo/pytorch-YOLOv4
- https://github.com/Megvii-BaseDetection/YOLOX
torch.onnx._export
has deprecated the keyword argument example_outputs
with torch > 1.10.1
.
The newest version of this repository has fixed the issue.