We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
yolo11n-cls的模型,自己先导出onnx,然后用如下命令导出成rnnn模型:
import sys from rknn.api import RKNN
DATASET_PATH = './datasets/coco128' DEFAULT_RKNN_PATH = './cls.rknn' DEFAULT_QUANT = False
def parse_arg(): if len(sys.argv) < 3: print("Usage: python3 {} onnx_model_path [platform] [dtype(optional)] [output_rknn_path(optional)]".format(sys.argv[0])) print(" platform choose from [rk3562,rk3566,rk3568,rk3588,rk3576,rk1808,rv1109,rv1126]") print(" dtype choose from [i8, fp] for [rk3562,rk3566,rk3568,rk3588,rk3576]") print(" dtype choose from [u8, fp] for [rk1808,rv1109,rv1126]") exit(1)
model_path = sys.argv[1] platform = sys.argv[2] do_quant = DEFAULT_QUANT if len(sys.argv) > 3: model_type = sys.argv[3] if model_type not in ['i8', 'u8', 'fp']: print("ERROR: Invalid model type: {}".format(model_type)) exit(1) elif model_type in ['i8', 'u8']: do_quant = True else: do_quant = False if len(sys.argv) > 4: output_path = sys.argv[4] else: output_path = DEFAULT_RKNN_PATH return model_path, platform, do_quant, output_path
if name == 'main': model_path, platform, do_quant, output_path = parse_arg()
# Create RKNN object rknn = RKNN(verbose=False) # Pre-process config print('--> Config model') rknn.config(mean_values=[[0, 0, 0]], std_values=[[255, 255, 255]], target_platform=platform) print('done') # Load model print('--> Loading model') ret = rknn.load_onnx(model=model_path, inputs=['images'], input_size_list=[[1,3,256,256]]) if ret != 0: print('Load model failed!') exit(ret) print('done') # Build model print('--> Building model') ret = rknn.build(do_quantization=do_quant, dataset=DATASET_PATH) if ret != 0: print('Build model failed!') exit(ret) print('done') # Export rknn model print('--> Export rknn model') ret = rknn.export_rknn(output_path) if ret != 0: print('Export rknn model failed!') exit(ret) print('done') # Release rknn.release()
但是拿导出的模型去做推理时,啥也检测不到,没法分类
The text was updated successfully, but these errors were encountered:
No branches or pull requests
yolo11n-cls的模型,自己先导出onnx,然后用如下命令导出成rnnn模型:
import sys
from rknn.api import RKNN
DATASET_PATH = './datasets/coco128'
DEFAULT_RKNN_PATH = './cls.rknn'
DEFAULT_QUANT = False
def parse_arg():
if len(sys.argv) < 3:
print("Usage: python3 {} onnx_model_path [platform] [dtype(optional)] [output_rknn_path(optional)]".format(sys.argv[0]))
print(" platform choose from [rk3562,rk3566,rk3568,rk3588,rk3576,rk1808,rv1109,rv1126]")
print(" dtype choose from [i8, fp] for [rk3562,rk3566,rk3568,rk3588,rk3576]")
print(" dtype choose from [u8, fp] for [rk1808,rv1109,rv1126]")
exit(1)
if name == 'main':
model_path, platform, do_quant, output_path = parse_arg()
但是拿导出的模型去做推理时,啥也检测不到,没法分类
The text was updated successfully, but these errors were encountered: