Replies: 1 comment
-
Hey @arthurvb Did you succeed to run tiny / medium models as well? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear All,
To save you time just some notes how I got Olive to work for
openai/whisper-large-v3
."save_as_external_data": true
in the json (e.g.whisper_cpu_int8.json
) in conversion/config and in prepost/configolive-env/lib/python3.10/site-packages/olive/passes/onnx/insert_beam_search.py
I removed the check of the onnx model by changing the last line toreturn model_proto_to_olive_model(combined_model, output_model_path, config, False)
. Otherwise the process stopped withThe model does not have an ir_version set properly.
Hopefully this will help some of you,
Arthur
Beta Was this translation helpful? Give feedback.
All reactions