Replies: 4 comments 1 reply
-
Hi @andife, yes its possible to export the predictor pytorch model trained ludwig to a static onnx graph.
Then import the library and load from the exported model path.
To then perform inference on the onnx graph you exported, passing pre-processed inputs in numpy format:
|
Beta Was this translation helpful? Give feedback.
-
Hi, it seems like there is an issue with ray backend which is the default for training. You should be able to install this with
But if you are getting errors due to windows or other reasons, you can update your config to add a backend of type
|
Beta Was this translation helpful? Give feedback.
-
I now use directly ubuntu and the backend-type local, I currently do not really get on.
The model saved in for example "results/api_experiment_run_1/model$" is a LudwigModel? not converted to torchscript? Furthermore, I am even more interested in the export of an image classification problem. (so mnist might an even better example for me) But there I currently get the same error. Has anyone else observed this with den keys? |
Beta Was this translation helpful? Give feedback.
-
Hi @andife please make sure your config is well formed yaml. It seems like the training process is not able to start due to the inputs not being recognised in your model. |
Beta Was this translation helpful? Give feedback.
-
Hello, I'm new to ludwig.
Can some provide a code fragment, how to export the final model to onnx?
Thank you
Andreas
Beta Was this translation helpful? Give feedback.
All reactions