Replies: 4 comments 13 replies
-
Starting from 0.9.0, you can load from modelDir from ModelZoo: You can refer to our doc here: http://docs.djl.ai/master/docs/load_model.html. There are many ways to load: like http, hdfs, s3 or simply local file paths are all supported. If you are looking for loading the model directly, you can do the followings:
Which is very similar when you create new instance of the model. There seemed to be no way to bypass setting model name from the |
Beta Was this translation helpful? Give feedback.
-
@csxiang18 If you put model in resource file, most likely you will pack the model in a jar file when deploy it. I this case, you have to use jar:// url to load the model: http://docs.djl.ai/docs/load_model.html#load-model-from-a-url
|
Beta Was this translation helpful? Give feedback.
-
Thanks for answering this question, as it's my first time to try DJL, so may some API I do not use very correctly. In fact I want to pack the model in the jar file as a lib to deploy so that others could use this. Maybe I could state the problem more clearly.
Then I want to provide a lib, and write like this, so others could use the function like this:
The above script works fine when I wrote the lib. However, when I package this lib, it could not load the model correctly. As Another question is that in the doc, it said that the predictor is not thread-safe. So the above way is the correct way to predict features? |
Beta Was this translation helpful? Give feedback.
-
@csxiang18 If you package your classes in a jar file, the URL returned from To resolve this issue, you can use Criteria API to load your model as I posted earlier. The api can take care both jar and unjared case.
|
Beta Was this translation helpful? Give feedback.
-
Question
I've already export a pytorch model into a pt file. For example:
model/my_model.pt
, this directory is under the resources directory.When I want to load this model, I use this way:
Is there any other way that I could use like this, load the pt file directly:
ptModel.load("model/my_model.pt")
?Beta Was this translation helpful? Give feedback.
All reactions