-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parameters and meta data #21
Comments
Each model have two methods called dump_parameters and load_parameters, which defined by user and store whatever user want. So after training, the dump_parameters will be called and the parameter will be stored under file "params". When we build an inference job 2 services will start, one is predictor, another one is called inference service. Predictor is a http server, getting images from user, and produce those images to kafka, inference service serve as consumer, which will load the model class from DB(postgresql) and then call the load_parameters method to load parameters we stored in "params". |
|
1, The parameters dumped into file name as aaa.model for example, we can dump all meta-data together, no matter what we dump, we must load them in load_parameters methods with the same data structure. |
During training, the model may generate some meta data;
The training dataset may have some meta data, e.g., the mapping from the index to the label name;
These meta data is used during inference.
How to pass them to the predict function?
The text was updated successfully, but these errors were encountered: