You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 13, 2023. It is now read-only.
Goal: Save a model as a mlem model, without sample_data.
I plan on loading in mlem model files, in deployment, like so:
from mlem.api import load
model = load('models/mlem-model')
To do this, I need to load in existing models and then save() them as a mlem model.
save() creates a model binary and a mlem metadata file.
models/
├── mlem-model
└── mlem-model.mlem
I've trained models, so don't pass sample_data as a paramter to save().
Code
from fastai.vision.all import *
import mlem
model = load_learner('models/model.pkl', cpu=True)
model.load('models/weights.pth')
mlem.api.save(model, 'models/mlem-model') #, sample_data=df)
Traceback
(venv) me@laptop:~/BitBucket/project$ python3 tools/export_pickle_as_mlem.py
Traceback (most recent call last):
File "/home/me/BitBucket/project/tools/export_pickle_as_mlem.py", line 18, in <module>
mlem.api.save(model, 'models/my-model-mlem')
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/mlem/telemetry.py", line 50, in inner
return f(*args, **kwargs)
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/mlem/core/metadata.py", line 122, in save
meta = get_object_metadata(
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/mlem/core/metadata.py", line 53, in get_object_metadata
return MlemModel.from_obj(
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/mlem/core/objects.py", line 740, in from_obj
model_type = model_hook.process(model)
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/mlem/contrib/callable.py", line 213, in process
s = Signature.from_method(
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/mlem/core/model.py", line 236, in from_method
name=override_name or method.__name__,
File "/home/me/miniconda3/envs/venv/lib/python3.10/site-packages/fastcore/basics.py", line 497, in __getattr__
raise AttributeError(k)
AttributeError: __name__. Did you mean: '__ne__'?
The text was updated successfully, but these errors were encountered:
FastAI is very hard to integrate with because it's authors are very "creative". You can try exporting to torch or maybe provide a callable function instead of learner (maybe something learner.predict or declare a custom function).
I will try to investigate it in the meantime
Sorry @mike0sv - didn't get your point about the screenshot. Do you mean it's an example of a "creative" code? Do we plan to ever integrate with FastAI then? 🤔
@danielbellsa - sorry - based on what Mike said, it looks like we can't just support that for now. Did you use some workaround or decided to skip using MLEM in this case?
Btw, about sample_data - to serve model with FastAPI or Streamlit (and deploy the model somewhere), you'll have to pass sample_data - so MLEM can figure out data schema to build an API for your model.
Goal: Save a model as a mlem model, without
sample_data
.I plan on loading in
mlem
model files, in deployment, like so:To do this, I need to load in existing models and then
save()
them as amlem
model.save()
creates a model binary and amlem
metadata file.I've trained models, so don't pass
sample_data
as a paramter tosave()
.Code
Traceback
The text was updated successfully, but these errors were encountered: