You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just wondering can we directly read the parameter metadata in "pytorch_model.bin.index.json" or "model.safetensors.index.json"? Otherwise we cannot merged model from our own customized model architecture.
Thanks!
The text was updated successfully, but these errors were encountered:
Have a look at the architecture-agnostic branch, this is WIP but should work for you! By the way, it'll work much more efficiently if you have the model stored with safetensors than pytorch bin.
Hi! Thanks much for developing this tool for model merging!
It seems that the tensor names are hardcoded in https://github.com/arcee-ai/mergekit/tree/main/mergekit/_data/architectures (for Mixtral it is defined in https://github.com/arcee-ai/mergekit/blob/main/mergekit/architecture.py#L282), and a function
get_architecture_info
(mergekit/mergekit/architecture.py
Line 358 in 57e7d14
Just wondering can we directly read the parameter metadata in "pytorch_model.bin.index.json" or "model.safetensors.index.json"? Otherwise we cannot merged model from our own customized model architecture.
Thanks!
The text was updated successfully, but these errors were encountered: