-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: 'weight' must be 2-D in upsert. #3516
Comments
@DavidYanAnDe, thanks for raising this. I have a feeling this issue is related to the embedding model and/or its config. I noticed you hare using SentenceTransformers. What model are you using with ST? |
@tazarov Just all-MiniLM-L6-v2, because there were some error for default onnxrunningtime, so I change to use sentence_transformer_ef = embedding_functions.SentenceTransformerEmbeddingFunction( |
interesting. Can you tell me which version of ST do you have |
I know it's a bert transformer in all-MiniLM-L6-v2, so is it some settings in deepspeed influence it during training, even it doesn't participant in? |
@tazarov sentence-transformers is 3.3.1 |
The reason why I said conflict between DeepSpeed and chromadb/ST is once I remove the "--deepspeed" in config it will work normally. Sadly, none of us can escape when training MLLM. -.- |
What happened?
When I use DeepSpeed stage-3 and upsert in collection of chromadb, it will output error like below, but once I remove DeepSpeed it work correctly, so is there any conflict between DeepSpeed and chromadb?
Versions
chromadb 0.6.3, python 3.10.16
Relevant log output
The text was updated successfully, but these errors were encountered: