-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example Using Gemma Pre-trained Embedding? #156
Comments
Great suggestion. I know it's possible (we've experimented with it in the past), including an example in the docs is a good idea. If anyone wants to write something up, feel free to open a PR! |
I would be happy to write something up, except that I don't yet understand how to properly generate embeddings using the Gemma model. I find it handy to load the Universal Sentence Encoder as a Keras layer and output embeddings this way:
I tried to do something similar with the Gemma model in Keras Hub, but the shape of the embeddings it outputs is different:
I'm not sure how I should reshape (or add some sort of pooling layer to?) the Gemma embeddings. |
The pre-trained embeddings example using the Universal Sentence Encoder is very helpful. I'm wondering how to do the same thing, but with a Gemma model loaded from Keras Hub.
The text was updated successfully, but these errors were encountered: