Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Support for PaliGemma-3B (or rather any other model) #63

Open
srinikrish22 opened this issue Jan 10, 2025 · 0 comments
Open

Adding Support for PaliGemma-3B (or rather any other model) #63

srinikrish22 opened this issue Jan 10, 2025 · 0 comments

Comments

@srinikrish22
Copy link

First of all thanks a lot for all the support with making our Jetson devices an AI powerhouse.

I was wondering if there is a possibility to add support to new models or model family. I was thinking about a rough guideline so that the support for new model Architectures can be added (or at least be attempted) to add support via NanoLLM. It is kind of clear that NanoLLM is one of the fastest possible acceleration frameworks for the Jetson devices. Is there at least an outline to get started with added support for new architectures please? I would like to try out some new families of VLM models on my Jetson device, stating with PaliGemma.

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant