You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you for your awesome work. When I trying to reproduce your work on my local machine, I was wondering am I able to use some local models (e.g. Llama3.2-1B/Llama3.1-8B downloaded from HF)?
The text was updated successfully, but these errors were encountered:
Hi, thank you for your interest! Yes, you can use local models from Huggingface, and you'll just need to add the Huggingface generation code to the generate_message function.
Our current code accepts different model types (Claude, Mistral, Llama, GPT) through Amazon Bedrock API. For local models, you'll need to replace Bedrock-specific API calls with Huggingface's generation function and ensure you format the prompts correctly according to the chat template expected by your specific Llama model version. Let me know if you have further questions!
Hi, thank you for your awesome work. When I trying to reproduce your work on my local machine, I was wondering am I able to use some local models (e.g. Llama3.2-1B/Llama3.1-8B downloaded from HF)?
The text was updated successfully, but these errors were encountered: