You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I see you recently added support for Llama, great work! I noticed the weight migration strength you used in the demo notebook was 0.85, which is very far from the 0.5 value that is used in the rest of the code and the paper. Why is Llama using such a large value?
The text was updated successfully, but these errors were encountered:
I see you recently added support for Llama, great work! I noticed the weight migration strength you used in the demo notebook was 0.85, which is very far from the 0.5 value that is used in the rest of the code and the paper. Why is Llama using such a large value?
The text was updated successfully, but these errors were encountered: