Prompt format for llama3 #125
Unanswered
tannisroot
asked this question in
Q&A
Replies: 1 comment 1 reply
-
If you turn on debug logging for the component, it will output whatever is being sent to the backend. Add this to your
Also I'm hoping to get the official Llama3 system prompt added to the next release soon. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm currently playing with llama3:instruct, and in the logs I'm seeing this:
Model response did not end on a stop token (unfinished sentence)
I'm using Alpaca Prompt format. Is this expected behavior with this prompt format, or am I using the wrong prompt format with llama3?
Beta Was this translation helpful? Give feedback.
All reactions