Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

{% if add_generation_prompt %} [FIXED] #1284

Open
giuliabaldini opened this issue Nov 13, 2024 · 3 comments
Open

{% if add_generation_prompt %} [FIXED] #1284

giuliabaldini opened this issue Nov 13, 2024 · 3 comments
Labels
fixed - pending confirmation Fixed, waiting for confirmation from poster

Comments

@giuliabaldini
Copy link
Contributor

giuliabaldini commented Nov 13, 2024

Hi there,

if I run my usual code after the Qwen 2.5 commit, I get multiple errors. The first one is the following

jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'endfor'. Jinja was looking for the following tags: 'elif' or 'else' or 'endif'. The innermost block that needs to be closed is 'if'.

which is probably because of the change in this line. Once I fix that, I still get

RuntimeError: Unsloth: The tokenizer `OpenMeditron/Meditron3-8B`
does not have a {% if add_generation_prompt %} for generation purposes.
Please file a bug report immediately - thanks!

Any ideas?

Best,
Giulia

@giuliabaldini giuliabaldini changed the title Cannot run old code after Qwen 2.5 update Jinja error after Qwen 2.5 update Nov 13, 2024
@xizhangmable
Copy link

I have a similar issue when running

train_ds = train_ds.map(lambda x: {"training_prompt": tokenizer.apply_chat_template(x["chat"], tokenize=False, add_generation_prompt=False)})

TemplateSyntaxError: Encountered unknown tag 'endfor'. Jinja was looking for the following tags: 'elif' or 'else' or 'endif'. The innermost block that needs to be closed is 'if'.

@danielhanchen danielhanchen changed the title Jinja error after Qwen 2.5 update [FIXED] {% if add_generation_prompt %} Error Nov 14, 2024
@danielhanchen danielhanchen changed the title [FIXED] {% if add_generation_prompt %} Error [FIXED] {% if add_generation_prompt %} Nov 14, 2024
@danielhanchen
Copy link
Contributor

danielhanchen commented Nov 14, 2024

Apologies just fixed @giuliabaldini @xizhangmable - thanks for reporting! Please update Unsloth on local machines via pip install --upgrade --no-cache-dir --no-deps unsloth

For Colab, Kaggle just refresh!

@danielhanchen danielhanchen added the fixed - pending confirmation Fixed, waiting for confirmation from poster label Nov 14, 2024
@danielhanchen danielhanchen pinned this issue Nov 14, 2024
@danielhanchen danielhanchen changed the title [FIXED] {% if add_generation_prompt %} {% if add_generation_prompt %} [FIXED] Nov 14, 2024
@GreenBogDes
Copy link

I am still getting this error in google colab in a new session. I was helped by rolling back to a2f8db3

RuntimeError Traceback (most recent call last)
in <cell line: 19>()
17 # ] # More models at https://huggingface.co/unsloth
18
---> 19 model, tokenizer = FastLanguageModel.from_pretrained(
20 model_name = "mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated",
21 max_seq_length = max_seq_length,

3 frames
/usr/local/lib/python3.10/dist-packages/unsloth/tokenizer_utils.py in fix_chat_template(tokenizer)
656 if "{% if add_generation_prompt %}" not in new_chat_template and
657 "{%- if add_generation_prompt %}" not in new_chat_template:
--> 658 raise RuntimeError(
659 f"Unsloth: The tokenizer {tokenizer.name_or_path}\n"
660 "does not have a {% if add_generation_prompt %} for generation purposes.\n"\

RuntimeError: Unsloth: The tokenizer mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated
does not have a {% if add_generation_prompt %} for generation purposes.
Please file a bug report immediately - thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fixed - pending confirmation Fixed, waiting for confirmation from poster
Projects
None yet
Development

No branches or pull requests

4 participants