Skip to content

GGML_ASSERT: /Users/runner/work/node-llama-cpp/node-llama-cpp/llama/llama.cpp/llama.cpp:5052: n_tokens <= n_batch #94

Closed Answered by giladgd
jparismorgan asked this question in Q&A
Discussion options

You must be logged in to vote

There's currently an issue with prompts that are longer than the batchSize; it'll be fixed as part of #85.
For a workaround for now, see #76

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants