-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Something went wrong: Telegram server says - Bad Request: message is too long #79
Comments
for me, ollama deepseek gives an answer like this:
and the |
i dont get how its related, but i had same error with llama3 model |
it's related because of also, your telegram bot ID is written at the last line (last chars) of your logs |
ye but t hat message in the example isnt that long is it ? and thank fro the heads up :D |
no it's not, but try without mentioning the bot, the |
ye without it, he doesnt even answer, it was working week ago, idk what changed, but the bot is in group and without mentioning him he doesnt answer at all |
I added a pagination function to split the LLM reply into pages of less than 4096 chars as that is the maximum Telegram message size |
Could we try your version? It seams that long replies of models like deepseek-r1-72b do cause sending telegram messages to fail. |
It's not really mergable. There's complications when say |
use telegramify from https://github.com/sudoskys/telegramify-markdown |
hello i have this running with deepseek-r1:1.5b model, i wonder if this model is limited in such a message lengths or is my setup wrong ?
The text was updated successfully, but these errors were encountered: