You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
thank you for this cool project. The bot is working just fine with local models. But when I try to chat with an openAI model, like gpt-4o, I get no answer. What else I have to configure so that this would be working?
Thanks
The text was updated successfully, but these errors were encountered:
Yes, and it worked as expected with local models, only models added to open-webui via api, like the openAI models, did not work.
I tried the fork from @sebaxakerhtc - there I see the whole model list from open-webui and can select these models.
But unfortunately the models with assigned knowledge and functions are not using these tools if I ask questions over the telegram bot. So I am at square one, because that was the use case from me, to have my knowledge with me using telegram. :-)
Hello,
thank you for this cool project. The bot is working just fine with local models. But when I try to chat with an openAI model, like gpt-4o, I get no answer. What else I have to configure so that this would be working?
Thanks
The text was updated successfully, but these errors were encountered: