Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to call an openAI model? #3

Open
dydimos opened this issue Feb 8, 2025 · 3 comments
Open

How to call an openAI model? #3

dydimos opened this issue Feb 8, 2025 · 3 comments

Comments

@dydimos
Copy link

dydimos commented Feb 8, 2025

Hello,

thank you for this cool project. The bot is working just fine with local models. But when I try to chat with an openAI model, like gpt-4o, I get no answer. What else I have to configure so that this would be working?

Thanks

@Sid-Sun
Copy link
Owner

Sid-Sun commented Feb 9, 2025

Have you configured the API key and endpoint?

@dydimos
Copy link
Author

dydimos commented Feb 9, 2025

Have you configured the API key and endpoint?

Yes, and it worked as expected with local models, only models added to open-webui via api, like the openAI models, did not work.

I tried the fork from @sebaxakerhtc - there I see the whole model list from open-webui and can select these models.

But unfortunately the models with assigned knowledge and functions are not using these tools if I ask questions over the telegram bot. So I am at square one, because that was the use case from me, to have my knowledge with me using telegram. :-)

@Sid-Sun
Copy link
Owner

Sid-Sun commented Feb 10, 2025

You need to add the model list in the configuration manually

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants