Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama/LiteLLM integration: Make api base configurable #474

Closed
moppman opened this issue Feb 10, 2024 · 1 comment
Closed

Ollama/LiteLLM integration: Make api base configurable #474

moppman opened this issue Feb 10, 2024 · 1 comment

Comments

@moppman
Copy link

moppman commented Feb 10, 2024

Hi,

#463 enables Ollama for ShellGPT, which is awesome.

Would you consider making the API base URL configurable in .sgptrc? LiteLLM defaults to api_base="http://localhost:11434" here.
It would be nice to be able to use ShellGPT with non-local endpoints.

Thanks!

@TheR1D
Copy link
Owner

TheR1D commented Feb 11, 2024

Hi @moppman, thank you for reporting this issue. We will address this in upcoming updates. We already have an issue related to OPENAI_BASE_URL #473, given that I will close this one as duplicate.

@TheR1D TheR1D closed this as not planned Won't fix, can't repro, duplicate, stale Feb 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants