-
Notifications
You must be signed in to change notification settings - Fork 282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generic Open AI API compatible provider. (Deepseek, Helicone, LiteLLM etc) #885
Comments
Second this, I'm trying to use Perplexity AI which has Open AI compatible API keys, but I'm not able to configure it |
you can set |
@willemsFEB @242816 would it make sense to have openai, but then a seperate "openai compatible" provider (seperate choices?) - the latter will ask for more config, thoughts? |
That makes sense, you may also want to add things like azure api version there as an optional parang |
+1 for openai like compatible provider in UI |
+1 for openai compatible API. I think it would make more sense to have separate configurations for openai and openai-compatible APIs. This way, you can have more control over the API version of the OpenAI-compatible provider, which may be necessary to work around edge cases of mismatching API versions. |
+1 for openai like compatible provider in UI 🙏 |
+1 for openai like compatible provider in UI. Our institution is starting to evaluate LiteLLM, which uses an openai compatible api with its own API Key. This will likely become more prevalent as businesses like LiteLLM spring up to offer cloud and on-prem ways to host opesource models. |
@michaelneale thanks for the suggestion, so I've set I'm able to get further, however, now goose is appending Any suggestions how I can bypass this? |
Most inference providers support the Open AI API.
So this morning I wanted to use Helicone to track goose calls.
With Aider I would use this https://aider.chat/docs/llms/openai-compat.html
I tried to use the existing open ai provider https://github.com/block/goose/blob/main/crates/goose/src/providers/openai.rs
This didn't work as it's too tied into Open AI.
So a generic provider should allow me to set the HOST, MODEL and KEY to any provider I want.
Requirements
A new provider with the ability to set the following
OPENAI_API_BASE=
OPENAI_API_KEY=
OPENAI_API_MODEL=
Note the BASE needs to be everything up to the /v1 i.e. https://oai.helicione.ai/324324-234234-24324/v1
The text was updated successfully, but these errors were encountered: