Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generic Open AI API compatible provider. (Deepseek, Helicone, LiteLLM etc) #885

Open
242816 opened this issue Jan 29, 2025 · 9 comments
Open
Labels
enhancement New feature or request

Comments

@242816
Copy link

242816 commented Jan 29, 2025

Most inference providers support the Open AI API.

So this morning I wanted to use Helicone to track goose calls.

With Aider I would use this https://aider.chat/docs/llms/openai-compat.html

I tried to use the existing open ai provider https://github.com/block/goose/blob/main/crates/goose/src/providers/openai.rs

This didn't work as it's too tied into Open AI.

So a generic provider should allow me to set the HOST, MODEL and KEY to any provider I want.

Requirements

A new provider with the ability to set the following

OPENAI_API_BASE=
OPENAI_API_KEY=
OPENAI_API_MODEL=

Note the BASE needs to be everything up to the /v1 i.e. https://oai.helicione.ai/324324-234234-24324/v1

@willemsFEB
Copy link

Second this, I'm trying to use Perplexity AI which has Open AI compatible API keys, but I'm not able to configure it

@michaelneale
Copy link
Collaborator

you can set export GOOST_HOST=... in ~/.zshrc for now, until the UI has support for it. Also hard to know how well things support the openai api (can't just be chat)

@michaelneale
Copy link
Collaborator

@willemsFEB @242816 would it make sense to have openai, but then a seperate "openai compatible" provider (seperate choices?) - the latter will ask for more config, thoughts?

@digitalbuddha
Copy link

That makes sense, you may also want to add things like azure api version there as an optional parang

@dailydaniel
Copy link

+1 for openai like compatible provider in UI

@gzuuus
Copy link

gzuuus commented Jan 29, 2025

+1 for openai compatible API. I think it would make more sense to have separate configurations for openai and openai-compatible APIs. This way, you can have more control over the API version of the OpenAI-compatible provider, which may be necessary to work around edge cases of mismatching API versions.

@JBMedeiros
Copy link

+1 for openai like compatible provider in UI 🙏

@dmlond
Copy link

dmlond commented Jan 29, 2025

+1 for openai like compatible provider in UI. Our institution is starting to evaluate LiteLLM, which uses an openai compatible api with its own API Key. This will likely become more prevalent as businesses like LiteLLM spring up to offer cloud and on-prem ways to host opesource models.

@salman1993 salman1993 added the enhancement New feature or request label Jan 29, 2025
@willemsFEB
Copy link

@michaelneale thanks for the suggestion, so I've set OPENAI_HOST=https://api.perplexity.ai (you mention GOOST_HOST but that wouldn't that be for Google Gemini API?)

I'm able to get further, however, now goose is appending v1/chat/completions to the end of the URL, which is invalid for Perplexity, as it's expects https://api.perplexity.ai/chat/completions, so without the v1

Any suggestions how I can bypass this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

9 participants