-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PR: Add Support for Groq's Hosted Models via Groq's OpenAI Compatibility API #683
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #683 +/- ##
==========================================
+ Coverage 98.44% 98.46% +0.01%
==========================================
Files 24 24
Lines 1353 1364 +11
==========================================
+ Hits 1332 1343 +11
Misses 15 15
Partials 6 6 ☔ View full report in Codecov by Sentry. |
092df75
to
34fe4d0
Compare
- Add Groq model options via Groq's OpenAI API compatibility - Update documentation accordingly - Fix example - Fix golangci-lint compliants - Fix comment typo - Fix usage example comments - Remove silly test.mp3 - Add test for chat completions endpoint using groq client config
@jackspirou thank you for the PR! I don't think Groq needs any special treatment, as there are multiple services with OpenAI-compatible API now. It requires a minimal four-line change to the basic example we have in README to make it work with Groq (or any other service like Ollama for that matter). I'm totally open to expanding our README with such an example, though! --- basic.go 2024-03-15 14:40:43
+++ groq.go 2024-03-15 14:40:29
@@ -7,11 +7,14 @@
)
func main() {
- client := openai.NewClient("your token")
+ config := openai.DefaultConfig("token")
+ config.BaseURL = "https://api.groq.com/openai/v1"
+ client := openai.NewClientWithConfig(config)
+
resp, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
- Model: openai.GPT3Dot5Turbo,
+ Model: "mixtral-8x7b-32768",
Messages: []openai.ChatCompletionMessage{
{
Role: openai.ChatMessageRoleUser,```
|
@sashabaranov - Thanks for the response and thanks for the example! I didn't understand the forest for the trees here. I agree with your assessment above, given the example you shared. I'll update this PR to take a stab at a simple readme update per your advice. One side-note - it might be nice to update the pattern of: I think it might have provided some hints for me to see the forest for the trees in this case. Maybe it's something you've already considered. |
was this released? |
Fixes #682