Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama integration 🦙 #461

Closed
TheR1D opened this issue Jan 30, 2024 · 2 comments · Fixed by #463
Closed

Ollama integration 🦙 #461

TheR1D opened this issue Jan 30, 2024 · 2 comments · Fixed by #463
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@TheR1D
Copy link
Owner

TheR1D commented Jan 30, 2024

Implement an option to integrate Ollama with ShellGPT. These changes should include the ability to easily switch LLM backends for ShellGPT, allowing users to toggle between OpenAI and Ollama. Since Ollama responses are slightly different (compared to OpenAI), we can utilise Ollama Python Library.

With multiple LLM backends, dependencies specific to a particular LLM/Backend should utilize the Python package 'extras'. This way, users will have the option to install ShellGPT with the default OpenAI client, e.g., pip install shell-gpt, or with a specific backend, e.g., pip install shell-gpt[ollama]

Since Ollama supports multiple open-source models, we need to identify the specific model that performs best for ShellGPT use cases. Based on my research, mistral:7b-instruct outperforms ollama2:*-* in shell command generation tasks.

UPD: Seems it would be much easier to integrate Ollama using LiteLLM. This will also enable ShellGPT to work with the Azure OpenAI API.

@TheR1D TheR1D added the enhancement New feature or request label Jan 30, 2024
@TheR1D TheR1D added this to the LocalLLMs milestone Jan 30, 2024
@TheR1D TheR1D self-assigned this Jan 30, 2024
@rcspam
Copy link

rcspam commented Jan 30, 2024

Bravo.... It could be cool to integrate mistralai api too.

@TheR1D
Copy link
Owner Author

TheR1D commented Feb 2, 2024

I would greatly appreciate any assistance with testing the Ollama Integration PR #463, especially on WSL environments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants