Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ollama AI] Connecting to an external server #15381

Open
alefeans opened this issue Nov 15, 2024 · 1 comment
Open

[Ollama AI] Connecting to an external server #15381

alefeans opened this issue Nov 15, 2024 · 1 comment
Labels
extension: raycast-ollama Issues related to the raycast-ollama extension extension Issues related to one of the extensions in the Store feature request New feature or improvement

Comments

@alefeans
Copy link

Extension

https://www.raycast.com/massimiliano_pasquini/raycast-ollama

Description

Hi, I really appreciate the quality of this extension!

However, I've noticed that it requires the Ollama server to operate locally. As I have an external server running Ollama, it would be wonderful if there were an option to configure the extension to connect to a different host and port.

Thank you for considering this suggestion!

Who will benefit from this feature?

No response

Anything else?

No response

@alefeans alefeans added extension Issues related to one of the extensions in the Store feature request New feature or improvement labels Nov 15, 2024
@raycastbot raycastbot added the extension: raycast-ollama Issues related to the raycast-ollama extension label Nov 15, 2024
@raycastbot
Copy link
Collaborator

Thank you for opening this issue!

🔔 @MassimilianoPasquini97 you might want to have a look.

💡 Author and Contributors commands

The author and contributors of massimiliano_pasquini/raycast-ollama can trigger bot actions by commenting:

  • @raycastbot close this issue Closes the issue.
  • @raycastbot close as not planned Closes the issue as not planned.
  • @raycastbot rename this issue to "Awesome new title" Renames the issue.
  • @raycastbot reopen this issue Reopens the issue.
  • @raycastbot assign me Assigns yourself to the issue.
  • @raycastbot good first issue Adds the "Good first issue" label to the issue.
  • @raycastbot keep this issue open Make sure the issue won't go stale and will be kept open by the bot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
extension: raycast-ollama Issues related to the raycast-ollama extension extension Issues related to one of the extensions in the Store feature request New feature or improvement
Projects
None yet
Development

No branches or pull requests

2 participants