Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose in the UI usage of custom models running locally (ollama, llamafile, vllm...) #858

Open
1 task done
ividal opened this issue Feb 12, 2025 · 0 comments
Open
1 task done
Labels
enhancement New feature or request frontend

Comments

@ividal
Copy link
Contributor

ividal commented Feb 12, 2025

Motivation

#770 adds support (via API) of users launching any model via ollama, llamafile or vllm. They would then have to launch experiments via curl.
Instead, ideally we should also be able to select them via UI.

Alternatives

Access via curl, requests and SDK are possible after #770 .

Contribution

Extending the UI guide with the example, once the feature is available.

Have you searched for similar issues before submitting this one?

  • Yes, I have searched for similar issues
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request frontend
Projects
None yet
Development

No branches or pull requests

1 participant