Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hardcoding models #1

Open
vigsterkr opened this issue Nov 3, 2024 · 0 comments
Open

hardcoding models #1

vigsterkr opened this issue Nov 3, 2024 · 0 comments

Comments

@vigsterkr
Copy link

first of all kudos for photoprism! I think it is a great idea to factor out the ML part to a microservice, but i'm really wondering whether it is a good idea to develop an ML server from scratch. Especially looking at the current code, this one hard-codes models, and that is really not something that makes things scalable. there are plenty of mature ML serving development out there, TF has it's own TensorFlow Serving but that of course has its own problem, meaning it ties you to use a specific backend, namely TF.

https://github.com/roboflow/inference is something that comes to mind that is backend agnostic, but here's a quite good list of possible options: https://github.com/topics/inference-server

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant