Why is the docker image so large? #282
Replies: 3 comments 4 replies
-
I read the code for a little. I think the framework should only require very little dependencies. As for the pipelines. I'll suggest let the pipeline provide a requirements.txt with the .py file. This should increase the extensibility while reducing the default install size. |
Beta Was this translation helpful? Give feedback.
-
Is it as simple as removing all of the ML libraries from the requirements.txt used in the main build? If it breaks things, then would to set up environment variable and conditionals everywhere those libraries are referenced. I'd be up for doing this in a new branch or fork so we have a super lightweight image to start off with. |
Beta Was this translation helpful? Give feedback.
-
I created a PR here #300 . |
Beta Was this translation helpful? Give feedback.
-
The pipelines' docker image is currently sitting at 4.37GB. This is far too large for its functionalities.
This may be fine on desktops with TB of storage. But it is making deploying this with the web-ui on SBCs harder. (When updating, two versions will consume over 20GB).
Please help investigate why this is the case. Are all the python dependencies really needed for this project?
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions