-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Llama.cpp whl is not downloaded - Pip returned an error while installing the wheel! #202
Comments
+1 |
This should be fixed in v0.3.5 |
I've completely uninstalled and reinstalled (via HACS) v0.3.5, but still get the error (it also points to 0.3.4):
|
Same for me, still getting the same error as @FlorianShepherd. Also completly reinstalled the extension. |
FYI - I managed to get it to work by following @acon96's fallback option: https://github.com/acon96/home-llm/blob/develop/docs/Backend%20Configuration.md#wheels (manually copy the WHL file to the HA folder and run the installer) The assistant works great (apart from my CPU being too slow) - awesome work Acon! |
I messed up the last release and just re-published it. Sometimes Github handles re-releases weird, so if that doesn't work I can just push a fully new version. |
I guess we'll need a new release then. :( |
I had to go back to v0.3.3 allow it to download the wheel and then update to 0.3.5 |
did not work for me, unfortunately |
Describe the bug
From Select Backend form, select Llama.cpp(HuggingFace)
Getting error:
"Pip returned an error while installing the wheel! Please check the Home Assistant logs for more details."
LogViewer of HASS:
2024-08-13 13:00:55.258 ERROR (SyncWorker_8) [homeassistant.util.package] Unable to install package https://github.com/acon96/home-llm/releases/download/v0.3.4/llama_cpp_python-0.2.87-cp312-cp312-musllinux_1_2_x86_64.whl: ERROR: HTTP error 404 while getting https://github.com/acon96/home-llm/releases/download/v0.3.4/llama_cpp_python-0.2.87-cp312-cp312-musllinux_1_2_x86_64.whl
(it's look like the file not exists)
Expected behavior
Backend will select and install Llama.cpp
The text was updated successfully, but these errors were encountered: