Skip to content

v0.5.9

Latest
Compare
Choose a tag to compare
@likelovewant likelovewant released this 12 Feb 06:48
· 42 commits to main since this release
0e97670
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with `ROCm6.1.2 hipsdk 6.1.2, make sure to replace the libs with v0.6.1.2

Tip

ROCm5.7 version * The upstream llama.cpp broken the rocm5.7 , need update update hipsdk5.7 clang17 to clang19 in building process. More from wiki

ollama-windows-amd64-rocm5.7z ( build for gfx803 gfx900:xnack- gfx1103(test only))

ROCmlibs for 5.7 available at ROCmlibs for 5.7

ROCmlibs for 6.1.2 available at ROCmlibs for 6.1.2

Support lists gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1100 gfx1101 gfx1103 gfx1150

Note

Install OllamaSetup.exe first in this release , 2nd, upzip ollama-windows-amd64.7z to replace the libs in C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama , 3rd, replace files in your Ollama program ROCm folder with the rocblas.dll and rocblas/library folder matches your GPU architecture with the correct ROCmlibs for 6.1.2 or ROCmlibs for 5.7 for ollama-windows-amd64-rocm5.7z .(eg. the file in C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama ( or you may skip step 1 ,remove any ollama clients on your local , simply use 7z package with step 2 and step3. by run ./ollama serve )

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

Full Changelog: v0.5.8...v0.5.9
For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases