Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: --v2 flag produces SIGSEV fault with default browser interaction. #678

Open
Eudox67 opened this issue Jan 16, 2025 · 0 comments
Open

Comments

@Eudox67
Copy link

Eudox67 commented Jan 16, 2025

Contact Details

[email protected]

What happened?

When running a llamafile (gemme-2 9B, Qwen2.5-instruct-14B, ...) and using the --v2 flag, the llamafile produces a signal.cpp SIGSEV fault upon submission from the default new browser interface at 127.0.0.1:8080. But it is not limited to the new browser. It also produces the same SIGSEV fault using the provided curl example in the readme.md file.

Version

llamafile v0.9.0

What operating system are you seeing the problem on?

Linux

Relevant log output

~/Code/llm$ ./gemma-2-9b-it-Q8_0-f16.llamafile -ngl 9999 --server --v2 --ctx-size 2048 --gpu AMD
import_cuda_impl: initializing gpu module...
link_cuda_dso: note: dynamically linking /home/trevor/.llamafile/v/0.9.0/ggml-rocm.so
ggml_cuda_link: HIP driver version 6.2.41134
ggml_cuda_link: HIP runtime version is 6.2.41134
ggml_cuda_link: welcome to ROCm SDK with hipBLAS
link_cuda_dso: GPU support loaded
2025-01-16T12:11:47.174687 llamafile/server/listen.cpp:41 server listen http://127.0.0.1:8080
2025-01-16T12:11:47.174882 llamafile/server/worker.cpp:143  warning: gpu mode disables pledge security
2025-01-16T12:11:57.027734 llamafile/server/client.cpp:679 43400 GET /
2025-01-16T12:11:57.125990 llamafile/server/client.cpp:679 43400 GET /chatbot.css
2025-01-16T12:11:57.126626 llamafile/server/client.cpp:679 43400 GET /highlight.css
2025-01-16T12:11:57.127172 llamafile/server/client.cpp:679 43400 GET /clipboard.js
2025-01-16T12:11:57.127695 llamafile/server/client.cpp:679 43412 GET /ctype.js
2025-01-16T12:11:57.127856 llamafile/server/client.cpp:679 43400 GET /highlight.js
2025-01-16T12:11:57.128140 llamafile/server/client.cpp:679 43438 GET /highlight_txt.js
2025-01-16T12:11:57.128269 llamafile/server/client.cpp:679 43426 GET /highlight_markdown.js
2025-01-16T12:11:57.128763 llamafile/server/client.cpp:679 43412 GET /highlight_cpp.js
2025-01-16T12:11:57.129580 llamafile/server/client.cpp:679 43400 GET /highlight_c.js
2025-01-16T12:11:57.129739 llamafile/server/client.cpp:679 43412 GET /highlight_d.js
2025-01-16T12:11:57.129856 llamafile/server/client.cpp:679 43438 GET /highlight_cxx.js
2025-01-16T12:11:57.130008 llamafile/server/client.cpp:679 43426 GET /highlight_ada.js
2025-01-16T12:11:57.130576 llamafile/server/client.cpp:679 43456 GET /highlight_asm.js
2025-01-16T12:11:57.130666 llamafile/server/client.cpp:679 43446 GET /highlight_basic.js
2025-01-16T12:11:57.131055 llamafile/server/client.cpp:679 43412 GET /highlight_cobol.js
2025-01-16T12:11:57.131133 llamafile/server/client.cpp:679 43438 GET /highlight_csharp.js
2025-01-16T12:11:57.132162 llamafile/server/client.cpp:679 43400 GET /highlight_forth.js
2025-01-16T12:11:57.132242 llamafile/server/client.cpp:679 43412 GET /highlight_fortran.js
2025-01-16T12:11:57.132321 llamafile/server/client.cpp:679 43426 GET /highlight_go.js
2025-01-16T12:11:57.132431 llamafile/server/client.cpp:679 43456 GET /highlight_haskell.js
2025-01-16T12:11:57.132550 llamafile/server/client.cpp:679 43446 GET /highlight_js.js
2025-01-16T12:11:57.133315 llamafile/server/client.cpp:679 43400 GET /highlight_css.js
2025-01-16T12:11:57.133383 llamafile/server/client.cpp:679 43412 GET /highlight_php.js
2025-01-16T12:11:57.133457 llamafile/server/client.cpp:679 43438 GET /highlight_html.js
2025-01-16T12:11:57.133625 llamafile/server/client.cpp:679 43426 GET /highlight_java.js
2025-01-16T12:11:57.133637 llamafile/server/client.cpp:679 43456 GET /highlight_julia.js
2025-01-16T12:11:57.134255 llamafile/server/client.cpp:679 43400 GET /highlight_kotlin.js
2025-01-16T12:11:57.134318 llamafile/server/client.cpp:679 43412 GET /highlight_ld.js
2025-01-16T12:11:57.134397 llamafile/server/client.cpp:679 43438 GET /highlight_lisp.js
2025-01-16T12:11:57.134471 llamafile/server/client.cpp:679 43446 GET /highlight_lua.js
2025-01-16T12:11:57.135155 llamafile/server/client.cpp:679 43412 GET /highlight_m4.js
2025-01-16T12:11:57.135238 llamafile/server/client.cpp:679 43426 GET /highlight_make.js
2025-01-16T12:11:57.135305 llamafile/server/client.cpp:679 43456 GET /highlight_matlab.js
2025-01-16T12:11:57.135799 llamafile/server/client.cpp:679 43400 GET /highlight_ocaml.js
2025-01-16T12:11:57.135856 llamafile/server/client.cpp:679 43412 GET /highlight_pascal.js
2025-01-16T12:11:57.135959 llamafile/server/client.cpp:679 43438 GET /highlight_perl.js
2025-01-16T12:11:57.136082 llamafile/server/client.cpp:679 43446 GET /highlight_python.js
2025-01-16T12:11:57.137128 llamafile/server/client.cpp:679 43400 GET /highlight_r.js
2025-01-16T12:11:57.137186 llamafile/server/client.cpp:679 43412 GET /highlight_ruby.js
2025-01-16T12:11:57.137251 llamafile/server/client.cpp:679 43438 GET /highlight_rust.js
2025-01-16T12:11:57.137318 llamafile/server/client.cpp:679 43426 GET /highlight_scala.js
2025-01-16T12:11:57.137378 llamafile/server/client.cpp:679 43456 GET /highlight_shell.js
2025-01-16T12:11:57.138021 llamafile/server/client.cpp:679 43446 GET /highlight_sql.js
2025-01-16T12:11:57.138140 llamafile/server/client.cpp:679 43412 GET /highlight_swift.js
2025-01-16T12:11:57.138256 llamafile/server/client.cpp:679 43438 GET /highlight_tcl.js
2025-01-16T12:11:57.139396 llamafile/server/client.cpp:679 43400 GET /highlight_tex.js
2025-01-16T12:11:57.139486 llamafile/server/client.cpp:679 43412 GET /highlight_typescript.js
2025-01-16T12:11:57.139554 llamafile/server/client.cpp:679 43456 GET /highlight_zig.js
2025-01-16T12:11:57.140043 llamafile/server/client.cpp:679 43446 GET /highlight_cmake.js
2025-01-16T12:11:57.140118 llamafile/server/client.cpp:679 43426 GET /render_markdown.js
2025-01-16T12:11:57.140674 llamafile/server/client.cpp:679 43400 GET /chatbot.js
2025-01-16T12:11:57.172332 llamafile/server/client.cpp:679 43400 GET /chatbot.png
2025-01-16T12:11:57.172506 llamafile/server/client.cpp:679 43412 GET /gear.svg
2025-01-16T12:11:57.172567 llamafile/server/client.cpp:679 43426 GET /redo.svg
2025-01-16T12:11:57.172643 llamafile/server/client.cpp:679 43446 GET /upload.svg
2025-01-16T12:11:57.173020 llamafile/server/client.cpp:679 43456 GET /close.svg
2025-01-16T12:11:57.193628 llamafile/server/client.cpp:679 43400 GET /flagz
2025-01-16T12:11:57.203410 llamafile/server/client.cpp:679 43400 GET /favicon.ico
2025-01-16T12:12:15.614236 llamafile/server/client.cpp:679 43400 POST /v1/chat/completions
2025-01-16T12:12:15.614840 llamafile/server/slots.cpp:132 43400 acquired slot #0 with score 28
2025-01-16T12:12:15.616587 llamafile/server/signals.cpp:54 43400 crashed SIGSEGV
2025-01-16T12:12:15.616609 llamafile/server/signals.cpp:57 43400 crashed SIGSEGV at 618 bt 7759eba9deb5 7759a6d84bb5 DANGEROUS 7757d08283e8
2025-01-16T12:12:15.616618 llamafile/server/slots.cpp:146 43400 relinquishing slot #0


Linux information

System:
  Kernel: 6.8.0-51-generic arch: x86_64 bits: 64 compiler: gcc v: 13.3.0 clocksource: tsc
  Desktop: Cinnamon v: 6.2.9 tk: GTK v: 3.24.41 wm: Muffin v: 6.2.0 vt: 7 dm: LightDM v: 1.30.0
    Distro: Linux Mint 22 Wilma base: Ubuntu 24.04 noble
Machine:
  Type: Desktop System: Micro-Star product: MS-7B79 v: 2.0 serial: <superuser required>
  Mobo: Micro-Star model: X470 GAMING PLUS (MS-7B79) v: 2.0 serial: <superuser required>
    uuid: <superuser required> UEFI: American Megatrends v: A.30 date: 04/23/2018
CPU:
  Info: 8-core model: AMD Ryzen 7 2700X bits: 64 type: MT MCP smt: enabled arch: Zen+ rev: 2 cache:
    L1: 768 KiB L2: 4 MiB L3: 16 MiB
  Speed (MHz): avg: 2179 high: 2200 min/max: 2200/3700 boost: enabled cores: 1: 2200 2: 2200
    3: 2199 4: 2200 5: 2200 6: 1887 7: 2200 8: 2200 9: 2198 10: 2200 11: 2199 12: 2200 13: 2199
    14: 2200 15: 2196 16: 2193 bogomips: 118401
  Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 sse4a ssse3 svm
Graphics:
  Device-1: AMD Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M] driver: amdgpu v: kernel arch: RDNA-3
    pcie: speed: 16 GT/s lanes: 16 ports: active: DP-1,DP-3,HDMI-A-1 empty: DP-2,Writeback-1
    bus-ID: 1f:00.0 chip-ID: 1002:744c class-ID: 0300
  Device-2: Microdia Webcam Vitade AF driver: snd-usb-audio,uvcvideo type: USB rev: 2.0
    speed: 480 Mb/s lanes: 1 bus-ID: 7-4:2 chip-ID: 0c45:6366 class-ID: 0102 serial: <filter>
  Display: x11 server: X.Org v: 21.1.11 with: Xwayland v: 23.2.6 driver: X: loaded: amdgpu
    unloaded: fbdev,modesetting,radeon,vesa dri: radeonsi gpu: amdgpu display-ID: :0 screens: 1
  API: EGL v: 1.5 hw: drv: amd radeonsi platforms: device: 0 drv: radeonsi device: 1 drv: swrast
    surfaceless: drv: radeonsi x11: drv: radeonsi inactive: gbm,wayland
  API: OpenGL v: 4.6 compat-v: 4.5 vendor: amd mesa v: 24.0.9-0ubuntu0.3 glx-v: 1.4
    direct-render: yes renderer: Radeon RX 7900 XT (radeonsi navi31 LLVM 17.0.6 DRM 3.57
    6.8.0-51-generic) device-ID: 1002:744c
  API: Vulkan v: 1.3.275 layers: 3 surfaces: xcb,xlib device: 0 type: discrete-gpu driver: N/A
    device-ID: 1002:744c device: 1 type: cpu driver: N/A device-ID: 10005:0000
Info:
  Memory: total: 64 GiB note: est. available: 62.74 GiB used: 6.56 GiB (10.5%)
  Processes: 499 Power: uptime: 15h 6m states: freeze,mem,disk suspend: deep wakeups: 0
    hibernate: platform Init: systemd v: 255 target: graphical (5) default: graphical
  Compilers: gcc: 13.3.0 alt: 9/11/12 Client: Cinnamon v: 6.2.9 inxi: 3.3.34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant