Skip to content

Actions: ggml-org/llama.cpp

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
103,421 workflow runs
103,421 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Fix visual encoders with no CLS
EditorConfig Checker #22009: Pull request #11982 synchronize by alex-jw-brooks
February 20, 2025 21:14 Action required alex-jw-brooks:fix_no_cls_vencoders
February 20, 2025 21:14 Action required
Fix visual encoders with no CLS
CI #19610: Pull request #11982 synchronize by alex-jw-brooks
February 20, 2025 21:14 Action required alex-jw-brooks:fix_no_cls_vencoders
February 20, 2025 21:14 Action required
Fix visual encoders with no CLS
Pull Request Labeler #8352: Pull request #11982 synchronize by alex-jw-brooks
February 20, 2025 21:14 13s
February 20, 2025 21:14 13s
CUDA: correct the lowest Maxwell supported by CUDA 12
CI #19609: Pull request #11984 opened by PureJourney
February 20, 2025 21:03 Action required PureJourney:master
February 20, 2025 21:03 Action required
CUDA: correct the lowest Maxwell supported by CUDA 12
EditorConfig Checker #22008: Pull request #11984 opened by PureJourney
February 20, 2025 21:03 Action required PureJourney:master
February 20, 2025 21:03 Action required
CUDA: correct the lowest Maxwell supported by CUDA 12
Server #11014: Pull request #11984 opened by PureJourney
February 20, 2025 21:03 Action required PureJourney:master
February 20, 2025 21:03 Action required
CUDA: correct the lowest Maxwell supported by CUDA 12
Pull Request Labeler #8351: Pull request #11984 opened by PureJourney
February 20, 2025 21:03 17s
February 20, 2025 21:03 17s
Fix visual encoders with no CLS
EditorConfig Checker #22007: Pull request #11982 opened by alex-jw-brooks
February 20, 2025 19:44 27s alex-jw-brooks:fix_no_cls_vencoders
February 20, 2025 19:44 27s
Fix visual encoders with no CLS
Pull Request Labeler #8350: Pull request #11982 opened by alex-jw-brooks
February 20, 2025 19:44 17s
February 20, 2025 19:44 17s
llama : refactor llama_kv_cache, llama_context and llm_build_context
CI #19607: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:55 34m 47s gg/llama-kv-cache
February 20, 2025 18:55 34m 47s
llama : refactor llama_kv_cache, llama_context and llm_build_context
Python Type-Check #1870: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:55 1m 19s gg/llama-kv-cache
February 20, 2025 18:55 1m 19s
llama : refactor llama_kv_cache, llama_context and llm_build_context
Server #11012: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:55 8m 19s gg/llama-kv-cache
February 20, 2025 18:55 8m 19s
llama : refactor llama_kv_cache, llama_context and llm_build_context
flake8 Lint #17487: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:55 16s gg/llama-kv-cache
February 20, 2025 18:55 16s
llama : refactor llama_kv_cache, llama_context and llm_build_context
EditorConfig Checker #22006: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:55 18s gg/llama-kv-cache
February 20, 2025 18:55 18s
llama : refactor llama_kv_cache, llama_context and llm_build_context
Pull Request Labeler #8349: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:55 14s
February 20, 2025 18:55 14s
server (webui): Fix Premature Submission During IME Conversion (#11971)
EditorConfig Checker #22005: Commit c392e50 pushed by ngxson
February 20, 2025 18:43 18s master
February 20, 2025 18:43 18s
server (webui): Fix Premature Submission During IME Conversion (#11971)
Server #11011: Commit c392e50 pushed by ngxson
February 20, 2025 18:43 12m 45s master
February 20, 2025 18:43 12m 45s
server (webui): Fix Premature Submission During IME Conversion
EditorConfig Checker #22004: Pull request #11971 synchronize by ngxson
February 20, 2025 18:34 22s mmngays:server-ime-composition
February 20, 2025 18:34 22s
server (webui): Fix Premature Submission During IME Conversion
Pull Request Labeler #8348: Pull request #11971 synchronize by ngxson
February 20, 2025 18:34 11s
February 20, 2025 18:34 11s
llama : refactor llama_kv_cache, llama_context and llm_build_context
Python Type-Check #1869: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:01 1m 14s gg/llama-kv-cache
February 20, 2025 18:01 1m 14s
llama : refactor llama_kv_cache, llama_context and llm_build_context
Server #11009: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:01 8m 17s gg/llama-kv-cache
February 20, 2025 18:01 8m 17s
llama : refactor llama_kv_cache, llama_context and llm_build_context
EditorConfig Checker #22003: Pull request #11213 synchronize by ggerganov
February 20, 2025 18:01 26s gg/llama-kv-cache
February 20, 2025 18:01 26s