Closed
Description
Name and Version
version: 5174 (5630406)
built with cc (Ubuntu 13.3.0-6ubuntu2~24.04) 13.3.0 for x86_64-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-cli
Command line
llama-cli --version
Problem description & steps to reproduce
Summary
Pulling any of the moving Docker tags (full-vulkan
, full
, server
, etc.) still returns build 5174 (5630406), which was published on 2025‑04‑24. Meanwhile, the Releases page has advanced to b5223 and beyond, so no Docker images have been published for roughly a week.
Steps to reproduce
# 1. Pull the latest image
docker pull ghcr.io/ggml-org/llama.cpp:full-vulkan
# 2. Check the build banner (bypasses tools.sh)
docker run --rm \
--entrypoint /app/llama-cli \
ghcr.io/ggml-org/llama.cpp:full-vulkan \
--version
Output:
version: 5174 (56304069)
built with cc (Ubuntu 13.3.0-6ubuntu2~24.04) 13.3.0 for x86_64-linux-gnu
Expected
The nightly Publish Docker image workflow pushes a new digest for every numbered release, so pulling full-vulkan
(or any moving tag) tracks the current release (today: b522x).
Actual
- All moving tags (
full
,full‑vulkan
,server
, etc.) resolve to digest sha256:23c3ec7e46c7… → build 5174. - Publish Docker image workflow has failed for every run after #14926 (24 Apr) with the same buildx/CMake error, so no new images are uploaded.
- Make archives workflow continues to succeed, producing release tags up to b5223.
Evidence
- Releases page: latest tag is
b5223
, dated today. - Actions → Publish Docker image: all runs after 2025‑04‑24 are red.
- Local pull still shows 24 Apr digest & build banner (commands above).
Why it matters
Many users install llama.cpp exclusively via Docker. Right now they are stuck on a week‑old build (missing recent fixes and features) unless they build the image locally.
First Bad Commit
No response