Base Package: mingw-w64-llama.cpp

Description:
Library and tools for running inference with Meta's LLaMA model (and derivatives) in C/C++ (mingw-w64)
Base Group(s):
-
Homepage:
https://github.com/ggml-org/llama.cpp
Repository:
https://github.com/ggml-org/llama.cpp
License(s):
MIT
Version:
1~b7972-1 (1~b8054-1 in git)
External:
Anitya
llama.cpp
AUR
b7376
Repology
llama.cpp
Vulnerabilities:
Not enough metadata for vulnerability reporting

Binary Packages:
ucrt64
clang64
clangarm64
mingw64
Last Update: 2026-02-16 01:07:18 [Request update]