-
Notifications
You must be signed in to change notification settings - Fork 10.1k
Issues: ggerganov/llama.cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Misc. bug: The cache_prompt parameter not working properly
bug-unconfirmed
#10993
opened Dec 27, 2024 by
feikiss
Misc. bug: llama-qwen2vl-cli: ignores --log* options
bug-unconfirmed
#10985
opened Dec 26, 2024 by
l29ah
Misc. bug: Buffer offset is not aligned on macOS / Intel / Vulkan
bug-unconfirmed
#10984
opened Dec 26, 2024 by
soerenkampschroer
Research: Performance differences between Metal (macOS) and Vulkan (Linux)
#10982
opened Dec 26, 2024 by
asahilina
Feature Request: add DeepSeek-v3 support
enhancement
New feature or request
#10981
opened Dec 26, 2024 by
RodriMora
4 tasks done
Compile bug: undefined reference to std::filesystem
bug-unconfirmed
#10978
opened Dec 26, 2024 by
Clauszy
Misc. bug: Large performance regression since version b4365
bug-unconfirmed
#10977
opened Dec 25, 2024 by
GlasslessPizza
Misc. bug: Unsupported op "CPY" / Segmentation fault on Metal
bug-unconfirmed
#10976
opened Dec 25, 2024 by
firelex
Eval bug: Out of Memory Error with Qwen2-VL on Windows
bug-unconfirmed
#10973
opened Dec 25, 2024 by
AmineM24
Compile bug: Converting the Model to Llama.cpp GGUF
bug-unconfirmed
#10969
opened Dec 24, 2024 by
ErdemYavuz55
Misc. bug: Vulkan backend with 7900XTX has severe performance dropoff at some batch sizes
bug-unconfirmed
#10966
opened Dec 24, 2024 by
Mushoz
Feature Request: Quantified model support
enhancement
New feature or request
#10965
opened Dec 24, 2024 by
wyhfc123
4 tasks done
Eval bug: qwen2-vl-2B on jetson agx orin ; try to allocate 655521.35 MB memory
bug-unconfirmed
#10956
opened Dec 23, 2024 by
chaoszzz123
Misc. bug: #if doesn't have #else which leads to a broken binary
bug-unconfirmed
#10947
opened Dec 22, 2024 by
yurivict
Feature Request: Use IndexedDB for server web UI
enhancement
New feature or request
good first issue
Good for newcomers
server/webui
#10946
opened Dec 22, 2024 by
ngxson
4 tasks done
Compile bug: Emulated Linux ARM64 CPU build fails
bug
Something isn't working
build
Compilation issues
#10933
opened Dec 21, 2024 by
SamuelTallet
examples : add configuration presets
documentation
Improvements or additions to documentation
enhancement
New feature or request
examples
good first issue
Good for newcomers
help wanted
Extra attention is needed
#10932
opened Dec 21, 2024 by
ggerganov
1 of 6 tasks
Misc. bug: All llama executables exit immediately without console output
bug-unconfirmed
#10929
opened Dec 21, 2024 by
Ikaron
Compile bug: iOS version able to build not not able to run
bug-unconfirmed
#10922
opened Dec 20, 2024 by
Animaxx
Eval bug: gte-Qwen2 produces non-homogenous embedding vectors
bug-unconfirmed
#10921
opened Dec 20, 2024 by
bringfido-adams
Misc. bug: llama-server throws "Unsupported param: tools"
bug-unconfirmed
#10920
opened Dec 20, 2024 by
hsm207
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.