-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Insights: abetlen/llama-cpp-python
Overview
-
0 Active issues
-
- 1 Merged pull request
- 1 Open pull request
- 0 Closed issues
- 0 New issues
Could not load contribution data
Please try again later
7 Releases published by 1 person
-
v0.3.13-metal
published
Jul 15, 2025 -
v0.3.13-cu124
published
Jul 15, 2025 -
v0.3.13-cu121
published
Jul 15, 2025 -
v0.3.13-cu122
published
Jul 15, 2025 -
v0.3.13-cu123
published
Jul 15, 2025 -
v0.3.14-metal
published
Jul 18, 2025 -
v0.3.14-cu124
published
Jul 18, 2025
1 Pull request merged by 1 person
-
Better chat format for Qwen2.5-VL
#2040 merged
Jul 15, 2025
1 Pull request opened by 1 person
-
Actually create a random seed when using seed = -1 on load
#2042 opened
Jul 16, 2025
9 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Segmentation fault (core dumped) appearing randomly
#2005 commented on
Jul 12, 2025 • 0 new comments -
Setting seed to -1 (random) or using default LLAMA_DEFAULT_SEED generates a deterministic reply chain
#1809 commented on
Jul 12, 2025 • 0 new comments -
Can't install llama-cpp-python with HIPBLAS/ROCm on Windows
#1489 commented on
Jul 15, 2025 • 0 new comments -
Generate answer from embedding vectors
#1897 commented on
Jul 16, 2025 • 0 new comments -
Windows11:ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)
#2035 commented on
Jul 17, 2025 • 0 new comments -
FileNotFoundError: Shared library with base name 'llama' not found
#568 commented on
Jul 18, 2025 • 0 new comments -
Inferencing Flan-T5 - GGML_ASSERT error
#2038 commented on
Jul 18, 2025 • 0 new comments -
Add support for XTC and DRY samplers
#1843 commented on
Jul 17, 2025 • 0 new comments -
ARM Runners support CUDA SBSA
#2039 commented on
Jul 18, 2025 • 0 new comments