Skip to content

Tags: JamePeng/llama-cpp-python

Tags

v0.3.14-cu126-AVX2-win-20250724

Toggle v0.3.14-cu126-AVX2-win-20250724's commit message
Update Submodule vendor/llama.cpp f0d4d17..a86f52b

v0.3.14-cu126-AVX2-linux-20250723

Toggle v0.3.14-cu126-AVX2-linux-20250723's commit message
Update Submodule vendor/llama.cpp f0d4d17..a86f52b

v0.3.14-cu124-AVX2-win-20250724

Toggle v0.3.14-cu124-AVX2-win-20250724's commit message
Update Submodule vendor/llama.cpp f0d4d17..a86f52b

v0.3.14-cu124-AVX2-linux-20250723

Toggle v0.3.14-cu124-AVX2-linux-20250723's commit message
Update Submodule vendor/llama.cpp f0d4d17..a86f52b

v0.3.13-cu126-AVX2-win-20250717

Toggle v0.3.13-cu126-AVX2-win-20250717's commit message
fix memory_seq_rm crash bug

v0.3.13-cu126-AVX2-linux-20250717

Toggle v0.3.13-cu126-AVX2-linux-20250717's commit message
fix memory_seq_rm crash bug

v0.3.13-cu124-AVX2-win-20250717

Toggle v0.3.13-cu124-AVX2-win-20250717's commit message
fix memory_seq_rm crash bug

v0.3.13-cu124-AVX2-linux-20250717

Toggle v0.3.13-cu124-AVX2-linux-20250717's commit message
fix memory_seq_rm crash bug

v0.3.12-cu126-AVX2-win-20250714

Toggle v0.3.12-cu126-AVX2-win-20250714's commit message
try to use the logit_bias instead of logit_processors in test_llama

v0.3.12-cu126-AVX2-linux-20250714

Toggle v0.3.12-cu126-AVX2-linux-20250714's commit message
try to use the logit_bias instead of logit_processors in test_llama