Skip to content

Commit 626003c

Browse files
committed
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
2 parents f5c2f99 + 1a13d76 commit 626003c

File tree

3 files changed

+12
-6
lines changed

3 files changed

+12
-6
lines changed

README.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,12 @@ pip install llama-cpp-python
2626
The above command will attempt to install the package and build build `llama.cpp` from source.
2727
This is the recommended installation method as it ensures that `llama.cpp` is built with the available optimizations for your system.
2828

29+
Note: If you are using Apple Silicon (M1) Mac, make sure you have installed a version of Python that supports arm64 architecture. For example:
30+
```
31+
wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-MacOSX-arm64.sh
32+
bash Miniforge3-MacOSX-arm64.sh
33+
```
34+
Otherwise, while installing it will build the llama.ccp x86 version which will be 10x slower on Apple Silicon (M1) Mac.
2935

3036
### Installation with OpenBLAS / cuBLAS / CLBlast
3137

@@ -120,7 +126,7 @@ Below is a short example demonstrating how to use the low-level API to tokenize
120126
>>> ctx = llama_cpp.llama_init_from_file(b"./models/7b/ggml-model.bin", params)
121127
>>> max_tokens = params.n_ctx
122128
# use ctypes arrays for array params
123-
>>> tokens = (llama_cppp.llama_token * int(max_tokens))()
129+
>>> tokens = (llama_cpp.llama_token * int(max_tokens))()
124130
>>> n_tokens = llama_cpp.llama_tokenize(ctx, b"Q: Name the planets in the solar system? A: ", tokens, max_tokens, add_bos=llama_cpp.c_bool(True))
125131
>>> llama_cpp.llama_free(ctx)
126132
```

poetry.lock

Lines changed: 4 additions & 4 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ black = "^23.3.0"
2222
twine = "^4.0.2"
2323
mkdocs = "^1.4.3"
2424
mkdocstrings = {extras = ["python"], version = "^0.21.2"}
25-
mkdocs-material = "^9.1.11"
25+
mkdocs-material = "^9.1.12"
2626
pytest = "^7.3.1"
2727
httpx = "^0.24.0"
2828

0 commit comments

Comments
 (0)