Skip to content

Error calling llama_kv_cache_clear in llama.py with 0.3.10 #2037

@davidmezzetti

Description

@davidmezzetti

Relevant part of stack trace below.

  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/llama_cpp/llama.py", line 1108, in embed
    decode_batch(s_batch)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/llama_cpp/llama.py", line 1044, in decode_batch
    llama_cpp.llama_kv_cache_clear(self._ctx.ctx)
AttributeError: module 'llama_cpp.llama_cpp' has no attribute 'llama_kv_cache_clear'. Did you mean: 'llama_kv_cache_p'?

Looks like line 1044 and 1115.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions