Skip to content

Commit aa87f55

Browse files
authored
new model and function calling example
1 parent f5e4c6b commit aa87f55

File tree

1 file changed

+57
-0
lines changed

1 file changed

+57
-0
lines changed

README.md

Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -531,6 +531,63 @@ llm.create_chat_completion(
531531
)
532532
```
533533

534+
Another model that supports function calling with its default template is [this](https://huggingface.co/CISCai/Mistral-7B-Instruct-v0.3-SOTA-GGUF) one, with full tool_calls support:
535+
536+
```python
537+
from llama_cpp import Llama
538+
llm = Llama(
539+
model_path="path/to/mistral/mistral-v3-model.gguf"
540+
)
541+
llm.create_chat_completion(
542+
messages = [
543+
{
544+
"role": "user",
545+
"content": "What's the weather like in Oslo?"
546+
},
547+
{ # The tool_calls is from the response to the above with tool_choice specified
548+
"role": "assistant",
549+
"content": None,
550+
"tool_calls": [
551+
{
552+
"id": "call__0_get_current_weather_cmpl-...",
553+
"type": "function",
554+
"function": {
555+
"name": "get_current_weather",
556+
"arguments": '{ "location": "Oslo, NO" ,"unit": "celsius"} '
557+
}
558+
}
559+
]
560+
},
561+
{ # The tool_call_id is from tool_calls and content is the result from the function call you made
562+
"role": "tool",
563+
"content": 20,
564+
"tool_call_id": "call__0_get_current_weather_cmpl-..."
565+
}
566+
],
567+
tools=[{
568+
"type": "function",
569+
"function": {
570+
"name": "get_current_weather",
571+
"description": "Get the current weather in a given location",
572+
"parameters": {
573+
"type": "object",
574+
"properties": {
575+
"location": {
576+
"type": "string",
577+
"description": "The city and state, e.g. San Francisco, CA"
578+
},
579+
"unit": {
580+
"type": "string",
581+
"enum": [ "celsius", "fahrenheit" ]
582+
}
583+
},
584+
"required": [ "location" ]
585+
}
586+
}
587+
}]
588+
)
589+
```
590+
534591
#### Built in function calling
535592

536593
The high-level API supports OpenAI compatible function and tool calling. This is possible through the `functionary` pre-trained models chat format or through the generic `chatml-function-calling` chat format.

0 commit comments

Comments
 (0)