You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+57Lines changed: 57 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -531,6 +531,63 @@ llm.create_chat_completion(
531
531
)
532
532
```
533
533
534
+
Another model that supports function calling with its default template is [this](https://huggingface.co/CISCai/Mistral-7B-Instruct-v0.3-SOTA-GGUF) one, with full tool_calls support:
"description": "Get the current weather in a given location",
572
+
"parameters": {
573
+
"type": "object",
574
+
"properties": {
575
+
"location": {
576
+
"type": "string",
577
+
"description": "The city and state, e.g. San Francisco, CA"
578
+
},
579
+
"unit": {
580
+
"type": "string",
581
+
"enum": [ "celsius", "fahrenheit" ]
582
+
}
583
+
},
584
+
"required": [ "location" ]
585
+
}
586
+
}
587
+
}]
588
+
)
589
+
```
590
+
534
591
#### Built in function calling
535
592
536
593
The high-level API supports OpenAI compatible function and tool calling. This is possible through the `functionary` pre-trained models chat format or through the generic `chatml-function-calling` chat format.
0 commit comments