mirror of
https://github.com/Hopiu/llm.git
synced 2026-03-17 05:00:25 +00:00
Add supports_schema to extra-openai-models (#819)
Recently support for structured output was added. But custom OpenAI-compatible models didn't support the `supports_schema` property in the config file `extra-openai-models.yaml`.
This commit is contained in:
parent
6c9a8efb50
commit
de87d37c28
2 changed files with 5 additions and 1 deletions
|
|
@ -47,6 +47,8 @@ Add `completion: true` if the model is a completion model that uses a `/completi
|
|||
|
||||
If a model does not support streaming, add `can_stream: false` to disable the streaming option.
|
||||
|
||||
If a model supports structured output via JSON schemas, you can add `supports_schema: true` to support this feature.
|
||||
|
||||
Having configured the model like this, run `llm models` to check that it installed correctly. You can then run prompts against it like so:
|
||||
|
||||
```bash
|
||||
|
|
@ -69,4 +71,4 @@ Some providers such as [openrouter.ai](https://openrouter.ai/docs) may require t
|
|||
headers:
|
||||
HTTP-Referer: "https://llm.datasette.io/"
|
||||
X-Title: LLM
|
||||
```
|
||||
```
|
||||
|
|
|
|||
|
|
@ -135,6 +135,8 @@ def register_models(register):
|
|||
kwargs = {}
|
||||
if extra_model.get("can_stream") is False:
|
||||
kwargs["can_stream"] = False
|
||||
if extra_model.get("supports_schema") is True:
|
||||
kwargs["supports_schema"] = True
|
||||
if extra_model.get("completion"):
|
||||
klass = Completion
|
||||
else:
|
||||
|
|
|
|||
Loading…
Reference in a new issue