Add supports_schema to extra-openai-models (#819)

Recently support for structured output was added. But custom
OpenAI-compatible models didn't support the `supports_schema` property
in the config file `extra-openai-models.yaml`.
This commit is contained in:
adaitche 2025-03-22 00:59:34 +01:00 committed by GitHub
parent 6c9a8efb50
commit de87d37c28
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
2 changed files with 5 additions and 1 deletions

View file

@ -47,6 +47,8 @@ Add `completion: true` if the model is a completion model that uses a `/completi
If a model does not support streaming, add `can_stream: false` to disable the streaming option.
If a model supports structured output via JSON schemas, you can add `supports_schema: true` to support this feature.
Having configured the model like this, run `llm models` to check that it installed correctly. You can then run prompts against it like so:
```bash
@ -69,4 +71,4 @@ Some providers such as [openrouter.ai](https://openrouter.ai/docs) may require t
headers:
HTTP-Referer: "https://llm.datasette.io/"
X-Title: LLM
```
```

View file

@ -135,6 +135,8 @@ def register_models(register):
kwargs = {}
if extra_model.get("can_stream") is False:
kwargs["can_stream"] = False
if extra_model.get("supports_schema") is True:
kwargs["supports_schema"] = True
if extra_model.get("completion"):
klass = Completion
else: