Commit graph

16 commits

Author SHA1 Message Date
Simon Willison
fe7a1f0ee7 Tweaked header 2025-05-27 13:06:12 -07:00
Simon Willison
218bd10d6d Include dataclasses in plugin tool docs
Refs #1000

Refs https://github.com/simonw/llm/issues/997#issuecomment-2873497310
2025-05-26 10:05:10 -07:00
Simon Willison
35d460c5e3 How to add tool support to model plugins, closes #1000 2025-05-26 09:57:04 -07:00
Simon Willison
88b806ae1a Got multi-tool OpenAI chat working, in no-stream mode too
Refs #1017, #1019
2025-05-13 17:19:30 -07:00
Simon Willison
9a39af82cd Tip about lazy loading dependencies, closes #949
!stable-docs
2025-04-23 10:55:13 -07:00
Simon Willison
090e971bf4 Model feature list for advanced plugins documentation
!stable-docs
2025-03-19 21:43:17 -07:00
Simon Willison
8d32b71ef1 Renamed build_json_schema to schema_dsl 2025-02-27 10:22:29 -08:00
Simon Willison
62c90dd472
llm prompt --schema X option and model.prompt(..., schema=) parameter (#777)
Refs #776

* Implemented new llm prompt --schema and model.prompt(schema=)
* Log schema to responses.schema_id and schemas table
* Include schema in llm logs Markdown output
* Test for schema=pydantic_model
* Initial --schema CLI documentation
* Python docs for schema=
* Advanced plugin docs on schemas
2025-02-26 16:58:28 -08:00
Simon Willison
6c6b100f3e
KeyModel and AsyncKeyModel classes for models that taken keys (#753)
* New KeyModel and AsyncKeyModel classes for models that taken keys - closes #744
* llm prompt --key now uses new mechanism, including for async
* use new key mechanism in llm chat command
* Python API tests for llm.KeyModel and llm.AsyncKeyModel
* Python API docs for for prompt(... key="")
* Mention await model.prompt() takes other parameters, reorg sections
* Better title for the model tutorial
* Docs on writing model plugins that take a key
2025-02-16 14:38:51 -08:00
Simon Willison
e449fd4f46
Typo fix
!stable-docs
2025-01-22 22:17:07 -08:00
Simon Willison
cfb10f4afd
Log input tokens, output tokens and token details (#642)
* Store input_tokens, output_tokens, token_details on Response, closes #610
* llm prompt -u/--usage option
* llm logs -u/--usage option
* Docs on tracking token usage in plugins
* OpenAI default plugin logs usage
2024-11-19 20:21:59 -08:00
Simon Willison
cf172cc70a response.text_or_raise() workaround
Closes https://github.com/simonw/llm/issues/632
2024-11-14 15:08:41 -08:00
Simon Willison
ba75c674cb
llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613)
- https://github.com/simonw/llm/issues/507#issuecomment-2458639308

* register_model is now async aware

Refs https://github.com/simonw/llm/issues/507#issuecomment-2458658134

* Refactor Chat and AsyncChat to use _Shared base class

Refs https://github.com/simonw/llm/issues/507#issuecomment-2458692338

* fixed function name

* Fix for infinite loop

* Applied Black

* Ran cog

* Applied Black

* Add Response.from_row() classmethod back again

It does not matter that this is a blocking call, since it is a classmethod

* Made mypy happy with llm/models.py

* mypy fixes for openai_models.py

I am unhappy with this, had to duplicate some code.

* First test for AsyncModel

* Still have not quite got this working

* Fix for not loading plugins during tests, refs #626

* audio/wav not audio/wave, refs #603

* Black and mypy and ruff all happy

* Refactor to avoid generics

* Removed obsolete response() method

* Support text = await async_mock_model.prompt("hello")

* Initial docs for llm.get_async_model() and await model.prompt()

Refs #507

* Initial async model plugin creation docs

* duration_ms ANY to pass test

* llm models --async option

Refs https://github.com/simonw/llm/pull/613#issuecomment-2474724406

* Removed obsolete TypeVars

* Expanded register_models() docs for async

* await model.prompt() now returns AsyncResponse

Refs https://github.com/simonw/llm/pull/613#issuecomment-2475157822

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-11-13 17:51:00 -08:00
Simon Willison
7520671176 audio/wav not audio/wave, refs #603 2024-11-12 21:43:07 -08:00
Simon Willison
0cc4072bcd Support attachments without prompts, closes #611 2024-11-05 21:27:18 -08:00
Simon Willison
1126393ba1 Docs for writing models that accept attachments, refs #587 2024-10-28 15:41:34 -07:00