Commit graph

85 commits

Author SHA1 Message Date
Simon Willison
fe7a1f0ee7 Tweaked header 2025-05-27 13:06:12 -07:00
Simon Willison
e23e13e6c7 Test for async toolbox, docs for toolboxes in general
Closes #1090, refs #997
2025-05-26 10:23:03 -07:00
Simon Willison
218bd10d6d Include dataclasses in plugin tool docs
Refs #1000

Refs https://github.com/simonw/llm/issues/997#issuecomment-2873497310
2025-05-26 10:05:10 -07:00
Simon Willison
35d460c5e3 How to add tool support to model plugins, closes #1000 2025-05-26 09:57:04 -07:00
Simon Willison
7eb8acb767 llm.get_key() is now a documented utility, closes #1094
Refs #1093, https://github.com/simonw/llm-tools-datasette/issues/2
2025-05-26 09:39:40 -07:00
Simon Willison
bb336d33a0
Toolbox class for class-based tool collections (#1086)
* Toolbox class for class-based tool collections

Refs #1059, #1058, #1057
2025-05-25 22:42:52 -07:00
Benjamin Kirkbride
ccc52265e9
fix typo in tutorial-model-plugin.md (#1043) 2025-05-23 23:33:40 -07:00
Simon Willison
88b806ae1a Got multi-tool OpenAI chat working, in no-stream mode too
Refs #1017, #1019
2025-05-13 17:19:30 -07:00
Simon Willison
4abd6e0faf Made a start on tools.md docs, refs #997
Also documented register_tools() plugin hook, refs #991
2025-05-13 17:19:30 -07:00
Simon Willison
cb1ea231dc
Example docstrings for fragment loaders
!stable-docs
2025-05-08 19:05:50 -07:00
Simon Willison
f07655715e
llm-video-frames
!stable-docs

Refs https://github.com/simonw/llm-video-frames/issues/2
2025-05-04 20:47:14 -07:00
Samuel Dion-Girardeau
00e5ee6b5a
Add llm-fragments-pypi to plugin directory (#929)
As the name suggests, this plugin gets PyPI packages as fragments.
Namely the project metadata and description.

See README for more usage examples:
https://github.com/samueldg/llm-fragments-pypi?tab=readme-ov-file#usage
2025-05-04 15:34:52 -07:00
Simon Willison
e02863c1ca
Fragment plugins can now optionally return attachments (#974)
Closes #972
2025-05-04 14:50:27 -07:00
Simon Willison
9a39af82cd Tip about lazy loading dependencies, closes #949
!stable-docs
2025-04-23 10:55:13 -07:00
Simon Willison
bf622a27cc
Fabric plugin now uses fabric:
https://github.com/simonw/llm-templates-fabric/issues/2

!stable-docs
2025-04-07 22:23:30 -07:00
Simon Willison
fd6b2d786a
Fragments and template loaders
!stable-docs

Refs #809, #886
2025-04-07 22:12:26 -07:00
Simon Willison
a571a4e948
register_fragment_loaders() hook (#886)
* Docs and shape of register_fragment_loaders hook, refs #863
* Update docs for fragment loaders returning a list of FragmentString
* Support multiple fragments with same content, closes #888
* Call the pm.hook.register_fragment_loaders hook
* Test for register_fragment_loaders hook
* Rename FragmentString to Fragment

Closes #863
2025-04-06 17:03:34 -07:00
Simon Willison
6c9a8efb50
register_template_loaders plugin hook, closes #809
* Moved templates CLI commands next to each other
* llm templates loaders command
* Template loader tests
* Documentation for template loaders
2025-03-21 16:46:44 -07:00
Simon Willison
090e971bf4 Model feature list for advanced plugins documentation
!stable-docs
2025-03-19 21:43:17 -07:00
Simon Willison
8d32b71ef1 Renamed build_json_schema to schema_dsl 2025-02-27 10:22:29 -08:00
Simon Willison
62c90dd472
llm prompt --schema X option and model.prompt(..., schema=) parameter (#777)
Refs #776

* Implemented new llm prompt --schema and model.prompt(schema=)
* Log schema to responses.schema_id and schemas table
* Include schema in llm logs Markdown output
* Test for schema=pydantic_model
* Initial --schema CLI documentation
* Python docs for schema=
* Advanced plugin docs on schemas
2025-02-26 16:58:28 -08:00
Simon Willison
e46cb7e761 Update docs to no longer mention PaLM
!stable-docs
2025-02-16 22:37:00 -08:00
Simon Willison
64f9f2ef52 Promote llm-mlx in changelog and plugin directory
!stable-docs
2025-02-16 22:29:32 -08:00
Simon Willison
6c6b100f3e
KeyModel and AsyncKeyModel classes for models that taken keys (#753)
* New KeyModel and AsyncKeyModel classes for models that taken keys - closes #744
* llm prompt --key now uses new mechanism, including for async
* use new key mechanism in llm chat command
* Python API tests for llm.KeyModel and llm.AsyncKeyModel
* Python API docs for for prompt(... key="")
* Mention await model.prompt() takes other parameters, reorg sections
* Better title for the model tutorial
* Docs on writing model plugins that take a key
2025-02-16 14:38:51 -08:00
Simon Willison
21df241443 llm-claude-3 is now called llm-anthropic
Refs https://github.com/simonw/llm-claude-3/issues/31

!stable-docs
2025-02-01 22:08:19 -08:00
Simon Willison
e449fd4f46
Typo fix
!stable-docs
2025-01-22 22:17:07 -08:00
Ryan Patterson
59983740e6
Update directory.md (#666) 2025-01-18 14:52:51 -08:00
abrasumente
e1388b27fe
Add llm-deepseek plugin (#517) 2025-01-11 18:56:34 -08:00
Steven Weaver
2b6b00641c
Update tutorial-model-plugin.md (#685)
pydantic.org -> pydantic.dev
2025-01-11 12:05:05 -08:00
watany
1c61b5addd
doc(plugin): adding AmazonBedrock (#698) 2025-01-10 16:42:39 -08:00
Arjan Mossel
4f4f9bc07d
Add llm-venice to plugin directory (#699) 2025-01-10 16:41:21 -08:00
Simon Willison
cfb10f4afd
Log input tokens, output tokens and token details (#642)
* Store input_tokens, output_tokens, token_details on Response, closes #610
* llm prompt -u/--usage option
* llm logs -u/--usage option
* Docs on tracking token usage in plugins
* OpenAI default plugin logs usage
2024-11-19 20:21:59 -08:00
Simon Willison
cf172cc70a response.text_or_raise() workaround
Closes https://github.com/simonw/llm/issues/632
2024-11-14 15:08:41 -08:00
Simon Willison
ba75c674cb
llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613)
- https://github.com/simonw/llm/issues/507#issuecomment-2458639308

* register_model is now async aware

Refs https://github.com/simonw/llm/issues/507#issuecomment-2458658134

* Refactor Chat and AsyncChat to use _Shared base class

Refs https://github.com/simonw/llm/issues/507#issuecomment-2458692338

* fixed function name

* Fix for infinite loop

* Applied Black

* Ran cog

* Applied Black

* Add Response.from_row() classmethod back again

It does not matter that this is a blocking call, since it is a classmethod

* Made mypy happy with llm/models.py

* mypy fixes for openai_models.py

I am unhappy with this, had to duplicate some code.

* First test for AsyncModel

* Still have not quite got this working

* Fix for not loading plugins during tests, refs #626

* audio/wav not audio/wave, refs #603

* Black and mypy and ruff all happy

* Refactor to avoid generics

* Removed obsolete response() method

* Support text = await async_mock_model.prompt("hello")

* Initial docs for llm.get_async_model() and await model.prompt()

Refs #507

* Initial async model plugin creation docs

* duration_ms ANY to pass test

* llm models --async option

Refs https://github.com/simonw/llm/pull/613#issuecomment-2474724406

* Removed obsolete TypeVars

* Expanded register_models() docs for async

* await model.prompt() now returns AsyncResponse

Refs https://github.com/simonw/llm/pull/613#issuecomment-2475157822

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-11-13 17:51:00 -08:00
Hiepler
5a984d0c87
docs: add llm-grok (#629)
Adds`llm-grok` xAI API (https://github.com/Hiepler/llm-grok) to the plugin directory.

!stable-docs
2024-11-13 17:21:04 -08:00
Simon Willison
7520671176 audio/wav not audio/wave, refs #603 2024-11-12 21:43:07 -08:00
Simon Willison
0cc4072bcd Support attachments without prompts, closes #611 2024-11-05 21:27:18 -08:00
Simon Willison
fe1e09706f
llm-lambda-labs
!stable-docs
2024-11-04 10:26:02 -08:00
Simon Willison
1126393ba1 Docs for writing models that accept attachments, refs #587 2024-10-28 15:41:34 -07:00
Simon Willison
7e6031e382
llm-gguf, llm-jq
!stable-docs
2024-10-26 22:44:06 -07:00
Kian-Meng Ang
50520c7c1c
Fix typos (#567)
Found via `codespell -H -L wit,thre`

!stable-docs
2024-09-08 08:44:43 -07:00
Simon Willison
ab1cc4fd1f Release 0.14
Refs #404, #431, #470, #490, #491
2024-05-13 13:26:48 -07:00
Fabian Labat
6cdc29c8d6
Update directory.md (#486)
* Update directory.md

Added support for Bedrock Llama 3
2024-05-13 13:01:33 -07:00
Simon Willison
3cc588f247 List llm-llamafile in plugins directory, closes #470 2024-05-13 12:55:22 -07:00
Simon Willison
04915e95f8
llm-groq
!stable-docs
2024-04-21 20:33:23 -07:00
Simon Willison
2a9b6113f5
llm-perplexity
Refs https://github.com/hex/llm-perplexity/issues/2

!stable-docs
2024-04-21 16:18:37 -07:00
Simon Willison
99a2836638
llm-fireworks
Refs https://github.com/simonw/llm-fireworks/issues/1

!stable-docs
2024-04-18 17:20:09 -07:00
Simon Willison
9ad9ac68dc
llm-reka in plugin directory
!stable-docs
2024-04-17 19:38:41 -07:00
Simon Willison
12e027d3e4
llm-command-r
!stable-docs

Refs https://github.com/simonw/llm-command-r/issues/1
2024-04-04 07:41:03 -07:00
Simon Willison
008efae86a
llm-cmd
!stable-docs

Refs https://github.com/simonw/llm-cmd/issues/1
2024-03-26 08:58:48 -07:00