Simon Willison
b8b030fc58
Release 0.22
...
Refs #737 , #742 , #744 , #745 , #748 , #752
2025-02-16 20:34:48 -08:00
Simon Willison
c053e214ec
include token usage information, not get - refs #756
2025-02-16 15:18:04 -08:00
Simon Willison
53d6ecdd59
Documentation for logs --short, refs #756
2025-02-16 15:16:51 -08:00
Simon Willison
6c6b100f3e
KeyModel and AsyncKeyModel classes for models that taken keys ( #753 )
...
* New KeyModel and AsyncKeyModel classes for models that taken keys - closes #744
* llm prompt --key now uses new mechanism, including for async
* use new key mechanism in llm chat command
* Python API tests for llm.KeyModel and llm.AsyncKeyModel
* Python API docs for for prompt(... key="")
* Mention await model.prompt() takes other parameters, reorg sections
* Better title for the model tutorial
* Docs on writing model plugins that take a key
2025-02-16 14:38:51 -08:00
Simon Willison
8611d9203c
Updated docs with new chatgpt-4o-latest model, refs #752
2025-02-15 17:46:07 -08:00
Simon Willison
747d92ea4f
Docs for multiple -q option, closes #748
2025-02-13 16:01:02 -08:00
Simon Willison
31e900e9e1
llm aliases set -q option, refs #749
2025-02-13 15:49:47 -08:00
Simon Willison
20c18a716d
-q multiple option for llm models and llm embed-models
...
Refs #748
2025-02-13 15:35:18 -08:00
Simon Willison
9a1374b447
llm embed-multi --prepend option ( #746 )
...
* llm embed-multi --prepend option
Closes #745
2025-02-12 15:19:18 -08:00
Simon Willison
f67c21522b
Docs for response.json() and response.usage()
...
!stable-docs
2025-02-11 08:35:27 -08:00
Simon Willison
41d64a8f12
llm logs --prompts option ( #737 )
...
Closes #736
2025-02-02 12:03:01 -08:00
Simon Willison
21df241443
llm-claude-3 is now called llm-anthropic
...
Refs https://github.com/simonw/llm-claude-3/issues/31
!stable-docs
2025-02-01 22:08:19 -08:00
Simon Willison
f8dcc67455
Release 0.21
...
Refs #717 , #728
2025-01-31 12:35:10 -08:00
Simon Willison
eb0e1e761b
o3-mini and reasoning_effort option, refs #728
2025-01-31 12:14:02 -08:00
Simon Willison
656d8fa3c4
--xl/--extract-last flag for prompt and log list commands ( #718 )
...
Closes #717
2025-01-24 10:52:46 -08:00
Simon Willison
e449fd4f46
Typo fix
...
!stable-docs
2025-01-22 22:17:07 -08:00
Simon Willison
3e88628602
uv tool upgrade llm, refs #702
...
!stable-docs
2025-01-22 21:08:16 -08:00
Simon Willison
bf10f63d3d
Mention gpt-4o-mini-audio-preview too #677
...
!stable-docs
2025-01-22 21:06:12 -08:00
Simon Willison
eb996baeab
Documentation for model.attachment_types, closes #705
2025-01-22 20:46:28 -08:00
Simon Willison
2b9a1bbc50
Fixed broken link
2025-01-22 20:39:01 -08:00
Simon Willison
dc127d2a87
Release 0.20
...
Refs #654 , #676 , #677 , #681 , #688 , #690 , #700 , #702 , #709
2025-01-22 20:36:10 -08:00
Simon Willison
57d3baac42
Update embedding model names in docs, refs #654
...
Also ran Black.
2025-01-22 20:35:17 -08:00
Ryan Patterson
59983740e6
Update directory.md ( #666 )
2025-01-18 14:52:51 -08:00
abrasumente
e1388b27fe
Add llm-deepseek plugin ( #517 )
2025-01-11 18:56:34 -08:00
Steven Weaver
2b6b00641c
Update tutorial-model-plugin.md ( #685 )
...
pydantic.org -> pydantic.dev
2025-01-11 12:05:05 -08:00
Amjith Ramanujam
e3c104b136
Show the default model when listing all available models. ( #688 )
2025-01-11 12:04:39 -08:00
Simon Willison
1d75792f9b
More uv/uvx tips, closes #702
...
Refs #690
2025-01-11 10:06:32 -08:00
Ariel Marcus
d964d02e90
Add installation docs with uv ( #690 )
2025-01-11 09:57:10 -08:00
watany
1c61b5addd
doc(plugin): adding AmazonBedrock ( #698 )
2025-01-10 16:42:39 -08:00
Arjan Mossel
4f4f9bc07d
Add llm-venice to plugin directory ( #699 )
2025-01-10 16:41:21 -08:00
Simon Willison
6baf1f7d83
o1
...
Closes #676
2025-01-10 15:57:06 -08:00
Csaba Henk
88a8cfd9e4
llm logs -x/--extract option ( #693 )
...
* llm logs -x/--extract option
* Update docs/help.md for llm logs -x
* Added test for llm logs -x/--extract, refs #693
* llm logs -xr behaves same as llm logs -x
* -x/--extract in llm logging docs
---------
Co-authored-by: Simon Willison <swillison@gmail.com>
2025-01-10 15:53:04 -08:00
Simon Willison
b452effa09
llm models -q/--query option, closes #700
2025-01-09 11:37:33 -08:00
Simon Willison
000e984def
--extract support for templates, closes #681
2024-12-19 07:16:48 -08:00
Simon Willison
67d4a99645
llm prompt -x/--extract option, closes #681
2024-12-19 06:40:05 -08:00
Simon Willison
6305b86026
gpt-4o-mini-audio-preview, closes #677
2024-12-17 20:28:57 -08:00
Simon Willison
8898584ba6
New OpenAI audio models, closes #677
2024-12-17 11:14:42 -08:00
Simon Willison
b8e8052229
Release 0.19.1
...
Refs #667
2024-12-05 13:47:28 -08:00
Simon Willison
e78fea17df
Fragment hash on 0.19 release
...
!stable-docs
2024-12-01 16:09:55 -08:00
Simon Willison
c018104083
Release 0.19
...
Refs #495 , #610 , #640 , #641 , #644 , #653
2024-12-01 15:58:27 -08:00
Simon Willison
f9af563df5
response.on_done() mechanism, closes #653
2024-12-01 15:47:23 -08:00
Simon Willison
335b3e635a
Release 0.19a2
...
Refs #640
2024-11-20 20:12:43 -08:00
Simon Willison
c52cfee881
llm.get_models() and llm.get_async_models(), closes #640
2024-11-20 20:09:06 -08:00
Simon Willison
845322e970
Release 0.19a1
...
Refs #644
2024-11-19 21:28:01 -08:00
Simon Willison
02852fe1a5
Release 0.19a0
...
Refs #610 , #641
2024-11-19 20:23:54 -08:00
Simon Willison
cfb10f4afd
Log input tokens, output tokens and token details ( #642 )
...
* Store input_tokens, output_tokens, token_details on Response, closes #610
* llm prompt -u/--usage option
* llm logs -u/--usage option
* Docs on tracking token usage in plugins
* OpenAI default plugin logs usage
2024-11-19 20:21:59 -08:00
Simon Willison
a6d62b7ec9
Release 0.18
...
Refs #507 , #600 , #603 , #608 , #611 , #612 , #614
2024-11-17 12:31:48 -08:00
Simon Willison
73823012ca
Release 0.18a1
...
Refs #632
2024-11-14 15:10:39 -08:00
Simon Willison
cf172cc70a
response.text_or_raise() workaround
...
Closes https://github.com/simonw/llm/issues/632
2024-11-14 15:08:41 -08:00
Simon Willison
041730d8b2
Release 0.18a0
...
Refs #507 , #599 , #600 , #603 , #608 , #611 , #612 , #613 , #614 , #615 , #616 , #621 , #622 , #623 , #626 , #629
2024-11-13 17:55:28 -08:00