Commit graph

455 commits

Author SHA1 Message Date
Rahim Nathwani
07ccfbcee5
Improved docs for extra-openai-models.yaml (#957)
- Mention mandatory model_name field 
- Document support_schema option
2025-05-04 10:30:37 -07:00
Abizer Lokhandwala
0b37123a38
Add GPT-4.1 model family to default OpenAI plugin (#965)
* openai: add gpt-4.1 models
* Refactor and run cog

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2025-05-04 10:27:12 -07:00
Benedikt Willi
7dcfa31143
Add '!edit' command to modify prompts in chat sessions using default editor (#969)
* Add '!edit' command to modify prompts in chat sessions
2025-05-04 10:19:48 -07:00
Simon Willison
9a39af82cd Tip about lazy loading dependencies, closes #949
!stable-docs
2025-04-23 10:55:13 -07:00
Simon Willison
963d85325d
Files not file 2025-04-23 07:52:17 -07:00
Simon Willison
fa34d7d452 Match example output to reality
Refs https://github.com/simonw/llm-fragments-github/issues/4

Refs #941
2025-04-20 08:01:58 -07:00
Simon Willison
c9f64096c9 llm fragments loaders, closes #941 2025-04-20 07:56:27 -07:00
Simon Willison
e78e1fceb2 LLM_MODEL and LLM_EMBEDDING_MODEL environment vars, closes #932 2025-04-19 20:41:24 -07:00
Simon Willison
3f25bb8bc9 Document difference between templates and fragments, cloess #918
!stable-docs
2025-04-13 20:50:26 -07:00
Simon Willison
6273bc79ff Release 0.25a0
Refs #887, #903, #904, #908
2025-04-10 17:28:36 -07:00
Simon Willison
54f54efcbe
Convert from setup.py to pyproject.toml (#908)
* Build package as part of tests, upload as artifact
* Only stash artifact for ubuntu-latest Python 3.13

Closes #907
2025-04-10 16:57:53 -07:00
guspix
0cc26b3d4f
Allow escaping a dollar sign in templates by adding another dollar sign (#876)
* Allow escaping a dollar sign in templates by adding another dollar sign
* Docs for escaping $$ in template, refs #876, #904

Closes #904

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2025-04-08 20:33:39 -07:00
Steve Morin
4894b13414
llm models --options now shows keys and environment variables
Closes #903
2025-04-08 20:15:56 -07:00
Simon Willison
e3b8371c58 Release 0.24.2
Refs #901, #902
2025-04-08 20:00:29 -07:00
Simon Willison
7b46c17976 Release 0.24.1
Refs #879, #897
2025-04-08 13:40:04 -07:00
Simon Willison
56d7f2f13a llm prompt -t filepath.yaml support, closes #897 2025-04-08 13:36:48 -07:00
Simon Willison
bf622a27cc
Fabric plugin now uses fabric:
https://github.com/simonw/llm-templates-fabric/issues/2

!stable-docs
2025-04-07 22:23:30 -07:00
Simon Willison
fd6b2d786a
Fragments and template loaders
!stable-docs

Refs #809, #886
2025-04-07 22:12:26 -07:00
Simon Willison
0fbbe6a054 llm logs backup command, closes #879 2025-04-07 21:45:52 -07:00
Simon Willison
63be6ef51d Update background on this project links
Refs #882

!stable-docs docs
2025-04-07 12:17:19 -07:00
Simon Willison
af14a9780c
Link to annotated release notes
!stable-docs
2025-04-07 10:47:17 -07:00
Simon Willison
7ad1ddab62 Release 0.24
Refs #617, #759, #809, #817, #819, #824, #825, #826, #829, #834, #835, #841, #843, #845, #853, #856, #857, #858, #880, #881, #886

Closes #882
2025-04-07 08:36:22 -07:00
Simon Willison
631835ef71 If template has no $input variable prompt is still concatenated, closes #878 2025-04-07 07:31:28 -07:00
Simon Willison
d0255a1eda llm fragments --aliases option, closes #891 2025-04-06 22:30:27 -07:00
Simon Willison
b011ed8352 Multpile llm logs -f are now ANDed together, closes #889 2025-04-06 22:22:30 -07:00
Simon Willison
d4a54bb7a8 Added missing reference ID, refs #884 2025-04-06 22:13:10 -07:00
Simon Willison
de0ae8118d Documentation for fragments: and system_fragments:, closes #884 2025-04-06 22:12:05 -07:00
Simon Willison
b02ab1d54b Fragments page in docs, closes #885 2025-04-06 22:07:08 -07:00
Simon Willison
05d1a363a0 Release notes for 0.24a1, refs #890 2025-04-06 17:42:17 -07:00
Simon Willison
1aad73168d Reorder docs, rename Prompt templates to Templates 2025-04-06 17:05:49 -07:00
Simon Willison
a571a4e948
register_fragment_loaders() hook (#886)
* Docs and shape of register_fragment_loaders hook, refs #863
* Update docs for fragment loaders returning a list of FragmentString
* Support multiple fragments with same content, closes #888
* Call the pm.hook.register_fragment_loaders hook
* Test for register_fragment_loaders hook
* Rename FragmentString to Fragment

Closes #863
2025-04-06 17:03:34 -07:00
Simon Willison
3de33be74f Support multiple fragments with same content, closes #888 2025-04-06 16:26:57 -07:00
Simon Willison
ac49075129 llm logs -e/--expand option, closes #881 2025-04-06 00:25:28 -07:00
Simon Willison
b7f54028e2 Better help for --at option 2025-04-05 17:29:10 -07:00
Simon Willison
f3e02b6de6 Fix for broken markdown in docs, closes #860 2025-04-05 17:24:31 -07:00
Simon Willison
f740a5cbbd
Fragments (#859)
* WIP fragments: schema plus reading but not yet writing, refs #617
* Unique index on fragments.alias, refs #617
* Fragments are now persisted, added basic CLI commands
* Fragment aliases work now, refs #617
* Improved help for -f/--fragment
* Support fragment hash as well
* Documentation for fragments
* Better non-JSON display of llm fragments list
* llm fragments -q search option
* _truncate_string is now truncate_string
* Use condense_json to avoid duplicate data in JSON in DB, refs #617
* Follow up to 3 redirects for fragments
* Python API docs for fragments= and system_fragments=
* Fragment aliases cannot contain a : - this is to ensure we can add custom fragment loaders later on, refs https://github.com/simonw/llm/pull/859#issuecomment-2761534692
* Use template fragments when running prompts
* llm fragments show command plus llm fragments group tests
* Tests for fragments family of commands
* Test for --save with fragments
* Add fragments tables to docs/logging.md
* Slightly better llm fragments --help
* Handle fragments in past conversations correctly
* Hint at llm prompt --help in llm --help, closes #868
* llm logs -f filter plus show fragments in llm logs --json
* Include prompt and system fragments in llm logs -s
* llm logs markdown fragment output and tests, refs #617
2025-04-05 17:22:37 -07:00
Simon Willison
70e0799821 Hint at llm prompt --help in llm --help, closes #868 2025-03-29 21:00:41 -07:00
Simon Willison
f641b89882 llm similar -p/--plain option, closes #853 2025-03-28 00:36:08 -07:00
Simon Willison
5b2c611c82 llm prompt -d/--database option, closes #858 2025-03-28 00:20:31 -07:00
Simon Willison
7e7ccdc19a Hide -p/--path in favor of standard -d/--database, closes #857
Spotted while working on #853
2025-03-28 00:11:01 -07:00
Simon Willison
9a24605996 Allow -t to take a URL to a template, closes #856 2025-03-27 20:36:58 -07:00
Simon Willison
3f6bccf87d Link to two more blog entries
!stable-docs
2025-03-25 19:30:48 -07:00
Simon Willison
22175414f0 Extra OpenAI docs including mention of PDFs, closes #834 2025-03-25 19:30:42 -07:00
Simon Willison
468b0551ee
llm models options commands for setting default model options
Closes #829
2025-03-22 18:28:45 -07:00
Simon Willison
1ad7bbd32a
Ability to store options in templates (#845)
* llm prompt --save option support, closes #830
* Fix for templates with just a system prompt, closes #844
* Tests for options from template, refs #830
* Test and bug fix for --save with options, refs #830
* Docs for template options support, refs #830
2025-03-22 17:24:02 -07:00
giuli007
51db7afddb
Support vision and audio for extra-openai-models.yaml (#843)
Add a vision option to enable OpenAI-compatible
models to receive image and audio attachments
2025-03-22 16:14:18 -07:00
Simon Willison
99cd2aa148
Improved OpenAI model docs
Refs #839, closes #840
2025-03-21 18:31:20 -07:00
adaitche
de87d37c28
Add supports_schema to extra-openai-models (#819)
Recently support for structured output was added. But custom
OpenAI-compatible models didn't support the `supports_schema` property
in the config file `extra-openai-models.yaml`.
2025-03-21 16:59:34 -07:00
Simon Willison
6c9a8efb50
register_template_loaders plugin hook, closes #809
* Moved templates CLI commands next to each other
* llm templates loaders command
* Template loader tests
* Documentation for template loaders
2025-03-21 16:46:44 -07:00
Simon Willison
3541415db4 llm prompt -q X -q Y option, closes #841 2025-03-21 15:17:16 -07:00