Commit graph

147 commits

Author SHA1 Message Date
Simon Willison
902cf6010c Fixed type hint on Prompt 2023-07-07 20:38:53 -07:00
Simon Willison
5e1b528ebb Snappier tutorial title 2023-07-07 08:13:45 -07:00
Simon Willison
40f3892e19 Fix lin to Gist in tutorial 2023-07-06 21:11:05 -07:00
Simon Willison
94b13ab872 types-click 2023-07-06 20:45:10 -07:00
Simon Willison
c1e3cbf2e9 Detailed tutorial on writing plugins 2023-07-06 20:38:12 -07:00
Simon Willison
3d5c9b2d3b Read prompt after validating options
This means that if you do this:

    llm -m markov -o length -1

You will see an error message rather than have the command hang
waiting for a prompt to be entered on stdin.
2023-07-06 19:57:04 -07:00
Simon Willison
04568115b8 Options base class is now llm.Options not Model.Options 2023-07-05 20:47:36 -07:00
Simon Willison
b4248df72a Markov plugin now lives in llm-markov repo 2023-07-05 20:46:53 -07:00
Simon Willison
4611bff412 iter_prompt() now takes prompt 2023-07-05 20:46:17 -07:00
Simon Willison
c08344f986 llm logs now decodes JSON for prompt_json etc 2023-07-05 18:31:38 -07:00
Simon Willison
b88906d459 Default __str__ method for models 2023-07-05 18:26:16 -07:00
Simon Willison
6ef6b343a9 Improved how keys work, execute() now has default implementation 2023-07-05 18:25:57 -07:00
Simon Willison
f193468f76 llm install -e/--editable option 2023-07-05 17:58:19 -07:00
Simon Willison
78c93b9e23 Model.execute() model now defaults to using self.Response 2023-07-05 16:57:10 -07:00
Simon Willison
23886e04ae Better error message display
Refs https://github.com/pydantic/pydantic/issues/6441
2023-07-05 16:36:38 -07:00
Simon Willison
bc8dad1e33 stream defaults to True on prompt() method 2023-07-04 08:14:00 -07:00
Simon Willison
49e4d688c6 Removed .stream() method in favor of .prompt(stream=False) 2023-07-04 07:50:31 -07:00
Simon Willison
3136948408 Moved things into inner classes, log_message is now defined on base Response 2023-07-03 21:25:19 -07:00
Simon Willison
de81cc9a9e Test messages logged in new format 2023-07-03 08:12:04 -07:00
Simon Willison
307af2474c Fix column order in logs 2023-07-03 07:29:58 -07:00
Simon Willison
345ad0d2dc Implemented new logs database schema 2023-07-03 07:27:47 -07:00
Simon Willison
b1c51df3f1 New LogMessage design, plus Response.json() method 2023-07-03 06:46:51 -07:00
Simon Willison
61dd8afc60 Drop the debug field from the logs, combine chunks from stream 2023-07-03 06:39:54 -07:00
Simon Willison
84b99f8baf -o/--option, implemented for OpenAI models - closes #63 2023-07-02 17:42:22 -07:00
Simon Willison
e485eadf14 just fix command 2023-07-02 17:40:45 -07:00
Simon Willison
52add96ec1 Rough initial version of new logging, to log2 table 2023-07-02 16:35:36 -07:00
Simon Willison
4a9f7f4908 Lint using Ruff, refs #78 2023-07-02 12:41:40 -07:00
Simon Willison
9eeebfa020 type stubs for PyYAML and requests, refs #77 2023-07-02 12:39:37 -07:00
Simon Willison
3ed86e403d Added mypy, plus some fixes to make it happy - refs #77 2023-07-02 12:36:22 -07:00
Simon Willison
7054c673e1 Better type hint for iter_prompt method
The Generator thing is only useful if the method uses the send mechanism

Declaring it an Iterator gives people more flexibility for how they implement it.
2023-07-02 11:18:31 -07:00
Simon Willison
9a180e65a8 llm models default command, plus refactored env variables
Closes #76
Closes #31
2023-07-01 14:01:29 -07:00
Simon Willison
f3f38c0c02 Initial experimental response.reply() method 2023-07-01 13:29:54 -07:00
Simon Willison
664fca165b .stream is now in base Response class 2023-07-01 13:29:27 -07:00
Simon Willison
183b647351 Removed PaLM 2 vertex model
It lives here now: https://github.com/simonw/llm-palm

Refs #20
2023-07-01 13:27:59 -07:00
Simon Willison
9d914bf9cc Include cogapp in pip install -e '.[test]' 2023-07-01 11:56:50 -07:00
Simon Willison
feff460496 Fix for pydantic warning, refs #74 2023-07-01 11:56:35 -07:00
Simon Willison
9ac120fae4 Upgrade to pydantic 2 using bump-pydantic, refs #74 2023-07-01 11:53:54 -07:00
Simon Willison
d26ed84939 Don't include tests/ in the package 2023-07-01 11:45:00 -07:00
Simon Willison
0f266d6657 Fix for missing package bug 2023-07-01 11:36:58 -07:00
Simon Willison
7485e0646e Use pip install -e '.[test]' 2023-07-01 11:32:38 -07:00
Simon Willison
9afc758cd7 Pass model to the Response 2023-07-01 11:29:41 -07:00
Simon Willison
6ef52172b0 Fixed timezone related test failure 2023-07-01 11:14:25 -07:00
Simon Willison
4d304d99e1 Fixed import error in llm openai models 2023-07-01 11:12:06 -07:00
Simon Willison
1b3f14fe89 Move default plugins into llm/default_plugins 2023-07-01 11:10:30 -07:00
Simon Willison
c4513068fb Disabled DB logging test for the moment 2023-07-01 11:07:10 -07:00
Simon Willison
14ce371007 Fix plugins tests to account for default plugins 2023-07-01 11:06:28 -07:00
Simon Willison
8f7450bd74 Fixed a test 2023-07-01 11:03:16 -07:00
Simon Willison
714b867e92 -s shortcut for --system, closes #69 2023-07-01 10:46:20 -07:00
Simon Willison
b9dd0a34db Ran cog for llm openai --help, refs #70 2023-07-01 10:40:11 -07:00
Simon Willison
c679d4d99e llm openai models command, closes #70 2023-07-01 10:39:24 -07:00