Release 0.25

Refs #876, #887, #903, #904, #907, #908, #916, #918, #925, #929, #932, #933, #941, #945, #949, #950, #957, #965, #968, #969, #972, #973, #974, #975, #976
This commit is contained in:
Simon Willison 2025-05-04 20:23:30 -07:00
parent 8e68c5e2d9
commit 8df839383f
2 changed files with 15 additions and 1 deletions

View file

@ -1,5 +1,19 @@
# Changelog
(v0_25)=
## 0.25 (2025-05-04)
- New plugin feature: {ref}`plugin-hooks-register-fragment-loaders` plugins can now return a mixture of fragments and attachments. The [llm-video-frames](https://github.com/simonw/llm-video-frames) plugin is the first to take advantage of this mechanism. [#972](https://github.com/simonw/llm/issues/972)
- New OpenAI models: `gpt-4.1`, `gpt-4.1-mini`, `gpt-41-nano`, `o3`, `o4-mini`. [#945](https://github.com/simonw/llm/issues/945), [#965](https://github.com/simonw/llm/issues/965), [#976](https://github.com/simonw/llm/issues/976).
- New environment variables: `LLM_MODEL` and `LLM_EMBEDDING_MODEL` for setting the model to use without needing to specify `-m model_id` every time. [#932](https://github.com/simonw/llm/issues/932)
- New command: `llm fragments loaders`, to list all currently available fragment loader prefixes provided by plugins. [#941](https://github.com/simonw/llm/issues/941)
- `llm fragments` command now shows fragments ordered by the date they were first used. [#973](https://github.com/simonw/llm/issues/973)
- `llm chat` now includes a `!edit` command for editing a prompt using your default terminal text editor. Thanks, [Benedikt Willi](https://github.com/Hopiu). [#969](https://github.com/simonw/llm/pull/969)
- Allow `-t` and `--system` to be used at the same time. [#916](https://github.com/simonw/llm/issues/916)
- Fixed a bug where accessing a model via its alias would fail to respect any default options set for that model. [#968](https://github.com/simonw/llm/issues/968)
- Improved documentation for {ref}extra-openai-models.yaml <openai-extra-models>`. Thanks, [Rahim Nathwani](https://github.com/rahimnathwani) and [Dan Guido](https://github.com/dguido). [#950](https://github.com/simonw/llm/pull/950), [#957](https://github.com/simonw/llm/pull/957)
- `llm -c/--continue` now works correctly with the `-d/--database` option. `llm chat` now accepts that `-d/--database` option. Thanks, [Sukhbinder Singh](https://github.com/sukhbinder). [#933](https://github.com/simonw/llm/issues/933)
(v0_25a0)=
## 0.25a0 (2025-04-10)

View file

@ -1,6 +1,6 @@
[project]
name = "llm"
version = "0.25a0"
version = "0.25"
description = "CLI utility and Python library for interacting with Large Language Models from organizations like OpenAI, Anthropic and Gemini plus local models installed on your own machine."
readme = { file = "README.md", content-type = "text/markdown" }
authors = [