mirror of
https://github.com/Hopiu/llm.git
synced 2026-04-10 16:21:01 +00:00
Relase 0.27
Refs #1009, #1014, #1049, #1053, #1088, #1104, #1110, #1111, #1114, #1117, #1134, #1137, #1148, #1150, #1177, #1228, #1229, #1232
This commit is contained in:
parent
5204a11f33
commit
4292fe9d21
3 changed files with 37 additions and 1 deletions
|
|
@ -1,5 +1,33 @@
|
|||
# Changelog
|
||||
|
||||
(v0_27)=
|
||||
## 0.27 (2025-08-11)
|
||||
|
||||
This release adds support for the new **GPT-5 family** of models from OpenAI. It also enhances tool calling in a number of ways, including allowing {ref}`templates <prompt-templates>` to bundle pre-configured tools.
|
||||
|
||||
### New features
|
||||
|
||||
- New models: `gpt-5`, `gpt-5-mini` and `gpt-5-nano`. [#1229](https://github.com/simonw/llm/issues/1229)
|
||||
- LLM {ref}`templates <prompt-templates>` can now include a list of tools. These can be named tools from plugins or arbitrary Python function blocks, see {ref}`Tools in templates <prompt-templates-tools>`. [#1009](https://github.com/simonw/llm/issues/1009)
|
||||
- Tools {ref}`can now return attachments <python-api-tools-attachments>`, for models that support features such as image input. [#1014](https://github.com/simonw/llm/issues/1014)
|
||||
- New methods on the `Toolbox` class: `.add_tool()`, `.prepare()` and `.prepare_async()`, described in {ref}`Dynamic toolboxes <python-api-tools-dynamic>`. [#1111](https://github.com/simonw/llm/issues/1111)
|
||||
- New `model.conversation(before_call=x, after_call=y)` attributes for registering callback functions to run before and after tool calls. See {ref}`tool debugging hooks <python-api-tools-debug-hooks>` for details. [#1088](https://github.com/simonw/llm/issues/1088)
|
||||
- Some model providers can serve different models from the same configured URL - [llm-llama-server](https://github.com/simonw/llm-llama-server) for example. Plugins for these providers can now record the resolved model ID of the model that was used to the LLM logs using the `response.set_resolved_model(model_id)` method. [#1117](https://github.com/simonw/llm/issues/1117)
|
||||
- Raising `llm.CancelToolCall` now only cancels the current tool call, passing an error back to the model and allowing it to continue. [#1148](https://github.com/simonw/llm/issues/1148)
|
||||
- New `-l/--latest` option for `llm logs -q searchterm` for searching logs ordered by date (most recent first) instead of the default relevance search. [#1177](https://github.com/simonw/llm/issues/1177)
|
||||
|
||||
### Bug fixes and documentation
|
||||
|
||||
- The `register_embedding_models` hook is [now documented](https://llm.datasette.io/en/stable/plugins/plugin-hooks.html#register-embedding-models-register). [#1049](https://github.com/simonw/llm/issues/1049)
|
||||
- Show visible stack trace for `llm templates show invalid-template-name`. [#1053](https://github.com/simonw/llm/issues/1053)
|
||||
- Handle invalid tool names more gracefully in `llm chat`. [#1104](https://github.com/simonw/llm/issues/1104)
|
||||
- Add a {ref}`Tool plugins <plugin-directory-tools>` section to the plugin directory. [#1110](https://github.com/simonw/llm/issues/1110)
|
||||
- Error on `register(Klass)` if the passed class is not a subclass of `Toolbox`. [#1114](https://github.com/simonw/llm/issues/1114)
|
||||
- Add `-h` for `--help` for all `llm` CLI commands. [#1134](https://github.com/simonw/llm/issues/1134)
|
||||
- Add missing `dataclasses` to advanced model plugins docs. [#1137](https://github.com/simonw/llm/issues/1137)
|
||||
- Fixed a bug where `llm logs -T llm_version "version" --async` incorrectly recorded just one single log entry when it should have recorded two. [#1150](https://github.com/simonw/llm/issues/1150)
|
||||
- All extra OpenAI model keys in `extra-openai-models.yaml` are {ref}`now documented <openai-compatible-models>`. [#1228](https://github.com/simonw/llm/issues/1228)
|
||||
|
||||
(v0_26)=
|
||||
## 0.26 (2025-05-27)
|
||||
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@
|
|||
|
||||
The following plugins are available for LLM. Here's {ref}`how to install them <installing-plugins>`.
|
||||
|
||||
(plugin-directory-local-models)=
|
||||
## Local models
|
||||
|
||||
These plugins all help you run LLMs directly on your own computer:
|
||||
|
|
@ -15,6 +16,7 @@ These plugins all help you run LLMs directly on your own computer:
|
|||
- **[llm-gpt4all](https://github.com/simonw/llm-gpt4all)** adds support for various models released by the [GPT4All](https://gpt4all.io/) project that are optimized to run locally on your own machine. These models include versions of Vicuna, Orca, Falcon and MPT - here's [a full list of models](https://observablehq.com/@simonw/gpt4all-models).
|
||||
- **[llm-mpt30b](https://github.com/simonw/llm-mpt30b)** adds support for the [MPT-30B](https://huggingface.co/mosaicml/mpt-30b) local model.
|
||||
|
||||
(plugin-directory-remote-apis)=
|
||||
## Remote APIs
|
||||
|
||||
These plugins can be used to interact with remotely hosted models via their API:
|
||||
|
|
@ -42,6 +44,7 @@ These plugins can be used to interact with remotely hosted models via their API:
|
|||
|
||||
If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin.
|
||||
|
||||
(plugin-directory-tools)=
|
||||
## Tools
|
||||
|
||||
The following plugins add new {ref}`tools <tools>` that can be used by models:
|
||||
|
|
@ -53,6 +56,7 @@ The following plugins add new {ref}`tools <tools>` that can be used by models:
|
|||
- **[llm-tools-exa](https://github.com/daturkel/llm-tools-exa)** by Dan Turkel can perform web searches and question-answering using [exa.ai](https://exa.ai/).
|
||||
- **[llm-tools-rag](https://github.com/daturkel/llm-tools-rag)** by Dan Turkel can perform searches over your LLM embedding collections for simple RAG.
|
||||
|
||||
(plugin-directory-loaders)=
|
||||
## Fragments and template loaders
|
||||
|
||||
{ref}`LLM 0.24 <v0_24>` introduced support for plugins that define `-f prefix:value` or `-t prefix:value` custom loaders for fragments and templates.
|
||||
|
|
@ -67,6 +71,7 @@ The following plugins add new {ref}`tools <tools>` that can be used by models:
|
|||
- **[llm-fragments-site-text](https://github.com/daturkel/llm-fragments-site-text)** by Dan Turkel converts websites to markdown with [Trafilatura](https://trafilatura.readthedocs.io/en/latest/) to use as fragments: `llm -f site:https://example.com "summarize this"`.
|
||||
- **[llm-fragments-reader](https://github.com/simonw/llm-fragments-reader)** runs a URL theough the Jina Reader API: `llm -f 'reader:https://simonwillison.net/tags/jina/' summary`.
|
||||
|
||||
(plugin-directory-embeddings)=
|
||||
## Embedding models
|
||||
|
||||
{ref}`Embedding models <embeddings>` are models that can be used to generate and store embedding vectors for text.
|
||||
|
|
@ -76,6 +81,7 @@ The following plugins add new {ref}`tools <tools>` that can be used by models:
|
|||
- **[llm-embed-jina](https://github.com/simonw/llm-embed-jina)** provides Jina AI's [8K text embedding models](https://jina.ai/news/jina-ai-launches-worlds-first-open-source-8k-text-embedding-rivaling-openai/).
|
||||
- **[llm-embed-onnx](https://github.com/simonw/llm-embed-onnx)** provides seven embedding models that can be executed using the ONNX model framework.
|
||||
|
||||
(plugin-directory-commands)=
|
||||
## Extra commands
|
||||
|
||||
- **[llm-cmd](https://github.com/simonw/llm-cmd)** accepts a prompt for a shell command, runs that prompt and populates the result in your shell so you can review it, edit it and then hit `<enter>` to execute or `ctrl+c` to cancel.
|
||||
|
|
@ -84,6 +90,7 @@ The following plugins add new {ref}`tools <tools>` that can be used by models:
|
|||
- **[llm-cluster](https://github.com/simonw/llm-cluster)** adds a `llm cluster` command for calculating clusters for a collection of embeddings. Calculated clusters can then be passed to a Large Language Model to generate a summary description.
|
||||
- **[llm-jq](https://github.com/simonw/llm-jq)** lets you pipe in JSON data and a prompt describing a `jq` program, then executes the generated program against the JSON.
|
||||
|
||||
(plugin-directory-fun)=
|
||||
## Just for fun
|
||||
|
||||
- **[llm-markov](https://github.com/simonw/llm-markov)** adds a simple model that generates output using a [Markov chain](https://en.wikipedia.org/wiki/Markov_chain). This example is used in the tutorial [Writing a plugin to support a new model](https://llm.datasette.io/en/latest/plugins/tutorial-model-plugin.html).
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[project]
|
||||
name = "llm"
|
||||
version = "0.26"
|
||||
version = "0.27"
|
||||
description = "CLI utility and Python library for interacting with Large Language Models from organizations like OpenAI, Anthropic and Gemini plus local models installed on your own machine."
|
||||
readme = { file = "README.md", content-type = "text/markdown" }
|
||||
authors = [
|
||||
|
|
@ -18,6 +18,7 @@ classifiers = [
|
|||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: 3.11",
|
||||
"Programming Language :: Python :: 3.12",
|
||||
"Programming Language :: Python :: 3.13",
|
||||
"Topic :: Scientific/Engineering :: Artificial Intelligence",
|
||||
"Topic :: Text Processing :: Linguistic",
|
||||
"Topic :: Utilities",
|
||||
|
|
|
|||
Loading…
Reference in a new issue