mirror of
https://github.com/Hopiu/llm.git
synced 2026-05-12 07:43:11 +00:00
parent
cfb10f4afd
commit
02852fe1a5
2 changed files with 9 additions and 1 deletions
|
|
@ -1,5 +1,13 @@
|
|||
# Changelog
|
||||
|
||||
(v0_19a0)=
|
||||
## 0.19a0 (2024-11-19)
|
||||
|
||||
- Tokens used by a response are now logged to new `input_tokens` and `output_tokens` integer columns and a `token_details` JSON string column, for the default OpenAI models and models from other plugins that {ref}`implement this feature <advanced-model-plugins-usage>`. [#610](https://github.com/simonw/llm/issues/610)
|
||||
- `llm prompt` now takes a `-u/--usage` flag to display token usage at the end of the response.
|
||||
- `llm logs -u/--usage` shows token usage information for logged responses.
|
||||
- `llm prompt ... --async` responses are now logged to the database. [#641](https://github.com/simonw/llm/issues/641)
|
||||
|
||||
(v0_18)=
|
||||
## 0.18 (2024-11-17)
|
||||
|
||||
|
|
|
|||
2
setup.py
2
setup.py
|
|
@ -1,7 +1,7 @@
|
|||
from setuptools import setup, find_packages
|
||||
import os
|
||||
|
||||
VERSION = "0.18"
|
||||
VERSION = "0.19a0"
|
||||
|
||||
|
||||
def get_long_description():
|
||||
|
|
|
|||
Loading…
Reference in a new issue