llm/setup.py
Simon Willison f740a5cbbd
Fragments (#859)
* WIP fragments: schema plus reading but not yet writing, refs #617
* Unique index on fragments.alias, refs #617
* Fragments are now persisted, added basic CLI commands
* Fragment aliases work now, refs #617
* Improved help for -f/--fragment
* Support fragment hash as well
* Documentation for fragments
* Better non-JSON display of llm fragments list
* llm fragments -q search option
* _truncate_string is now truncate_string
* Use condense_json to avoid duplicate data in JSON in DB, refs #617
* Follow up to 3 redirects for fragments
* Python API docs for fragments= and system_fragments=
* Fragment aliases cannot contain a : - this is to ensure we can add custom fragment loaders later on, refs https://github.com/simonw/llm/pull/859#issuecomment-2761534692
* Use template fragments when running prompts
* llm fragments show command plus llm fragments group tests
* Tests for fragments family of commands
* Test for --save with fragments
* Add fragments tables to docs/logging.md
* Slightly better llm fragments --help
* Handle fragments in past conversations correctly
* Hint at llm prompt --help in llm --help, closes #868
* llm logs -f filter plus show fragments in llm logs --json
* Include prompt and system fragments in llm logs -s
* llm logs markdown fragment output and tests, refs #617
2025-04-05 17:22:37 -07:00

70 lines
1.8 KiB
Python

from setuptools import setup, find_packages
import os
VERSION = "0.24a0"
def get_long_description():
with open(
os.path.join(os.path.dirname(os.path.abspath(__file__)), "README.md"),
encoding="utf8",
) as fp:
return fp.read()
setup(
name="llm",
description=(
"CLI utility and Python library for interacting with Large Language Models from "
"organizations like OpenAI, Anthropic and Gemini plus local models installed on your own machine."
),
long_description=get_long_description(),
long_description_content_type="text/markdown",
author="Simon Willison",
url="https://github.com/simonw/llm",
project_urls={
"Documentation": "https://llm.datasette.io/",
"Issues": "https://github.com/simonw/llm/issues",
"CI": "https://github.com/simonw/llm/actions",
"Changelog": "https://github.com/simonw/llm/releases",
},
license="Apache License, Version 2.0",
version=VERSION,
packages=find_packages(),
entry_points="""
[console_scripts]
llm=llm.cli:cli
""",
install_requires=[
"click",
"condense-json>=0.1.2",
"openai>=1.55.3",
"click-default-group>=1.2.3",
"sqlite-utils>=3.37",
"sqlite-migrate>=0.1a2",
"pydantic>=2.0.0",
"PyYAML",
"pluggy",
"python-ulid",
"setuptools",
"pip",
"pyreadline3; sys_platform == 'win32'",
"puremagic",
],
extras_require={
"test": [
"pytest",
"numpy",
"pytest-httpx>=0.33.0",
"pytest-asyncio",
"cogapp",
"mypy>=1.10.0",
"black>=25.1.0",
"ruff",
"types-click",
"types-PyYAML",
"types-setuptools",
]
},
python_requires=">=3.9",
)