Access large language models from the command-line
Find a file
2023-08-17 14:17:39 -07:00
.github Use cog prolog 2023-08-11 23:17:50 -07:00
docs llm logs now defaults to text output, use --json for JSON, use -c X for specific conversation 2023-08-17 13:57:18 -07:00
llm Fixed test failure, refs #160 2023-08-17 14:17:39 -07:00
tests Lose the indentation for log output, refs #160 2023-08-17 14:12:57 -07:00
.gitignore Set min/max constraints to float arguments 2023-07-26 10:59:09 -07:00
.readthedocs.yaml .readthedocs.yaml 2023-07-24 08:53:48 -07:00
Justfile Use prolog for cog --check in Justfile 2023-08-11 23:19:56 -07:00
LICENSE Initial prototype, refs #1 2023-04-01 14:28:24 -07:00
MANIFEST.in Don't include tests/ in the package 2023-07-01 11:45:00 -07:00
mypy.ini Added mypy, plus some fixes to make it happy - refs #77 2023-07-02 12:36:22 -07:00
README.md Link to two more blog entries 2023-08-12 09:27:36 -07:00
ruff.toml Lint using Ruff, refs #78 2023-07-02 12:41:40 -07:00
setup.py Release 0.7 2023-08-12 09:25:42 -07:00

LLM

PyPI Documentation Changelog Tests License Discord

A CLI utility and Python library for interacting with Large Language Models, including OpenAI, PaLM and local models installed on your own machine.

Full documentation: llm.datasette.io

Background on this project:

Installation

Install this tool using pip:

pip install llm

Or using Homebrew:

brew install llm

Detailed installation instructions.

Getting started

If you have an OpenAI API key you can get started using the OpenAI models right away.

As an alternative to OpenAI, you can install plugins to access models by other providers, including models that can be installed and run on your own device.

Save your OpenAI API key like this:

llm keys set openai

This will prompt you for your key like so:

llm keys set openai
Enter key: <paste here>

Now that you've saved a key you can run a prompt like this:

llm "Five cute names for a pet penguin"
1. Waddles
2. Pebbles
3. Bubbles
4. Flappy
5. Chilly

Read the usage instructions for more.

Using a system prompt

You can use the -s/--system option to set a system prompt, providing instructions for processing other input to the tool.

To describe how the code a file works, try this:

cat mycode.py | llm -s "Explain this code"

Help

For help, run:

llm --help

You can also use:

python -m llm --help