2023-06-17 12:33:12 +00:00
# LLM
2023-04-01 21:28:24 +00:00
2023-04-01 22:00:16 +00:00
[](https://pypi.org/project/llm/)
2023-06-17 09:13:56 +00:00
[](https://llm.datasette.io/)
2023-06-17 17:03:05 +00:00
[](https://llm.datasette.io/en/stable/changelog.html)
2023-04-01 22:00:16 +00:00
[](https://github.com/simonw/llm/actions?query=workflow%3ATest)
2023-07-03 20:10:30 +00:00
[](https://github.com/simonw/llm/blob/main/LICENSE)
2023-07-12 01:56:38 +00:00
[](https://datasette.io/discord-llm)
2023-09-03 18:50:18 +00:00
[](https://formulae.brew.sh/formula/llm)
2023-04-01 21:28:24 +00:00
2023-09-04 02:07:05 +00:00
A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine.
[Run prompts from the command-line ](https://llm.datasette.io/en/stable/usage.html#executing-a-prompt ), [store the results in SQLite ](https://llm.datasette.io/en/stable/logging.html ), [generate embeddings ](https://llm.datasette.io/en/stable/embeddings/index.html ) and more.
2023-04-01 21:28:24 +00:00
2024-04-22 03:30:35 +00:00
Consult the ** [LLM plugins directory ](https://llm.datasette.io/en/stable/plugins/directory.html )** for plugins that provide access to remote and local models.
2023-06-17 09:13:56 +00:00
Full documentation: ** [llm.datasette.io ](https://llm.datasette.io/ )**
2023-07-12 14:09:16 +00:00
Background on this project:
- [llm, ttok and strip-tags—CLI tools for working with ChatGPT and other LLMs ](https://simonwillison.net/2023/May/18/cli-tools-for-llms/ )
- [The LLM CLI tool now supports self-hosted language models via plugins ](https://simonwillison.net/2023/Jul/12/llm/ )
2023-07-18 21:24:32 +00:00
- [Accessing Llama 2 from the command-line with the llm-replicate plugin ](https://simonwillison.net/2023/Jul/18/accessing-llama-2/ )
2023-08-12 16:27:36 +00:00
- [Run Llama 2 on your own Mac using LLM and Homebrew ](https://simonwillison.net/2023/Aug/1/llama-2-mac/ )
- [Catching up on the weird world of LLMs ](https://simonwillison.net/2023/Aug/3/weird-world-of-llms/ )
2023-09-04 20:42:01 +00:00
- [LLM now provides tools for working with embeddings ](https://simonwillison.net/2023/Sep/4/llm-embeddings/ )
2023-09-13 19:15:38 +00:00
- [Build an image search engine with llm-clip, chat with models with llm chat ](https://simonwillison.net/2023/Sep/12/llm-clip-and-chat/ )
2023-12-18 18:28:11 +00:00
- [Many options for running Mistral models in your terminal using LLM ](https://simonwillison.net/2023/Dec/18/mistral/ )
2023-05-18 22:41:59 +00:00
2023-04-01 21:28:24 +00:00
## Installation
Install this tool using `pip` :
2023-07-12 02:55:33 +00:00
```bash
pip install llm
```
2024-01-26 21:50:21 +00:00
Or using [Homebrew ](https://brew.sh/ ):
2023-07-12 02:55:33 +00:00
```bash
2024-01-26 21:50:21 +00:00
brew install llm
2023-07-12 02:55:33 +00:00
```
2023-06-17 09:13:56 +00:00
[Detailed installation instructions ](https://llm.datasette.io/en/stable/setup.html ).
2023-06-15 16:51:12 +00:00
## Getting started
2024-01-26 00:02:48 +00:00
If you have an [OpenAI API key ](https://platform.openai.com/api-keys ) you can get started using the OpenAI models right away.
2023-06-15 16:51:12 +00:00
2023-07-12 14:01:01 +00:00
As an alternative to OpenAI, you can [install plugins ](https://llm.datasette.io/en/stable/plugins/installing-plugins.html ) to access models by other providers, including models that can be installed and run on your own device.
2023-07-12 02:55:33 +00:00
Save your OpenAI API key like this:
```bash
2023-06-15 16:51:12 +00:00
llm keys set openai
```
This will prompt you for your key like so:
```
2023-07-12 02:55:33 +00:00
Enter key: < paste here >
2023-06-15 16:51:12 +00:00
```
2023-07-12 02:55:33 +00:00
Now that you've saved a key you can run a prompt like this:
```bash
2023-06-15 16:51:12 +00:00
llm "Five cute names for a pet penguin"
```
```
1. Waddles
2. Pebbles
3. Bubbles
4. Flappy
5. Chilly
```
2023-06-17 08:44:06 +00:00
Read the [usage instructions ](https://llm.datasette.io/en/stable/usage.html ) for more.
2023-04-02 01:52:46 +00:00
2023-09-13 23:05:09 +00:00
## Installing a model that runs on your own machine
[LLM plugins ](https://llm.datasette.io/en/stable/plugins/index.html ) can add support for alternative models, including models that run on your own machine.
2024-01-26 22:30:37 +00:00
To download and run Mistral 7B Instruct locally, you can install the [llm-gpt4all ](https://github.com/simonw/llm-gpt4all ) plugin:
2023-09-13 23:05:09 +00:00
```bash
2024-01-26 22:30:37 +00:00
llm install llm-gpt4all
```
Then run this command to see which models it makes available:
2023-09-13 23:05:09 +00:00
```bash
2024-01-26 22:30:37 +00:00
llm models
```
```
gpt4all: all-MiniLM-L6-v2-f16 - SBert, 43.76MB download, needs 1GB RAM
gpt4all: orca-mini-3b-gguf2-q4_0 - Mini Orca (Small), 1.84GB download, needs 4GB RAM
gpt4all: mistral-7b-instruct-v0 - Mistral Instruct, 3.83GB download, needs 8GB RAM
...
2023-09-13 23:05:09 +00:00
```
2024-01-26 22:30:37 +00:00
Each model file will be downloaded once the first time you use it. Try Mistral out like this:
2023-09-13 23:05:09 +00:00
```bash
2024-01-26 22:30:37 +00:00
llm -m mistral-7b-instruct-v0 'difference between a pelican and a walrus'
2023-09-13 23:05:09 +00:00
```
You can also start a chat session with the model using the `llm chat` command:
```bash
2024-01-26 22:30:37 +00:00
llm chat -m mistral-7b-instruct-v0
2023-09-13 23:05:09 +00:00
```
```
2024-01-26 22:30:37 +00:00
Chatting with mistral-7b-instruct-v0
2023-09-13 23:05:09 +00:00
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
>
```
2023-07-12 02:55:33 +00:00
## Using a system prompt
You can use the `-s/--system` option to set a system prompt, providing instructions for processing other input to the tool.
2024-11-13 03:07:28 +00:00
To describe how the code in a file works, try this:
2023-07-12 02:55:33 +00:00
```bash
cat mycode.py | llm -s "Explain this code"
```
2023-04-01 21:28:24 +00:00
## Help
For help, run:
llm --help
You can also use:
python -m llm --help