mirror of
https://github.com/Hopiu/llm.git
synced 2026-04-23 06:24:46 +00:00
116 lines
4.4 KiB
Markdown
116 lines
4.4 KiB
Markdown
# LLM
|
|
|
|
[](https://pypi.org/project/llm/)
|
|
[](https://llm.datasette.io/)
|
|
[](https://llm.datasette.io/en/stable/changelog.html)
|
|
[](https://github.com/simonw/llm/actions?query=workflow%3ATest)
|
|
[](https://github.com/simonw/llm/blob/main/LICENSE)
|
|
[](https://datasette.io/discord-llm)
|
|
[](https://formulae.brew.sh/formula/llm)
|
|
|
|
A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine.
|
|
|
|
[Run prompts from the command-line](https://llm.datasette.io/en/stable/usage.html#executing-a-prompt), [store the results in SQLite](https://llm.datasette.io/en/stable/logging.html), [generate embeddings](https://llm.datasette.io/en/stable/embeddings/index.html) and more.
|
|
|
|
Full documentation: **[llm.datasette.io](https://llm.datasette.io/)**
|
|
|
|
Background on this project:
|
|
- [llm, ttok and strip-tags—CLI tools for working with ChatGPT and other LLMs](https://simonwillison.net/2023/May/18/cli-tools-for-llms/)
|
|
- [The LLM CLI tool now supports self-hosted language models via plugins](https://simonwillison.net/2023/Jul/12/llm/)
|
|
- [Accessing Llama 2 from the command-line with the llm-replicate plugin](https://simonwillison.net/2023/Jul/18/accessing-llama-2/)
|
|
- [Run Llama 2 on your own Mac using LLM and Homebrew](https://simonwillison.net/2023/Aug/1/llama-2-mac/)
|
|
- [Catching up on the weird world of LLMs](https://simonwillison.net/2023/Aug/3/weird-world-of-llms/)
|
|
- [LLM now provides tools for working with embeddings](https://simonwillison.net/2023/Sep/4/llm-embeddings/)
|
|
- [Build an image search engine with llm-clip, chat with models with llm chat](https://simonwillison.net/2023/Sep/12/llm-clip-and-chat/)
|
|
|
|
## Installation
|
|
|
|
Install this tool using `pip`:
|
|
```bash
|
|
pip install llm
|
|
```
|
|
Or using [Homebrew](https://brew.sh/):
|
|
```bash
|
|
brew install llm
|
|
```
|
|
[Detailed installation instructions](https://llm.datasette.io/en/stable/setup.html).
|
|
|
|
## Getting started
|
|
|
|
If you have an [OpenAI API key](https://platform.openai.com/account/api-keys) you can get started using the OpenAI models right away.
|
|
|
|
As an alternative to OpenAI, you can [install plugins](https://llm.datasette.io/en/stable/plugins/installing-plugins.html) to access models by other providers, including models that can be installed and run on your own device.
|
|
|
|
Save your OpenAI API key like this:
|
|
|
|
```bash
|
|
llm keys set openai
|
|
```
|
|
This will prompt you for your key like so:
|
|
```
|
|
Enter key: <paste here>
|
|
```
|
|
Now that you've saved a key you can run a prompt like this:
|
|
```bash
|
|
llm "Five cute names for a pet penguin"
|
|
```
|
|
```
|
|
1. Waddles
|
|
2. Pebbles
|
|
3. Bubbles
|
|
4. Flappy
|
|
5. Chilly
|
|
```
|
|
Read the [usage instructions](https://llm.datasette.io/en/stable/usage.html) for more.
|
|
|
|
## Installing a model that runs on your own machine
|
|
|
|
[LLM plugins](https://llm.datasette.io/en/stable/plugins/index.html) can add support for alternative models, including models that run on your own machine.
|
|
|
|
To download and run Llama 2 13B locally, you can install the [llm-mlc](https://github.com/simonw/llm-mlc) plugin:
|
|
```bash
|
|
llm install llm-mlc
|
|
llm mlc pip install --pre --force-reinstall \
|
|
mlc-ai-nightly \
|
|
mlc-chat-nightly \
|
|
-f https://mlc.ai/wheels
|
|
llm mlc setup
|
|
```
|
|
Then download the 15GB Llama 2 13B model like this:
|
|
```bash
|
|
llm mlc download-model Llama-2-7b-chat --alias llama2
|
|
```
|
|
And run a prompt through it:
|
|
```bash
|
|
llm -m llama2 'difference between a llama and an alpaca'
|
|
```
|
|
You can also start a chat session with the model using the `llm chat` command:
|
|
```bash
|
|
llm chat -m llama2
|
|
```
|
|
```
|
|
Chatting with mlc-chat-Llama-2-13b-chat-hf-q4f16_1
|
|
Type 'exit' or 'quit' to exit
|
|
Type '!multi' to enter multiple lines, then '!end' to finish
|
|
>
|
|
```
|
|
|
|
## Using a system prompt
|
|
|
|
You can use the `-s/--system` option to set a system prompt, providing instructions for processing other input to the tool.
|
|
|
|
To describe how the code a file works, try this:
|
|
|
|
```bash
|
|
cat mycode.py | llm -s "Explain this code"
|
|
```
|
|
|
|
## Help
|
|
|
|
For help, run:
|
|
|
|
llm --help
|
|
|
|
You can also use:
|
|
|
|
python -m llm --help
|