Move plugin directory into LLM repo, refs #173

This commit is contained in:
Simon Willison 2023-08-20 22:17:13 -07:00
parent 5c657b35d9
commit 6155f12c9c
6 changed files with 59 additions and 4 deletions

View file

@ -57,11 +57,11 @@ maxdepth: 3
setup
usage
other-models
plugins/index
aliases
python-api
templates
logging
plugins/index
help
contributing
changelog

View file

@ -20,7 +20,7 @@ llm -m ggml-vicuna-7b-1 'What is the capital of France?'
```
The model will be downloaded and cached the first time you use it.
Check the **[llm-plugins](https://github.com/simonw/llm-plugins)** repository for the latest list of available plugins for other models.
Check the {ref}`plugin directory <plugin-directory>` for the latest list of available plugins for other models.
(openai-extra-models)=

27
docs/plugins/directory.md Normal file
View file

@ -0,0 +1,27 @@
(plugin-directory)=
# Plugin directory
The following plugins are available for LLM. Here's {ref}`how to install them <installing-plugins>`.
## Local models
These plugins all help you run LLMs directly on your own computer:
- **[llm-mlc](https://github.com/simonw/llm-mlc)** can run local models released by the [MLC project](https://mlc.ai/mlc-llm/), including models that can take advantage of the GPU on Apple Silicon M1/M2 devices.
- **[llm-llama-cpp](https://github.com/simonw/llm-llama-cpp)** uses [llama.cpp](https://github.com/ggerganov/llama.cpp) to run models published in the GGML format.
- **[llm-gpt4all](https://github.com/simonw/llm-gpt4all)** adds support for various models released by the [GPT4All](https://gpt4all.io/) project that are optimized to run locally on your own machine. These models include versions of Vicuna, Orca, Falcon and MPT - here's [a full list of models](https://observablehq.com/@simonw/gpt4all-models).
- **[llm-mpt30b](https://github.com/simonw/llm-mpt30b)** adds support for the [MPT-30B](https://huggingface.co/mosaicml/mpt-30b) local model.
## Remote APIs
These plugins can be used to interact with remotely hosted models via their API:
- **[llm-palm](https://github.com/simonw/llm-palm)** adds support for Google's [PaLM 2 model](https://developers.generativeai.google/).
- **[llm-replicate](https://github.com/simonw/llm-replicate)** adds support for remote models hosted on [Replicate](https://replicate.com/), including Llama 2 from Meta AI.
- **[llm-claude](https://github.com/tomviner/llm-claude)** by Tom Viner adds support for Claude and Claude Instant by Anthropic.
If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin.
## Just for fun
- **[llm-markov](https://github.com/simonw/llm-markov)** adds a simple model that generates output using a [Markov chain](https://en.wikipedia.org/wiki/Markov_chain). This example is used in the tutorial [Writing a plugin to support a new model](https://llm.datasette.io/en/latest/plugins/tutorial-model-plugin.html).

View file

@ -5,7 +5,7 @@ LLM plugins can enhance LLM by making alternative Large Language Models availabl
Plugins can also add new commands to the `llm` CLI tool.
The [llm-plugins](https://github.com/simonw/llm-plugins) repository describes available plugins that you can install and use.
The {ref}`plugin directory <plugin-directory>` lists available plugins that you can install and use.
{ref}`tutorial-model-plugin` describes how to build a new plugin in detail.
@ -14,6 +14,7 @@ The [llm-plugins](https://github.com/simonw/llm-plugins) repository describes av
maxdepth: 3
---
installing-plugins
directory
plugin-hooks
tutorial-model-plugin
plugin-utilities

View file

@ -3,7 +3,7 @@
Plugins must be installed in the same virtual environment as LLM itself.
You can find names of plugins to install in the [llm-plugins](https://github.com/simonw/llm-plugins) repository.
You can find names of plugins to install in the {ref}`plugin directory <plugin-directory>`
Use the `llm install` command (a thin wrapper around `pip install`) to install plugins in the correct environment:
```bash

27
docs/plugins/plugins.md Normal file
View file

@ -0,0 +1,27 @@
(plugin-directory)=
# Plugin directory
The following plugins are available for LLM. Here's {ref}`how to install them <installing-plugins>`.
## Local models
These plugins all help you run LLMs directly on your own computer:
- **[llm-mlc](https://github.com/simonw/llm-mlc)** can run local models released by the [MLC project](https://mlc.ai/mlc-llm/), including models that can take advantage of the GPU on Apple Silicon M1/M2 devices.
- **[llm-llama-cpp](https://github.com/simonw/llm-llama-cpp)** uses [llama.cpp](https://github.com/ggerganov/llama.cpp) to run models published in the GGML format.
- **[llm-gpt4all](https://github.com/simonw/llm-gpt4all)** adds support for various models released by the [GPT4All](https://gpt4all.io/) project that are optimized to run locally on your own machine. These models include versions of Vicuna, Orca, Falcon and MPT - here's [a full list of models](https://observablehq.com/@simonw/gpt4all-models).
- **[llm-mpt30b](https://github.com/simonw/llm-mpt30b)** adds support for the [MPT-30B](https://huggingface.co/mosaicml/mpt-30b) local model.
## Remote APIs
These plugins can be used to interact with remotely hosted models via their API:
- **[llm-palm](https://github.com/simonw/llm-palm)** adds support for Google's [PaLM 2 model](https://developers.generativeai.google/).
- **[llm-replicate](https://github.com/simonw/llm-replicate)** adds support for remote models hosted on [Replicate](https://replicate.com/), including Llama 2 from Meta AI.
- **[llm-claude](https://github.com/tomviner/llm-claude)** by Tom Viner adds support for Claude and Claude Instant by Anthropic.
If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin.
## Just for fun
- **[llm-markov](https://github.com/simonw/llm-markov)** adds a simple model that generates output using a [Markov chain](https://en.wikipedia.org/wiki/Markov_chain). This example is used in the tutorial [Writing a plugin to support a new model](https://llm.datasette.io/en/latest/plugins/tutorial-model-plugin.html).