llm/docs/plugins/directory.md
Simon Willison b5260b4eb0
llm-anyscale-endpoints
!stable-docs
2023-08-23 13:46:48 -07:00

2.5 KiB

(plugin-directory)=

Plugin directory

The following plugins are available for LLM. Here's {ref}how to install them <installing-plugins>.

Local models

These plugins all help you run LLMs directly on your own computer:

  • llm-mlc can run local models released by the MLC project, including models that can take advantage of the GPU on Apple Silicon M1/M2 devices.
  • llm-llama-cpp uses llama.cpp to run models published in the GGML format.
  • llm-gpt4all adds support for various models released by the GPT4All project that are optimized to run locally on your own machine. These models include versions of Vicuna, Orca, Falcon and MPT - here's a full list of models.
  • llm-mpt30b adds support for the MPT-30B local model.

Remote APIs

These plugins can be used to interact with remotely hosted models via their API:

If an API model host provides an OpenAI-compatible API you can also configure LLM to talk to it without needing an extra plugin.

Just for fun