mirror of
https://github.com/Hopiu/llm.git
synced 2026-03-16 20:50:25 +00:00
llm-claude-3 is now called llm-anthropic
Refs https://github.com/simonw/llm-claude-3/issues/31 !stable-docs
This commit is contained in:
parent
deb8bc3b4f
commit
21df241443
3 changed files with 4 additions and 5 deletions
|
|
@ -21,8 +21,7 @@ These plugins can be used to interact with remotely hosted models via their API:
|
|||
|
||||
- **[llm-mistral](https://github.com/simonw/llm-mistral)** adds support for [Mistral AI](https://mistral.ai/)'s language and embedding models.
|
||||
- **[llm-gemini](https://github.com/simonw/llm-gemini)** adds support for Google's [Gemini](https://ai.google.dev/docs) models.
|
||||
- **[llm-claude](https://github.com/tomviner/llm-claude)** by Tom Viner adds support for Claude 2.1 and Claude Instant 2.1 by Anthropic.
|
||||
- **[llm-claude-3](https://github.com/simonw/llm-claude-3)** supports Anthropic's [Claude 3 family](https://www.anthropic.com/news/claude-3-family) of models.
|
||||
- **[llm-anthropic](https://github.com/simonw/llm-anthropic)** supports Anthropic's [Claude 3 family](https://www.anthropic.com/news/claude-3-family), [3.5 Sonnet](https://www.anthropic.com/news/claude-3-5-sonnet) and beyond.
|
||||
- **[llm-command-r](https://github.com/simonw/llm-command-r)** supports Cohere's Command R and [Command R Plus](https://txt.cohere.com/command-r-plus-microsoft-azure/) API models.
|
||||
- **[llm-reka](https://github.com/simonw/llm-reka)** supports the [Reka](https://www.reka.ai/) family of models via their API.
|
||||
- **[llm-perplexity](https://github.com/hex/llm-perplexity)** by Alexandru Geana supports the [Perplexity Labs](https://docs.perplexity.ai/) API models, including `llama-3-sonar-large-32k-online` which can search for things online and `llama-3-70b-instruct`.
|
||||
|
|
|
|||
|
|
@ -94,10 +94,10 @@ print(model.prompt("Names for otters", temperature=0.2))
|
|||
|
||||
### Models from plugins
|
||||
|
||||
Any models you have installed as plugins will also be available through this mechanism, for example to use Anthropic's Claude 3.5 Sonnet model with [llm-claude-3](https://github.com/simonw/llm-claude-3):
|
||||
Any models you have installed as plugins will also be available through this mechanism, for example to use Anthropic's Claude 3.5 Sonnet model with [llm-anthropic](https://github.com/simonw/llm-anthropic):
|
||||
|
||||
```bash
|
||||
pip install llm-claude-3
|
||||
pip install llm-anthropic
|
||||
```
|
||||
Then in your Python code:
|
||||
```python
|
||||
|
|
|
|||
|
|
@ -56,7 +56,7 @@ This will install and run LLM using a temporary virtual environment.
|
|||
You can use the `--with` option to add extra plugins. To use Anthropic's models, for example:
|
||||
```bash
|
||||
export ANTHROPIC_API_KEY='...'
|
||||
uvx --with llm-claude-3 llm -m claude-3.5-haiku 'fun facts about skunks'
|
||||
uvx --with llm-anthropic llm -m claude-3.5-haiku 'fun facts about skunks'
|
||||
```
|
||||
All of the usual LLM commands will work with `uvx llm`. Here's how to set your OpenAI key without needing an environment variable for example:
|
||||
```bash
|
||||
|
|
|
|||
Loading…
Reference in a new issue