From 21df2414436dcf298603d01a2bb9e79af69e69c2 Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Sat, 1 Feb 2025 22:08:19 -0800 Subject: [PATCH] llm-claude-3 is now called llm-anthropic Refs https://github.com/simonw/llm-claude-3/issues/31 !stable-docs --- docs/plugins/directory.md | 3 +-- docs/python-api.md | 4 ++-- docs/setup.md | 2 +- 3 files changed, 4 insertions(+), 5 deletions(-) diff --git a/docs/plugins/directory.md b/docs/plugins/directory.md index a80b1b0..e1e10a6 100644 --- a/docs/plugins/directory.md +++ b/docs/plugins/directory.md @@ -21,8 +21,7 @@ These plugins can be used to interact with remotely hosted models via their API: - **[llm-mistral](https://github.com/simonw/llm-mistral)** adds support for [Mistral AI](https://mistral.ai/)'s language and embedding models. - **[llm-gemini](https://github.com/simonw/llm-gemini)** adds support for Google's [Gemini](https://ai.google.dev/docs) models. -- **[llm-claude](https://github.com/tomviner/llm-claude)** by Tom Viner adds support for Claude 2.1 and Claude Instant 2.1 by Anthropic. -- **[llm-claude-3](https://github.com/simonw/llm-claude-3)** supports Anthropic's [Claude 3 family](https://www.anthropic.com/news/claude-3-family) of models. +- **[llm-anthropic](https://github.com/simonw/llm-anthropic)** supports Anthropic's [Claude 3 family](https://www.anthropic.com/news/claude-3-family), [3.5 Sonnet](https://www.anthropic.com/news/claude-3-5-sonnet) and beyond. - **[llm-command-r](https://github.com/simonw/llm-command-r)** supports Cohere's Command R and [Command R Plus](https://txt.cohere.com/command-r-plus-microsoft-azure/) API models. - **[llm-reka](https://github.com/simonw/llm-reka)** supports the [Reka](https://www.reka.ai/) family of models via their API. - **[llm-perplexity](https://github.com/hex/llm-perplexity)** by Alexandru Geana supports the [Perplexity Labs](https://docs.perplexity.ai/) API models, including `llama-3-sonar-large-32k-online` which can search for things online and `llama-3-70b-instruct`. diff --git a/docs/python-api.md b/docs/python-api.md index 49aff02..6a7bcfa 100644 --- a/docs/python-api.md +++ b/docs/python-api.md @@ -94,10 +94,10 @@ print(model.prompt("Names for otters", temperature=0.2)) ### Models from plugins -Any models you have installed as plugins will also be available through this mechanism, for example to use Anthropic's Claude 3.5 Sonnet model with [llm-claude-3](https://github.com/simonw/llm-claude-3): +Any models you have installed as plugins will also be available through this mechanism, for example to use Anthropic's Claude 3.5 Sonnet model with [llm-anthropic](https://github.com/simonw/llm-anthropic): ```bash -pip install llm-claude-3 +pip install llm-anthropic ``` Then in your Python code: ```python diff --git a/docs/setup.md b/docs/setup.md index 9218c50..72801ba 100644 --- a/docs/setup.md +++ b/docs/setup.md @@ -56,7 +56,7 @@ This will install and run LLM using a temporary virtual environment. You can use the `--with` option to add extra plugins. To use Anthropic's models, for example: ```bash export ANTHROPIC_API_KEY='...' -uvx --with llm-claude-3 llm -m claude-3.5-haiku 'fun facts about skunks' +uvx --with llm-anthropic llm -m claude-3.5-haiku 'fun facts about skunks' ``` All of the usual LLM commands will work with `uvx llm`. Here's how to set your OpenAI key without needing an environment variable for example: ```bash