mirror of
https://github.com/Hopiu/llm.git
synced 2026-04-19 04:31:04 +00:00
1.8 KiB
1.8 KiB
(aliases)=
Model aliases
LLM supports model aliases, which allow you to refer to a model by a short name instead of its full ID.
Listing aliases
To list current aliases, run this:
llm aliases
Example output:
3.5 : gpt-3.5-turbo
chatgpt : gpt-3.5-turbo
chatgpt-16k : gpt-3.5-turbo-16k
3.5-16k : gpt-3.5-turbo-16k
4 : gpt-4
gpt4 : gpt-4
4-32k : gpt-4-32k
Add --json to get that list back as JSON:
llm aliases list --json
Example output:
{
"3.5": "gpt-3.5-turbo",
"chatgpt": "gpt-3.5-turbo",
"chatgpt-16k": "gpt-3.5-turbo-16k",
"3.5-16k": "gpt-3.5-turbo-16k",
"4": "gpt-4",
"gpt4": "gpt-4",
"4-32k": "gpt-4-32k"
}
Adding a new alias
The llm aliases set <alias> <model-id> command can be used to add a new alias:
llm aliases set turbo gpt-3.5-turbo-16k
Now you can run the gpt-3.5-turbo-16k model using the turbo alias like this:
llm -m turbo 'An epic Greek-style saga about a cheesecake that builds a SQL database from scratch'
Removing an alias
The llm aliases remove <alias> command will remove the specified alias:
llm aliases remove turbo
This can also be used for aliases that were introduced by other plugins as opposed to the llm aliases set command.
Viewing the aliases file
Aliases are stored in an aliases.json file in the LLM configuration directory.
To see the path to that file, run this:
llm aliases path
To view the content of that file, run this:
cat "$(llm aliases path)"