mirror of
https://github.com/Hopiu/llm.git
synced 2026-04-23 14:34:46 +00:00
* New KeyModel and AsyncKeyModel classes for models that taken keys - closes #744 * llm prompt --key now uses new mechanism, including for async * use new key mechanism in llm chat command * Python API tests for llm.KeyModel and llm.AsyncKeyModel * Python API docs for for prompt(... key="") * Mention await model.prompt() takes other parameters, reorg sections * Better title for the model tutorial * Docs on writing model plugins that take a key |
||
|---|---|---|
| .. | ||
| conftest.py | ||
| test-llm-load-plugins.sh | ||
| test_aliases.py | ||
| test_async.py | ||
| test_attachments.py | ||
| test_chat.py | ||
| test_cli_openai_models.py | ||
| test_embed.py | ||
| test_embed_cli.py | ||
| test_encode_decode.py | ||
| test_keys.py | ||
| test_llm.py | ||
| test_migrate.py | ||
| test_plugins.py | ||
| test_templates.py | ||
| test_utils.py | ||