..
conftest.py
llm prompt --schema X option and model.prompt(..., schema=) parameter ( #777 )
2025-02-26 16:58:28 -08:00
test-llm-load-plugins.sh
Test using test-llm-load-plugins.sh
2024-01-25 17:44:34 -08:00
test_aliases.py
llm aliases set -q option, refs #749
2025-02-13 15:49:47 -08:00
test_async.py
Test for async model conversations Python API, refs #742
2025-02-16 19:48:25 -08:00
test_attachments.py
Special case treat audio/wave as audio/wav, closes #603
2024-11-07 17:13:54 -08:00
test_chat.py
llm prompt --schema X option and model.prompt(..., schema=) parameter ( #777 )
2025-02-26 16:58:28 -08:00
test_cli_openai_models.py
Fix for UTC warnings
2024-12-12 14:57:23 -08:00
test_embed.py
batch_size= argument to embed_multi(), refs #273
2023-09-13 16:24:04 -07:00
test_embed_cli.py
llm prompt --schema X option and model.prompt(..., schema=) parameter ( #777 )
2025-02-26 16:58:28 -08:00
test_encode_decode.py
NumPy decoding docs, plus extra tests for llm.encode/decode
2023-09-14 14:01:47 -07:00
test_keys.py
llm keys get command, refs #623
2024-11-11 09:47:13 -08:00
test_llm.py
llm logs --schema, --data, --data-array and --data-key options ( #785 )
2025-02-26 21:51:08 -08:00
test_llm_logs.py
llm logs --schema, --data, --data-array and --data-key options ( #785 )
2025-02-26 21:51:08 -08:00
test_migrate.py
llm prompt --schema X option and model.prompt(..., schema=) parameter ( #777 )
2025-02-26 16:58:28 -08:00
test_plugins.py
Moved iter_prompt from Response to Model, moved a lot of other stuff
2023-07-10 07:45:11 -07:00
test_templates.py
Schema template --save --schema support
2025-02-27 07:19:15 -08:00
test_utils.py
--xl/--extract-last flag for prompt and log list commands ( #718 )
2025-01-24 10:52:46 -08:00