llm/tests
2024-11-20 20:09:06 -08:00
..
conftest.py response.usage() and await aresponse.usage(), closes #644 2024-11-19 21:25:37 -08:00
test-llm-load-plugins.sh Test using test-llm-load-plugins.sh 2024-01-25 17:44:34 -08:00
test_aliases.py Make tests robust against extra plugins, closes #258 2023-09-10 11:21:04 -07:00
test_async.py response.usage() and await aresponse.usage(), closes #644 2024-11-19 21:25:37 -08:00
test_attachments.py Special case treat audio/wave as audio/wav, closes #603 2024-11-07 17:13:54 -08:00
test_chat.py response.usage() and await aresponse.usage(), closes #644 2024-11-19 21:25:37 -08:00
test_cli_openai_models.py Log input tokens, output tokens and token details (#642) 2024-11-19 20:21:59 -08:00
test_embed.py batch_size= argument to embed_multi(), refs #273 2023-09-13 16:24:04 -07:00
test_embed_cli.py Windows readline fix, plus run CI against macOS and Windows 2024-01-26 16:24:58 -08:00
test_encode_decode.py NumPy decoding docs, plus extra tests for llm.encode/decode 2023-09-14 14:01:47 -07:00
test_keys.py llm keys get command, refs #623 2024-11-11 09:47:13 -08:00
test_llm.py llm.get_models() and llm.get_async_models(), closes #640 2024-11-20 20:09:06 -08:00
test_migrate.py Log input tokens, output tokens and token details (#642) 2024-11-19 20:21:59 -08:00
test_plugins.py Moved iter_prompt from Response to Model, moved a lot of other stuff 2023-07-10 07:45:11 -07:00
test_templates.py Upgrade to pytest-httpx>=0.33.0 2024-10-28 15:41:34 -07:00
test_utils.py Log input tokens, output tokens and token details (#642) 2024-11-19 20:21:59 -08:00