llm/tests
Simon Willison ba75c674cb
llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613)
- https://github.com/simonw/llm/issues/507#issuecomment-2458639308

* register_model is now async aware

Refs https://github.com/simonw/llm/issues/507#issuecomment-2458658134

* Refactor Chat and AsyncChat to use _Shared base class

Refs https://github.com/simonw/llm/issues/507#issuecomment-2458692338

* fixed function name

* Fix for infinite loop

* Applied Black

* Ran cog

* Applied Black

* Add Response.from_row() classmethod back again

It does not matter that this is a blocking call, since it is a classmethod

* Made mypy happy with llm/models.py

* mypy fixes for openai_models.py

I am unhappy with this, had to duplicate some code.

* First test for AsyncModel

* Still have not quite got this working

* Fix for not loading plugins during tests, refs #626

* audio/wav not audio/wave, refs #603

* Black and mypy and ruff all happy

* Refactor to avoid generics

* Removed obsolete response() method

* Support text = await async_mock_model.prompt("hello")

* Initial docs for llm.get_async_model() and await model.prompt()

Refs #507

* Initial async model plugin creation docs

* duration_ms ANY to pass test

* llm models --async option

Refs https://github.com/simonw/llm/pull/613#issuecomment-2474724406

* Removed obsolete TypeVars

* Expanded register_models() docs for async

* await model.prompt() now returns AsyncResponse

Refs https://github.com/simonw/llm/pull/613#issuecomment-2475157822

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-11-13 17:51:00 -08:00
..
conftest.py llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613) 2024-11-13 17:51:00 -08:00
test-llm-load-plugins.sh Test using test-llm-load-plugins.sh 2024-01-25 17:44:34 -08:00
test_aliases.py Make tests robust against extra plugins, closes #258 2023-09-10 11:21:04 -07:00
test_async.py llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613) 2024-11-13 17:51:00 -08:00
test_attachments.py Special case treat audio/wave as audio/wav, closes #603 2024-11-07 17:13:54 -08:00
test_chat.py llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613) 2024-11-13 17:51:00 -08:00
test_cli_openai_models.py audio/wav not audio/wave, refs #603 2024-11-12 21:43:07 -08:00
test_embed.py batch_size= argument to embed_multi(), refs #273 2023-09-13 16:24:04 -07:00
test_embed_cli.py Windows readline fix, plus run CI against macOS and Windows 2024-01-26 16:24:58 -08:00
test_encode_decode.py NumPy decoding docs, plus extra tests for llm.encode/decode 2023-09-14 14:01:47 -07:00
test_keys.py llm keys get command, refs #623 2024-11-11 09:47:13 -08:00
test_llm.py llm.get_async_model(), llm.AsyncModel base class and OpenAI async models (#613) 2024-11-13 17:51:00 -08:00
test_migrate.py Binary embeddings (#254) 2023-09-11 18:58:44 -07:00
test_plugins.py Moved iter_prompt from Response to Model, moved a lot of other stuff 2023-07-10 07:45:11 -07:00
test_templates.py Upgrade to pytest-httpx>=0.33.0 2024-10-28 15:41:34 -07:00