mirror of
https://github.com/Hopiu/llm.git
synced 2026-03-17 05:00:25 +00:00
- https://github.com/simonw/llm/issues/507#issuecomment-2458639308 * register_model is now async aware Refs https://github.com/simonw/llm/issues/507#issuecomment-2458658134 * Refactor Chat and AsyncChat to use _Shared base class Refs https://github.com/simonw/llm/issues/507#issuecomment-2458692338 * fixed function name * Fix for infinite loop * Applied Black * Ran cog * Applied Black * Add Response.from_row() classmethod back again It does not matter that this is a blocking call, since it is a classmethod * Made mypy happy with llm/models.py * mypy fixes for openai_models.py I am unhappy with this, had to duplicate some code. * First test for AsyncModel * Still have not quite got this working * Fix for not loading plugins during tests, refs #626 * audio/wav not audio/wave, refs #603 * Black and mypy and ruff all happy * Refactor to avoid generics * Removed obsolete response() method * Support text = await async_mock_model.prompt("hello") * Initial docs for llm.get_async_model() and await model.prompt() Refs #507 * Initial async model plugin creation docs * duration_ms ANY to pass test * llm models --async option Refs https://github.com/simonw/llm/pull/613#issuecomment-2474724406 * Removed obsolete TypeVars * Expanded register_models() docs for async * await model.prompt() now returns AsyncResponse Refs https://github.com/simonw/llm/pull/613#issuecomment-2475157822 --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
17 lines
527 B
Python
17 lines
527 B
Python
import llm
|
|
import pytest
|
|
|
|
|
|
@pytest.mark.asyncio
|
|
async def test_async_model(async_mock_model):
|
|
gathered = []
|
|
async_mock_model.enqueue(["hello world"])
|
|
async for chunk in async_mock_model.prompt("hello"):
|
|
gathered.append(chunk)
|
|
assert gathered == ["hello world"]
|
|
# Not as an iterator
|
|
async_mock_model.enqueue(["hello world"])
|
|
response = await async_mock_model.prompt("hello")
|
|
text = await response.text()
|
|
assert text == "hello world"
|
|
assert isinstance(response, llm.AsyncResponse)
|