1.8 KiB
Contributing
To contribute to this tool, first checkout the code. Then create a new virtual environment:
cd llm
python -m venv venv
source venv/bin/activate
Or if you are using pipenv:
pipenv shell
Now install the dependencies and test dependencies:
pip install -e '.[test]'
To run the tests:
pytest
Debugging tricks
The default OpenAI plugin has a debugging mechanism for showing the exact responses that came back from the OpenAI API.
Set the LLM_OPENAI_SHOW_RESPONSES environment variable like this:
LLM_OPENAI_SHOW_RESPONSES=1 llm -m chatgpt 'three word slogan for an an otter-run bakery'
This will output the response (including streaming responses) to standard error, as shown in issues 286.
Documentation
Documentation for this project uses MyST - it is written in Markdown and rendered using Sphinx.
To build the documentation locally, run the following:
cd docs
pip install -r requirements.txt
make livehtml
This will start a live preview server, using sphinx-autobuild.
The CLI --help examples in the documentation are managed using Cog. Update those files like this:
just cog
You'll need Just installed to run this command.
Release process
To release a new version:
- Update
docs/changelog.mdwith the new changes. - Update the version number in
setup.py - Create a GitHub release for the new version.
- Wait for the package to push to PyPI and then...
- Run the regenerate.yaml workflow to update the Homebrew tap to the latest version.