Contributing¶
Setup¶
Clone the git repository:
$ git clone https://gitlab.com/dalibo/pglift.git
$ cd pglift
Then, create a Python3 virtualenv and install the project:
$ python3 -m venv .venv --upgrade-deps
$ . .venv/bin/activate
(.venv) $ pip install -r requirements/dev.txt
Though not required, tox can be used to run all checks (lint, tests, etc.) needed in development environment.
Linting, formatting, type-checking¶
The project uses ruff for linting and formatting and mypy for type-checking.
All these checks can be run with tox -e lint
or individually.
When creating new files in the source repository, take care to include the license and copyright information. This can be handled with the help of the reuse tool, e.g.:
$ reuse annotate \
--copyright="Dalibo" --year=2021 --license="GPL-3.0-or-later" \
<PATH>
For data files (even in tests), the --force-dot-license
might be also
passed to the latter command.
Running tests¶
The test suite can be run either directly:
(.venv) $ pytest
or through tox
:
$ tox [-e tests-doctest|tests-unit|tests-func|tests-expect]
The test suite is quite extensive and can take long to run. It is split into
functional tests and more unit ones (including doctests), the former
require a real PostgreSQL instance (which will be set up automatically) while
the latter do not. Each test suite gets a dedicated tox environment:
tests-doctest
, tests-unit
, tests-func
and tests-expect
.
When working on a simple fix or change that would be covered by non-functional tests, one can run the following part of the test suite quickly:
(.venv) $ pytest lib/src tests/unit
or through tox
:
$ tox -e tests-doctest -e tests-unit
Some unit tests use local files (in test/data
) to compare actual results
with their expectation. Often, when there is a mismatch that is intentional
(e.g. if interface models changed), it’s handy to write back expected files:
for this, pass --write-changes
option to pytest invocation.
By default, functional tests will not use systemd as a service manager /
scheduler. In order to run tests with systemd, pass the --systemd
option
to pytest command.
Still in functional tests, the PostgreSQL environment would be guessed by
inspecting the system to look for PostgreSQL binaries for the most recent
version available. If multiple versions of PostgreSQL are available, a
specific version can be selected by passing --pg-version=<version>
option
to the pytest
command. Likewise, the --pg-auth
option can be used to
run tests with specified authentication method.
“Expect” tests (in t
directory) should be run with Prysk, not pytest.
If your system uses a specific locale and your tests are failing because of assertion issues with translated messages, you can run the tests with LANG=C.
Note
Tests automatically run in the regular CI don’t use the systemd because of
technical limitations. However developers (with appropriate permissions)
can launch CI jobs using the sourcehut builds service. Job manifests
available in the .builds
directory will be run any time commits are
pushed to the git.sr.ht/~dalibo/pglift
repository.
If you run this test suite (based on Prysk and port-for) with a non-root user, you will probably have a permission issue for writing to /etc/port-for.conf. As a workaround you can create that file and change the ownership manually to the user running those tests.
$ sudo touch /etc/port-for.conf
$ sudo chown ${UID}:${GID} /etc/port-for.conf
Pre-commit hooks¶
Some checks (linting, typing, syntax checking, …) can be done for you before git commits.
You just need to install the pre-commit hooks:
(.venv) $ pre-commit install
Working on documentation¶
To build the documentation in HTML format, run:
(.venv) $ make -C docs html
and open docs/_build/html/index.html
to browse the result.
Alternatively, keep the following command running:
(.venv) $ make -C docs serve
to get the documentation rebuilt and along with a live-reloaded Web browser.
Contributing changes¶
Make sure that lint, typing checks pass as well as at least unit tests.
If needed, create a news fragment using
towncrier create <id>.<type>.rst [--edit]
where<id>
is a short description of changes and<type>
describes the type of changes, within:feature
,bugfix
,removal
,doc
ormisc
. Similarly, if changes affect the Ansible collection, create a changelog fragment as a.yaml
file inansible/changelogs/fragments/
directory (refer to Ansible development process documentation for details).When committing changes with git, write one commit per logical change and try to follow pre-existing style and write a meaningful commit message (see https://commit.style/ for a quick guide).
Release workflow¶
Preparation¶
Prior to releasing, first, the dependencies for building pglift’s binary with PyOxidizer need to be pinned and compiled. This is done by:
running
tox -e pin
, which, ifpyoxidizer/requirements.txt
changed, commits the result,in which case, the next step is to create a merge request in which the
buildbin
job would run (along with, possibly,tests-binary
ones), and,then proceeding with next steps after merge.
Release¶
Assuming we’re releasing version 1.2.3, the following steps should be followed:
Build the changelog
$ towncrier build --version=1.2.3 $ git commit -m "Prepare version 1.2.3"
Create an annotated git tag following the
v<MAJOR>.<MINOR>.<PATCH>
pattern.$ git tag v1.2.3 -a [-s] -m 'pglift v1.2.3' --edit
then edit the tag message to include a changelog since latest release (as built in the previous step).
Push the tag to the main (upstream) repository:
$ git push --follow-tags
Soon after, the CI will build and upload the Python package to PyPI.
Ansible collection¶
If a release of the Ansible collection is needed, in the ansible/
directory:
Bump the version number in
galaxy.yml
Generate the changelog, using antsibull-changelog, by running
antsibull-changelog release
Commit the result
Make an annotated tag with the
ansible/
prefix, e.g.$ git tag ansible/v0.10.0 -a [-s] -m 'pglift-ansible v0.10.0'
Push to the upstream repository, the CI should then handle publication to Ansible Galaxy.