First: if you're unsure or afraid of anything, ask for help! You can submit a work in progress (WIP) pull request, or file an issue with the parts you know. We'll do our best to guide you in the right direction, and let you know if there are guidelines we will need to follow. We want people to be able to participate without fear of doing the wrong thing.
Below are our expectations for contributors. Following these guidelines gives us the best opportunity to work with you, by making sure we have the things we need in order to make it happen. Doing your best to follow it will speed up our ability to merge PRs and respond to issues.
Both upload providers and backup targets were created with possibility to easly add new in the future.
- Python 3.13.
- Poetry https://python-poetry.org/.
- Docker and docker compose plugin https://docs.docker.com/get-docker/.
- Debian/Ubuntu are known to work.
- To work with databases natively,
postgresql-client
andmariadb-client
packages installed on your system, there are scripts inscripts
that can help.
-
Install python dependencies
poetry install
-
Install pre-commit hooks.
pre-commit install
-
Create
.env
filecp .env.example .env
-
To run database backups, you will need
mariadb-client
andpostgresql-client
installed, there are dedicated scripts in folder/scripts
that can do that (or install usingapt
for example). -
Setup databases
make docker_setup_up
Note, file
docker/docker-compose.dbs.yml
is automatically updated weekly keeping always latest tag per release cycle and adding new releases and removing those after EOL. -
You can run ogion (
--single
here to make all backups immediatly and then exit):python -m ogion.main --single
To play with documentation, after dependencies are in place installed with poetry:
mkdocs serve
will start development server.
The project has a handful of test cases which must pass for a contribution to be accepted. We also expect that you either create new test cases or modify existing ones in order to target your changes.
You can run all the test cases locally by invoking pytest
.
Since the code is closely related to docker container environment, setup to run pytest directly in the container exists:
- amd64:
make unit_tests
- arm64 (slow, qemu)
OGION_ARCH=arm64 make unit_tests
To run tests vs arm64
you will need to follow steps in https://docs.docker.com/build/building/multi-platform/.
Note, exactly above is being run inside runners in tests github action.
The tests folder includes tests for cloud providers integrations, including fake gcs server, minio instance and azurite, but for obvious reasons this is only 99% replacement for full end-to-end tests with real credentials to storage in cloud and eventually costs involved, proceed with caution!
For Debug
provider acceptance tests, existing .env.example
can be ok for .env
file. To test other providers, you will need to change BACKUP_PROVIDER
environment variable accordingly.
Then make acceptance_tests_amd64
(or even for arm64 make acceptance_tests_arm64
) will build docker image target build
and your local .env
file, then fire single backup to upload provider and exit.
We expect that all code contributions have been formatted using ruff format
. You can
run ruff format .
to format your code.
The project use ruff
for style, isort rules etc. This is also included in pre-commit
rules, but you can use ruff check . --fix
We also expect passing mypy
static code analysis, that can be run with mypy .
There is no dedicated commit message convention, just use descriptive commit messages and useful branch/PR names.
On every push to main
branch, there is automated build that will create dev
image tag. Then usually after at minimum 1-2 days after it is propagated to some real world ogion instances and working just fine, release is made manually using GitHub releases with autogenerated changelog. After new release and tag is created, for example 4.0
, docker image with the same tag is build and uploaded to dockerhub.
See https://github.com/rafsaf/ogion/releases.
Mike is used for docs versioning: https://github.com/jimporter/mike
After new release is triggered, manual command needs to be run (eg. tag 0.1
) and then generated commit pushed to origin:
mike deploy --push --update-aliases 0.1 latest
mike set-default --push 0.1
That is usually done simultaneously with release process.