data load tool (dlt)
data load tool (dlt) is a simple, open source Python library that makes data loading easy
- Automatically turn the JSON returned by any API into a live dataset stored wherever you want it
pip install python-dltand then includeimport dltto use it in your Python loading script- The dlt library is licensed under the Apache License 2.0, so you can use it for free forever
Read more about it on the dlt Docs
semantic versioning
python-dlt will follow the semantic versioning with MAJOR.MINOR.PATCH pattern. Currently we do pre-release versioning with major version being 0.
minorversion change means breaking changespatchversion change means new features that should be backward compatible- any suffix change ie.
a10->a11is a patch
development
python-dlt uses poetry to manage, build and version the package. It also uses make to automate tasks. To start
make install-poetry # will install poetry, to be run outside virtualenv
then
make dev # will install all deps including dev
Executing poetry shell and working in it is very convenient at this moment.
python version
Use python 3.8 for development which is the lowest supported version for python-dlt. You'll need distutils and venv:
sudo apt-get install python3.8
sudo apt-get install python3.8-distutils
sudo apt install python3.8-venv
You may also use pyenv as poetry suggests.
bumping version
Please use poetry version prerelease to bump patch and then make build-library to apply changes. The source of the version is pyproject.toml and we use poetry to manage it.
testing and linting
python-dlt uses mypy and flake8 with several plugins for linting. We do not reorder imports or reformat code.
pytest is used as test harness. make test-common will run tests of common components and does not require any external resources.
testing destinations
To test destinations use make test. You will need following external resources
BigQueryprojectRedshiftclusterPostgresinstance. You can find a docker compose for postgres instance here
See tests/.example.env for the expected environment variables. Then create tests/.env from it. You configure the tests as you would configure the dlt pipeline.
We'll provide you with access to the resources above if you wish to test locally.
publishing
- Make sure that you are on
develbranch and you have the newest code that passed all tests on CI. - Verify the current version with
poetry version - You'll need
pypiaccess token and usepoetry config pypi-token.pypi your-api-tokenthen
make publish-library
- Make a release on github, use version and git tag as release name
contributing
To contribute via pull request:
- Create an issue with your idea for a feature etc.
- Write your code and tests
- Lint your code with
make lint. Test the common modules withmake test-common - If you work on a destination code then contact us to get access to test destinations
- Create a pull request
