* child table column removed from parent
* A utility functin that checks whether a column has seen-null-first set
* Improved comments and docstrings, separate method in worker
* null column not inferred if exists as compound
* Column level x-normalizer cleaning moved outside of worker
* Test for empty column becoming compound
* Test clean_seen_null_first_hint
* split home and workspace render methods
* header row dry-er
* catch-all errors in home()-cell
* local try-catch for broken traces
* e2e test for broken trace
* removes this
* shows navigation on pipeline attach error
---------
Co-authored-by: Marcin Rudolf <rudolfix@rudolfix.org>
* makes trace backward compat with 1.17.0 and earlier
* skips trace if any error in unpickle
* always saves merged pipeline trace to have consistent pipeline.last_trace property
* tests for past traces, broken traces and other improvements
* Redshift feature: Include STS session token in COPY CREDENTIALS. If aws_session_token is present, append the session token. Keeps IAM_ROLE path and long-lieved keys unchanged
---------
Co-authored-by: Tim Hable <thable@varengold.de>
* Initial commit
* Html cleaned
* Summary moved to home section, migration badge added
* Load package status badges improved
* Test getting steps data, migrations count
* Various tests
* Fix in test
* Styles moved, improved ui
---------
Co-authored-by: Marcin Rudolf <rudolfix@rudolfix.org>
* make arrow_stream default return_type for connectorx backend
* formatting
* bump connectorx version
* return to arrow by default, keep arrow_stream support, add info message
* document arrow_stream cornercases in the docs
* add the test for connectorx arrow_stream return type
* fix formatting
* fix test typo
* fix the tests
* fix package version check, return original version constraint
* adds utils function to losless cast date64 to timestamp[us]
* cast date64 to timestamp for connectorx, update test
---------
Co-authored-by: ivasio <ivan@dlthub.com>
Co-authored-by: Marcin Rudolf <rudolfix@rudolfix.org>
Added a dropdown for profile selection in the dashboard interface and updated the layout to display profile and workspace information inline with pipeline selection.
* updated the sql databases configuration docs
* Updated sql database and table sources as well which is nice
* updated
* Updated
* Updated docstrings for defer_table_reflect parameter in SQL Database source.
* Updated
* updated template
* renamed pipeline; set dev_mode=True
* set refresh to drop everything
* lint fixes
* makes dataset_name optional in init scripts
* falls to object __repr__ if Pipeline repr fails
* allows to instantiate pipeline on pipeline script import
* remove dev_mode kwarg
* fix unrelated docs notebook
---------
Co-authored-by: Marcin Rudolf <rudolfix@rudolfix.org>
* adds tools to generate api reference for workspace
* writes install, mcp, api reference and improves other docs in hub
* Apply suggestions from code review
Co-authored-by: Violetta Mishechkina <sansiositres@gmail.com>
* fixes free tier
---------
Co-authored-by: Violetta Mishechkina <sansiositres@gmail.com>
* Minor hub docs polishing
* fixes workflow setup wrt not running certain steps if there are only docs changes
* Remove the duplicate content
* Fix build
---------
Co-authored-by: David Scharf <shrps@posteo.net>
* adds option in load that prevents draining pool on signal
* adds runtime pipeline option to not intercept signals
* refactors signal module
* tests new cases
* describes signal handling in running in prod docs
* bumps dlt to 1.18.0
* fixes tests forked
* removes logging and buffered console output from signals
* adds retry count to load job metrics, generates started_at in init of runnable load job
* allows to update existing metrics in load step
* finalized jobs require start and finish dates
* generates metrics in each job state and in each completed loop, does not complete package if pool drained but jobs left, adds detailed tests for metrics
* fixes remote metrics
* replaces event with package bound semaphore to complete load jobs early
* fixes dashboard to on windows
* improves signals docs
* renames delayed_signals to intercepted_signals
* use dlt.Dataset query normalization in _DltBackend
* pass dlt SQL cursor to _DltBackend instead of return values
---------
Co-authored-by: Marcin Rudolf <rudolfix@rudolfix.org>