forked from repo-mirrors/dbt-core
Compare commits
41 Commits
update-ind
...
macro-reso
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
60f87411d5 | ||
|
|
eb96e3deec | ||
|
|
f68af070f3 | ||
|
|
7ad1accf2b | ||
|
|
ed8f5d38e4 | ||
|
|
7ad6aa18da | ||
|
|
6796edd66e | ||
|
|
e01eb30884 | ||
|
|
ba53f053fd | ||
|
|
b8de881ed3 | ||
|
|
d7d5e2335c | ||
|
|
160d0db238 | ||
|
|
2cee8652a6 | ||
|
|
7f777f8a42 | ||
|
|
00f49206e9 | ||
|
|
1bca662883 | ||
|
|
41ac915949 | ||
|
|
373125ecb8 | ||
|
|
294ad82e50 | ||
|
|
12bd1e87fb | ||
|
|
8bad75c65b | ||
|
|
220f56d8d2 | ||
|
|
615ad1fe2d | ||
|
|
2ab0f7b26b | ||
|
|
e56a5dae8b | ||
|
|
1d0a3e92c8 | ||
|
|
51b94b26cc | ||
|
|
4ee950427a | ||
|
|
c4ff280436 | ||
|
|
1260782bd2 | ||
|
|
333120b111 | ||
|
|
af916666a2 | ||
|
|
7de8930d1d | ||
|
|
200bcdcd9f | ||
|
|
b9a603e3aa | ||
|
|
1a825484fb | ||
|
|
f44d704801 | ||
|
|
dbd02e54c2 | ||
|
|
a89642a6f9 | ||
|
|
c141148616 | ||
|
|
469a9aca06 |
@@ -1,5 +1,5 @@
|
|||||||
[bumpversion]
|
[bumpversion]
|
||||||
current_version = 1.9.0a1
|
current_version = 1.8.0a1
|
||||||
parse = (?P<major>[\d]+) # major version number
|
parse = (?P<major>[\d]+) # major version number
|
||||||
\.(?P<minor>[\d]+) # minor version number
|
\.(?P<minor>[\d]+) # minor version number
|
||||||
\.(?P<patch>[\d]+) # patch version number
|
\.(?P<patch>[\d]+) # patch version number
|
||||||
@@ -35,3 +35,13 @@ first_value = 1
|
|||||||
[bumpversion:file:core/setup.py]
|
[bumpversion:file:core/setup.py]
|
||||||
|
|
||||||
[bumpversion:file:core/dbt/version.py]
|
[bumpversion:file:core/dbt/version.py]
|
||||||
|
|
||||||
|
[bumpversion:file:plugins/postgres/setup.py]
|
||||||
|
|
||||||
|
[bumpversion:file:plugins/postgres/dbt/adapters/postgres/__version__.py]
|
||||||
|
|
||||||
|
[bumpversion:file:docker/Dockerfile]
|
||||||
|
|
||||||
|
[bumpversion:file:tests/adapter/setup.py]
|
||||||
|
|
||||||
|
[bumpversion:file:tests/adapter/dbt/tests/adapter/__version__.py]
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# dbt Core Changelog
|
# dbt Core Changelog
|
||||||
|
|
||||||
- This file provides a full account of all changes to `dbt-core`
|
- This file provides a full account of all changes to `dbt-core` and `dbt-postgres`
|
||||||
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
|
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
|
||||||
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
|
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
|
||||||
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
|
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
|
||||||
|
|||||||
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Breaking Changes
|
||||||
|
body: Remove adapter.get_compiler interface
|
||||||
|
time: 2023-11-27T11:47:57.443202-05:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "9148"
|
||||||
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Breaking Changes
|
||||||
|
body: Move AdapterLogger to adapters folder
|
||||||
|
time: 2023-11-28T13:43:56.853925-08:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "9151"
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Breaking Changes
|
||||||
|
body: move event manager setup back to core, remove ref to global EVENT_MANAGER and
|
||||||
|
clean up event manager functions
|
||||||
|
time: 2023-11-30T13:53:48.645192-08:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "9150"
|
||||||
6
.changes/unreleased/Dependencies-20231031-131954.yaml
Normal file
6
.changes/unreleased/Dependencies-20231031-131954.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Dependencies
|
||||||
|
body: Begin using DSI 0.4.x
|
||||||
|
time: 2023-10-31T13:19:54.750009-07:00
|
||||||
|
custom:
|
||||||
|
Author: QMalcolm peterallenwebb
|
||||||
|
PR: "8892"
|
||||||
6
.changes/unreleased/Dependencies-20231106-130051.yaml
Normal file
6
.changes/unreleased/Dependencies-20231106-130051.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Dependencies
|
||||||
|
body: Update typing-extensions version to >=4.4
|
||||||
|
time: 2023-11-06T13:00:51.062386-08:00
|
||||||
|
custom:
|
||||||
|
Author: tlento
|
||||||
|
PR: "9012"
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Dependencies
|
|
||||||
body: Remove logbook dependency
|
|
||||||
time: 2024-05-09T09:37:17.745129-05:00
|
|
||||||
custom:
|
|
||||||
Author: emmyoop
|
|
||||||
Issue: "8027"
|
|
||||||
6
.changes/unreleased/Docs-20231106-123157.yaml
Normal file
6
.changes/unreleased/Docs-20231106-123157.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Docs
|
||||||
|
body: fix get_custom_database docstring
|
||||||
|
time: 2023-11-06T12:31:57.525711Z
|
||||||
|
custom:
|
||||||
|
Author: LeoTheGriff
|
||||||
|
Issue: "9003"
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Docs
|
|
||||||
body: Fix rendering docs with saved queries
|
|
||||||
time: 2024-05-22T17:47:13.414938-04:00
|
|
||||||
custom:
|
|
||||||
Author: ChenyuLInx michelleark
|
|
||||||
Issue: "10168"
|
|
||||||
6
.changes/unreleased/Features-20230915-123733.yaml
Normal file
6
.changes/unreleased/Features-20230915-123733.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: 'Allow adapters to include package logs in dbt standard logging '
|
||||||
|
time: 2023-09-15T12:37:33.862862-07:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "7859"
|
||||||
6
.changes/unreleased/Features-20231017-143620.yaml
Normal file
6
.changes/unreleased/Features-20231017-143620.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: Add drop_schema_named macro
|
||||||
|
time: 2023-10-17T14:36:20.612289-07:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "8025"
|
||||||
6
.changes/unreleased/Features-20231026-110821.yaml
Normal file
6
.changes/unreleased/Features-20231026-110821.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: migrate utils to common and adapters folders
|
||||||
|
time: 2023-10-26T11:08:21.458709-07:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "8924"
|
||||||
6
.changes/unreleased/Features-20231026-123556.yaml
Normal file
6
.changes/unreleased/Features-20231026-123556.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: Move Agate helper client into common
|
||||||
|
time: 2023-10-26T12:35:56.538587-07:00
|
||||||
|
custom:
|
||||||
|
Author: MichelleArk
|
||||||
|
Issue: "8926"
|
||||||
6
.changes/unreleased/Features-20231026-123913.yaml
Normal file
6
.changes/unreleased/Features-20231026-123913.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: remove usage of dbt.config.PartialProject from dbt/adapters
|
||||||
|
time: 2023-10-26T12:39:13.904116-07:00
|
||||||
|
custom:
|
||||||
|
Author: MichelleArk
|
||||||
|
Issue: "8928"
|
||||||
6
.changes/unreleased/Features-20231031-132022.yaml
Normal file
6
.changes/unreleased/Features-20231031-132022.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: Add exports to SavedQuery spec
|
||||||
|
time: 2023-10-31T13:20:22.448158-07:00
|
||||||
|
custom:
|
||||||
|
Author: QMalcolm peterallenwebb
|
||||||
|
Issue: "8892"
|
||||||
6
.changes/unreleased/Features-20231107-135635.yaml
Normal file
6
.changes/unreleased/Features-20231107-135635.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: Remove legacy logger
|
||||||
|
time: 2023-11-07T13:56:35.186648-08:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "8027"
|
||||||
6
.changes/unreleased/Features-20231110-154255.yaml
Normal file
6
.changes/unreleased/Features-20231110-154255.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Features
|
||||||
|
body: Support setting export configs hierarchically via saved query and project configs
|
||||||
|
time: 2023-11-10T15:42:55.042317-08:00
|
||||||
|
custom:
|
||||||
|
Author: QMalcolm
|
||||||
|
Issue: "8956"
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Features
|
|
||||||
body: serialize inferred primary key
|
|
||||||
time: 2024-05-06T17:56:42.757673-05:00
|
|
||||||
custom:
|
|
||||||
Author: dave-connors-3
|
|
||||||
Issue: "9824"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Features
|
|
||||||
body: 'Add unit_test: selection method'
|
|
||||||
time: 2024-05-07T16:27:17.047585-04:00
|
|
||||||
custom:
|
|
||||||
Author: michelleark
|
|
||||||
Issue: "10053"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Fixes
|
|
||||||
body: Remove unused check_new method
|
|
||||||
time: 2023-06-01T20:41:57.556342+02:00
|
|
||||||
custom:
|
|
||||||
Author: kevinneville
|
|
||||||
Issue: "7586"
|
|
||||||
6
.changes/unreleased/Fixes-20231013-130943.yaml
Normal file
6
.changes/unreleased/Fixes-20231013-130943.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: For packages installed with tarball method, fetch metadata to resolve nested dependencies
|
||||||
|
time: 2023-10-13T13:09:43.188308-04:00
|
||||||
|
custom:
|
||||||
|
Author: adamlopez
|
||||||
|
Issue: "8621"
|
||||||
6
.changes/unreleased/Fixes-20231016-163953.yaml
Normal file
6
.changes/unreleased/Fixes-20231016-163953.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Fix partial parsing not working for semantic model change
|
||||||
|
time: 2023-10-16T16:39:53.05058-07:00
|
||||||
|
custom:
|
||||||
|
Author: ChenyuLInx
|
||||||
|
Issue: "8859"
|
||||||
6
.changes/unreleased/Fixes-20231024-110151.yaml
Normal file
6
.changes/unreleased/Fixes-20231024-110151.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Handle unknown `type_code` for model contracts
|
||||||
|
time: 2023-10-24T11:01:51.980781-06:00
|
||||||
|
custom:
|
||||||
|
Author: dbeatty10
|
||||||
|
Issue: 8877 8353
|
||||||
6
.changes/unreleased/Fixes-20231024-145504.yaml
Normal file
6
.changes/unreleased/Fixes-20231024-145504.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Add back contract enforcement for temporary tables on postgres
|
||||||
|
time: 2023-10-24T14:55:04.051683-05:00
|
||||||
|
custom:
|
||||||
|
Author: emmyoop
|
||||||
|
Issue: "8857"
|
||||||
6
.changes/unreleased/Fixes-20231024-155400.yaml
Normal file
6
.changes/unreleased/Fixes-20231024-155400.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Rework get_catalog implementation to retain previous adapter interface semantics
|
||||||
|
time: 2023-10-24T15:54:00.628086-04:00
|
||||||
|
custom:
|
||||||
|
Author: peterallenwebb
|
||||||
|
Issue: "8846"
|
||||||
6
.changes/unreleased/Fixes-20231026-002536.yaml
Normal file
6
.changes/unreleased/Fixes-20231026-002536.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Add version to fqn when version==0
|
||||||
|
time: 2023-10-26T00:25:36.259356-05:00
|
||||||
|
custom:
|
||||||
|
Author: aranke
|
||||||
|
Issue: "8836"
|
||||||
6
.changes/unreleased/Fixes-20231030-093734.yaml
Normal file
6
.changes/unreleased/Fixes-20231030-093734.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Fix cased comparison in catalog-retrieval function.
|
||||||
|
time: 2023-10-30T09:37:34.258612-04:00
|
||||||
|
custom:
|
||||||
|
Author: peterallenwebb
|
||||||
|
Issue: "8939"
|
||||||
6
.changes/unreleased/Fixes-20231031-005345.yaml
Normal file
6
.changes/unreleased/Fixes-20231031-005345.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Catalog queries now assign the correct type to materialized views
|
||||||
|
time: 2023-10-31T00:53:45.486203-04:00
|
||||||
|
custom:
|
||||||
|
Author: mikealfare
|
||||||
|
Issue: "8864"
|
||||||
6
.changes/unreleased/Fixes-20231031-144837.yaml
Normal file
6
.changes/unreleased/Fixes-20231031-144837.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Fix compilation exception running empty seed file and support new Integer agate data_type
|
||||||
|
time: 2023-10-31T14:48:37.774871-04:00
|
||||||
|
custom:
|
||||||
|
Author: gshank
|
||||||
|
Issue: "8895"
|
||||||
6
.changes/unreleased/Fixes-20231101-155824.yaml
Normal file
6
.changes/unreleased/Fixes-20231101-155824.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Make relation filtering None-tolerant for maximal flexibility across adapters.
|
||||||
|
time: 2023-11-01T15:58:24.552054-04:00
|
||||||
|
custom:
|
||||||
|
Author: peterallenwebb
|
||||||
|
Issue: "8974"
|
||||||
7
.changes/unreleased/Fixes-20231106-155933.yaml
Normal file
7
.changes/unreleased/Fixes-20231106-155933.yaml
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Update run_results.json from previous versions of dbt to support deferral and
|
||||||
|
rerun from failure
|
||||||
|
time: 2023-11-06T15:59:33.677915-05:00
|
||||||
|
custom:
|
||||||
|
Author: jtcohen6 peterallenwebb
|
||||||
|
Issue: "9010"
|
||||||
6
.changes/unreleased/Fixes-20231107-092358.yaml
Normal file
6
.changes/unreleased/Fixes-20231107-092358.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Fix git repository with subdirectory for Deps
|
||||||
|
time: 2023-11-07T09:23:58.214271-08:00
|
||||||
|
custom:
|
||||||
|
Author: ChenyuLInx
|
||||||
|
Issue: "9000"
|
||||||
7
.changes/unreleased/Fixes-20231107-094130.yaml
Normal file
7
.changes/unreleased/Fixes-20231107-094130.yaml
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Use MANIFEST.in to recursively include all jinja templates; fixes issue where
|
||||||
|
some templates were not included in the distribution
|
||||||
|
time: 2023-11-07T09:41:30.121733-05:00
|
||||||
|
custom:
|
||||||
|
Author: mikealfare
|
||||||
|
Issue: "9016"
|
||||||
6
.changes/unreleased/Fixes-20231113-114956.yaml
Normal file
6
.changes/unreleased/Fixes-20231113-114956.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: Fix formatting of tarball information in packages-lock.yml
|
||||||
|
time: 2023-11-13T11:49:56.437007-08:00
|
||||||
|
custom:
|
||||||
|
Author: ChenyuLInx QMalcolm
|
||||||
|
Issue: "9062"
|
||||||
6
.changes/unreleased/Fixes-20231127-154310.yaml
Normal file
6
.changes/unreleased/Fixes-20231127-154310.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: 'deps: Lock git packages to commit SHA during resolution'
|
||||||
|
time: 2023-11-27T15:43:10.122069+01:00
|
||||||
|
custom:
|
||||||
|
Author: jtcohen6
|
||||||
|
Issue: "9050"
|
||||||
6
.changes/unreleased/Fixes-20231127-154347.yaml
Normal file
6
.changes/unreleased/Fixes-20231127-154347.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: 'deps: Use PackageRenderer to read package-lock.json'
|
||||||
|
time: 2023-11-27T15:43:47.842423+01:00
|
||||||
|
custom:
|
||||||
|
Author: jtcohen6
|
||||||
|
Issue: "9127"
|
||||||
6
.changes/unreleased/Fixes-20231128-155225.yaml
Normal file
6
.changes/unreleased/Fixes-20231128-155225.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Fixes
|
||||||
|
body: 'Get sources working again in dbt docs generate'
|
||||||
|
time: 2023-11-28T15:52:25.738256Z
|
||||||
|
custom:
|
||||||
|
Author: aranke
|
||||||
|
Issue: "9119"
|
||||||
@@ -1,7 +0,0 @@
|
|||||||
kind: Fixes
|
|
||||||
body: 'Restore previous behavior for --favor-state: only favor defer_relation if not
|
|
||||||
selected in current command"'
|
|
||||||
time: 2024-05-08T15:11:27.510912+02:00
|
|
||||||
custom:
|
|
||||||
Author: jtcohen6
|
|
||||||
Issue: "10107"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Fixes
|
|
||||||
body: Unit test fixture (csv) returns null for empty value
|
|
||||||
time: 2024-05-09T09:14:11.772709-04:00
|
|
||||||
custom:
|
|
||||||
Author: michelleark
|
|
||||||
Issue: "9881"
|
|
||||||
@@ -1,7 +0,0 @@
|
|||||||
kind: Fixes
|
|
||||||
body: Fix json format log and --quiet for ls and jinja print by converting print call
|
|
||||||
to fire events
|
|
||||||
time: 2024-05-16T15:39:13.896723-07:00
|
|
||||||
custom:
|
|
||||||
Author: ChenyuLInx
|
|
||||||
Issue: "8756"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Fixes
|
|
||||||
body: Add resource type to saved_query
|
|
||||||
time: 2024-05-16T22:35:10.287514-07:00
|
|
||||||
custom:
|
|
||||||
Author: ChenyuLInx
|
|
||||||
Issue: "10168"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Security
|
|
||||||
body: Explicitly bind to localhost in docs serve
|
|
||||||
time: 2024-05-22T09:45:40.748185-04:00
|
|
||||||
custom:
|
|
||||||
Author: ChenyuLInx michelleark
|
|
||||||
Issue: "10209"
|
|
||||||
6
.changes/unreleased/Under the Hood-20230831-164435.yaml
Normal file
6
.changes/unreleased/Under the Hood-20230831-164435.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Added more type annotations.
|
||||||
|
time: 2023-08-31T16:44:35.737954-04:00
|
||||||
|
custom:
|
||||||
|
Author: peterallenwebb
|
||||||
|
Issue: "8537"
|
||||||
6
.changes/unreleased/Under the Hood-20231026-184953.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231026-184953.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Remove usage of dbt.include.global_project in dbt/adapters
|
||||||
|
time: 2023-10-26T18:49:53.36449-04:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "8925"
|
||||||
6
.changes/unreleased/Under the Hood-20231027-140048.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231027-140048.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Add a no-op runner for Saved Qeury
|
||||||
|
time: 2023-10-27T14:00:48.4755-07:00
|
||||||
|
custom:
|
||||||
|
Author: ChenyuLInx
|
||||||
|
Issue: "8893"
|
||||||
6
.changes/unreleased/Under the Hood-20231101-102758.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231101-102758.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: remove dbt.flags.MP_CONTEXT usage in dbt/adapters
|
||||||
|
time: 2023-11-01T10:27:58.790153-04:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "8967"
|
||||||
6
.changes/unreleased/Under the Hood-20231101-173124.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231101-173124.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: 'Remove usage of dbt.flags.LOG_CACHE_EVENTS in dbt/adapters'
|
||||||
|
time: 2023-11-01T17:31:24.974093-04:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "8969"
|
||||||
7
.changes/unreleased/Under the Hood-20231103-195222.yaml
Normal file
7
.changes/unreleased/Under the Hood-20231103-195222.yaml
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Move CatalogRelationTypes test case to the shared test suite to be reused by
|
||||||
|
adapter maintainers
|
||||||
|
time: 2023-11-03T19:52:22.694394-04:00
|
||||||
|
custom:
|
||||||
|
Author: mikealfare
|
||||||
|
Issue: "8952"
|
||||||
6
.changes/unreleased/Under the Hood-20231106-080422.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231106-080422.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Treat SystemExit as an interrupt if raised during node execution.
|
||||||
|
time: 2023-11-06T08:04:22.022179-05:00
|
||||||
|
custom:
|
||||||
|
Author: benmosher
|
||||||
|
Issue: n/a
|
||||||
6
.changes/unreleased/Under the Hood-20231106-105730.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231106-105730.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Removing unused 'documentable'
|
||||||
|
time: 2023-11-06T10:57:30.694056-08:00
|
||||||
|
custom:
|
||||||
|
Author: QMalcolm
|
||||||
|
Issue: "8871"
|
||||||
6
.changes/unreleased/Under the Hood-20231107-135728.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231107-135728.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Remove use of dbt/core exceptions in dbt/adapter
|
||||||
|
time: 2023-11-07T13:57:28.683727-08:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt MichelleArk
|
||||||
|
Issue: "8920"
|
||||||
6
.changes/unreleased/Under the Hood-20231107-191546.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231107-191546.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Cache dbt plugin modules to improve integration test performance
|
||||||
|
time: 2023-11-07T19:15:46.170151-05:00
|
||||||
|
custom:
|
||||||
|
Author: peterallenwebb
|
||||||
|
Issue: "9029"
|
||||||
7
.changes/unreleased/Under the Hood-20231111-175350.yaml
Normal file
7
.changes/unreleased/Under the Hood-20231111-175350.yaml
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Fix test_current_timestamp_matches_utc test; allow for MacOS runner system clock
|
||||||
|
variance
|
||||||
|
time: 2023-11-11T17:53:50.098843-05:00
|
||||||
|
custom:
|
||||||
|
Author: mikealfare
|
||||||
|
Issue: "9057"
|
||||||
7
.changes/unreleased/Under the Hood-20231116-174251.yaml
Normal file
7
.changes/unreleased/Under the Hood-20231116-174251.yaml
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Remove usage of dbt.deprecations in dbt/adapters, enable core & adapter-specific
|
||||||
|
event types and protos
|
||||||
|
time: 2023-11-16T17:42:51.005023-05:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: 8927 8918
|
||||||
6
.changes/unreleased/Under the Hood-20231120-134735.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231120-134735.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Clean up unused adaptor folders
|
||||||
|
time: 2023-11-20T13:47:35.923794-08:00
|
||||||
|
custom:
|
||||||
|
Author: ChenyuLInx
|
||||||
|
Issue: "9123"
|
||||||
7
.changes/unreleased/Under the Hood-20231120-183214.yaml
Normal file
7
.changes/unreleased/Under the Hood-20231120-183214.yaml
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Move column constraints into common/contracts, removing another dependency of
|
||||||
|
adapters on core.
|
||||||
|
time: 2023-11-20T18:32:14.859503-05:00
|
||||||
|
custom:
|
||||||
|
Author: peterallenwebb
|
||||||
|
Issue: "9024"
|
||||||
6
.changes/unreleased/Under the Hood-20231128-170732.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231128-170732.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Move dbt.semver to dbt.common.semver and update references.
|
||||||
|
time: 2023-11-28T17:07:32.172421-08:00
|
||||||
|
custom:
|
||||||
|
Author: versusfacit
|
||||||
|
Issue: "9039"
|
||||||
6
.changes/unreleased/Under the Hood-20231130-135432.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231130-135432.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Move lowercase utils method to common
|
||||||
|
time: 2023-11-30T13:54:32.561673-08:00
|
||||||
|
custom:
|
||||||
|
Author: colin-rogers-dbt
|
||||||
|
Issue: "9180"
|
||||||
6
.changes/unreleased/Under the Hood-20231205-093544.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231205-093544.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Remove usages of dbt.clients.jinja in dbt/adapters
|
||||||
|
time: 2023-12-05T09:35:44.845352+09:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "9205"
|
||||||
6
.changes/unreleased/Under the Hood-20231205-120559.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231205-120559.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Remove usage of dbt.contracts in dbt/adapters
|
||||||
|
time: 2023-12-05T12:05:59.936775+09:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "9208"
|
||||||
6
.changes/unreleased/Under the Hood-20231205-165812.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231205-165812.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Remove usage of dbt.contracts.graph.nodes.ResultNode in dbt/adapters
|
||||||
|
time: 2023-12-05T16:58:12.932172+09:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "9214"
|
||||||
6
.changes/unreleased/Under the Hood-20231205-170725.yaml
Normal file
6
.changes/unreleased/Under the Hood-20231205-170725.yaml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
kind: Under the Hood
|
||||||
|
body: Introduce RelationConfig Protocol, consolidate Relation.create_from
|
||||||
|
time: 2023-12-05T17:07:25.33861+09:00
|
||||||
|
custom:
|
||||||
|
Author: michelleark
|
||||||
|
Issue: "9215"
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Under the Hood
|
|
||||||
body: Clear error message for Private package in dbt-core
|
|
||||||
time: 2024-05-02T15:44:30.713097-07:00
|
|
||||||
custom:
|
|
||||||
Author: ChenyuLInx
|
|
||||||
Issue: "10083"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Under the Hood
|
|
||||||
body: Enable use of context in serialization
|
|
||||||
time: 2024-05-06T14:55:11.1812-04:00
|
|
||||||
custom:
|
|
||||||
Author: gshank
|
|
||||||
Issue: "10093"
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
kind: Under the Hood
|
|
||||||
body: Make RSS high water mark measurement more accurate on Linux
|
|
||||||
time: 2024-05-19T15:59:46.700842315-04:00
|
|
||||||
custom:
|
|
||||||
Author: peterallenwebb
|
|
||||||
Issue: "10177"
|
|
||||||
@@ -31,7 +31,43 @@ kinds:
|
|||||||
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
|
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
|
||||||
- label: Under the Hood
|
- label: Under the Hood
|
||||||
- label: Dependencies
|
- label: Dependencies
|
||||||
|
changeFormat: |-
|
||||||
|
{{- $PRList := list }}
|
||||||
|
{{- $changes := splitList " " $.Custom.PR }}
|
||||||
|
{{- range $pullrequest := $changes }}
|
||||||
|
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $pullrequest }}
|
||||||
|
{{- $PRList = append $PRList $changeLink }}
|
||||||
|
{{- end -}}
|
||||||
|
- {{.Body}} ({{ range $index, $element := $PRList }}{{if $index}}, {{end}}{{$element}}{{end}})
|
||||||
|
skipGlobalChoices: true
|
||||||
|
additionalChoices:
|
||||||
|
- key: Author
|
||||||
|
label: GitHub Username(s) (separated by a single space if multiple)
|
||||||
|
type: string
|
||||||
|
minLength: 3
|
||||||
|
- key: PR
|
||||||
|
label: GitHub Pull Request Number (separated by a single space if multiple)
|
||||||
|
type: string
|
||||||
|
minLength: 1
|
||||||
- label: Security
|
- label: Security
|
||||||
|
changeFormat: |-
|
||||||
|
{{- $PRList := list }}
|
||||||
|
{{- $changes := splitList " " $.Custom.PR }}
|
||||||
|
{{- range $pullrequest := $changes }}
|
||||||
|
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $pullrequest }}
|
||||||
|
{{- $PRList = append $PRList $changeLink }}
|
||||||
|
{{- end -}}
|
||||||
|
- {{.Body}} ({{ range $index, $element := $PRList }}{{if $index}}, {{end}}{{$element}}{{end}})
|
||||||
|
skipGlobalChoices: true
|
||||||
|
additionalChoices:
|
||||||
|
- key: Author
|
||||||
|
label: GitHub Username(s) (separated by a single space if multiple)
|
||||||
|
type: string
|
||||||
|
minLength: 3
|
||||||
|
- key: PR
|
||||||
|
label: GitHub Pull Request Number (separated by a single space if multiple)
|
||||||
|
type: string
|
||||||
|
minLength: 1
|
||||||
|
|
||||||
newlines:
|
newlines:
|
||||||
afterChangelogHeader: 1
|
afterChangelogHeader: 1
|
||||||
@@ -70,10 +106,18 @@ footerFormat: |
|
|||||||
{{- $changeList := splitList " " $change.Custom.Author }}
|
{{- $changeList := splitList " " $change.Custom.Author }}
|
||||||
{{- $IssueList := list }}
|
{{- $IssueList := list }}
|
||||||
{{- $changeLink := $change.Kind }}
|
{{- $changeLink := $change.Kind }}
|
||||||
{{- $changes := splitList " " $change.Custom.Issue }}
|
{{- if or (eq $change.Kind "Dependencies") (eq $change.Kind "Security") }}
|
||||||
{{- range $issueNbr := $changes }}
|
{{- $changes := splitList " " $change.Custom.PR }}
|
||||||
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
|
{{- range $issueNbr := $changes }}
|
||||||
{{- $IssueList = append $IssueList $changeLink }}
|
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $issueNbr }}
|
||||||
|
{{- $IssueList = append $IssueList $changeLink }}
|
||||||
|
{{- end -}}
|
||||||
|
{{- else }}
|
||||||
|
{{- $changes := splitList " " $change.Custom.Issue }}
|
||||||
|
{{- range $issueNbr := $changes }}
|
||||||
|
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
|
||||||
|
{{- $IssueList = append $IssueList $changeLink }}
|
||||||
|
{{- end -}}
|
||||||
{{- end }}
|
{{- end }}
|
||||||
{{- /* check if this contributor has other changes associated with them already */}}
|
{{- /* check if this contributor has other changes associated with them already */}}
|
||||||
{{- if hasKey $contributorDict $author }}
|
{{- if hasKey $contributorDict $author }}
|
||||||
|
|||||||
25
.github/CODEOWNERS
vendored
25
.github/CODEOWNERS
vendored
@@ -13,6 +13,31 @@
|
|||||||
# the core team as a whole will be assigned
|
# the core team as a whole will be assigned
|
||||||
* @dbt-labs/core-team
|
* @dbt-labs/core-team
|
||||||
|
|
||||||
|
### ADAPTERS
|
||||||
|
|
||||||
|
# Adapter interface ("base" + "sql" adapter defaults, cache)
|
||||||
|
/core/dbt/adapters @dbt-labs/core-adapters
|
||||||
|
|
||||||
|
# Global project (default macros + materializations), starter project
|
||||||
|
/core/dbt/include @dbt-labs/core-adapters
|
||||||
|
|
||||||
|
# Postgres plugin
|
||||||
|
/plugins/ @dbt-labs/core-adapters
|
||||||
|
/plugins/postgres/setup.py @dbt-labs/core-adapters
|
||||||
|
|
||||||
|
# Functional tests for adapter plugins
|
||||||
|
/tests/adapter @dbt-labs/core-adapters
|
||||||
|
|
||||||
|
### TESTS
|
||||||
|
|
||||||
|
# Overlapping ownership for vast majority of unit + functional tests
|
||||||
|
|
||||||
|
# Perf regression testing framework
|
||||||
|
# This excludes the test project files itself since those aren't specific
|
||||||
|
# framework changes (excluded by not setting an owner next to it- no owner)
|
||||||
|
/performance @nathaniel-may
|
||||||
|
/performance/projects
|
||||||
|
|
||||||
### ARTIFACTS
|
### ARTIFACTS
|
||||||
|
|
||||||
/schemas/dbt @dbt-labs/cloud-artifacts
|
/schemas/dbt @dbt-labs/cloud-artifacts
|
||||||
|
|||||||
13
.github/ISSUE_TEMPLATE/implementation-ticket.yml
vendored
13
.github/ISSUE_TEMPLATE/implementation-ticket.yml
vendored
@@ -30,16 +30,6 @@ body:
|
|||||||
What is the definition of done for this ticket? Include any relevant edge cases and/or test cases
|
What is the definition of done for this ticket? Include any relevant edge cases and/or test cases
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
|
||||||
attributes:
|
|
||||||
label: Suggested Tests
|
|
||||||
description: |
|
|
||||||
Provide scenarios to test. Link to existing similar tests if appropriate.
|
|
||||||
placeholder: |
|
|
||||||
1. Test with no version specified in the schema file and use selection logic on a versioned model for a specific version. Expect pass.
|
|
||||||
2. Test with a version specified in the schema file that is no valid. Expect ParsingError.
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
attributes:
|
attributes:
|
||||||
label: Impact to Other Teams
|
label: Impact to Other Teams
|
||||||
@@ -62,6 +52,7 @@ body:
|
|||||||
attributes:
|
attributes:
|
||||||
label: Context
|
label: Context
|
||||||
description: |
|
description: |
|
||||||
Provide the "why", motivation, and alternative approaches considered -- linking to previous refinement issues, spikes and documentation as appropriate
|
Provide the "why", motivation, and alternative approaches considered -- linking to previous refinement issues, spikes, Notion docs as appropriate
|
||||||
|
validations:
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|||||||
3
.github/_README.md
vendored
3
.github/_README.md
vendored
@@ -47,8 +47,7 @@ ___
|
|||||||
|
|
||||||
### How to re-run jobs
|
### How to re-run jobs
|
||||||
|
|
||||||
- From the UI you can rerun from failure
|
- Some actions cannot be rerun in the GitHub UI. Namely the snyk checks and the cla check. Snyk checks are rerun by closing and reopening the PR. You can retrigger the cla check by commenting on the PR with `@cla-bot check`
|
||||||
- You can retrigger the cla check by commenting on the PR with `@cla-bot check`
|
|
||||||
|
|
||||||
___
|
___
|
||||||
|
|
||||||
|
|||||||
21
.github/actions/latest-wrangler/action.yml
vendored
21
.github/actions/latest-wrangler/action.yml
vendored
@@ -1,21 +1,20 @@
|
|||||||
name: "GitHub package `latest` tag wrangler for containers"
|
name: "Github package 'latest' tag wrangler for containers"
|
||||||
description: "Determines if the published image should include `latest` tags"
|
description: "Determines wether or not a given dbt container should be given a bare 'latest' tag (I.E. dbt-core:latest)"
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
package_name:
|
package_name:
|
||||||
description: "Package being published (i.e. `dbt-core`, `dbt-redshift`, etc.)"
|
description: "Package to check (I.E. dbt-core, dbt-redshift, etc)"
|
||||||
required: true
|
required: true
|
||||||
new_version:
|
new_version:
|
||||||
description: "SemVer of the package being published (i.e. 1.7.2, 1.8.0a1, etc.)"
|
description: "Semver of the container being built (I.E. 1.0.4)"
|
||||||
required: true
|
required: true
|
||||||
github_token:
|
gh_token:
|
||||||
description: "Auth token for GitHub (must have view packages scope)"
|
description: "Auth token for github (must have view packages scope)"
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
tags:
|
latest:
|
||||||
description: "A list of tags to associate with this version"
|
description: "Wether or not built container should be tagged latest (bool)"
|
||||||
|
minor_latest:
|
||||||
|
description: "Wether or not built container should be tagged minor.latest (bool)"
|
||||||
runs:
|
runs:
|
||||||
using: "docker"
|
using: "docker"
|
||||||
image: "Dockerfile"
|
image: "Dockerfile"
|
||||||
|
|||||||
161
.github/actions/latest-wrangler/main.py
vendored
161
.github/actions/latest-wrangler/main.py
vendored
@@ -1,71 +1,98 @@
|
|||||||
import os
|
import os
|
||||||
from packaging.version import Version, parse
|
|
||||||
import requests
|
|
||||||
import sys
|
import sys
|
||||||
from typing import List
|
import requests
|
||||||
|
from distutils.util import strtobool
|
||||||
|
from typing import Union
|
||||||
def main():
|
from packaging.version import parse, Version
|
||||||
package_name: str = os.environ["INPUT_PACKAGE_NAME"]
|
|
||||||
new_version: Version = parse(os.environ["INPUT_NEW_VERSION"])
|
|
||||||
github_token: str = os.environ["INPUT_GITHUB_TOKEN"]
|
|
||||||
|
|
||||||
response = _package_metadata(package_name, github_token)
|
|
||||||
published_versions = _published_versions(response)
|
|
||||||
new_version_tags = _new_version_tags(new_version, published_versions)
|
|
||||||
_register_tags(new_version_tags, package_name)
|
|
||||||
|
|
||||||
|
|
||||||
def _package_metadata(package_name: str, github_token: str) -> requests.Response:
|
|
||||||
url = f"https://api.github.com/orgs/dbt-labs/packages/container/{package_name}/versions"
|
|
||||||
return requests.get(url, auth=("", github_token))
|
|
||||||
|
|
||||||
|
|
||||||
def _published_versions(response: requests.Response) -> List[Version]:
|
|
||||||
package_metadata = response.json()
|
|
||||||
return [
|
|
||||||
parse(tag)
|
|
||||||
for version in package_metadata
|
|
||||||
for tag in version["metadata"]["container"]["tags"]
|
|
||||||
if "latest" not in tag
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
def _new_version_tags(new_version: Version, published_versions: List[Version]) -> List[str]:
|
|
||||||
# the package version is always a tag
|
|
||||||
tags = [str(new_version)]
|
|
||||||
|
|
||||||
# pre-releases don't get tagged with `latest`
|
|
||||||
if new_version.is_prerelease:
|
|
||||||
return tags
|
|
||||||
|
|
||||||
if new_version > max(published_versions):
|
|
||||||
tags.append("latest")
|
|
||||||
|
|
||||||
published_patches = [
|
|
||||||
version
|
|
||||||
for version in published_versions
|
|
||||||
if version.major == new_version.major and version.minor == new_version.minor
|
|
||||||
]
|
|
||||||
if new_version > max(published_patches):
|
|
||||||
tags.append(f"{new_version.major}.{new_version.minor}.latest")
|
|
||||||
|
|
||||||
return tags
|
|
||||||
|
|
||||||
|
|
||||||
def _register_tags(tags: List[str], package_name: str) -> None:
|
|
||||||
fully_qualified_tags = ",".join([f"ghcr.io/dbt-labs/{package_name}:{tag}" for tag in tags])
|
|
||||||
github_output = os.environ.get("GITHUB_OUTPUT")
|
|
||||||
with open(github_output, "at", encoding="utf-8") as gh_output:
|
|
||||||
gh_output.write(f"fully_qualified_tags={fully_qualified_tags}")
|
|
||||||
|
|
||||||
|
|
||||||
def _validate_response(response: requests.Response) -> None:
|
|
||||||
message = response["message"]
|
|
||||||
if response.status_code != 200:
|
|
||||||
print(f"Call to GitHub API failed: {response.status_code} - {message}")
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
main()
|
|
||||||
|
# get inputs
|
||||||
|
package = os.environ["INPUT_PACKAGE"]
|
||||||
|
new_version = parse(os.environ["INPUT_NEW_VERSION"])
|
||||||
|
gh_token = os.environ["INPUT_GH_TOKEN"]
|
||||||
|
halt_on_missing = strtobool(os.environ.get("INPUT_HALT_ON_MISSING", "False"))
|
||||||
|
|
||||||
|
# get package metadata from github
|
||||||
|
package_request = requests.get(
|
||||||
|
f"https://api.github.com/orgs/dbt-labs/packages/container/{package}/versions",
|
||||||
|
auth=("", gh_token),
|
||||||
|
)
|
||||||
|
package_meta = package_request.json()
|
||||||
|
|
||||||
|
# Log info if we don't get a 200
|
||||||
|
if package_request.status_code != 200:
|
||||||
|
print(f"Call to GH API failed: {package_request.status_code} {package_meta['message']}")
|
||||||
|
|
||||||
|
# Make an early exit if there is no matching package in github
|
||||||
|
if package_request.status_code == 404:
|
||||||
|
if halt_on_missing:
|
||||||
|
sys.exit(1)
|
||||||
|
# everything is the latest if the package doesn't exist
|
||||||
|
github_output = os.environ.get("GITHUB_OUTPUT")
|
||||||
|
with open(github_output, "at", encoding="utf-8") as gh_output:
|
||||||
|
gh_output.write("latest=True")
|
||||||
|
gh_output.write("minor_latest=True")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
# TODO: verify package meta is "correct"
|
||||||
|
# https://github.com/dbt-labs/dbt-core/issues/4640
|
||||||
|
|
||||||
|
# map versions and tags
|
||||||
|
version_tag_map = {
|
||||||
|
version["id"]: version["metadata"]["container"]["tags"] for version in package_meta
|
||||||
|
}
|
||||||
|
|
||||||
|
# is pre-release
|
||||||
|
pre_rel = True if any(x in str(new_version) for x in ["a", "b", "rc"]) else False
|
||||||
|
|
||||||
|
# semver of current latest
|
||||||
|
for version, tags in version_tag_map.items():
|
||||||
|
if "latest" in tags:
|
||||||
|
# N.B. This seems counterintuitive, but we expect any version tagged
|
||||||
|
# 'latest' to have exactly three associated tags:
|
||||||
|
# latest, major.minor.latest, and major.minor.patch.
|
||||||
|
# Subtracting everything that contains the string 'latest' gets us
|
||||||
|
# the major.minor.patch which is what's needed for comparison.
|
||||||
|
current_latest = parse([tag for tag in tags if "latest" not in tag][0])
|
||||||
|
else:
|
||||||
|
current_latest = False
|
||||||
|
|
||||||
|
# semver of current_minor_latest
|
||||||
|
for version, tags in version_tag_map.items():
|
||||||
|
if f"{new_version.major}.{new_version.minor}.latest" in tags:
|
||||||
|
# Similar to above, only now we expect exactly two tags:
|
||||||
|
# major.minor.patch and major.minor.latest
|
||||||
|
current_minor_latest = parse([tag for tag in tags if "latest" not in tag][0])
|
||||||
|
else:
|
||||||
|
current_minor_latest = False
|
||||||
|
|
||||||
|
def is_latest(
|
||||||
|
pre_rel: bool, new_version: Version, remote_latest: Union[bool, Version]
|
||||||
|
) -> bool:
|
||||||
|
"""Determine if a given contaier should be tagged 'latest' based on:
|
||||||
|
- it's pre-release status
|
||||||
|
- it's version
|
||||||
|
- the version of a previously identified container tagged 'latest'
|
||||||
|
|
||||||
|
:param pre_rel: Wether or not the version of the new container is a pre-release
|
||||||
|
:param new_version: The version of the new container
|
||||||
|
:param remote_latest: The version of the previously identified container that's
|
||||||
|
already tagged latest or False
|
||||||
|
"""
|
||||||
|
# is a pre-release = not latest
|
||||||
|
if pre_rel:
|
||||||
|
return False
|
||||||
|
# + no latest tag found = is latest
|
||||||
|
if not remote_latest:
|
||||||
|
return True
|
||||||
|
# + if remote version is lower than current = is latest, else not latest
|
||||||
|
return True if remote_latest <= new_version else False
|
||||||
|
|
||||||
|
latest = is_latest(pre_rel, new_version, current_latest)
|
||||||
|
minor_latest = is_latest(pre_rel, new_version, current_minor_latest)
|
||||||
|
|
||||||
|
github_output = os.environ.get("GITHUB_OUTPUT")
|
||||||
|
with open(github_output, "at", encoding="utf-8") as gh_output:
|
||||||
|
gh_output.write(f"latest={latest}")
|
||||||
|
gh_output.write(f"minor_latest={minor_latest}")
|
||||||
|
|||||||
5
.github/dependabot.yml
vendored
5
.github/dependabot.yml
vendored
@@ -11,6 +11,11 @@ updates:
|
|||||||
schedule:
|
schedule:
|
||||||
interval: "daily"
|
interval: "daily"
|
||||||
rebase-strategy: "disabled"
|
rebase-strategy: "disabled"
|
||||||
|
- package-ecosystem: "pip"
|
||||||
|
directory: "/plugins/postgres"
|
||||||
|
schedule:
|
||||||
|
interval: "daily"
|
||||||
|
rebase-strategy: "disabled"
|
||||||
|
|
||||||
# docker dependencies
|
# docker dependencies
|
||||||
- package-ecosystem: "docker"
|
- package-ecosystem: "docker"
|
||||||
|
|||||||
2
.github/workflows/backport.yml
vendored
2
.github/workflows/backport.yml
vendored
@@ -35,6 +35,6 @@ jobs:
|
|||||||
github.event.pull_request.merged
|
github.event.pull_request.merged
|
||||||
&& contains(github.event.label.name, 'backport')
|
&& contains(github.event.label.name, 'backport')
|
||||||
steps:
|
steps:
|
||||||
- uses: tibdex/backport@v2.0.4
|
- uses: tibdex/backport@v2.0.3
|
||||||
with:
|
with:
|
||||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|||||||
4
.github/workflows/bot-changelog.yml
vendored
4
.github/workflows/bot-changelog.yml
vendored
@@ -41,6 +41,8 @@ jobs:
|
|||||||
include:
|
include:
|
||||||
- label: "dependencies"
|
- label: "dependencies"
|
||||||
changie_kind: "Dependencies"
|
changie_kind: "Dependencies"
|
||||||
|
- label: "snyk"
|
||||||
|
changie_kind: "Security"
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
@@ -56,4 +58,4 @@ jobs:
|
|||||||
commit_message: "Add automated changelog yaml from template for bot PR"
|
commit_message: "Add automated changelog yaml from template for bot PR"
|
||||||
changie_kind: ${{ matrix.changie_kind }}
|
changie_kind: ${{ matrix.changie_kind }}
|
||||||
label: ${{ matrix.label }}
|
label: ${{ matrix.label }}
|
||||||
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n Issue: ${{ github.event.pull_request.number }}"
|
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n PR: ${{ github.event.pull_request.number }}"
|
||||||
|
|||||||
2
.github/workflows/changelog-existence.yml
vendored
2
.github/workflows/changelog-existence.yml
vendored
@@ -19,8 +19,6 @@ name: Check Changelog Entry
|
|||||||
on:
|
on:
|
||||||
pull_request_target:
|
pull_request_target:
|
||||||
types: [opened, reopened, labeled, unlabeled, synchronize]
|
types: [opened, reopened, labeled, unlabeled, synchronize]
|
||||||
paths-ignore: ['.changes/**', '.github/**', 'tests/**', '**.md', '**.yml']
|
|
||||||
|
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
|
|
||||||
defaults:
|
defaults:
|
||||||
|
|||||||
41
.github/workflows/check-artifact-changes.yml
vendored
41
.github/workflows/check-artifact-changes.yml
vendored
@@ -1,41 +0,0 @@
|
|||||||
name: Check Artifact Changes
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
types: [ opened, reopened, labeled, unlabeled, synchronize ]
|
|
||||||
paths-ignore: [ '.changes/**', '.github/**', 'tests/**', '**.md', '**.yml' ]
|
|
||||||
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
check-artifact-changes:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: ${{ !contains(github.event.pull_request.labels.*.name, 'artifact_minor_upgrade') }}
|
|
||||||
steps:
|
|
||||||
- name: Checkout code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Check for changes in core/dbt/artifacts
|
|
||||||
# https://github.com/marketplace/actions/paths-changes-filter
|
|
||||||
uses: dorny/paths-filter@v3
|
|
||||||
id: check_artifact_changes
|
|
||||||
with:
|
|
||||||
filters: |
|
|
||||||
artifacts_changed:
|
|
||||||
- 'core/dbt/artifacts/**'
|
|
||||||
list-files: shell
|
|
||||||
|
|
||||||
- name: Fail CI if artifacts have changed
|
|
||||||
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
|
|
||||||
run: |
|
|
||||||
echo "CI failure: Artifact changes checked in core/dbt/artifacts directory."
|
|
||||||
echo "Files changed: ${{ steps.check_artifact_changes.outputs.artifacts_changed_files }}"
|
|
||||||
echo "To bypass this check, confirm that the change is not breaking (https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/artifacts/README.md#breaking-changes) and add the 'artifact_minor_upgrade' label to the PR."
|
|
||||||
exit 1
|
|
||||||
|
|
||||||
- name: CI check passed
|
|
||||||
if: steps.check_artifact_changes.outputs.artifacts_changed == 'false'
|
|
||||||
run: |
|
|
||||||
echo "No prohibited artifact changes found in core/dbt/artifacts. CI check passed."
|
|
||||||
39
.github/workflows/community-label.yml
vendored
39
.github/workflows/community-label.yml
vendored
@@ -1,39 +0,0 @@
|
|||||||
# **what?**
|
|
||||||
# Label a PR with a `community` label when a PR is opened by a user outside core/adapters
|
|
||||||
|
|
||||||
# **why?**
|
|
||||||
# To streamline triage and ensure that community contributions are recognized and prioritized
|
|
||||||
|
|
||||||
# **when?**
|
|
||||||
# When a PR is opened, not in draft or moved from draft to ready for review
|
|
||||||
|
|
||||||
|
|
||||||
name: Label community PRs
|
|
||||||
|
|
||||||
on:
|
|
||||||
# have to use pull_request_target since community PRs come from forks
|
|
||||||
pull_request_target:
|
|
||||||
types: [opened, ready_for_review]
|
|
||||||
|
|
||||||
defaults:
|
|
||||||
run:
|
|
||||||
shell: bash
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
pull-requests: write # labels PRs
|
|
||||||
contents: read # reads team membership
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
open_issues:
|
|
||||||
# If this PR already has the community label, no need to relabel it
|
|
||||||
# If this PR is opened and not draft, determine if it needs to be labeled
|
|
||||||
# if the PR is converted out of draft, determine if it needs to be labeled
|
|
||||||
if: |
|
|
||||||
(!contains(github.event.pull_request.labels.*.name, 'community') &&
|
|
||||||
(github.event.action == 'opened' && github.event.pull_request.draft == false ) ||
|
|
||||||
github.event.action == 'ready_for_review' )
|
|
||||||
uses: dbt-labs/actions/.github/workflows/label-community.yml@main
|
|
||||||
with:
|
|
||||||
github_team: 'core-group'
|
|
||||||
label: 'community'
|
|
||||||
secrets: inherit
|
|
||||||
26
.github/workflows/docs-issue.yml
vendored
26
.github/workflows/docs-issue.yml
vendored
@@ -1,18 +1,19 @@
|
|||||||
# **what?**
|
# **what?**
|
||||||
# Open an issue in docs.getdbt.com when an issue is labeled `user docs` and closed as completed
|
# Open an issue in docs.getdbt.com when a PR is labeled `user docs`
|
||||||
|
|
||||||
# **why?**
|
# **why?**
|
||||||
# To reduce barriers for keeping docs up to date
|
# To reduce barriers for keeping docs up to date
|
||||||
|
|
||||||
# **when?**
|
# **when?**
|
||||||
# When an issue is labeled `user docs` and is closed as completed. Can be labeled before or after the issue is closed.
|
# When a PR is labeled `user docs` and is merged. Runs on pull_request_target to run off the workflow already merged,
|
||||||
|
# not the workflow that existed on the PR branch. This allows old PRs to get comments.
|
||||||
|
|
||||||
|
|
||||||
name: Open issues in docs.getdbt.com repo when an issue is labeled
|
name: Open issues in docs.getdbt.com repo when a PR is labeled
|
||||||
run-name: "Open an issue in docs.getdbt.com for issue #${{ github.event.issue.number }}"
|
run-name: "Open an issue in docs.getdbt.com for PR #${{ github.event.pull_request.number }}"
|
||||||
|
|
||||||
on:
|
on:
|
||||||
issues:
|
pull_request_target:
|
||||||
types: [labeled, closed]
|
types: [labeled, closed]
|
||||||
|
|
||||||
defaults:
|
defaults:
|
||||||
@@ -20,22 +21,23 @@ defaults:
|
|||||||
shell: bash
|
shell: bash
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
issues: write # comments on issues
|
issues: write # opens new issues
|
||||||
|
pull-requests: write # comments on PRs
|
||||||
|
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
open_issues:
|
open_issues:
|
||||||
# we only want to run this when the issue is closed as completed and the label `user docs` has been assigned.
|
# we only want to run this when the PR has been merged or the label in the labeled event is `user docs`. Otherwise it runs the
|
||||||
# If this logic does not exist in this workflow, it runs the
|
|
||||||
# risk of duplicaton of issues being created due to merge and label both triggering this workflow to run and neither having
|
# risk of duplicaton of issues being created due to merge and label both triggering this workflow to run and neither having
|
||||||
# generating the comment before the other runs. This lives here instead of the shared workflow because this is where we
|
# generating the comment before the other runs. This lives here instead of the shared workflow because this is where we
|
||||||
# decide if it should run or not.
|
# decide if it should run or not.
|
||||||
if: |
|
if: |
|
||||||
(github.event.issue.state == 'closed' &&
|
(github.event.pull_request.merged == true) &&
|
||||||
github.event.issue.state_reason == 'completed' &&
|
((github.event.action == 'closed' && contains( github.event.pull_request.labels.*.name, 'user docs')) ||
|
||||||
contains( github.event.issue.labels.*.name, 'user docs'))
|
(github.event.action == 'labeled' && github.event.label.name == 'user docs'))
|
||||||
uses: dbt-labs/actions/.github/workflows/open-issue-in-repo.yml@main
|
uses: dbt-labs/actions/.github/workflows/open-issue-in-repo.yml@main
|
||||||
with:
|
with:
|
||||||
issue_repository: "dbt-labs/docs.getdbt.com"
|
issue_repository: "dbt-labs/docs.getdbt.com"
|
||||||
issue_title: "Docs Changes Needed from ${{ github.event.repository.name }} Issue #${{ github.event.issue.number }}"
|
issue_title: "Docs Changes Needed from ${{ github.event.repository.name }} PR #${{ github.event.pull_request.number }}"
|
||||||
issue_body: "At a minimum, update body to include a link to the page on docs.getdbt.com requiring updates and what part(s) of the page you would like to see updated."
|
issue_body: "At a minimum, update body to include a link to the page on docs.getdbt.com requiring updates and what part(s) of the page you would like to see updated."
|
||||||
secrets: inherit
|
secrets: inherit
|
||||||
|
|||||||
26
.github/workflows/jira-creation.yml
vendored
Normal file
26
.github/workflows/jira-creation.yml
vendored
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
# **what?**
|
||||||
|
# Mirrors issues into Jira. Includes the information: title,
|
||||||
|
# GitHub Issue ID and URL
|
||||||
|
|
||||||
|
# **why?**
|
||||||
|
# Jira is our tool for tracking and we need to see these issues in there
|
||||||
|
|
||||||
|
# **when?**
|
||||||
|
# On issue creation or when an issue is labeled `Jira`
|
||||||
|
|
||||||
|
name: Jira Issue Creation
|
||||||
|
|
||||||
|
on:
|
||||||
|
issues:
|
||||||
|
types: [opened, labeled]
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
issues: write
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
call-creation-action:
|
||||||
|
uses: dbt-labs/actions/.github/workflows/jira-creation-actions.yml@main
|
||||||
|
secrets:
|
||||||
|
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
|
||||||
|
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
|
||||||
|
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}
|
||||||
26
.github/workflows/jira-label.yml
vendored
Normal file
26
.github/workflows/jira-label.yml
vendored
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
# **what?**
|
||||||
|
# Calls mirroring Jira label Action. Includes adding a new label
|
||||||
|
# to an existing issue or removing a label as well
|
||||||
|
|
||||||
|
# **why?**
|
||||||
|
# Jira is our tool for tracking and we need to see these labels in there
|
||||||
|
|
||||||
|
# **when?**
|
||||||
|
# On labels being added or removed from issues
|
||||||
|
|
||||||
|
name: Jira Label Mirroring
|
||||||
|
|
||||||
|
on:
|
||||||
|
issues:
|
||||||
|
types: [labeled, unlabeled]
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
issues: read
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
call-label-action:
|
||||||
|
uses: dbt-labs/actions/.github/workflows/jira-label-actions.yml@main
|
||||||
|
secrets:
|
||||||
|
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
|
||||||
|
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
|
||||||
|
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}
|
||||||
27
.github/workflows/jira-transition.yml
vendored
Normal file
27
.github/workflows/jira-transition.yml
vendored
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
# **what?**
|
||||||
|
# Transition a Jira issue to a new state
|
||||||
|
# Only supports these GitHub Issue transitions:
|
||||||
|
# closed, deleted, reopened
|
||||||
|
|
||||||
|
# **why?**
|
||||||
|
# Jira needs to be kept up-to-date
|
||||||
|
|
||||||
|
# **when?**
|
||||||
|
# On issue closing, deletion, reopened
|
||||||
|
|
||||||
|
name: Jira Issue Transition
|
||||||
|
|
||||||
|
on:
|
||||||
|
issues:
|
||||||
|
types: [closed, deleted, reopened]
|
||||||
|
|
||||||
|
# no special access is needed
|
||||||
|
permissions: read-all
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
call-transition-action:
|
||||||
|
uses: dbt-labs/actions/.github/workflows/jira-transition-actions.yml@main
|
||||||
|
secrets:
|
||||||
|
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
|
||||||
|
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
|
||||||
|
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}
|
||||||
48
.github/workflows/main.yml
vendored
48
.github/workflows/main.yml
vendored
@@ -47,10 +47,10 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out the repository
|
- name: Check out the repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: '3.8'
|
python-version: '3.8'
|
||||||
|
|
||||||
@@ -74,17 +74,17 @@ jobs:
|
|||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
python-version: [ "3.8", "3.9", "3.10", "3.11", "3.12" ]
|
python-version: ["3.8", "3.9", "3.10", "3.11"]
|
||||||
|
|
||||||
env:
|
env:
|
||||||
TOXENV: "unit"
|
TOXENV: "unit"
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out the repository
|
- name: Check out the repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
|
|
||||||
@@ -95,12 +95,8 @@ jobs:
|
|||||||
python -m pip install tox
|
python -m pip install tox
|
||||||
tox --version
|
tox --version
|
||||||
|
|
||||||
- name: Run unit tests
|
- name: Run tox
|
||||||
uses: nick-fields/retry@v3
|
run: tox
|
||||||
with:
|
|
||||||
timeout_minutes: 10
|
|
||||||
max_attempts: 3
|
|
||||||
command: tox -e unit
|
|
||||||
|
|
||||||
- name: Get current date
|
- name: Get current date
|
||||||
if: always()
|
if: always()
|
||||||
@@ -111,7 +107,7 @@ jobs:
|
|||||||
|
|
||||||
- name: Upload Unit Test Coverage to Codecov
|
- name: Upload Unit Test Coverage to Codecov
|
||||||
if: ${{ matrix.python-version == '3.11' }}
|
if: ${{ matrix.python-version == '3.11' }}
|
||||||
uses: codecov/codecov-action@v4
|
uses: codecov/codecov-action@v3
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.CODECOV_TOKEN }}
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
flags: unit
|
flags: unit
|
||||||
@@ -139,7 +135,7 @@ jobs:
|
|||||||
- name: generate include
|
- name: generate include
|
||||||
id: generate-include
|
id: generate-include
|
||||||
run: |
|
run: |
|
||||||
INCLUDE=('"python-version":"3.8","os":"windows-latest"' '"python-version":"3.8","os":"macos-12"' )
|
INCLUDE=('"python-version":"3.8","os":"windows-latest"' '"python-version":"3.8","os":"macos-latest"' )
|
||||||
INCLUDE_GROUPS="["
|
INCLUDE_GROUPS="["
|
||||||
for include in ${INCLUDE[@]}; do
|
for include in ${INCLUDE[@]}; do
|
||||||
for group in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
|
for group in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
|
||||||
@@ -161,7 +157,7 @@ jobs:
|
|||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
python-version: [ "3.8", "3.9", "3.10", "3.11", "3.12" ]
|
python-version: ["3.8", "3.9", "3.10", "3.11"]
|
||||||
os: [ubuntu-20.04]
|
os: [ubuntu-20.04]
|
||||||
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
|
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
|
||||||
include: ${{ fromJson(needs.integration-metadata.outputs.include) }}
|
include: ${{ fromJson(needs.integration-metadata.outputs.include) }}
|
||||||
@@ -179,10 +175,10 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out the repository
|
- name: Check out the repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
|
|
||||||
@@ -205,12 +201,8 @@ jobs:
|
|||||||
python -m pip install tox
|
python -m pip install tox
|
||||||
tox --version
|
tox --version
|
||||||
|
|
||||||
- name: Run integration tests
|
- name: Run tests
|
||||||
uses: nick-fields/retry@v3
|
run: tox -- --ddtrace
|
||||||
with:
|
|
||||||
timeout_minutes: 30
|
|
||||||
max_attempts: 3
|
|
||||||
command: tox -- --ddtrace
|
|
||||||
env:
|
env:
|
||||||
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
|
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
|
||||||
|
|
||||||
@@ -221,15 +213,15 @@ jobs:
|
|||||||
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
|
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
|
||||||
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
|
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
- uses: actions/upload-artifact@v4
|
- uses: actions/upload-artifact@v3
|
||||||
if: always()
|
if: always()
|
||||||
with:
|
with:
|
||||||
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ matrix.split-group }}_${{ steps.date.outputs.date }}
|
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}
|
||||||
path: ./logs
|
path: ./logs
|
||||||
|
|
||||||
- name: Upload Integration Test Coverage to Codecov
|
- name: Upload Integration Test Coverage to Codecov
|
||||||
if: ${{ matrix.python-version == '3.11' }}
|
if: ${{ matrix.python-version == '3.11' }}
|
||||||
uses: codecov/codecov-action@v4
|
uses: codecov/codecov-action@v3
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.CODECOV_TOKEN }}
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
flags: integration
|
flags: integration
|
||||||
@@ -258,10 +250,10 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check out the repository
|
- name: Check out the repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: '3.8'
|
python-version: '3.8'
|
||||||
|
|
||||||
@@ -296,7 +288,7 @@ jobs:
|
|||||||
- name: Install source distributions
|
- name: Install source distributions
|
||||||
# ignore dbt-1.0.0, which intentionally raises an error when installed from source
|
# ignore dbt-1.0.0, which intentionally raises an error when installed from source
|
||||||
run: |
|
run: |
|
||||||
find ./dist/*.gz -maxdepth 1 -type f | xargs python -m pip install --force-reinstall --find-links=dist/
|
find ./dist/dbt-[a-z]*.gz -maxdepth 1 -type f | xargs python -m pip install --force-reinstall --find-links=dist/
|
||||||
|
|
||||||
- name: Check source distributions
|
- name: Check source distributions
|
||||||
run: |
|
run: |
|
||||||
|
|||||||
20
.github/workflows/model_performance.yml
vendored
20
.github/workflows/model_performance.yml
vendored
@@ -48,7 +48,7 @@ jobs:
|
|||||||
# explicitly checkout the performance runner from main regardless of which
|
# explicitly checkout the performance runner from main regardless of which
|
||||||
# version we are modeling.
|
# version we are modeling.
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
ref: main
|
ref: main
|
||||||
|
|
||||||
@@ -87,12 +87,12 @@ jobs:
|
|||||||
# explicitly checkout the performance runner from main regardless of which
|
# explicitly checkout the performance runner from main regardless of which
|
||||||
# version we are modeling.
|
# version we are modeling.
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
ref: main
|
ref: main
|
||||||
|
|
||||||
# attempts to access a previously cached runner
|
# attempts to access a previously cached runner
|
||||||
- uses: actions/cache@v4
|
- uses: actions/cache@v3
|
||||||
id: cache
|
id: cache
|
||||||
with:
|
with:
|
||||||
path: ${{ env.RUNNER_CACHE_PATH }}
|
path: ${{ env.RUNNER_CACHE_PATH }}
|
||||||
@@ -148,7 +148,7 @@ jobs:
|
|||||||
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
|
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
|
||||||
|
|
||||||
- name: Setup Python
|
- name: Setup Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: "3.8"
|
python-version: "3.8"
|
||||||
|
|
||||||
@@ -160,13 +160,13 @@ jobs:
|
|||||||
|
|
||||||
# explicitly checkout main to get the latest project definitions
|
# explicitly checkout main to get the latest project definitions
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
ref: main
|
ref: main
|
||||||
|
|
||||||
# this was built in the previous job so it will be there.
|
# this was built in the previous job so it will be there.
|
||||||
- name: Fetch Runner
|
- name: Fetch Runner
|
||||||
uses: actions/cache@v4
|
uses: actions/cache@v3
|
||||||
id: cache
|
id: cache
|
||||||
with:
|
with:
|
||||||
path: ${{ env.RUNNER_CACHE_PATH }}
|
path: ${{ env.RUNNER_CACHE_PATH }}
|
||||||
@@ -195,7 +195,7 @@ jobs:
|
|||||||
- name: '[DEBUG] ls baseline directory after run'
|
- name: '[DEBUG] ls baseline directory after run'
|
||||||
run: ls -R performance/baselines/
|
run: ls -R performance/baselines/
|
||||||
|
|
||||||
- uses: actions/upload-artifact@v4
|
- uses: actions/upload-artifact@v3
|
||||||
with:
|
with:
|
||||||
name: baseline
|
name: baseline
|
||||||
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}/
|
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}/
|
||||||
@@ -225,7 +225,7 @@ jobs:
|
|||||||
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
|
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
|
||||||
|
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
ref: ${{ matrix.base-branch }}
|
ref: ${{ matrix.base-branch }}
|
||||||
|
|
||||||
@@ -235,7 +235,7 @@ jobs:
|
|||||||
git push origin ${{ matrix.target-branch }}
|
git push origin ${{ matrix.target-branch }}
|
||||||
git branch --set-upstream-to=origin/${{ matrix.target-branch }} ${{ matrix.target-branch }}
|
git branch --set-upstream-to=origin/${{ matrix.target-branch }} ${{ matrix.target-branch }}
|
||||||
|
|
||||||
- uses: actions/download-artifact@v4
|
- uses: actions/download-artifact@v3
|
||||||
with:
|
with:
|
||||||
name: baseline
|
name: baseline
|
||||||
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}
|
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}
|
||||||
@@ -253,7 +253,7 @@ jobs:
|
|||||||
push: 'origin origin/${{ matrix.target-branch }}'
|
push: 'origin origin/${{ matrix.target-branch }}'
|
||||||
|
|
||||||
- name: Create Pull Request
|
- name: Create Pull Request
|
||||||
uses: peter-evans/create-pull-request@v6
|
uses: peter-evans/create-pull-request@v5
|
||||||
with:
|
with:
|
||||||
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
|
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
|
||||||
base: ${{ matrix.base-branch }}
|
base: ${{ matrix.base-branch }}
|
||||||
|
|||||||
16
.github/workflows/nightly-release.yml
vendored
16
.github/workflows/nightly-release.yml
vendored
@@ -20,7 +20,6 @@ on:
|
|||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: write # this is the permission that allows creating a new release
|
contents: write # this is the permission that allows creating a new release
|
||||||
packages: write # this is the permission that allows Docker release
|
|
||||||
|
|
||||||
defaults:
|
defaults:
|
||||||
run:
|
run:
|
||||||
@@ -34,15 +33,22 @@ jobs:
|
|||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
|
commit_sha: ${{ steps.resolve-commit-sha.outputs.release_commit }}
|
||||||
version_number: ${{ steps.nightly-release-version.outputs.number }}
|
version_number: ${{ steps.nightly-release-version.outputs.number }}
|
||||||
release_branch: ${{ steps.release-branch.outputs.name }}
|
release_branch: ${{ steps.release-branch.outputs.name }}
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: "Checkout ${{ github.repository }} Branch ${{ env.RELEASE_BRANCH }}"
|
- name: "Checkout ${{ github.repository }} Branch ${{ env.RELEASE_BRANCH }}"
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
ref: ${{ env.RELEASE_BRANCH }}
|
ref: ${{ env.RELEASE_BRANCH }}
|
||||||
|
|
||||||
|
- name: "Resolve Commit To Release"
|
||||||
|
id: resolve-commit-sha
|
||||||
|
run: |
|
||||||
|
commit_sha=$(git rev-parse HEAD)
|
||||||
|
echo "release_commit=$commit_sha" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
- name: "Get Current Version Number"
|
- name: "Get Current Version Number"
|
||||||
id: version-number-sources
|
id: version-number-sources
|
||||||
run: |
|
run: |
|
||||||
@@ -82,6 +88,7 @@ jobs:
|
|||||||
steps:
|
steps:
|
||||||
- name: "[DEBUG] Log Outputs"
|
- name: "[DEBUG] Log Outputs"
|
||||||
run: |
|
run: |
|
||||||
|
echo commit_sha : ${{ needs.aggregate-release-data.outputs.commit_sha }}
|
||||||
echo version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
|
echo version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
|
||||||
echo release_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
|
echo release_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
|
||||||
|
|
||||||
@@ -90,8 +97,13 @@ jobs:
|
|||||||
|
|
||||||
uses: ./.github/workflows/release.yml
|
uses: ./.github/workflows/release.yml
|
||||||
with:
|
with:
|
||||||
|
sha: ${{ needs.aggregate-release-data.outputs.commit_sha }}
|
||||||
target_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
|
target_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
|
||||||
version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
|
version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
|
||||||
|
build_script_path: "scripts/build-dist.sh"
|
||||||
|
env_setup_script_path: "scripts/env-setup.sh"
|
||||||
|
s3_bucket_name: "core-team-artifacts"
|
||||||
|
package_test_command: "dbt --version"
|
||||||
test_run: true
|
test_run: true
|
||||||
nightly_release: true
|
nightly_release: true
|
||||||
secrets: inherit
|
secrets: inherit
|
||||||
|
|||||||
118
.github/workflows/release-docker.yml
vendored
Normal file
118
.github/workflows/release-docker.yml
vendored
Normal file
@@ -0,0 +1,118 @@
|
|||||||
|
# **what?**
|
||||||
|
# This workflow will generate a series of docker images for dbt and push them to the github container registry
|
||||||
|
|
||||||
|
# **why?**
|
||||||
|
# Docker images for dbt are used in a number of important places throughout the dbt ecosystem. This is how we keep those images up-to-date.
|
||||||
|
|
||||||
|
# **when?**
|
||||||
|
# This is triggered manually
|
||||||
|
|
||||||
|
# **next steps**
|
||||||
|
# - build this into the release workflow (or conversly, break out the different release methods into their own workflow files)
|
||||||
|
|
||||||
|
name: Docker release
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
packages: write
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
package:
|
||||||
|
description: The package to release. _One_ of [dbt-core, dbt-redshift, dbt-bigquery, dbt-snowflake, dbt-spark, dbt-postgres]
|
||||||
|
required: true
|
||||||
|
version_number:
|
||||||
|
description: The release version number (i.e. 1.0.0b1). Do not include `latest` tags or a leading `v`!
|
||||||
|
required: true
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
get_version_meta:
|
||||||
|
name: Get version meta
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
major: ${{ steps.version.outputs.major }}
|
||||||
|
minor: ${{ steps.version.outputs.minor }}
|
||||||
|
patch: ${{ steps.version.outputs.patch }}
|
||||||
|
latest: ${{ steps.latest.outputs.latest }}
|
||||||
|
minor_latest: ${{ steps.latest.outputs.minor_latest }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v3
|
||||||
|
- name: Split version
|
||||||
|
id: version
|
||||||
|
run: |
|
||||||
|
IFS="." read -r MAJOR MINOR PATCH <<< ${{ github.event.inputs.version_number }}
|
||||||
|
echo "major=$MAJOR" >> $GITHUB_OUTPUT
|
||||||
|
echo "minor=$MINOR" >> $GITHUB_OUTPUT
|
||||||
|
echo "patch=$PATCH" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
- name: Is pkg 'latest'
|
||||||
|
id: latest
|
||||||
|
uses: ./.github/actions/latest-wrangler
|
||||||
|
with:
|
||||||
|
package: ${{ github.event.inputs.package }}
|
||||||
|
new_version: ${{ github.event.inputs.version_number }}
|
||||||
|
gh_token: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
halt_on_missing: False
|
||||||
|
|
||||||
|
setup_image_builder:
|
||||||
|
name: Set up docker image builder
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: [get_version_meta]
|
||||||
|
steps:
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v2
|
||||||
|
|
||||||
|
build_and_push:
|
||||||
|
name: Build images and push to GHCR
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: [setup_image_builder, get_version_meta]
|
||||||
|
steps:
|
||||||
|
- name: Get docker build arg
|
||||||
|
id: build_arg
|
||||||
|
run: |
|
||||||
|
BUILD_ARG_NAME=$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
|
||||||
|
BUILD_ARG_VALUE=$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
|
||||||
|
echo "build_arg_name=$BUILD_ARG_NAME" >> $GITHUB_OUTPUT
|
||||||
|
echo "build_arg_value=$BUILD_ARG_VALUE" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
- name: Log in to the GHCR
|
||||||
|
uses: docker/login-action@v2
|
||||||
|
with:
|
||||||
|
registry: ghcr.io
|
||||||
|
username: ${{ github.actor }}
|
||||||
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
- name: Build and push MAJOR.MINOR.PATCH tag
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
file: docker/Dockerfile
|
||||||
|
push: True
|
||||||
|
target: ${{ github.event.inputs.package }}
|
||||||
|
build-args: |
|
||||||
|
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
|
||||||
|
tags: |
|
||||||
|
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:${{ github.event.inputs.version_number }}
|
||||||
|
|
||||||
|
- name: Build and push MINOR.latest tag
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
if: ${{ needs.get_version_meta.outputs.minor_latest == 'True' }}
|
||||||
|
with:
|
||||||
|
file: docker/Dockerfile
|
||||||
|
push: True
|
||||||
|
target: ${{ github.event.inputs.package }}
|
||||||
|
build-args: |
|
||||||
|
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
|
||||||
|
tags: |
|
||||||
|
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:${{ needs.get_version_meta.outputs.major }}.${{ needs.get_version_meta.outputs.minor }}.latest
|
||||||
|
|
||||||
|
- name: Build and push latest tag
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
if: ${{ needs.get_version_meta.outputs.latest == 'True' }}
|
||||||
|
with:
|
||||||
|
file: docker/Dockerfile
|
||||||
|
push: True
|
||||||
|
target: ${{ github.event.inputs.package }}
|
||||||
|
build-args: |
|
||||||
|
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
|
||||||
|
tags: |
|
||||||
|
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:latest
|
||||||
148
.github/workflows/release.yml
vendored
148
.github/workflows/release.yml
vendored
@@ -7,7 +7,6 @@
|
|||||||
# - run unit and integration tests against given commit;
|
# - run unit and integration tests against given commit;
|
||||||
# - build and package that SHA;
|
# - build and package that SHA;
|
||||||
# - release it to GitHub and PyPI with that specific build;
|
# - release it to GitHub and PyPI with that specific build;
|
||||||
# - release it to Docker
|
|
||||||
#
|
#
|
||||||
# **why?**
|
# **why?**
|
||||||
# Ensure an automated and tested release process
|
# Ensure an automated and tested release process
|
||||||
@@ -15,12 +14,15 @@
|
|||||||
# **when?**
|
# **when?**
|
||||||
# This workflow can be run manually on demand or can be called by other workflows
|
# This workflow can be run manually on demand or can be called by other workflows
|
||||||
|
|
||||||
name: "Release to GitHub, PyPI & Docker"
|
name: Release to GitHub and PyPI
|
||||||
run-name: "Release ${{ inputs.version_number }} to GitHub, PyPI & Docker"
|
|
||||||
|
|
||||||
on:
|
on:
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
inputs:
|
inputs:
|
||||||
|
sha:
|
||||||
|
description: "The last commit sha in the release"
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
target_branch:
|
target_branch:
|
||||||
description: "The branch to release from"
|
description: "The branch to release from"
|
||||||
type: string
|
type: string
|
||||||
@@ -29,6 +31,26 @@ on:
|
|||||||
description: "The release version number (i.e. 1.0.0b1)"
|
description: "The release version number (i.e. 1.0.0b1)"
|
||||||
type: string
|
type: string
|
||||||
required: true
|
required: true
|
||||||
|
build_script_path:
|
||||||
|
description: "Build script path"
|
||||||
|
type: string
|
||||||
|
default: "scripts/build-dist.sh"
|
||||||
|
required: true
|
||||||
|
env_setup_script_path:
|
||||||
|
description: "Environment setup script path"
|
||||||
|
type: string
|
||||||
|
default: "scripts/env-setup.sh"
|
||||||
|
required: false
|
||||||
|
s3_bucket_name:
|
||||||
|
description: "AWS S3 bucket name"
|
||||||
|
type: string
|
||||||
|
default: "core-team-artifacts"
|
||||||
|
required: true
|
||||||
|
package_test_command:
|
||||||
|
description: "Package test command"
|
||||||
|
type: string
|
||||||
|
default: "dbt --version"
|
||||||
|
required: true
|
||||||
test_run:
|
test_run:
|
||||||
description: "Test run (Publish release as draft)"
|
description: "Test run (Publish release as draft)"
|
||||||
type: boolean
|
type: boolean
|
||||||
@@ -39,13 +61,12 @@ on:
|
|||||||
type: boolean
|
type: boolean
|
||||||
default: false
|
default: false
|
||||||
required: false
|
required: false
|
||||||
only_docker:
|
|
||||||
description: "Only release Docker image, skip GitHub & PyPI"
|
|
||||||
type: boolean
|
|
||||||
default: false
|
|
||||||
required: false
|
|
||||||
workflow_call:
|
workflow_call:
|
||||||
inputs:
|
inputs:
|
||||||
|
sha:
|
||||||
|
description: "The last commit sha in the release"
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
target_branch:
|
target_branch:
|
||||||
description: "The branch to release from"
|
description: "The branch to release from"
|
||||||
type: string
|
type: string
|
||||||
@@ -54,6 +75,26 @@ on:
|
|||||||
description: "The release version number (i.e. 1.0.0b1)"
|
description: "The release version number (i.e. 1.0.0b1)"
|
||||||
type: string
|
type: string
|
||||||
required: true
|
required: true
|
||||||
|
build_script_path:
|
||||||
|
description: "Build script path"
|
||||||
|
type: string
|
||||||
|
default: "scripts/build-dist.sh"
|
||||||
|
required: true
|
||||||
|
env_setup_script_path:
|
||||||
|
description: "Environment setup script path"
|
||||||
|
type: string
|
||||||
|
default: "scripts/env-setup.sh"
|
||||||
|
required: false
|
||||||
|
s3_bucket_name:
|
||||||
|
description: "AWS S3 bucket name"
|
||||||
|
type: string
|
||||||
|
default: "core-team-artifacts"
|
||||||
|
required: true
|
||||||
|
package_test_command:
|
||||||
|
description: "Package test command"
|
||||||
|
type: string
|
||||||
|
default: "dbt --version"
|
||||||
|
required: true
|
||||||
test_run:
|
test_run:
|
||||||
description: "Test run (Publish release as draft)"
|
description: "Test run (Publish release as draft)"
|
||||||
type: boolean
|
type: boolean
|
||||||
@@ -73,47 +114,32 @@ defaults:
|
|||||||
shell: bash
|
shell: bash
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
job-setup:
|
log-inputs:
|
||||||
name: Log Inputs
|
name: Log Inputs
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
outputs:
|
|
||||||
starting_sha: ${{ steps.set_sha.outputs.starting_sha }}
|
|
||||||
steps:
|
steps:
|
||||||
- name: "[DEBUG] Print Variables"
|
- name: "[DEBUG] Print Variables"
|
||||||
run: |
|
run: |
|
||||||
echo Inputs
|
echo The last commit sha in the release: ${{ inputs.sha }}
|
||||||
echo The branch to release from: ${{ inputs.target_branch }}
|
echo The branch to release from: ${{ inputs.target_branch }}
|
||||||
echo The release version number: ${{ inputs.version_number }}
|
echo The release version number: ${{ inputs.version_number }}
|
||||||
|
echo Build script path: ${{ inputs.build_script_path }}
|
||||||
|
echo Environment setup script path: ${{ inputs.env_setup_script_path }}
|
||||||
|
echo AWS S3 bucket name: ${{ inputs.s3_bucket_name }}
|
||||||
|
echo Package test command: ${{ inputs.package_test_command }}
|
||||||
echo Test run: ${{ inputs.test_run }}
|
echo Test run: ${{ inputs.test_run }}
|
||||||
echo Nightly release: ${{ inputs.nightly_release }}
|
echo Nightly release: ${{ inputs.nightly_release }}
|
||||||
echo Only Docker: ${{ inputs.only_docker }}
|
|
||||||
|
|
||||||
- name: "Checkout target branch"
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
ref: ${{ inputs.target_branch }}
|
|
||||||
|
|
||||||
# release-prep.yml really shouldn't take in the sha but since core + all adapters
|
|
||||||
# depend on it now this workaround lets us not input it manually with risk of error.
|
|
||||||
# The changes always get merged into the head so we can't use a specific commit for
|
|
||||||
# releases anyways.
|
|
||||||
- name: "Capture sha"
|
|
||||||
id: set_sha
|
|
||||||
run: |
|
|
||||||
echo "starting_sha=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
bump-version-generate-changelog:
|
bump-version-generate-changelog:
|
||||||
name: Bump package version, Generate changelog
|
name: Bump package version, Generate changelog
|
||||||
needs: [job-setup]
|
|
||||||
if: ${{ !inputs.only_docker }}
|
|
||||||
|
|
||||||
uses: dbt-labs/dbt-release/.github/workflows/release-prep.yml@main
|
uses: dbt-labs/dbt-release/.github/workflows/release-prep.yml@main
|
||||||
|
|
||||||
with:
|
with:
|
||||||
sha: ${{ needs.job-setup.outputs.starting_sha }}
|
sha: ${{ inputs.sha }}
|
||||||
version_number: ${{ inputs.version_number }}
|
version_number: ${{ inputs.version_number }}
|
||||||
target_branch: ${{ inputs.target_branch }}
|
target_branch: ${{ inputs.target_branch }}
|
||||||
env_setup_script_path: "scripts/env-setup.sh"
|
env_setup_script_path: ${{ inputs.env_setup_script_path }}
|
||||||
test_run: ${{ inputs.test_run }}
|
test_run: ${{ inputs.test_run }}
|
||||||
nightly_release: ${{ inputs.nightly_release }}
|
nightly_release: ${{ inputs.nightly_release }}
|
||||||
|
|
||||||
@@ -121,7 +147,7 @@ jobs:
|
|||||||
|
|
||||||
log-outputs-bump-version-generate-changelog:
|
log-outputs-bump-version-generate-changelog:
|
||||||
name: "[Log output] Bump package version, Generate changelog"
|
name: "[Log output] Bump package version, Generate changelog"
|
||||||
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
|
if: ${{ !failure() && !cancelled() }}
|
||||||
|
|
||||||
needs: [bump-version-generate-changelog]
|
needs: [bump-version-generate-changelog]
|
||||||
|
|
||||||
@@ -135,8 +161,8 @@ jobs:
|
|||||||
|
|
||||||
build-test-package:
|
build-test-package:
|
||||||
name: Build, Test, Package
|
name: Build, Test, Package
|
||||||
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
|
if: ${{ !failure() && !cancelled() }}
|
||||||
needs: [job-setup, bump-version-generate-changelog]
|
needs: [bump-version-generate-changelog]
|
||||||
|
|
||||||
uses: dbt-labs/dbt-release/.github/workflows/build.yml@main
|
uses: dbt-labs/dbt-release/.github/workflows/build.yml@main
|
||||||
|
|
||||||
@@ -144,9 +170,9 @@ jobs:
|
|||||||
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
|
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
|
||||||
version_number: ${{ inputs.version_number }}
|
version_number: ${{ inputs.version_number }}
|
||||||
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
|
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
|
||||||
build_script_path: "scripts/build-dist.sh"
|
build_script_path: ${{ inputs.build_script_path }}
|
||||||
s3_bucket_name: "core-team-artifacts"
|
s3_bucket_name: ${{ inputs.s3_bucket_name }}
|
||||||
package_test_command: "dbt --version"
|
package_test_command: ${{ inputs.package_test_command }}
|
||||||
test_run: ${{ inputs.test_run }}
|
test_run: ${{ inputs.test_run }}
|
||||||
nightly_release: ${{ inputs.nightly_release }}
|
nightly_release: ${{ inputs.nightly_release }}
|
||||||
|
|
||||||
@@ -156,7 +182,7 @@ jobs:
|
|||||||
|
|
||||||
github-release:
|
github-release:
|
||||||
name: GitHub Release
|
name: GitHub Release
|
||||||
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
|
if: ${{ !failure() && !cancelled() }}
|
||||||
|
|
||||||
needs: [bump-version-generate-changelog, build-test-package]
|
needs: [bump-version-generate-changelog, build-test-package]
|
||||||
|
|
||||||
@@ -183,51 +209,6 @@ jobs:
|
|||||||
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
|
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
|
||||||
TEST_PYPI_API_TOKEN: ${{ secrets.TEST_PYPI_API_TOKEN }}
|
TEST_PYPI_API_TOKEN: ${{ secrets.TEST_PYPI_API_TOKEN }}
|
||||||
|
|
||||||
determine-docker-package:
|
|
||||||
# dbt-postgres exists within dbt-core for versions 1.7 and earlier but is a separate package for 1.8 and later.
|
|
||||||
# determine if we need to release dbt-core or both dbt-core and dbt-postgres
|
|
||||||
name: Determine Docker Package
|
|
||||||
if: ${{ !failure() && !cancelled() }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: [pypi-release]
|
|
||||||
outputs:
|
|
||||||
matrix: ${{ steps.determine-docker-package.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
- name: "Audit Version And Parse Into Parts"
|
|
||||||
id: semver
|
|
||||||
uses: dbt-labs/actions/parse-semver@v1.1.0
|
|
||||||
with:
|
|
||||||
version: ${{ inputs.version_number }}
|
|
||||||
|
|
||||||
- name: "Determine Packages to Release"
|
|
||||||
id: determine-docker-package
|
|
||||||
run: |
|
|
||||||
if [ ${{ steps.semver.outputs.minor }} -ge 8 ]; then
|
|
||||||
json_output={\"package\":[\"dbt-core\"]}
|
|
||||||
else
|
|
||||||
json_output={\"package\":[\"dbt-core\",\"dbt-postgres\"]}
|
|
||||||
fi
|
|
||||||
echo "matrix=$json_output" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
docker-release:
|
|
||||||
name: "Docker Release for ${{ matrix.package }}"
|
|
||||||
needs: [determine-docker-package]
|
|
||||||
# We cannot release to docker on a test run because it uses the tag in GitHub as
|
|
||||||
# what we need to release but draft releases don't actually tag the commit so it
|
|
||||||
# finds nothing to release
|
|
||||||
if: ${{ !failure() && !cancelled() && (!inputs.test_run || inputs.only_docker) }}
|
|
||||||
strategy:
|
|
||||||
matrix: ${{fromJson(needs.determine-docker-package.outputs.matrix)}}
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
packages: write
|
|
||||||
|
|
||||||
uses: dbt-labs/dbt-release/.github/workflows/release-docker.yml@main
|
|
||||||
with:
|
|
||||||
package: ${{ matrix.package }}
|
|
||||||
version_number: ${{ inputs.version_number }}
|
|
||||||
test_run: ${{ inputs.test_run }}
|
|
||||||
|
|
||||||
slack-notification:
|
slack-notification:
|
||||||
name: Slack Notification
|
name: Slack Notification
|
||||||
if: ${{ failure() && (!inputs.test_run || inputs.nightly_release) }}
|
if: ${{ failure() && (!inputs.test_run || inputs.nightly_release) }}
|
||||||
@@ -238,7 +219,6 @@ jobs:
|
|||||||
build-test-package,
|
build-test-package,
|
||||||
github-release,
|
github-release,
|
||||||
pypi-release,
|
pypi-release,
|
||||||
docker-release,
|
|
||||||
]
|
]
|
||||||
|
|
||||||
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
|
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
|
||||||
|
|||||||
58
.github/workflows/schema-check.yml
vendored
58
.github/workflows/schema-check.yml
vendored
@@ -13,18 +13,20 @@
|
|||||||
name: Artifact Schema Check
|
name: Artifact Schema Check
|
||||||
|
|
||||||
on:
|
on:
|
||||||
pull_request:
|
|
||||||
types: [ opened, reopened, labeled, unlabeled, synchronize ]
|
|
||||||
paths-ignore: [ '.changes/**', '.github/**', 'tests/**', '**.md', '**.yml' ]
|
|
||||||
|
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
|
pull_request: #TODO: remove before merging
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- "develop"
|
||||||
|
- "*.latest"
|
||||||
|
- "releases/*"
|
||||||
|
|
||||||
# no special access is needed
|
# no special access is needed
|
||||||
permissions: read-all
|
permissions: read-all
|
||||||
|
|
||||||
env:
|
env:
|
||||||
LATEST_SCHEMA_PATH: ${{ github.workspace }}/new_schemas
|
LATEST_SCHEMA_PATH: ${{ github.workspace }}/new_schemas
|
||||||
SCHEMA_DIFF_ARTIFACT: ${{ github.workspace }}/schema_changes.txt
|
SCHEMA_DIFF_ARTIFACT: ${{ github.workspace }}//schema_schanges.txt
|
||||||
DBT_REPO_DIRECTORY: ${{ github.workspace }}/dbt
|
DBT_REPO_DIRECTORY: ${{ github.workspace }}/dbt
|
||||||
SCHEMA_REPO_DIRECTORY: ${{ github.workspace }}/schemas.getdbt.com
|
SCHEMA_REPO_DIRECTORY: ${{ github.workspace }}/schemas.getdbt.com
|
||||||
|
|
||||||
@@ -35,41 +37,24 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: 3.8
|
python-version: 3.8
|
||||||
|
|
||||||
- name: Checkout dbt repo
|
- name: Checkout dbt repo
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
path: ${{ env.DBT_REPO_DIRECTORY }}
|
path: ${{ env.DBT_REPO_DIRECTORY }}
|
||||||
|
|
||||||
- name: Check for changes in core/dbt/artifacts
|
|
||||||
# https://github.com/marketplace/actions/paths-changes-filter
|
|
||||||
uses: dorny/paths-filter@v3
|
|
||||||
id: check_artifact_changes
|
|
||||||
with:
|
|
||||||
filters: |
|
|
||||||
artifacts_changed:
|
|
||||||
- 'core/dbt/artifacts/**'
|
|
||||||
list-files: shell
|
|
||||||
working-directory: ${{ env.DBT_REPO_DIRECTORY }}
|
|
||||||
|
|
||||||
- name: Succeed if no artifacts have changed
|
|
||||||
if: steps.check_artifact_changes.outputs.artifacts_changed == 'false'
|
|
||||||
run: |
|
|
||||||
echo "No artifact changes found in core/dbt/artifacts. CI check passed."
|
|
||||||
|
|
||||||
- name: Checkout schemas.getdbt.com repo
|
- name: Checkout schemas.getdbt.com repo
|
||||||
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
|
uses: actions/checkout@v3
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
with:
|
||||||
repository: dbt-labs/schemas.getdbt.com
|
repository: dbt-labs/schemas.getdbt.com
|
||||||
ref: 'main'
|
ref: 'main'
|
||||||
|
ssh-key: ${{ secrets.SCHEMA_SSH_PRIVATE_KEY }}
|
||||||
path: ${{ env.SCHEMA_REPO_DIRECTORY }}
|
path: ${{ env.SCHEMA_REPO_DIRECTORY }}
|
||||||
|
|
||||||
- name: Generate current schema
|
- name: Generate current schema
|
||||||
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
|
|
||||||
run: |
|
run: |
|
||||||
cd ${{ env.DBT_REPO_DIRECTORY }}
|
cd ${{ env.DBT_REPO_DIRECTORY }}
|
||||||
python3 -m venv env
|
python3 -m venv env
|
||||||
@@ -80,17 +65,26 @@ jobs:
|
|||||||
|
|
||||||
# Copy generated schema files into the schemas.getdbt.com repo
|
# Copy generated schema files into the schemas.getdbt.com repo
|
||||||
# Do a git diff to find any changes
|
# Do a git diff to find any changes
|
||||||
# Ignore any lines with date-like (yyyy-mm-dd) or version-like (x.y.z) changes
|
# Ignore any date or version changes though
|
||||||
- name: Compare schemas
|
- name: Compare schemas
|
||||||
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
|
|
||||||
run: |
|
run: |
|
||||||
cp -r ${{ env.LATEST_SCHEMA_PATH }}/dbt ${{ env.SCHEMA_REPO_DIRECTORY }}
|
cp -r ${{ env.LATEST_SCHEMA_PATH }}/dbt ${{ env.SCHEMA_REPO_DIRECTORY }}
|
||||||
cd ${{ env.SCHEMA_REPO_DIRECTORY }}
|
cd ${{ env.SCHEMA_REPO_DIRECTORY }}
|
||||||
git diff -I='*[0-9]{4}-[0-9]{2}-[0-9]{2}' -I='*[0-9]+\.[0-9]+\.[0-9]+' --exit-code > ${{ env.SCHEMA_DIFF_ARTIFACT }}
|
diff_results=$(git diff -I='*[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])T' \
|
||||||
|
-I='*[0-9]{1}.[0-9]{2}.[0-9]{1}(rc[0-9]|b[0-9]| )' --compact-summary)
|
||||||
|
if [[ $(echo diff_results) ]]; then
|
||||||
|
echo $diff_results
|
||||||
|
echo "Schema changes detected!"
|
||||||
|
git diff -I='*[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])T' \
|
||||||
|
-I='*[0-9]{1}.[0-9]{2}.[0-9]{1}(rc[0-9]|b[0-9]| )' > ${{ env.SCHEMA_DIFF_ARTIFACT }}
|
||||||
|
exit 1
|
||||||
|
else
|
||||||
|
echo "No schema changes detected"
|
||||||
|
fi
|
||||||
|
|
||||||
- name: Upload schema diff
|
- name: Upload schema diff
|
||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v3
|
||||||
if: ${{ failure() && steps.check_artifact_changes.outputs.artifacts_changed == 'true' }}
|
if: ${{ failure() }}
|
||||||
with:
|
with:
|
||||||
name: 'schema_changes.txt'
|
name: 'schema_schanges.txt'
|
||||||
path: '${{ env.SCHEMA_DIFF_ARTIFACT }}'
|
path: '${{ env.SCHEMA_DIFF_ARTIFACT }}'
|
||||||
|
|||||||
@@ -69,12 +69,12 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: checkout dev
|
- name: checkout dev
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
persist-credentials: false
|
persist-credentials: false
|
||||||
|
|
||||||
- name: Setup Python
|
- name: Setup Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: "3.8"
|
python-version: "3.8"
|
||||||
|
|
||||||
@@ -94,11 +94,7 @@ jobs:
|
|||||||
# integration tests generate a ton of logs in different files. the next step will find them all.
|
# integration tests generate a ton of logs in different files. the next step will find them all.
|
||||||
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs
|
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs
|
||||||
- name: Run integration tests
|
- name: Run integration tests
|
||||||
uses: nick-fields/retry@v3
|
run: tox -e integration -- -nauto
|
||||||
with:
|
|
||||||
timeout_minutes: 30
|
|
||||||
max_attempts: 3
|
|
||||||
command: tox -e integration -- -nauto
|
|
||||||
env:
|
env:
|
||||||
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
|
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
|
||||||
|
|
||||||
|
|||||||
8
.github/workflows/test-repeater.yml
vendored
8
.github/workflows/test-repeater.yml
vendored
@@ -36,7 +36,7 @@ on:
|
|||||||
type: choice
|
type: choice
|
||||||
options:
|
options:
|
||||||
- 'ubuntu-latest'
|
- 'ubuntu-latest'
|
||||||
- 'macos-12'
|
- 'macos-latest'
|
||||||
- 'windows-latest'
|
- 'windows-latest'
|
||||||
num_runs_per_batch:
|
num_runs_per_batch:
|
||||||
description: 'Max number of times to run the test per batch. We always run 10 batches.'
|
description: 'Max number of times to run the test per batch. We always run 10 batches.'
|
||||||
@@ -83,12 +83,12 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: "Checkout code"
|
- name: "Checkout code"
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v3
|
||||||
with:
|
with:
|
||||||
ref: ${{ inputs.branch }}
|
ref: ${{ inputs.branch }}
|
||||||
|
|
||||||
- name: "Setup Python"
|
- name: "Setup Python"
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: "${{ inputs.python_version }}"
|
python-version: "${{ inputs.python_version }}"
|
||||||
|
|
||||||
@@ -101,7 +101,7 @@ jobs:
|
|||||||
|
|
||||||
# mac and windows don't use make due to limitations with docker with those runners in GitHub
|
# mac and windows don't use make due to limitations with docker with those runners in GitHub
|
||||||
- name: "Set up postgres (macos)"
|
- name: "Set up postgres (macos)"
|
||||||
if: inputs.os == 'macos-12'
|
if: inputs.os == 'macos-latest'
|
||||||
uses: ./.github/actions/setup-postgres-macos
|
uses: ./.github/actions/setup-postgres-macos
|
||||||
|
|
||||||
- name: "Set up postgres (windows)"
|
- name: "Set up postgres (windows)"
|
||||||
|
|||||||
1
.github/workflows/test/.actrc
vendored
Normal file
1
.github/workflows/test/.actrc
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
-P ubuntu-latest=ghcr.io/catthehacker/ubuntu:act-latest
|
||||||
1
.github/workflows/test/.gitignore
vendored
Normal file
1
.github/workflows/test/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.secrets
|
||||||
1
.github/workflows/test/.secrets.EXAMPLE
vendored
Normal file
1
.github/workflows/test/.secrets.EXAMPLE
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
GITHUB_TOKEN=GH_PERSONAL_ACCESS_TOKEN_GOES_HERE
|
||||||
6
.github/workflows/test/inputs/release_docker.json
vendored
Normal file
6
.github/workflows/test/inputs/release_docker.json
vendored
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
{
|
||||||
|
"inputs": {
|
||||||
|
"version_number": "1.0.1",
|
||||||
|
"package": "dbt-postgres"
|
||||||
|
}
|
||||||
|
}
|
||||||
28
.github/workflows/version-bump.yml
vendored
Normal file
28
.github/workflows/version-bump.yml
vendored
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
# **what?**
|
||||||
|
# This workflow will take the new version number to bump to. With that
|
||||||
|
# it will run versionbump to update the version number everywhere in the
|
||||||
|
# code base and then run changie to create the corresponding changelog.
|
||||||
|
# A PR will be created with the changes that can be reviewed before committing.
|
||||||
|
|
||||||
|
# **why?**
|
||||||
|
# This is to aid in releasing dbt and making sure we have updated
|
||||||
|
# the version in all places and generated the changelog.
|
||||||
|
|
||||||
|
# **when?**
|
||||||
|
# This is triggered manually
|
||||||
|
|
||||||
|
name: Version Bump
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
version_number:
|
||||||
|
description: 'The version number to bump to (ex. 1.2.0, 1.3.0b1)'
|
||||||
|
required: true
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
version_bump_and_changie:
|
||||||
|
uses: dbt-labs/actions/.github/workflows/version-bump.yml@main
|
||||||
|
with:
|
||||||
|
version_number: ${{ inputs.version_number }}
|
||||||
|
secrets: inherit # ok since what we are calling is internally maintained
|
||||||
@@ -1,4 +0,0 @@
|
|||||||
[settings]
|
|
||||||
profile=black
|
|
||||||
extend_skip_glob=.github/*,third-party-stubs/*,scripts/*
|
|
||||||
known_first_party=dbt,dbt_adapters,dbt_common,dbt_extractor,dbt_semantic_interface
|
|
||||||
@@ -19,10 +19,6 @@ repos:
|
|||||||
exclude_types:
|
exclude_types:
|
||||||
- "markdown"
|
- "markdown"
|
||||||
- id: check-case-conflict
|
- id: check-case-conflict
|
||||||
- repo: https://github.com/pycqa/isort
|
|
||||||
rev: 5.12.0
|
|
||||||
hooks:
|
|
||||||
- id: isort
|
|
||||||
- repo: https://github.com/psf/black
|
- repo: https://github.com/psf/black
|
||||||
rev: 22.3.0
|
rev: 22.3.0
|
||||||
hooks:
|
hooks:
|
||||||
|
|||||||
@@ -31,8 +31,7 @@ This is the docs website code. It comes from the dbt-docs repository, and is gen
|
|||||||
|
|
||||||
## Adapters
|
## Adapters
|
||||||
|
|
||||||
dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc.
|
dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc. For testing and development purposes, the dbt-postgres plugin lives alongside the dbt-core codebase, in the [`plugins`](plugins) subdirectory. Like other adapter plugins, it is a self-contained codebase and package that builds on top of dbt-core.
|
||||||
Note: dbt-postgres used to exist in dbt-core but is now in [its own repo](https://github.com/dbt-labs/dbt-postgres)
|
|
||||||
|
|
||||||
Each adapter is a mix of python, Jinja2, and SQL. The adapter code also makes heavy use of Jinja2 to wrap modular chunks of SQL functionality, define default implementations, and allow plugins to override it.
|
Each adapter is a mix of python, Jinja2, and SQL. The adapter code also makes heavy use of Jinja2 to wrap modular chunks of SQL functionality, define default implementations, and allow plugins to override it.
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# dbt Core Changelog
|
# dbt Core Changelog
|
||||||
|
|
||||||
- This file provides a full account of all changes to `dbt-core`
|
- This file provides a full account of all changes to `dbt-core` and `dbt-postgres`
|
||||||
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
|
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
|
||||||
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
|
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
|
||||||
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
|
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
|
||||||
@@ -10,7 +10,6 @@
|
|||||||
For information on prior major and minor releases, see their changelogs:
|
For information on prior major and minor releases, see their changelogs:
|
||||||
|
|
||||||
|
|
||||||
* [1.8](https://github.com/dbt-labs/dbt-core/blob/1.8.latest/CHANGELOG.md)
|
|
||||||
* [1.7](https://github.com/dbt-labs/dbt-core/blob/1.7.latest/CHANGELOG.md)
|
* [1.7](https://github.com/dbt-labs/dbt-core/blob/1.7.latest/CHANGELOG.md)
|
||||||
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
|
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
|
||||||
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
|
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
|
||||||
|
|||||||
@@ -10,7 +10,6 @@
|
|||||||
6. [Debugging](#debugging)
|
6. [Debugging](#debugging)
|
||||||
7. [Adding or modifying a changelog entry](#adding-or-modifying-a-changelog-entry)
|
7. [Adding or modifying a changelog entry](#adding-or-modifying-a-changelog-entry)
|
||||||
8. [Submitting a Pull Request](#submitting-a-pull-request)
|
8. [Submitting a Pull Request](#submitting-a-pull-request)
|
||||||
9. [Troubleshooting Tips](#troubleshooting-tips)
|
|
||||||
|
|
||||||
## About this document
|
## About this document
|
||||||
|
|
||||||
@@ -22,10 +21,10 @@ If you get stuck, we're happy to help! Drop us a line in the `#dbt-core-developm
|
|||||||
|
|
||||||
### Notes
|
### Notes
|
||||||
|
|
||||||
- **Adapters:** Is your issue or proposed code change related to a specific [database adapter](https://docs.getdbt.com/docs/available-adapters)? If so, please open issues, PRs, and discussions in that adapter's repository instead.
|
- **Adapters:** Is your issue or proposed code change related to a specific [database adapter](https://docs.getdbt.com/docs/available-adapters)? If so, please open issues, PRs, and discussions in that adapter's repository instead. The sole exception is Postgres; the `dbt-postgres` plugin lives in this repository (`dbt-core`).
|
||||||
- **CLA:** Please note that anyone contributing code to `dbt-core` must sign the [Contributor License Agreement](https://docs.getdbt.com/docs/contributor-license-agreements). If you are unable to sign the CLA, the `dbt-core` maintainers will unfortunately be unable to merge any of your Pull Requests. We welcome you to participate in discussions, open issues, and comment on existing ones.
|
- **CLA:** Please note that anyone contributing code to `dbt-core` must sign the [Contributor License Agreement](https://docs.getdbt.com/docs/contributor-license-agreements). If you are unable to sign the CLA, the `dbt-core` maintainers will unfortunately be unable to merge any of your Pull Requests. We welcome you to participate in discussions, open issues, and comment on existing ones.
|
||||||
- **Branches:** All pull requests from community contributors should target the `main` branch (default). If the change is needed as a patch for a minor version of dbt that has already been released (or is already a release candidate), a maintainer will backport the changes in your PR to the relevant "latest" release branch (`1.0.latest`, `1.1.latest`, ...). If an issue fix applies to a release branch, that fix should be first committed to the development branch and then to the release branch (rarely release-branch fixes may not apply to `main`).
|
- **Branches:** All pull requests from community contributors should target the `main` branch (default). If the change is needed as a patch for a minor version of dbt that has already been released (or is already a release candidate), a maintainer will backport the changes in your PR to the relevant "latest" release branch (`1.0.latest`, `1.1.latest`, ...). If an issue fix applies to a release branch, that fix should be first committed to the development branch and then to the release branch (rarely release-branch fixes may not apply to `main`).
|
||||||
- **Releases**: Before releasing a new minor version of Core, we prepare a series of alphas and release candidates to allow users (especially employees of dbt Labs!) to test the new version in live environments. This is an important quality assurance step, as it exposes the new code to a wide variety of complicated deployments and can surface bugs before official release. Releases are accessible via our [supported installation methods](https://docs.getdbt.com/docs/core/installation-overview#install-dbt-core).
|
- **Releases**: Before releasing a new minor version of Core, we prepare a series of alphas and release candidates to allow users (especially employees of dbt Labs!) to test the new version in live environments. This is an important quality assurance step, as it exposes the new code to a wide variety of complicated deployments and can surface bugs before official release. Releases are accessible via pip, homebrew, and dbt Cloud.
|
||||||
|
|
||||||
## Getting the code
|
## Getting the code
|
||||||
|
|
||||||
@@ -45,7 +44,9 @@ If you are not a member of the `dbt-labs` GitHub organization, you can contribut
|
|||||||
|
|
||||||
### dbt Labs contributors
|
### dbt Labs contributors
|
||||||
|
|
||||||
If you are a member of the `dbt-labs` GitHub organization, you will have push access to the `dbt-core` repo. Rather than forking `dbt-core` to make your changes, just clone the repository, check out a new branch, and push directly to that branch.
|
If you are a member of the `dbt-labs` GitHub organization, you will have push access to the `dbt-core` repo. Rather than forking `dbt-core` to make your changes, just clone the repository, check out a new branch, and push directly to that branch. Branch names should be fixed by `CT-XXX/` where:
|
||||||
|
* CT stands for 'core team'
|
||||||
|
* XXX stands for a JIRA ticket number
|
||||||
|
|
||||||
## Setting up an environment
|
## Setting up an environment
|
||||||
|
|
||||||
@@ -170,9 +171,9 @@ Finally, you can also run a specific test or group of tests using [`pytest`](htt
|
|||||||
|
|
||||||
```sh
|
```sh
|
||||||
# run all unit tests in a file
|
# run all unit tests in a file
|
||||||
python3 -m pytest tests/unit/test_base_column.py
|
python3 -m pytest tests/unit/test_graph.py
|
||||||
# run a specific unit test
|
# run a specific unit test
|
||||||
python3 -m pytest tests/unit/test_base_column.py::TestNumericType::test__numeric_type
|
python3 -m pytest tests/unit/test_graph.py::GraphTest::test__dependency_list
|
||||||
# run specific Postgres functional tests
|
# run specific Postgres functional tests
|
||||||
python3 -m pytest tests/functional/sources
|
python3 -m pytest tests/functional/sources
|
||||||
```
|
```
|
||||||
@@ -220,12 +221,10 @@ You don't need to worry about which `dbt-core` version your change will go into.
|
|||||||
|
|
||||||
## Submitting a Pull Request
|
## Submitting a Pull Request
|
||||||
|
|
||||||
Code can be merged into the current development branch `main` by opening a pull request. If the proposal looks like it's on the right track, then a `dbt-core` maintainer will triage the PR and label it as `ready_for_review`. From this point, two code reviewers will be assigned with the aim of responding to any updates to the PR within about one week. They may suggest code revision for style or clarity, or request that you add unit or integration test(s). These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code. Once merged, your contribution will be available for the next release of `dbt-core`.
|
Code can be merged into the current development branch `main` by opening a pull request. A `dbt-core` maintainer will review your PR. They may suggest code revision for style or clarity, or request that you add unit or integration test(s). These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code.
|
||||||
|
|
||||||
Automated tests run via GitHub Actions. If you're a first-time contributor, all tests (including code checks and unit tests) will require a maintainer to approve. Changes in the `dbt-core` repository trigger integration tests against Postgres. dbt Labs also provides CI environments in which to test changes to other adapters, triggered by PRs in those adapters' repositories, as well as periodic maintenance checks of each adapter in concert with the latest `dbt-core` code changes.
|
Automated tests run via GitHub Actions. If you're a first-time contributor, all tests (including code checks and unit tests) will require a maintainer to approve. Changes in the `dbt-core` repository trigger integration tests against Postgres. dbt Labs also provides CI environments in which to test changes to other adapters, triggered by PRs in those adapters' repositories, as well as periodic maintenance checks of each adapter in concert with the latest `dbt-core` code changes.
|
||||||
|
|
||||||
Once all tests are passing and your PR has been approved, a `dbt-core` maintainer will merge your changes into the active development branch. And that's it! Happy developing :tada:
|
Once all tests are passing and your PR has been approved, a `dbt-core` maintainer will merge your changes into the active development branch. And that's it! Happy developing :tada:
|
||||||
|
|
||||||
## Troubleshooting Tips
|
|
||||||
|
|
||||||
Sometimes, the content license agreement auto-check bot doesn't find a user's entry in its roster. If you need to force a rerun, add `@cla-bot check` in a comment on the pull request.
|
Sometimes, the content license agreement auto-check bot doesn't find a user's entry in its roster. If you need to force a rerun, add `@cla-bot check` in a comment on the pull request.
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user