forked from repo-mirrors/dbt-core
Compare commits
6 Commits
jerco/pyth
...
v1.3.0rc2
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c94c891f73 | ||
|
|
fc45a51582 | ||
|
|
6042469c71 | ||
|
|
2584465169 | ||
|
|
9a81a4dfe3 | ||
|
|
b6a82446e5 |
@@ -1,5 +1,5 @@
|
||||
[bumpversion]
|
||||
current_version = 1.3.0b2
|
||||
current_version = 1.3.0rc2
|
||||
parse = (?P<major>\d+)
|
||||
\.(?P<minor>\d+)
|
||||
\.(?P<patch>\d+)
|
||||
|
||||
75
.changes/1.3.0-rc1.md
Normal file
75
.changes/1.3.0-rc1.md
Normal file
@@ -0,0 +1,75 @@
|
||||
## dbt-core 1.3.0-rc1 - September 28, 2022
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- Renaming Metric Spec Attributes ([#5774](https://github.com/dbt-labs/dbt-core/issues/5774), [#5775](https://github.com/dbt-labs/dbt-core/pull/5775))
|
||||
|
||||
### Features
|
||||
|
||||
- merge_exclude_columns for incremental materialization ([#5260](https://github.com/dbt-labs/dbt-core/issues/5260), [#5457](https://github.com/dbt-labs/dbt-core/pull/5457))
|
||||
- Search current working directory for `profiles.yml` ([#5411](https://github.com/dbt-labs/dbt-core/issues/5411), [#5717](https://github.com/dbt-labs/dbt-core/pull/5717))
|
||||
- Adding the `window` parameter to the metric spec. ([#5721](https://github.com/dbt-labs/dbt-core/issues/5721), [#5722](https://github.com/dbt-labs/dbt-core/pull/5722))
|
||||
- Add invocation args dict to ProviderContext class ([#5524](https://github.com/dbt-labs/dbt-core/issues/5524), [#5782](https://github.com/dbt-labs/dbt-core/pull/5782))
|
||||
- Adds new cli framework ([#5526](https://github.com/dbt-labs/dbt-core/issues/5526), [#5647](https://github.com/dbt-labs/dbt-core/pull/5647))
|
||||
- Flags work with new Click CLI ([#5529](https://github.com/dbt-labs/dbt-core/issues/5529), [#5790](https://github.com/dbt-labs/dbt-core/pull/5790))
|
||||
- Add metadata env method to ProviderContext class ([#5522](https://github.com/dbt-labs/dbt-core/issues/5522), [#5794](https://github.com/dbt-labs/dbt-core/pull/5794))
|
||||
- Array macros ([#5520](https://github.com/dbt-labs/dbt-core/issues/5520), [#5823](https://github.com/dbt-labs/dbt-core/pull/5823))
|
||||
- Add enabled config to exposures and metrics ([#5422](https://github.com/dbt-labs/dbt-core/issues/5422), [#5815](https://github.com/dbt-labs/dbt-core/pull/5815))
|
||||
- add -fr flag shorthand ([#5878](https://github.com/dbt-labs/dbt-core/issues/5878), [#5879](https://github.com/dbt-labs/dbt-core/pull/5879))
|
||||
- add type_boolean as a data type macro ([#5739](https://github.com/dbt-labs/dbt-core/issues/5739), [#5875](https://github.com/dbt-labs/dbt-core/pull/5875))
|
||||
- Support .dbtignore in project root to ignore certain files being read by dbt ([#5733](https://github.com/dbt-labs/dbt-core/issues/5733), [#5897](https://github.com/dbt-labs/dbt-core/pull/5897))
|
||||
- This conditionally no-ops warehouse connection at compile depending on an env var, disabling introspection/queries during compilation only. This is a temporary solution to more complex permissions requirements for the semantic layer. ([#5936](https://github.com/dbt-labs/dbt-core/issues/5936), [#5926](https://github.com/dbt-labs/dbt-core/pull/5926))
|
||||
|
||||
### Fixes
|
||||
|
||||
- Fix typos of comments in core/dbt/adapters/ ([#5690](https://github.com/dbt-labs/dbt-core/issues/5690), [#5693](https://github.com/dbt-labs/dbt-core/pull/5693))
|
||||
- Include py.typed in MANIFEST.in. This enables packages that install dbt-core from pypi to use mypy. ([#5703](https://github.com/dbt-labs/dbt-core/issues/5703), [#5703](https://github.com/dbt-labs/dbt-core/pull/5703))
|
||||
- Removal of all .coverage files when using make clean command ([#5633](https://github.com/dbt-labs/dbt-core/issues/5633), [#5759](https://github.com/dbt-labs/dbt-core/pull/5759))
|
||||
- Remove temp files generated by unit tests ([#5631](https://github.com/dbt-labs/dbt-core/issues/5631), [#5749](https://github.com/dbt-labs/dbt-core/pull/5749))
|
||||
- Fix warnings as errors during tests ([#5424](https://github.com/dbt-labs/dbt-core/issues/5424), [#5800](https://github.com/dbt-labs/dbt-core/pull/5800))
|
||||
- Prevent event_history from holding references ([#5848](https://github.com/dbt-labs/dbt-core/issues/5848), [#5858](https://github.com/dbt-labs/dbt-core/pull/5858))
|
||||
- ConfigSelectorMethod should check for bools ([#5890](https://github.com/dbt-labs/dbt-core/issues/5890), [#5889](https://github.com/dbt-labs/dbt-core/pull/5889))
|
||||
- shorthand for full refresh should be one character ([#5878](https://github.com/dbt-labs/dbt-core/issues/5878), [#5908](https://github.com/dbt-labs/dbt-core/pull/5908))
|
||||
- Fix macro resolution order during static analysis for custom generic tests ([#5720](https://github.com/dbt-labs/dbt-core/issues/5720), [#5907](https://github.com/dbt-labs/dbt-core/pull/5907))
|
||||
- Fix race condition when invoking dbt via lib.py concurrently ([#5919](https://github.com/dbt-labs/dbt-core/issues/5919), [#5921](https://github.com/dbt-labs/dbt-core/pull/5921))
|
||||
|
||||
### Docs
|
||||
|
||||
- Refer to exposures by their label by default. ([dbt-docs/#306](https://github.com/dbt-labs/dbt-docs/issues/306), [dbt-docs/#307](https://github.com/dbt-labs/dbt-docs/pull/307))
|
||||
|
||||
### Under the Hood
|
||||
|
||||
- Migrate integration test 014 but also fix the snapshot hard delete test's timezone logic and force all integration tests to run flags.set_from_args to force environment variables are accessible to all integration test threads. ([#5760](https://github.com/dbt-labs/dbt-core/issues/5760), [#5760](https://github.com/dbt-labs/dbt-core/pull/5760))
|
||||
- Support dbt-metrics compilation by rebuilding flat_graph ([#5525](https://github.com/dbt-labs/dbt-core/issues/5525), [#5786](https://github.com/dbt-labs/dbt-core/pull/5786))
|
||||
- Reworking the way we define the window attribute of metrics to match freshness tests ([#5722](https://github.com/dbt-labs/dbt-core/issues/5722), [#5793](https://github.com/dbt-labs/dbt-core/pull/5793))
|
||||
- Add PythonJobHelper base class in core and add more type checking ([#5802](https://github.com/dbt-labs/dbt-core/issues/5802), [#5802](https://github.com/dbt-labs/dbt-core/pull/5802))
|
||||
- The link did not go to the anchor directly, now it does ([#5813](https://github.com/dbt-labs/dbt-core/issues/5813), [#5814](https://github.com/dbt-labs/dbt-core/pull/5814))
|
||||
- remove key as reserved keyword from test_bool_or ([#5817](https://github.com/dbt-labs/dbt-core/issues/5817), [#5818](https://github.com/dbt-labs/dbt-core/pull/5818))
|
||||
- Convert default selector tests to pytest ([#5728](https://github.com/dbt-labs/dbt-core/issues/5728), [#5820](https://github.com/dbt-labs/dbt-core/pull/5820))
|
||||
- Compatibiltiy for metric attribute renaming ([#5807](https://github.com/dbt-labs/dbt-core/issues/5807), [#5825](https://github.com/dbt-labs/dbt-core/pull/5825))
|
||||
- remove source quoting setting in adapter tests ([#5836](https://github.com/dbt-labs/dbt-core/issues/5836), [#5839](https://github.com/dbt-labs/dbt-core/pull/5839))
|
||||
- Add name validation for metrics ([#5456](https://github.com/dbt-labs/dbt-core/issues/5456), [#5841](https://github.com/dbt-labs/dbt-core/pull/5841))
|
||||
- Validate exposure name and add label ([#5606](https://github.com/dbt-labs/dbt-core/issues/5606), [#5844](https://github.com/dbt-labs/dbt-core/pull/5844))
|
||||
- Adding validation for metric expression attribute ([#5871](https://github.com/dbt-labs/dbt-core/issues/5871), [#5873](https://github.com/dbt-labs/dbt-core/pull/5873))
|
||||
- Profiling and Adapter Management work with Click CLI ([#5531](https://github.com/dbt-labs/dbt-core/issues/5531), [#5892](https://github.com/dbt-labs/dbt-core/pull/5892))
|
||||
- Reparse references to deleted metric ([#5444](https://github.com/dbt-labs/dbt-core/issues/5444), [#5920](https://github.com/dbt-labs/dbt-core/pull/5920))
|
||||
|
||||
### Dependency
|
||||
|
||||
- Bump black from 22.6.0 to 22.8.0 ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904), [#5750](https://github.com/dbt-labs/dbt-core/pull/5750))
|
||||
- Bump python from 3.10.6-slim-bullseye to 3.10.7-slim-bullseye in /docker ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904), [#5805](https://github.com/dbt-labs/dbt-core/pull/5805))
|
||||
|
||||
### Contributors
|
||||
- [@bbroeksema](https://github.com/bbroeksema) ([#5749](https://github.com/dbt-labs/dbt-core/pull/5749))
|
||||
- [@callum-mcdata](https://github.com/callum-mcdata) ([#5775](https://github.com/dbt-labs/dbt-core/pull/5775), [#5722](https://github.com/dbt-labs/dbt-core/pull/5722), [#5793](https://github.com/dbt-labs/dbt-core/pull/5793), [#5825](https://github.com/dbt-labs/dbt-core/pull/5825), [#5873](https://github.com/dbt-labs/dbt-core/pull/5873))
|
||||
- [@danielcmessias](https://github.com/danielcmessias) ([#5889](https://github.com/dbt-labs/dbt-core/pull/5889))
|
||||
- [@dave-connors-3](https://github.com/dave-connors-3) ([#5457](https://github.com/dbt-labs/dbt-core/pull/5457), [#5879](https://github.com/dbt-labs/dbt-core/pull/5879), [#5908](https://github.com/dbt-labs/dbt-core/pull/5908))
|
||||
- [@dbeatty10](https://github.com/dbeatty10) ([#5717](https://github.com/dbt-labs/dbt-core/pull/5717), [#5823](https://github.com/dbt-labs/dbt-core/pull/5823))
|
||||
- [@drewbanin](https://github.com/drewbanin) ([#5921](https://github.com/dbt-labs/dbt-core/pull/5921))
|
||||
- [@graciegoheen](https://github.com/graciegoheen) ([#5823](https://github.com/dbt-labs/dbt-core/pull/5823))
|
||||
- [@jared-rimmer](https://github.com/jared-rimmer) ([#5782](https://github.com/dbt-labs/dbt-core/pull/5782), [#5794](https://github.com/dbt-labs/dbt-core/pull/5794), [#5759](https://github.com/dbt-labs/dbt-core/pull/5759))
|
||||
- [@jpmmcneill](https://github.com/jpmmcneill) ([#5875](https://github.com/dbt-labs/dbt-core/pull/5875))
|
||||
- [@panasenco](https://github.com/panasenco) ([#5703](https://github.com/dbt-labs/dbt-core/pull/5703))
|
||||
- [@racheldaniel](https://github.com/racheldaniel) ([#5926](https://github.com/dbt-labs/dbt-core/pull/5926))
|
||||
- [@sdebruyn](https://github.com/sdebruyn) ([#5814](https://github.com/dbt-labs/dbt-core/pull/5814), [#5818](https://github.com/dbt-labs/dbt-core/pull/5818), [#5839](https://github.com/dbt-labs/dbt-core/pull/5839))
|
||||
- [@yoiki](https://github.com/yoiki) ([#5693](https://github.com/dbt-labs/dbt-core/pull/5693))
|
||||
9
.changes/1.3.0-rc2.md
Normal file
9
.changes/1.3.0-rc2.md
Normal file
@@ -0,0 +1,9 @@
|
||||
## dbt-core 1.3.0-rc2 - October 03, 2022
|
||||
|
||||
### Features
|
||||
|
||||
- Migrate dbt-utils current_timestamp macros into core + adapters ([#5521](https://github.com/dbt-labs/dbt-core/issues/5521), [#5838](https://github.com/dbt-labs/dbt-core/pull/5838))
|
||||
|
||||
### Fixes
|
||||
|
||||
- Account for disabled flags on models in schema files more completely ([#3992](https://github.com/dbt-labs/dbt-core/issues/3992), [#5868](https://github.com/dbt-labs/dbt-core/pull/5868))
|
||||
7
.changes/1.3.0/Features-20220914-095625.yaml
Normal file
7
.changes/1.3.0/Features-20220914-095625.yaml
Normal file
@@ -0,0 +1,7 @@
|
||||
kind: Features
|
||||
body: Migrate dbt-utils current_timestamp macros into core + adapters
|
||||
time: 2022-09-14T09:56:25.97818-07:00
|
||||
custom:
|
||||
Author: colin-rogers-dbt
|
||||
Issue: "5521"
|
||||
PR: "5838"
|
||||
7
.changes/1.3.0/Fixes-20220916-104854.yaml
Normal file
7
.changes/1.3.0/Fixes-20220916-104854.yaml
Normal file
@@ -0,0 +1,7 @@
|
||||
kind: Fixes
|
||||
body: Account for disabled flags on models in schema files more completely
|
||||
time: 2022-09-16T10:48:54.162273-05:00
|
||||
custom:
|
||||
Author: emmyoop
|
||||
Issue: "3992"
|
||||
PR: "5868"
|
||||
2
.changie.yaml
Executable file → Normal file
2
.changie.yaml
Executable file → Normal file
@@ -44,7 +44,7 @@ custom:
|
||||
footerFormat: |
|
||||
{{- $contributorDict := dict }}
|
||||
{{- /* any names added to this list should be all lowercase for later matching purposes */}}
|
||||
{{- $core_team := list "peterallenwebb" "emmyoop" "nathaniel-may" "gshank" "leahwicz" "chenyulinx" "stu-k" "iknox-fa" "versusfacit" "mcknight-42" "jtcohen6" "dependabot[bot]" "snyk-bot" }}
|
||||
{{- $core_team := list "peterallenwebb" "emmyoop" "nathaniel-may" "gshank" "leahwicz" "chenyulinx" "stu-k" "iknox-fa" "versusfacit" "mcknight-42" "jtcohen6" "dependabot[bot]" "snyk-bot" "colin-rogers-dbt" }}
|
||||
{{- range $change := .Changes }}
|
||||
{{- $authorList := splitList " " $change.Custom.Author }}
|
||||
{{- /* loop through all authors for a PR */}}
|
||||
|
||||
34
.github/workflows/version-bump.yml
vendored
34
.github/workflows/version-bump.yml
vendored
@@ -46,6 +46,16 @@ jobs:
|
||||
source env/bin/activate
|
||||
pip install --upgrade pip
|
||||
|
||||
- name: Add Homebrew to PATH
|
||||
run: |
|
||||
echo "/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin" >> $GITHUB_PATH
|
||||
|
||||
- name: Install Homebrew packages
|
||||
run: |
|
||||
brew install pre-commit
|
||||
brew tap miniscruff/changie https://github.com/miniscruff/changie
|
||||
brew install changie
|
||||
|
||||
- name: Audit Version and Parse Into Parts
|
||||
id: semver
|
||||
uses: dbt-labs/actions/parse-semver@v1
|
||||
@@ -70,18 +80,8 @@ jobs:
|
||||
env/bin/bumpversion --allow-dirty --new-version ${{ github.event.inputs.version_number }} major
|
||||
git status
|
||||
|
||||
# this step will fail on whitespace errors but also correct them
|
||||
- name: Format bumpversion file
|
||||
continue-on-error: true
|
||||
run: |
|
||||
brew install pre-commit
|
||||
pre-commit run trailing-whitespace --files .bumpversion.cfg
|
||||
git status
|
||||
|
||||
- name: Run changie
|
||||
run: |
|
||||
brew tap miniscruff/changie https://github.com/miniscruff/changie
|
||||
brew install changie
|
||||
if [[ ${{ steps.semver.outputs.is-pre-release }} -eq 1 ]]
|
||||
then
|
||||
changie batch ${{ steps.semver.outputs.base-version }} --move-dir '${{ steps.semver.outputs.base-version }}' --prerelease '${{ steps.semver.outputs.pre-release }}'
|
||||
@@ -91,6 +91,20 @@ jobs:
|
||||
changie merge
|
||||
git status
|
||||
|
||||
# this step will fail on whitespace errors but also correct them
|
||||
- name: Remove trailing whitespace
|
||||
continue-on-error: true
|
||||
run: |
|
||||
pre-commit run trailing-whitespace --files .bumpversion.cfg CHANGELOG.md .changes/*
|
||||
git status
|
||||
|
||||
# this step will fail on newline errors but also correct them
|
||||
- name: Removing extra newlines
|
||||
continue-on-error: true
|
||||
run: |
|
||||
pre-commit run end-of-file-fixer --files .bumpversion.cfg CHANGELOG.md .changes/*
|
||||
git status
|
||||
|
||||
- name: Commit version bump to branch
|
||||
uses: EndBug/add-and-commit@v7
|
||||
with:
|
||||
|
||||
89
CHANGELOG.md
89
CHANGELOG.md
@@ -5,6 +5,94 @@
|
||||
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
|
||||
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
|
||||
|
||||
## dbt-core 1.3.0-rc2 - October 03, 2022
|
||||
|
||||
### Features
|
||||
|
||||
- Migrate dbt-utils current_timestamp macros into core + adapters ([#5521](https://github.com/dbt-labs/dbt-core/issues/5521), [#5838](https://github.com/dbt-labs/dbt-core/pull/5838))
|
||||
|
||||
### Fixes
|
||||
|
||||
- Account for disabled flags on models in schema files more completely ([#3992](https://github.com/dbt-labs/dbt-core/issues/3992), [#5868](https://github.com/dbt-labs/dbt-core/pull/5868))
|
||||
|
||||
|
||||
|
||||
## dbt-core 1.3.0-rc1 - September 28, 2022
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- Renaming Metric Spec Attributes ([#5774](https://github.com/dbt-labs/dbt-core/issues/5774), [#5775](https://github.com/dbt-labs/dbt-core/pull/5775))
|
||||
|
||||
### Features
|
||||
|
||||
- merge_exclude_columns for incremental materialization ([#5260](https://github.com/dbt-labs/dbt-core/issues/5260), [#5457](https://github.com/dbt-labs/dbt-core/pull/5457))
|
||||
- Search current working directory for `profiles.yml` ([#5411](https://github.com/dbt-labs/dbt-core/issues/5411), [#5717](https://github.com/dbt-labs/dbt-core/pull/5717))
|
||||
- Adding the `window` parameter to the metric spec. ([#5721](https://github.com/dbt-labs/dbt-core/issues/5721), [#5722](https://github.com/dbt-labs/dbt-core/pull/5722))
|
||||
- Add invocation args dict to ProviderContext class ([#5524](https://github.com/dbt-labs/dbt-core/issues/5524), [#5782](https://github.com/dbt-labs/dbt-core/pull/5782))
|
||||
- Adds new cli framework ([#5526](https://github.com/dbt-labs/dbt-core/issues/5526), [#5647](https://github.com/dbt-labs/dbt-core/pull/5647))
|
||||
- Flags work with new Click CLI ([#5529](https://github.com/dbt-labs/dbt-core/issues/5529), [#5790](https://github.com/dbt-labs/dbt-core/pull/5790))
|
||||
- Add metadata env method to ProviderContext class ([#5522](https://github.com/dbt-labs/dbt-core/issues/5522), [#5794](https://github.com/dbt-labs/dbt-core/pull/5794))
|
||||
- Array macros ([#5520](https://github.com/dbt-labs/dbt-core/issues/5520), [#5823](https://github.com/dbt-labs/dbt-core/pull/5823))
|
||||
- Add enabled config to exposures and metrics ([#5422](https://github.com/dbt-labs/dbt-core/issues/5422), [#5815](https://github.com/dbt-labs/dbt-core/pull/5815))
|
||||
- add -fr flag shorthand ([#5878](https://github.com/dbt-labs/dbt-core/issues/5878), [#5879](https://github.com/dbt-labs/dbt-core/pull/5879))
|
||||
- add type_boolean as a data type macro ([#5739](https://github.com/dbt-labs/dbt-core/issues/5739), [#5875](https://github.com/dbt-labs/dbt-core/pull/5875))
|
||||
- Support .dbtignore in project root to ignore certain files being read by dbt ([#5733](https://github.com/dbt-labs/dbt-core/issues/5733), [#5897](https://github.com/dbt-labs/dbt-core/pull/5897))
|
||||
- This conditionally no-ops warehouse connection at compile depending on an env var, disabling introspection/queries during compilation only. This is a temporary solution to more complex permissions requirements for the semantic layer. ([#5936](https://github.com/dbt-labs/dbt-core/issues/5936), [#5926](https://github.com/dbt-labs/dbt-core/pull/5926))
|
||||
|
||||
### Fixes
|
||||
|
||||
- Fix typos of comments in core/dbt/adapters/ ([#5690](https://github.com/dbt-labs/dbt-core/issues/5690), [#5693](https://github.com/dbt-labs/dbt-core/pull/5693))
|
||||
- Include py.typed in MANIFEST.in. This enables packages that install dbt-core from pypi to use mypy. ([#5703](https://github.com/dbt-labs/dbt-core/issues/5703), [#5703](https://github.com/dbt-labs/dbt-core/pull/5703))
|
||||
- Removal of all .coverage files when using make clean command ([#5633](https://github.com/dbt-labs/dbt-core/issues/5633), [#5759](https://github.com/dbt-labs/dbt-core/pull/5759))
|
||||
- Remove temp files generated by unit tests ([#5631](https://github.com/dbt-labs/dbt-core/issues/5631), [#5749](https://github.com/dbt-labs/dbt-core/pull/5749))
|
||||
- Fix warnings as errors during tests ([#5424](https://github.com/dbt-labs/dbt-core/issues/5424), [#5800](https://github.com/dbt-labs/dbt-core/pull/5800))
|
||||
- Prevent event_history from holding references ([#5848](https://github.com/dbt-labs/dbt-core/issues/5848), [#5858](https://github.com/dbt-labs/dbt-core/pull/5858))
|
||||
- ConfigSelectorMethod should check for bools ([#5890](https://github.com/dbt-labs/dbt-core/issues/5890), [#5889](https://github.com/dbt-labs/dbt-core/pull/5889))
|
||||
- shorthand for full refresh should be one character ([#5878](https://github.com/dbt-labs/dbt-core/issues/5878), [#5908](https://github.com/dbt-labs/dbt-core/pull/5908))
|
||||
- Fix macro resolution order during static analysis for custom generic tests ([#5720](https://github.com/dbt-labs/dbt-core/issues/5720), [#5907](https://github.com/dbt-labs/dbt-core/pull/5907))
|
||||
- Fix race condition when invoking dbt via lib.py concurrently ([#5919](https://github.com/dbt-labs/dbt-core/issues/5919), [#5921](https://github.com/dbt-labs/dbt-core/pull/5921))
|
||||
|
||||
### Docs
|
||||
|
||||
- Refer to exposures by their label by default. ([dbt-docs/#306](https://github.com/dbt-labs/dbt-docs/issues/306), [dbt-docs/#307](https://github.com/dbt-labs/dbt-docs/pull/307))
|
||||
|
||||
### Under the Hood
|
||||
|
||||
- Migrate integration test 014 but also fix the snapshot hard delete test's timezone logic and force all integration tests to run flags.set_from_args to force environment variables are accessible to all integration test threads. ([#5760](https://github.com/dbt-labs/dbt-core/issues/5760), [#5760](https://github.com/dbt-labs/dbt-core/pull/5760))
|
||||
- Support dbt-metrics compilation by rebuilding flat_graph ([#5525](https://github.com/dbt-labs/dbt-core/issues/5525), [#5786](https://github.com/dbt-labs/dbt-core/pull/5786))
|
||||
- Reworking the way we define the window attribute of metrics to match freshness tests ([#5722](https://github.com/dbt-labs/dbt-core/issues/5722), [#5793](https://github.com/dbt-labs/dbt-core/pull/5793))
|
||||
- Add PythonJobHelper base class in core and add more type checking ([#5802](https://github.com/dbt-labs/dbt-core/issues/5802), [#5802](https://github.com/dbt-labs/dbt-core/pull/5802))
|
||||
- The link did not go to the anchor directly, now it does ([#5813](https://github.com/dbt-labs/dbt-core/issues/5813), [#5814](https://github.com/dbt-labs/dbt-core/pull/5814))
|
||||
- remove key as reserved keyword from test_bool_or ([#5817](https://github.com/dbt-labs/dbt-core/issues/5817), [#5818](https://github.com/dbt-labs/dbt-core/pull/5818))
|
||||
- Convert default selector tests to pytest ([#5728](https://github.com/dbt-labs/dbt-core/issues/5728), [#5820](https://github.com/dbt-labs/dbt-core/pull/5820))
|
||||
- Compatibiltiy for metric attribute renaming ([#5807](https://github.com/dbt-labs/dbt-core/issues/5807), [#5825](https://github.com/dbt-labs/dbt-core/pull/5825))
|
||||
- remove source quoting setting in adapter tests ([#5836](https://github.com/dbt-labs/dbt-core/issues/5836), [#5839](https://github.com/dbt-labs/dbt-core/pull/5839))
|
||||
- Add name validation for metrics ([#5456](https://github.com/dbt-labs/dbt-core/issues/5456), [#5841](https://github.com/dbt-labs/dbt-core/pull/5841))
|
||||
- Validate exposure name and add label ([#5606](https://github.com/dbt-labs/dbt-core/issues/5606), [#5844](https://github.com/dbt-labs/dbt-core/pull/5844))
|
||||
- Adding validation for metric expression attribute ([#5871](https://github.com/dbt-labs/dbt-core/issues/5871), [#5873](https://github.com/dbt-labs/dbt-core/pull/5873))
|
||||
- Profiling and Adapter Management work with Click CLI ([#5531](https://github.com/dbt-labs/dbt-core/issues/5531), [#5892](https://github.com/dbt-labs/dbt-core/pull/5892))
|
||||
- Reparse references to deleted metric ([#5444](https://github.com/dbt-labs/dbt-core/issues/5444), [#5920](https://github.com/dbt-labs/dbt-core/pull/5920))
|
||||
|
||||
### Dependency
|
||||
|
||||
- Bump black from 22.6.0 to 22.8.0 ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904), [#5750](https://github.com/dbt-labs/dbt-core/pull/5750))
|
||||
- Bump python from 3.10.6-slim-bullseye to 3.10.7-slim-bullseye in /docker ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904), [#5805](https://github.com/dbt-labs/dbt-core/pull/5805))
|
||||
|
||||
### Contributors
|
||||
- [@bbroeksema](https://github.com/bbroeksema) ([#5749](https://github.com/dbt-labs/dbt-core/pull/5749))
|
||||
- [@callum-mcdata](https://github.com/callum-mcdata) ([#5775](https://github.com/dbt-labs/dbt-core/pull/5775), [#5722](https://github.com/dbt-labs/dbt-core/pull/5722), [#5793](https://github.com/dbt-labs/dbt-core/pull/5793), [#5825](https://github.com/dbt-labs/dbt-core/pull/5825), [#5873](https://github.com/dbt-labs/dbt-core/pull/5873))
|
||||
- [@danielcmessias](https://github.com/danielcmessias) ([#5889](https://github.com/dbt-labs/dbt-core/pull/5889))
|
||||
- [@dave-connors-3](https://github.com/dave-connors-3) ([#5457](https://github.com/dbt-labs/dbt-core/pull/5457), [#5879](https://github.com/dbt-labs/dbt-core/pull/5879), [#5908](https://github.com/dbt-labs/dbt-core/pull/5908))
|
||||
- [@dbeatty10](https://github.com/dbeatty10) ([#5717](https://github.com/dbt-labs/dbt-core/pull/5717), [#5823](https://github.com/dbt-labs/dbt-core/pull/5823))
|
||||
- [@drewbanin](https://github.com/drewbanin) ([#5921](https://github.com/dbt-labs/dbt-core/pull/5921))
|
||||
- [@graciegoheen](https://github.com/graciegoheen) ([#5823](https://github.com/dbt-labs/dbt-core/pull/5823))
|
||||
- [@jared-rimmer](https://github.com/jared-rimmer) ([#5782](https://github.com/dbt-labs/dbt-core/pull/5782), [#5794](https://github.com/dbt-labs/dbt-core/pull/5794), [#5759](https://github.com/dbt-labs/dbt-core/pull/5759))
|
||||
- [@jpmmcneill](https://github.com/jpmmcneill) ([#5875](https://github.com/dbt-labs/dbt-core/pull/5875))
|
||||
- [@panasenco](https://github.com/panasenco) ([#5703](https://github.com/dbt-labs/dbt-core/pull/5703))
|
||||
- [@racheldaniel](https://github.com/racheldaniel) ([#5926](https://github.com/dbt-labs/dbt-core/pull/5926))
|
||||
- [@sdebruyn](https://github.com/sdebruyn) ([#5814](https://github.com/dbt-labs/dbt-core/pull/5814), [#5818](https://github.com/dbt-labs/dbt-core/pull/5818), [#5839](https://github.com/dbt-labs/dbt-core/pull/5839))
|
||||
- [@yoiki](https://github.com/yoiki) ([#5693](https://github.com/dbt-labs/dbt-core/pull/5693))
|
||||
|
||||
## dbt-core 1.3.0-b2 - August 29, 2022
|
||||
|
||||
### Features
|
||||
@@ -55,7 +143,6 @@
|
||||
- [@sungchun12](https://github.com/sungchun12) ([#5397](https://github.com/dbt-labs/dbt-core/pull/5397), [dbt-docs/#281](https://github.com/dbt-labs/dbt-docs/pull/281))
|
||||
- [@varun-dc](https://github.com/varun-dc) ([#5627](https://github.com/dbt-labs/dbt-core/pull/5627))
|
||||
|
||||
|
||||
## dbt-core 1.3.0-b1 - July 29, 2022
|
||||
### Features
|
||||
- Python model inital version ([#5261](https://github.com/dbt-labs/dbt-core/issues/5261), [#5421](https://github.com/dbt-labs/dbt-core/pull/5421))
|
||||
|
||||
@@ -15,7 +15,7 @@ def statically_extract_macro_calls(string, ctx, db_wrapper=None):
|
||||
if hasattr(func_call, "node") and hasattr(func_call.node, "name"):
|
||||
func_name = func_call.node.name
|
||||
else:
|
||||
# func_call for dbt_utils.current_timestamp macro
|
||||
# func_call for dbt.current_timestamp macro
|
||||
# Call(
|
||||
# node=Getattr(
|
||||
# node=Name(
|
||||
|
||||
@@ -1088,8 +1088,13 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
|
||||
def add_node(self, source_file: AnySourceFile, node: ManifestNodes, test_from=None):
|
||||
self.add_node_nofile(node)
|
||||
if isinstance(source_file, SchemaSourceFile):
|
||||
assert test_from
|
||||
source_file.add_test(node.unique_id, test_from)
|
||||
if isinstance(node, ParsedGenericTestNode):
|
||||
assert test_from
|
||||
source_file.add_test(node.unique_id, test_from)
|
||||
if isinstance(node, ParsedMetric):
|
||||
source_file.metrics.append(node.unique_id)
|
||||
if isinstance(node, ParsedExposure):
|
||||
source_file.exposures.append(node.unique_id)
|
||||
else:
|
||||
source_file.nodes.append(node.unique_id)
|
||||
|
||||
|
||||
@@ -1,13 +1,3 @@
|
||||
{% macro current_timestamp() -%}
|
||||
{{ adapter.dispatch('current_timestamp', 'dbt')() }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__current_timestamp() -%}
|
||||
{{ exceptions.raise_not_implemented(
|
||||
'current_timestamp macro not implemented for adapter '+adapter.type()) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{% macro collect_freshness(source, loaded_at_field, filter) %}
|
||||
{{ return(adapter.dispatch('collect_freshness', 'dbt')(source, loaded_at_field, filter))}}
|
||||
{% endmacro %}
|
||||
|
||||
@@ -0,0 +1,44 @@
|
||||
{%- macro current_timestamp() -%}
|
||||
{{ adapter.dispatch('current_timestamp', 'dbt')() }}
|
||||
{%- endmacro -%}
|
||||
|
||||
{% macro default__current_timestamp() -%}
|
||||
{{ exceptions.raise_not_implemented(
|
||||
'current_timestamp macro not implemented for adapter ' + adapter.type()) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{%- macro snapshot_get_time() -%}
|
||||
{{ adapter.dispatch('snapshot_get_time', 'dbt')() }}
|
||||
{%- endmacro -%}
|
||||
|
||||
{% macro default__snapshot_get_time() %}
|
||||
{{ current_timestamp() }}
|
||||
{% endmacro %}
|
||||
|
||||
---------------------------------------------
|
||||
|
||||
/* {#
|
||||
DEPRECATED: DO NOT USE IN NEW PROJECTS
|
||||
|
||||
This is ONLY to handle the fact that Snowflake + Postgres had functionally
|
||||
different implementations of {{ dbt.current_timestamp }} + {{ dbt_utils.current_timestamp }}
|
||||
|
||||
If you had a project or package that called {{ dbt_utils.current_timestamp() }}, you should
|
||||
continue to use this macro to guarantee identical behavior on those two databases.
|
||||
#} */
|
||||
|
||||
{% macro current_timestamp_backcompat() %}
|
||||
{{ return(adapter.dispatch('current_timestamp_backcompat', 'dbt')()) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__current_timestamp_backcompat() %}
|
||||
current_timestamp::timestamp
|
||||
{% endmacro %}
|
||||
|
||||
{% macro current_timestamp_in_utc_backcompat() %}
|
||||
{{ return(adapter.dispatch('current_timestamp_in_utc_backcompat', 'dbt')()) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__current_timestamp_in_utc_backcompat() %}
|
||||
{{ return(adapter.dispatch('current_timestamp_backcompat', 'dbt')()) }}
|
||||
{% endmacro %}
|
||||
@@ -46,19 +46,6 @@
|
||||
{%- endfor -%})
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{#
|
||||
Get the current time cross-db
|
||||
#}
|
||||
{% macro snapshot_get_time() -%}
|
||||
{{ adapter.dispatch('snapshot_get_time', 'dbt')() }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__snapshot_get_time() -%}
|
||||
{{ current_timestamp() }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{#
|
||||
Core strategy definitions
|
||||
#}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from copy import deepcopy
|
||||
from dataclasses import dataclass
|
||||
from dataclasses import field
|
||||
from datetime import datetime
|
||||
@@ -368,6 +369,8 @@ class ManifestLoader:
|
||||
project, project_parser_files[project.project_name], parser_types
|
||||
)
|
||||
|
||||
self.process_nodes()
|
||||
|
||||
self._perf_info.parse_project_elapsed = time.perf_counter() - start_parse_projects
|
||||
|
||||
# patch_sources converts the UnparsedSourceDefinitions in the
|
||||
@@ -468,6 +471,7 @@ class ManifestLoader:
|
||||
dct = block.file.pp_dict
|
||||
else:
|
||||
dct = block.file.dict_from_yaml
|
||||
# this is where the schema file gets parsed
|
||||
parser.parse_file(block, dct=dct)
|
||||
# Came out of here with UnpatchedSourceDefinition containing configs at the source level
|
||||
# and not configs at the table level (as expected)
|
||||
@@ -926,6 +930,31 @@ class ManifestLoader:
|
||||
continue
|
||||
_process_sources_for_exposure(self.manifest, current_project, exposure)
|
||||
|
||||
def process_nodes(self):
|
||||
# make sure the nodes are in the manifest.nodes or the disabled dict,
|
||||
# correctly now that the schema files are also parsed
|
||||
disabled_nodes = []
|
||||
for node in self.manifest.nodes.values():
|
||||
if not node.config.enabled:
|
||||
disabled_nodes.append(node.unique_id)
|
||||
self.manifest.add_disabled_nofile(node)
|
||||
for unique_id in disabled_nodes:
|
||||
self.manifest.nodes.pop(unique_id)
|
||||
|
||||
disabled_copy = deepcopy(self.manifest.disabled)
|
||||
for disabled in disabled_copy.values():
|
||||
for node in disabled:
|
||||
if node.config.enabled:
|
||||
for dis_index, dis_node in enumerate(disabled):
|
||||
# Remove node from disabled and unique_id from disabled dict if necessary
|
||||
del self.manifest.disabled[node.unique_id][dis_index]
|
||||
if not self.manifest.disabled[node.unique_id]:
|
||||
self.manifest.disabled.pop(node.unique_id)
|
||||
|
||||
self.manifest.add_node_nofile(node)
|
||||
|
||||
self.manifest.rebuild_ref_lookup()
|
||||
|
||||
|
||||
def invalid_ref_fail_unless_test(node, target_model_name, target_model_package, disabled):
|
||||
|
||||
|
||||
@@ -834,17 +834,24 @@ class PartialParsing:
|
||||
# remove elem node and remove unique_id from node_patches
|
||||
if elem_unique_id:
|
||||
# might have been already removed
|
||||
if elem_unique_id in self.saved_manifest.nodes:
|
||||
node = self.saved_manifest.nodes.pop(elem_unique_id)
|
||||
self.deleted_manifest.nodes[elem_unique_id] = node
|
||||
if (
|
||||
elem_unique_id in self.saved_manifest.nodes
|
||||
or elem_unique_id in self.saved_manifest.disabled
|
||||
):
|
||||
if elem_unique_id in self.saved_manifest.nodes:
|
||||
nodes = [self.saved_manifest.nodes.pop(elem_unique_id)]
|
||||
else:
|
||||
# The value of disabled items is a list of nodes
|
||||
nodes = self.saved_manifest.disabled.pop(elem_unique_id)
|
||||
# need to add the node source_file to pp_files
|
||||
file_id = node.file_id
|
||||
# need to copy new file to saved files in order to get content
|
||||
if file_id in self.new_files:
|
||||
self.saved_files[file_id] = deepcopy(self.new_files[file_id])
|
||||
if self.saved_files[file_id]:
|
||||
source_file = self.saved_files[file_id]
|
||||
self.add_to_pp_files(source_file)
|
||||
for node in nodes:
|
||||
file_id = node.file_id
|
||||
# need to copy new file to saved files in order to get content
|
||||
if file_id in self.new_files:
|
||||
self.saved_files[file_id] = deepcopy(self.new_files[file_id])
|
||||
if self.saved_files[file_id]:
|
||||
source_file = self.saved_files[file_id]
|
||||
self.add_to_pp_files(source_file)
|
||||
# remove from patches
|
||||
schema_file.node_patches.remove(elem_unique_id)
|
||||
|
||||
|
||||
@@ -508,6 +508,8 @@ class SchemaParser(SimpleParser[GenericTestBlock, ParsedGenericTestNode]):
|
||||
# NonSourceParser.parse(), TestablePatchParser is a variety of
|
||||
# NodePatchParser
|
||||
if "models" in dct:
|
||||
# the models are already in the manifest as nodes when we reach this code,
|
||||
# even if they are disabled in the schema file
|
||||
parser = TestablePatchParser(self, yaml_block, "models")
|
||||
for test_block in parser.parse():
|
||||
self.parse_tests(test_block)
|
||||
@@ -775,9 +777,10 @@ class NonSourceParser(YamlDocsReader, Generic[NonSourceTarget, Parsed]):
|
||||
refs: ParserRef = ParserRef.from_target(node)
|
||||
else:
|
||||
refs = ParserRef()
|
||||
# This adds the node_block to self.manifest
|
||||
# as a ParsedNodePatch or ParsedMacroPatch
|
||||
|
||||
# There's no unique_id on the node yet so cannot add to disabled dict
|
||||
self.parse_patch(node_block, refs)
|
||||
|
||||
# This will always be empty if the node a macro or analysis
|
||||
return test_blocks
|
||||
|
||||
@@ -881,15 +884,34 @@ class NodePatchParser(NonSourceParser[NodeTarget, ParsedNodePatch], Generic[Node
|
||||
f"Unexpected yaml_key {patch.yaml_key} for patch in "
|
||||
f"file {source_file.path.original_file_path}"
|
||||
)
|
||||
# handle disabled nodes
|
||||
if unique_id is None:
|
||||
# Node might be disabled. Following call returns list of matching disabled nodes
|
||||
found_nodes = self.manifest.disabled_lookup.find(patch.name, patch.package_name)
|
||||
if found_nodes:
|
||||
# There might be multiple disabled nodes for this model
|
||||
if len(found_nodes) > 1 and patch.config.get("enabled"):
|
||||
# There are multiple disabled nodes for this model and the schema file wants to enable one.
|
||||
# We have no way to know which one to enable.
|
||||
resource_type = found_nodes[0].unique_id.split(".")[0]
|
||||
msg = (
|
||||
f"Found {len(found_nodes)} matching disabled nodes for "
|
||||
f"{resource_type} '{patch.name}'. Multiple nodes for the same "
|
||||
"unique id cannot be enabled in the schema file. They must be enabled "
|
||||
"in `dbt_project.yml` or in the sql files."
|
||||
)
|
||||
raise ParsingException(msg)
|
||||
|
||||
# all nodes in the disabled dict have the same unique_id so just grab the first one
|
||||
# to append with the uniqe id
|
||||
source_file.append_patch(patch.yaml_key, found_nodes[0].unique_id)
|
||||
for node in found_nodes:
|
||||
# We're saving the patch_path because we need to schedule
|
||||
# re-application of the patch in partial parsing.
|
||||
node.patch_path = source_file.file_id
|
||||
# re-calculate the node config with the patch config. Always do this
|
||||
# for the case when no config is set to ensure the default of true gets captured
|
||||
if patch.config:
|
||||
self.patch_node_config(node, patch)
|
||||
|
||||
node.patch(patch)
|
||||
else:
|
||||
msg = (
|
||||
f"Did not find matching node for patch with name '{patch.name}' "
|
||||
@@ -905,11 +927,13 @@ class NodePatchParser(NonSourceParser[NodeTarget, ParsedNodePatch], Generic[Node
|
||||
if node.patch_path:
|
||||
package_name, existing_file_path = node.patch_path.split("://")
|
||||
raise_duplicate_patch_name(patch, existing_file_path)
|
||||
source_file.append_patch(patch.yaml_key, unique_id)
|
||||
# If this patch has config changes, re-calculate the node config
|
||||
# with the patch config
|
||||
|
||||
source_file.append_patch(patch.yaml_key, node.unique_id)
|
||||
# re-calculate the node config with the patch config. Always do this
|
||||
# for the case when no config is set to ensure the default of true gets captured
|
||||
if patch.config:
|
||||
self.patch_node_config(node, patch)
|
||||
|
||||
node.patch(patch)
|
||||
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ import yaml
|
||||
import json
|
||||
import warnings
|
||||
from datetime import datetime
|
||||
from typing import List
|
||||
from typing import Dict, List
|
||||
from contextlib import contextmanager
|
||||
from dbt.adapters.factory import Adapter
|
||||
|
||||
@@ -35,6 +35,7 @@ from dbt.events.test_types import IntegrationTestDebug
|
||||
# relation_from_name
|
||||
# check_relation_types (table/view)
|
||||
# check_relations_equal
|
||||
# check_relation_has_expected_schema
|
||||
# check_relations_equal_with_relations
|
||||
# check_table_does_exist
|
||||
# check_table_does_not_exist
|
||||
@@ -321,6 +322,17 @@ def check_relations_equal(adapter, relation_names: List, compare_snapshot_cols=F
|
||||
)
|
||||
|
||||
|
||||
# Used to check that a particular relation has an expected schema
|
||||
# expected_schema should look like {"column_name": "expected datatype"}
|
||||
def check_relation_has_expected_schema(adapter, relation_name, expected_schema: Dict):
|
||||
relation = relation_from_name(adapter, relation_name)
|
||||
with get_connection(adapter):
|
||||
actual_columns = {c.name: c.data_type for c in adapter.get_columns_in_relation(relation)}
|
||||
assert (
|
||||
actual_columns == expected_schema
|
||||
), f"Actual schema did not match expected, actual: {json.dumps(actual_columns)}"
|
||||
|
||||
|
||||
# This can be used when checking relations in different schemas, by supplying
|
||||
# a list of relations. Called by 'check_relations_equal'.
|
||||
# Uses:
|
||||
|
||||
@@ -235,5 +235,5 @@ def _get_adapter_plugin_names() -> Iterator[str]:
|
||||
yield plugin_name
|
||||
|
||||
|
||||
__version__ = "1.3.0b2"
|
||||
__version__ = "1.3.0rc2"
|
||||
installed = get_installed_version()
|
||||
|
||||
@@ -25,7 +25,7 @@ with open(os.path.join(this_directory, "README.md")) as f:
|
||||
|
||||
|
||||
package_name = "dbt-core"
|
||||
package_version = "1.3.0b2"
|
||||
package_version = "1.3.0rc2"
|
||||
description = """With dbt, data analysts and engineers can build analytics \
|
||||
the way engineers build applications."""
|
||||
|
||||
|
||||
@@ -14,12 +14,12 @@ FROM --platform=$build_for python:3.10.7-slim-bullseye as base
|
||||
# N.B. The refs updated automagically every release via bumpversion
|
||||
# N.B. dbt-postgres is currently found in the core codebase so a value of dbt-core@<some_version> is correct
|
||||
|
||||
ARG dbt_core_ref=dbt-core@v1.3.0b2
|
||||
ARG dbt_postgres_ref=dbt-core@v1.3.0b2
|
||||
ARG dbt_redshift_ref=dbt-redshift@v1.3.0b2
|
||||
ARG dbt_bigquery_ref=dbt-bigquery@v1.3.0b2
|
||||
ARG dbt_snowflake_ref=dbt-snowflake@v1.3.0b2
|
||||
ARG dbt_spark_ref=dbt-spark@v1.3.0b2
|
||||
ARG dbt_core_ref=dbt-core@v1.3.0rc2
|
||||
ARG dbt_postgres_ref=dbt-core@v1.3.0rc2
|
||||
ARG dbt_redshift_ref=dbt-redshift@v1.3.0rc2
|
||||
ARG dbt_bigquery_ref=dbt-bigquery@v1.3.0rc2
|
||||
ARG dbt_snowflake_ref=dbt-snowflake@v1.3.0rc2
|
||||
ARG dbt_spark_ref=dbt-spark@v1.3.0rc2
|
||||
# special case args
|
||||
ARG dbt_spark_version=all
|
||||
ARG dbt_third_party
|
||||
|
||||
@@ -1 +1 @@
|
||||
version = "1.3.0b2"
|
||||
version = "1.3.0rc2"
|
||||
|
||||
@@ -117,23 +117,8 @@
|
||||
{{ return(load_result('check_schema_exists').table) }}
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
{% macro postgres__current_timestamp() -%}
|
||||
now()
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro postgres__snapshot_string_as_time(timestamp) -%}
|
||||
{%- set result = "'" ~ timestamp ~ "'::timestamp without time zone" -%}
|
||||
{{ return(result) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{% macro postgres__snapshot_get_time() -%}
|
||||
{{ current_timestamp() }}::timestamp without time zone
|
||||
{%- endmacro %}
|
||||
|
||||
{#
|
||||
Postgres tables have a maximum length off 63 characters, anything longer is silently truncated.
|
||||
Postgres tables have a maximum length of 63 characters, anything longer is silently truncated.
|
||||
Temp and backup relations add a lot of extra characters to the end of table names to ensure uniqueness.
|
||||
To prevent this going over the character limit, the base_relation name is truncated to ensure
|
||||
that name + suffix + uniquestring is < 63 characters.
|
||||
|
||||
20
plugins/postgres/dbt/include/postgres/macros/timestamps.sql
Normal file
20
plugins/postgres/dbt/include/postgres/macros/timestamps.sql
Normal file
@@ -0,0 +1,20 @@
|
||||
{% macro postgres__current_timestamp() -%}
|
||||
now()
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro postgres__snapshot_string_as_time(timestamp) -%}
|
||||
{%- set result = "'" ~ timestamp ~ "'::timestamp without time zone" -%}
|
||||
{{ return(result) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro postgres__snapshot_get_time() -%}
|
||||
{{ current_timestamp() }}::timestamp without time zone
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro postgres__current_timestamp_backcompat() %}
|
||||
current_timestamp::{{ type_timestamp() }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro postgres__current_timestamp_in_utc_backcompat() %}
|
||||
(current_timestamp at time zone 'utc')::{{ type_timestamp() }}
|
||||
{% endmacro %}
|
||||
@@ -41,7 +41,7 @@ def _dbt_psycopg2_name():
|
||||
|
||||
|
||||
package_name = "dbt-postgres"
|
||||
package_version = "1.3.0b2"
|
||||
package_version = "1.3.0rc2"
|
||||
description = """The postgres adapter plugin for dbt (data build tool)"""
|
||||
|
||||
this_directory = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
@@ -0,0 +1,13 @@
|
||||
version: 2
|
||||
|
||||
models:
|
||||
- name: model_one
|
||||
description: "The first model"
|
||||
- name: model_three
|
||||
description: "The third model"
|
||||
config:
|
||||
enabled: false
|
||||
columns:
|
||||
- name: id
|
||||
tests:
|
||||
- unique
|
||||
@@ -0,0 +1,13 @@
|
||||
version: 2
|
||||
|
||||
models:
|
||||
- name: model_one
|
||||
description: "The first model"
|
||||
- name: model_three
|
||||
description: "The third model"
|
||||
config:
|
||||
enabled: true
|
||||
columns:
|
||||
- name: id
|
||||
tests:
|
||||
- unique
|
||||
@@ -172,6 +172,26 @@ class ModelTest(BasePPTest):
|
||||
results = self.run_dbt(["--partial-parse", "run"])
|
||||
self.assertEqual(len(results), 3)
|
||||
|
||||
# disable model three in the schema file
|
||||
self.copy_file('test-files/models-schema4.yml', 'models/schema.yml')
|
||||
results = self.run_dbt(["--partial-parse", "run"])
|
||||
self.assertEqual(len(results), 2)
|
||||
|
||||
# update enabled config to be true for model three in the schema file
|
||||
self.copy_file('test-files/models-schema4b.yml', 'models/schema.yml')
|
||||
results = self.run_dbt(["--partial-parse", "run"])
|
||||
self.assertEqual(len(results), 3)
|
||||
|
||||
# disable model three in the schema file again
|
||||
self.copy_file('test-files/models-schema4.yml', 'models/schema.yml')
|
||||
results = self.run_dbt(["--partial-parse", "run"])
|
||||
self.assertEqual(len(results), 2)
|
||||
|
||||
# remove disabled config for model three in the schema file to check it gets enabled
|
||||
self.copy_file('test-files/models-schema3.yml', 'models/schema.yml')
|
||||
results = self.run_dbt(["--partial-parse", "run"])
|
||||
self.assertEqual(len(results), 3)
|
||||
|
||||
# Add a macro
|
||||
self.copy_file('test-files/my_macro.sql', 'macros/my_macro.sql')
|
||||
results = self.run_dbt(["--partial-parse", "run"])
|
||||
|
||||
@@ -23,7 +23,7 @@ class MacroCalls(unittest.TestCase):
|
||||
*
|
||||
from {{ model }} )
|
||||
{% endmacro %}""",
|
||||
"{% macro test_my_test(model) %} select {{ dbt_utils.current_timestamp() }} {% endmacro %}",
|
||||
"{% macro test_my_test(model) %} select {{ current_timestamp_backcompat() }} {% endmacro %}",
|
||||
"{% macro some_test(model) -%} {{ return(adapter.dispatch('test_some_kind4', 'foo_utils4')) }} {%- endmacro %}",
|
||||
"{% macro some_test(model) -%} {{ return(adapter.dispatch('test_some_kind5', macro_namespace = 'foo_utils5')) }} {%- endmacro %}",
|
||||
]
|
||||
@@ -34,7 +34,7 @@ class MacroCalls(unittest.TestCase):
|
||||
['get_snapshot_unique_id'],
|
||||
['get_columns_in_query'],
|
||||
['get_snapshot_unique_id'],
|
||||
['dbt_utils.current_timestamp'],
|
||||
['current_timestamp_backcompat'],
|
||||
['test_some_kind4', 'foo_utils4.test_some_kind4'],
|
||||
['test_some_kind5', 'foo_utils5.test_some_kind5'],
|
||||
]
|
||||
|
||||
@@ -1 +1 @@
|
||||
version = "1.3.0b2"
|
||||
version = "1.3.0rc2"
|
||||
|
||||
55
tests/adapter/dbt/tests/adapter/utils/test_timestamps.py
Normal file
55
tests/adapter/dbt/tests/adapter/utils/test_timestamps.py
Normal file
@@ -0,0 +1,55 @@
|
||||
import pytest
|
||||
import re
|
||||
from dbt.tests.util import check_relation_has_expected_schema, run_dbt
|
||||
|
||||
_MODEL_CURRENT_TIMESTAMP = """
|
||||
select {{ current_timestamp() }} as current_timestamp,
|
||||
{{ current_timestamp_in_utc_backcompat() }} as current_timestamp_in_utc_backcompat,
|
||||
{{ current_timestamp_backcompat() }} as current_timestamp_backcompat
|
||||
"""
|
||||
|
||||
_MODEL_EXPECTED_SQL = """
|
||||
select now() as current_timestamp,
|
||||
(current_timestamp at time zone 'utc')::TIMESTAMP as current_timestamp_in_utc_backcompat,
|
||||
current_timestamp::TIMESTAMP as current_timestamp_backcompat
|
||||
"""
|
||||
|
||||
|
||||
class BaseCurrentTimestamps:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {"get_current_timestamp.sql": _MODEL_CURRENT_TIMESTAMP}
|
||||
|
||||
# any adapters that don't want to check can set expected schema to None
|
||||
@pytest.fixture(scope="class")
|
||||
def expected_sql(self):
|
||||
return _MODEL_EXPECTED_SQL
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def expected_schema(self):
|
||||
return {
|
||||
"current_timestamp": "timestamp with time zone",
|
||||
"current_timestamp_in_utc_backcompat": "timestamp without time zone",
|
||||
"current_timestamp_backcompat": "timestamp without time zone",
|
||||
}
|
||||
|
||||
def test_current_timestamps(self, project, models, expected_schema, expected_sql):
|
||||
results = run_dbt(["run"])
|
||||
assert len(results) == 1
|
||||
check_relation_has_expected_schema(
|
||||
project.adapter,
|
||||
relation_name="get_current_timestamp",
|
||||
expected_schema=expected_schema,
|
||||
)
|
||||
|
||||
if expected_sql:
|
||||
generated_sql = results.results[0].node.compiled_code
|
||||
generated_sql_check = re.sub(r"\s+", "", generated_sql).lower()
|
||||
expected_sql_check = re.sub(r"\s+", "", expected_sql).lower()
|
||||
assert (
|
||||
expected_sql_check == generated_sql_check
|
||||
), f"generated sql did not match expected: {generated_sql}"
|
||||
|
||||
|
||||
class TestCurrentTimestamps(BaseCurrentTimestamps):
|
||||
pass
|
||||
@@ -20,7 +20,7 @@ except ImportError:
|
||||
|
||||
|
||||
package_name = "dbt-tests-adapter"
|
||||
package_version = "1.3.0b2"
|
||||
package_version = "1.3.0rc2"
|
||||
description = """The dbt adapter tests for adapter plugins"""
|
||||
|
||||
this_directory = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
@@ -78,6 +78,82 @@ select 1 as fun
|
||||
|
||||
"""
|
||||
|
||||
my_model = """
|
||||
select 1 as user
|
||||
"""
|
||||
|
||||
my_model_2 = """
|
||||
select * from {{ ref('my_model') }}
|
||||
"""
|
||||
|
||||
my_model_3 = """
|
||||
select * from {{ ref('my_model_2') }}
|
||||
"""
|
||||
|
||||
my_model_2_disabled = """
|
||||
{{ config(enabled=false) }}
|
||||
select * from {{ ref('my_model') }}
|
||||
"""
|
||||
|
||||
my_model_3_disabled = """
|
||||
{{ config(enabled=false) }}
|
||||
select * from {{ ref('my_model_2') }}
|
||||
"""
|
||||
|
||||
my_model_2_enabled = """
|
||||
{{ config(enabled=true) }}
|
||||
select * from {{ ref('my_model') }}
|
||||
"""
|
||||
|
||||
my_model_3_enabled = """
|
||||
{{ config(enabled=true) }}
|
||||
select * from {{ ref('my_model') }}
|
||||
"""
|
||||
|
||||
schema_all_disabled_yml = """
|
||||
version: 2
|
||||
models:
|
||||
- name: my_model
|
||||
- name: my_model_2
|
||||
config:
|
||||
enabled: false
|
||||
- name: my_model_3
|
||||
config:
|
||||
enabled: false
|
||||
"""
|
||||
|
||||
schema_explicit_enabled_yml = """
|
||||
version: 2
|
||||
models:
|
||||
- name: my_model
|
||||
- name: my_model_2
|
||||
config:
|
||||
enabled: true
|
||||
- name: my_model_3
|
||||
config:
|
||||
enabled: true
|
||||
"""
|
||||
|
||||
schema_partial_disabled_yml = """
|
||||
version: 2
|
||||
models:
|
||||
- name: my_model
|
||||
- name: my_model_2
|
||||
config:
|
||||
enabled: false
|
||||
- name: my_model_3
|
||||
"""
|
||||
|
||||
schema_partial_enabled_yml = """
|
||||
version: 2
|
||||
models:
|
||||
- name: my_model
|
||||
- name: my_model_2
|
||||
config:
|
||||
enabled: True
|
||||
- name: my_model_3
|
||||
"""
|
||||
|
||||
|
||||
class BaseConfigProject:
|
||||
@pytest.fixture(scope="class")
|
||||
|
||||
385
tests/functional/configs/test_disabled_model.py
Normal file
385
tests/functional/configs/test_disabled_model.py
Normal file
@@ -0,0 +1,385 @@
|
||||
import pytest
|
||||
from dbt.tests.util import run_dbt, get_manifest
|
||||
|
||||
from dbt.exceptions import CompilationException, ParsingException
|
||||
|
||||
from tests.functional.configs.fixtures import (
|
||||
schema_all_disabled_yml,
|
||||
schema_partial_enabled_yml,
|
||||
schema_partial_disabled_yml,
|
||||
schema_explicit_enabled_yml,
|
||||
my_model,
|
||||
my_model_2,
|
||||
my_model_2_enabled,
|
||||
my_model_2_disabled,
|
||||
my_model_3,
|
||||
my_model_3_disabled,
|
||||
my_model_3_enabled,
|
||||
)
|
||||
|
||||
|
||||
# ensure double disabled doesn't throw error when set at schema level
|
||||
class TestSchemaDisabledConfigs:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"schema.yml": schema_all_disabled_yml,
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
}
|
||||
|
||||
def test_disabled_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
|
||||
|
||||
# ensure this throws a specific error that the model is disabled
|
||||
class TestSchemaDisabledConfigsFailure:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"schema.yml": schema_partial_disabled_yml,
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
}
|
||||
|
||||
def test_disabled_config(self, project):
|
||||
with pytest.raises(CompilationException) as exc:
|
||||
run_dbt(["parse"])
|
||||
exc_str = " ".join(str(exc.value).split()) # flatten all whitespace
|
||||
expected_msg = "which is disabled"
|
||||
assert expected_msg in exc_str
|
||||
|
||||
|
||||
# ensure double disabled doesn't throw error when set in model configs
|
||||
class TestModelDisabledConfigs:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2_disabled,
|
||||
"my_model_3.sql": my_model_3_disabled,
|
||||
}
|
||||
|
||||
def test_disabled_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" not in manifest.nodes
|
||||
assert "model.test.my_model_3" not in manifest.nodes
|
||||
|
||||
assert "model.test.my_model_2" in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
|
||||
# ensure config set in project.yml can be overridden in yaml file
|
||||
class TestOverrideProjectConfigsInYaml:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"schema.yml": schema_partial_enabled_yml,
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
}
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def project_config_update(self):
|
||||
return {
|
||||
"models": {
|
||||
"test": {
|
||||
"my_model_2": {
|
||||
"enabled": False,
|
||||
},
|
||||
"my_model_3": {
|
||||
"enabled": False,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
def test_override_project_yaml_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" not in manifest.nodes
|
||||
|
||||
assert "model.test.my_model_2" not in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
|
||||
# ensure config set in project.yml can be overridden in sql file
|
||||
class TestOverrideProjectConfigsInSQL:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2_enabled,
|
||||
"my_model_3.sql": my_model_3,
|
||||
}
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def project_config_update(self):
|
||||
return {
|
||||
"models": {
|
||||
"test": {
|
||||
"my_model_2": {
|
||||
"enabled": False,
|
||||
},
|
||||
"my_model_3": {
|
||||
"enabled": False,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
def test_override_project_sql_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" not in manifest.nodes
|
||||
|
||||
assert "model.test.my_model_2" not in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
|
||||
# ensure false config set in yaml file can be overridden in sql file
|
||||
class TestOverrideFalseYAMLConfigsInSQL:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"schema.yml": schema_all_disabled_yml,
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2_enabled,
|
||||
"my_model_3.sql": my_model_3,
|
||||
}
|
||||
|
||||
def test_override_yaml_sql_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" not in manifest.nodes
|
||||
|
||||
assert "model.test.my_model_2" not in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
|
||||
# ensure true config set in yaml file can be overridden by false in sql file
|
||||
class TestOverrideTrueYAMLConfigsInSQL:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"schema.yml": schema_explicit_enabled_yml,
|
||||
"my_model.sql": my_model,
|
||||
"my_model_2.sql": my_model_2_enabled,
|
||||
"my_model_3.sql": my_model_3_disabled,
|
||||
}
|
||||
|
||||
def test_override_yaml_sql_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" not in manifest.nodes
|
||||
|
||||
assert "model.test.my_model_2" not in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
|
||||
# ensure error when enabling in schema file when multiple nodes exist within disabled
|
||||
class TestMultipleDisabledNodesForUniqueIDFailure:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"schema.yml": schema_partial_enabled_yml,
|
||||
"my_model.sql": my_model,
|
||||
"folder_1": {
|
||||
"my_model_2.sql": my_model_2_disabled,
|
||||
"my_model_3.sql": my_model_3_disabled,
|
||||
},
|
||||
"folder_2": {
|
||||
"my_model_2.sql": my_model_2_disabled,
|
||||
"my_model_3.sql": my_model_3_disabled,
|
||||
},
|
||||
"folder_3": {
|
||||
"my_model_2.sql": my_model_2_disabled,
|
||||
"my_model_3.sql": my_model_3_disabled,
|
||||
},
|
||||
}
|
||||
|
||||
def test_disabled_config(self, project):
|
||||
with pytest.raises(ParsingException) as exc:
|
||||
run_dbt(["parse"])
|
||||
exc_str = " ".join(str(exc.value).split()) # flatten all whitespace
|
||||
expected_msg = "Found 3 matching disabled nodes for model 'my_model_2'"
|
||||
assert expected_msg in exc_str
|
||||
|
||||
|
||||
# ensure error when enabling in schema file when multiple nodes exist within disabled
|
||||
class TestMultipleDisabledNodesSuccess:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"my_model.sql": my_model,
|
||||
"folder_1": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
"folder_2": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
}
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def project_config_update(self):
|
||||
return {
|
||||
"models": {
|
||||
"test": {
|
||||
"folder_1": {
|
||||
"enabled": False,
|
||||
},
|
||||
"folder_2": {
|
||||
"enabled": True,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
def test_multiple_disabled_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" in manifest.nodes
|
||||
|
||||
expected_file_path = "folder_2"
|
||||
assert expected_file_path in manifest.nodes["model.test.my_model_2"].original_file_path
|
||||
assert expected_file_path in manifest.nodes["model.test.my_model_3"].original_file_path
|
||||
|
||||
assert "model.test.my_model_2" in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
expected_disabled_file_path = "folder_1"
|
||||
assert (
|
||||
expected_disabled_file_path
|
||||
in manifest.disabled["model.test.my_model_2"][0].original_file_path
|
||||
)
|
||||
assert (
|
||||
expected_disabled_file_path
|
||||
in manifest.disabled["model.test.my_model_3"][0].original_file_path
|
||||
)
|
||||
|
||||
|
||||
# ensure overrides work when enabling in sql file when multiple nodes exist within disabled
|
||||
class TestMultipleDisabledNodesOverrideModel:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"my_model.sql": my_model,
|
||||
"folder_1": {
|
||||
"my_model_2.sql": my_model_2_enabled,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
"folder_2": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3_enabled,
|
||||
},
|
||||
}
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def project_config_update(self):
|
||||
return {
|
||||
"models": {
|
||||
"test": {
|
||||
"folder_1": {
|
||||
"enabled": False,
|
||||
},
|
||||
"folder_2": {
|
||||
"enabled": False,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
def test_multiple_disabled_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" in manifest.nodes
|
||||
|
||||
expected_file_path_2 = "folder_1"
|
||||
assert expected_file_path_2 in manifest.nodes["model.test.my_model_2"].original_file_path
|
||||
expected_file_path_3 = "folder_2"
|
||||
assert expected_file_path_3 in manifest.nodes["model.test.my_model_3"].original_file_path
|
||||
|
||||
assert "model.test.my_model_2" in manifest.disabled
|
||||
assert "model.test.my_model_3" in manifest.disabled
|
||||
|
||||
expected_disabled_file_path_2 = "folder_2"
|
||||
assert (
|
||||
expected_disabled_file_path_2
|
||||
in manifest.disabled["model.test.my_model_2"][0].original_file_path
|
||||
)
|
||||
expected_disabled_file_path_3 = "folder_1"
|
||||
assert (
|
||||
expected_disabled_file_path_3
|
||||
in manifest.disabled["model.test.my_model_3"][0].original_file_path
|
||||
)
|
||||
|
||||
|
||||
# ensure everything lands where it should when disabling multiple nodes with the same unique id
|
||||
class TestManyDisabledNodesSuccess:
|
||||
@pytest.fixture(scope="class")
|
||||
def models(self):
|
||||
return {
|
||||
"my_model.sql": my_model,
|
||||
"folder_1": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
"folder_2": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
"folder_3": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
"folder_4": {
|
||||
"my_model_2.sql": my_model_2,
|
||||
"my_model_3.sql": my_model_3,
|
||||
},
|
||||
}
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def project_config_update(self):
|
||||
return {
|
||||
"models": {
|
||||
"test": {
|
||||
"folder_1": {
|
||||
"enabled": False,
|
||||
},
|
||||
"folder_2": {
|
||||
"enabled": True,
|
||||
},
|
||||
"folder_3": {
|
||||
"enabled": False,
|
||||
},
|
||||
"folder_4": {
|
||||
"enabled": False,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
def test_many_disabled_config(self, project):
|
||||
run_dbt(["parse"])
|
||||
manifest = get_manifest(project.project_root)
|
||||
assert "model.test.my_model_2" in manifest.nodes
|
||||
assert "model.test.my_model_3" in manifest.nodes
|
||||
|
||||
expected_file_path = "folder_2"
|
||||
assert expected_file_path in manifest.nodes["model.test.my_model_2"].original_file_path
|
||||
assert expected_file_path in manifest.nodes["model.test.my_model_3"].original_file_path
|
||||
|
||||
assert len(manifest.disabled["model.test.my_model_2"]) == 3
|
||||
assert len(manifest.disabled["model.test.my_model_3"]) == 3
|
||||
Reference in New Issue
Block a user