Compare commits

..

34 Commits

Author SHA1 Message Date
Kshitij Aranke
57aef33fb3 Click cli main merge (#6926) 2023-02-09 12:21:39 -08:00
Kshitij Aranke
8bbae7926b Fix Project Env Var Tests (#6916)
Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com>
Co-authored-by: Ian Knox <ian.knox@dbtlabs.com>
2023-02-09 08:53:39 -08:00
Jeremy Cohen
db2b12021e Fix test_builtin_invocation_args_dict_function (#6898)
Co-authored-by: Ian Knox <ian.knox@dbtlabs.com>
2023-02-09 09:53:42 -06:00
Michelle Ark
77748571b4 profiles dir exists=False for dbt debug (#6910)
profiles dir exists=False for dbt debug
2023-02-08 18:32:06 -05:00
Stu Kilgore
8ce4c289c5 Docs generate doesn't write manifest (#6905) 2023-02-08 16:16:31 -06:00
Kshitij Aranke
c6d0e7c926 Fix Click CLI test DB Name (#6895) 2023-02-08 09:58:16 -08:00
Michelle Ark
bc015843d4 use UnsetProfile in deps and clean commands (#6890)
use unset_profile in deps and clean commands
2023-02-08 12:26:58 -05:00
Michelle Ark
db0981afe7 NoneConfig for DebugTask (#6893)
NoneConfig for DebugTask
2023-02-07 21:33:15 -05:00
Michelle Ark
dcf6544f93 flags.THREADS defaults to None (#6887)
flags.THREADS defaults to None
2023-02-07 17:15:26 -05:00
Ian Knox
c2c8959fee Merge main to feature (#6817)
merge main to feature/click-cli
2023-02-07 16:32:31 -05:00
Chenyu Li
d0b5d752df consolidate flags (#6788)
Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-02-07 08:35:35 -08:00
Kshitij Aranke
9c0b62b4f5 Fix CLI vars test to check for object instead of string (#6850) 2023-02-06 15:13:40 -08:00
Ian Knox
e08eede5e2 Remove unused cli_runner (#6877) 2023-02-06 15:24:09 -06:00
Ian Knox
05e53d4143 Test fix: TestProfileEnvVars::test_profile_env_vars (#6856) 2023-02-06 10:49:40 -06:00
Stu Kilgore
0503c141b7 Lazily call --version (#6813)
* Lazily call --version

* Add generated CLI API docs

---------

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-01-31 17:20:18 -08:00
Stu Kilgore
59d773ea7e Implement --version in click (#6802) 2023-01-31 12:40:01 -06:00
Kshitij Aranke
84bf5b4620 [CT-1947] Alias --models to --select for all commands except dbt ls (#6787)
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-01-31 10:18:10 -08:00
Ian Knox
726c4d6c58 Enable the new Click Cli (#6785) 2023-01-30 19:30:01 -06:00
Michelle Ark
acc88d47a3 mutually exclusive handling for warn_error_options and warn_error params in Click CLI (#6771)
warn_error_options, warn_error mutual exclusivity with click
2023-01-30 18:38:36 -05:00
Chenyu Li
0a74594d09 move favor state arg to click (#6774) 2023-01-30 11:55:31 -08:00
Ian Knox
d6ac340df0 Merge main into feature/click-cli (#6761) 2023-01-27 15:07:30 -06:00
Kshitij Aranke
08b2d94ccd [CT-920][CT-1900] Create Click CLI runner and use it to fix dbt docs … (#6723)
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-01-26 10:42:49 -08:00
Michelle Ark
7fa61f0816 dbt init works with click (#6698)
dbt init works with click
2023-01-26 12:45:15 -05:00
Stu Kilgore
d27016a4e7 Migrate debug task to click (#6728) 2023-01-25 14:52:57 -06:00
Stu Kilgore
92b7166c10 Abstract manifest generation from tasks (#6565) 2023-01-24 11:05:57 -06:00
Michelle Ark
a0ade13f5a dbt docs generate works with click (#6681) 2023-01-20 15:26:20 -05:00
Michelle Ark
9823a56e1d dbt build works in click (#6680)
* build working with click
2023-01-20 14:27:39 -05:00
Jeremy Cohen
11c622230c Add run-operation to click CLI (#5552) (#6656)
* Add run-operation to click CLI

* Add changelog entry

* PR feedback

* Fix unit test
2023-01-20 01:58:47 +01:00
Chenyu Li
f0349488ed Seed and freshness works with click (#6651) 2023-01-19 16:14:26 -08:00
Chenyu Li
c85be323f5 fix MP_CONTEXT is not JSON serializable (#6650) 2023-01-19 08:16:58 -08:00
Michelle Ark
30a1595f72 click working with list (#6641)
* list working with click
2023-01-19 09:28:32 -05:00
Chenyu Li
1913eac5ed Click snapshot as click command (#5972) (#6640) 2023-01-18 16:01:58 -08:00
Kshitij Aranke
53127daad8 [CT-921] dbt compile works in click (#5545) (#6586)
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
resolves https://github.com/dbt-labs/dbt-core/issues/5545
2023-01-11 15:01:50 -08:00
Kshitij Aranke
91b20b7482 dbt test works with Click (#5556)
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
resolves https://github.com/dbt-labs/dbt-core/issues/5556
2023-01-11 12:19:43 -08:00
560 changed files with 16473 additions and 27459 deletions

View File

@@ -1,13 +1,21 @@
[bumpversion]
current_version = 1.4.0b1
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
((?P<prekind>a|b|rc)
(?P<pre>\d+) # pre-release version num
current_version = 1.5.0a1
# `parse` allows parsing the version into the parts we need to check. There are some
# unnamed groups and that's okay because they do not need to be audited. If any part
# of the version passed and does not match the regex, it will fail.
# expected matches: `1.5.0`, `1.5.0a1`, `1.5.0a1.dev123457+nightly`
# excepted failures: `1`, `1.5`, `1.5.2-a1`, `text1.5.0`
parse = (?P<major>[\d]+) # major version number
\.(?P<minor>[\d]+) # minor version number
\.(?P<patch>[\d]+) # patch version number
(((?P<prekind>a|b|rc) # optional pre-release type
?(?P<num>[\d]+?)) # optional pre-release version number
\.?(?P<nightly>[a-z0-9]+\+[a-z]+)? # optional nightly release indicator
)?
serialize =
{major}.{minor}.{patch}{prekind}{pre}
{major}.{minor}.{patch}{prekind}{num}.{nightly}
{major}.{minor}.{patch}{prekind}{num}
{major}.{minor}.{patch}
commit = False
tag = False
@@ -21,9 +29,11 @@ values =
rc
final
[bumpversion:part:pre]
[bumpversion:part:num]
first_value = 1
[bumpversion:part:nightly]
[bumpversion:file:core/setup.py]
[bumpversion:file:core/dbt/version.py]

View File

@@ -3,6 +3,7 @@
For information on prior major and minor releases, see their changelogs:
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)
* [1.2](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md)
* [1.1](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md)

View File

@@ -0,0 +1,6 @@
kind: Features
body: Have dbt debug spit out structured json logs with flags enabled.
time: 2023-01-07T00:31:57.516063-08:00
custom:
Author: versusfacit
Issue: "5353"

View File

@@ -0,0 +1,6 @@
kind: Features
body: add adapter_response to dbt test and freshness result
time: 2023-01-18T23:38:01.857342+08:00
custom:
Author: aezomz
Issue: "2964"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Improve error message for packages missing `dbt_project.yml`
time: 2023-01-20T11:29:21.509967-07:00
custom:
Author: dbeatty10
Issue: "6663"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Adjust makefile to have clearer instructions for CI env var changes.
time: 2023-01-26T15:47:16.887327-08:00
custom:
Author: versusfacit
Issue: "6689"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Stand-alone Python module for PostgresColumn
time: 2023-01-27T16:28:12.212427-08:00
custom:
Author: nssalian
Issue: "6772"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Respect quoting config for dbt.ref(), dbt.source(), and dbt.this() in dbt-py models
time: 2023-01-16T12:36:45.63092+01:00
custom:
Author: jtcohen6
Issue: 6103 6619

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Provide backward compatibility for `get_merge_sql` arguments
time: 2023-01-17T10:13:42.118336-06:00
custom:
Author: dave-connors-3
Issue: "6625"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: add merge_exclude_columns adapter tests
time: 2023-01-23T13:28:14.808748-06:00
custom:
Author: dave-connors-3
Issue: "6699"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Include adapter_response in NodeFinished run_result log event
time: 2023-01-24T11:58:37.74179-05:00
custom:
Author: gshank
Issue: "6703"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Sort cli vars before hashing for partial parsing
time: 2023-01-24T14:19:43.333628-05:00
custom:
Author: gshank
Issue: "6710"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: '[Regression] exposure_content referenced incorrectly'
time: 2023-01-25T19:17:39.942081-05:00
custom:
Author: Mathyoub
Issue: "6738"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Remove pin on packaging and stop using it for prerelease comparisons
time: 2023-02-01T15:44:18.279158-05:00
custom:
Author: gshank
Issue: "6834"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Fix regression of --quiet cli parameter behavior
time: 2023-02-07T14:35:44.160163-05:00
custom:
Author: peterallenwebb
Issue: "6749"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Ensure results from hooks contain nodes when processing them
time: 2023-02-08T11:05:51.952494-06:00
custom:
Author: emmyoop
Issue: "6796"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: '[CT-932] Implement `dbt test` in Click'
time: 2023-01-09T15:14:17.524221-08:00
custom:
Author: aranke
Issue: "5556"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Abstract manifest generation
time: 2023-01-10T11:57:25.193965-06:00
custom:
Author: stu-k
Issue: "6357"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: '[CT-921] dbt compile works in click'
time: 2023-01-11T14:51:43.324107-08:00
custom:
Author: aranke
Issue: "5545"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Fix use of ConnectionReused logging event
time: 2023-01-13T13:25:13.023168-05:00
custom:
Author: gshank
Issue: "6168"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Port docs tests to pytest
time: 2023-01-13T15:07:00.477038-05:00
custom:
Author: peterallenwebb
Issue: "6573"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Update deprecated github action command
time: 2023-01-17T11:17:37.046095-06:00
custom:
Author: davidbloss
Issue: "6153"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: dbt snapshot works in click
time: 2023-01-17T16:25:05.973769-08:00
custom:
Author: ChenyuLInx
Issue: "5554"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: dbt list working with click
time: 2023-01-17T21:37:29.91632-05:00
custom:
Author: michelleark
Issue: "5549"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Add dbt run-operation to click CLI
time: 2023-01-19T10:53:04.154871+01:00
custom:
Author: jtcohen6
Issue: "5552"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: dbt build working with new click framework
time: 2023-01-19T20:56:50.50549-05:00
custom:
Author: michelleark
Issue: "5541"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: dbt docs generate works with new click framework
time: 2023-01-19T21:10:40.698851-05:00
custom:
Author: michelleark
Issue: "5543"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Replaced the EmptyLine event with a more general Formatting event, and added
a Note event.
time: 2023-01-20T17:22:54.45828-05:00
custom:
Author: peterallenwebb
Issue: "6481"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Small optimization on manifest parsing benefitting large DAGs
time: 2023-01-22T21:52:35.549814+01:00
custom:
Author: boxysean
Issue: "6697"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Revised and simplified various structured logging events
time: 2023-01-24T15:35:53.065356-05:00
custom:
Author: peterallenwebb
Issue: 6664 6665 6666

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: dbt init works with click
time: 2023-01-24T17:51:10.74065-05:00
custom:
Author: michelleark
Issue: "5548"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: '[CT-920][CT-1900] Create Click CLI runner and use it to fix dbt docs commands'
time: 2023-01-25T04:11:36.57506-08:00
custom:
Author: aranke
Issue: 5544 6722

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Migrate debug task to click
time: 2023-01-25T10:26:06.735994-06:00
custom:
Author: stu-k
Issue: "5546"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: ' Optimized GraphQueue to remove graph analysis bottleneck in large dags.'
time: 2023-01-26T13:59:39.518345-05:00
custom:
Author: peterallenwebb
Issue: "6759"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Implement --version for click cli
time: 2023-01-26T14:31:02.740282-06:00
custom:
Author: stu-k
Issue: "6757"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: '[CT-1841] Convert custom target test to Pytest'
time: 2023-01-26T16:47:41.198714-08:00
custom:
Author: aranke
Issue: "6638"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: "Enables the new Click Cli on the commandline! \U0001F680"
time: 2023-01-30T17:57:52.65626-06:00
custom:
Author: iknox-fa
Issue: "6784"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: warn_error/warn_error_options mutual exclusivity in click
time: 2023-01-30T18:09:17.240662-05:00
custom:
Author: michelleark
Issue: "6579"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Lazily call --version
time: 2023-01-31T14:18:06.02312-06:00
custom:
Author: stu-k
Issue: "6812"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Moving simple_seed to adapter zone to help adapter test conversions
time: 2023-02-03T14:35:51.481856-08:00
custom:
Author: nssalian
Issue: CT-1959

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: flags.THREADS defaults to None
time: 2023-02-07T16:51:11.011984-05:00
custom:
Author: michelleark
Issue: "6887"

View File

@@ -97,22 +97,28 @@ footerFormat: |
{{- /* we only want to include non-core team contributors */}}
{{- if not (has $authorLower $core_team)}}
{{- $changeList := splitList " " $change.Custom.Author }}
{{- /* Docs kind link back to dbt-docs instead of dbt-core issues */}}
{{- $IssueList := list }}
{{- $changeLink := $change.Kind }}
{{- if or (eq $change.Kind "Dependencies") (eq $change.Kind "Security") }}
{{- $changeLink = "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $change.Custom.PR }}
{{- else if eq $change.Kind "Docs"}}
{{- $changeLink = "[dbt-docs/#nbr](https://github.com/dbt-labs/dbt-docs/issues/nbr)" | replace "nbr" $change.Custom.Issue }}
{{- $changes := splitList " " $change.Custom.PR }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
{{- else }}
{{- $changeLink = "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $change.Custom.Issue }}
{{- $changes := splitList " " $change.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
{{- end }}
{{- /* check if this contributor has other changes associated with them already */}}
{{- if hasKey $contributorDict $author }}
{{- $contributionList := get $contributorDict $author }}
{{- $contributionList = append $contributionList $changeLink }}
{{- $contributionList = concat $contributionList $IssueList }}
{{- $contributorDict := set $contributorDict $author $contributionList }}
{{- else }}
{{- $contributionList := list $changeLink }}
{{- $contributionList := $IssueList }}
{{- $contributorDict := set $contributorDict $author $contributionList }}
{{- end }}
{{- end}}

View File

@@ -9,4 +9,4 @@ ignore =
E203 # makes Flake8 work like black
E741
E501 # long line checking is done in black
exclude = test
exclude = test/

20
.github/_README.md vendored
View File

@@ -63,12 +63,12 @@ permissions:
contents: read
pull-requests: write
```
### Secrets
- When to use a [Personal Access Token (PAT)](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token) vs the [GITHUB_TOKEN](https://docs.github.com/en/actions/security-guides/automatic-token-authentication) generated for the action?
The `GITHUB_TOKEN` is used by default. In most cases it is sufficient for what you need.
If you expect the workflow to result in a commit to that should retrigger workflows, you will need to use a Personal Access Token for the bot to commit the file. When using the GITHUB_TOKEN, the resulting commit will not trigger another GitHub Actions Workflow run. This is due to limitations set by GitHub. See [the docs](https://docs.github.com/en/actions/security-guides/automatic-token-authentication#using-the-github_token-in-a-workflow) for a more detailed explanation.
For example, we must use a PAT in our workflow to commit a new changelog yaml file for bot PRs. Once the file has been committed to the branch, it should retrigger the check to validate that a changelog exists on the PR. Otherwise, it would stay in a failed state since the check would never retrigger.
@@ -105,7 +105,7 @@ Some triggers of note that we use:
```
# **what?**
# Describe what the action does.
# Describe what the action does.
# **why?**
# Why does this action exist?
@@ -138,7 +138,7 @@ Some triggers of note that we use:
id: fp
run: |
FILEPATH=.changes/unreleased/Dependencies-${{ steps.filename_time.outputs.time }}.yaml
echo "::set-output name=FILEPATH::$FILEPATH"
echo "FILEPATH=$FILEPATH" >> $GITHUB_OUTPUT
```
- Print out all variables you will reference as the first step of a job. This allows for easier debugging. The first job should log all inputs. Subsequent jobs should reference outputs of other jobs, if present.
@@ -158,14 +158,14 @@ Some triggers of note that we use:
echo "The build_script_path: ${{ inputs.build_script_path }}"
echo "The s3_bucket_name: ${{ inputs.s3_bucket_name }}"
echo "The package_test_command: ${{ inputs.package_test_command }}"
# collect all the variables that need to be used in subsequent jobs
- name: Set Variables
id: variables
run: |
echo "::set-output name=important_path::'performance/runner/Cargo.toml'"
echo "::set-output name=release_id::${{github.event.inputs.release_id}}"
echo "::set-output name=open_prs::${{github.event.inputs.open_prs}}"
echo "important_path='performance/runner/Cargo.toml'" >> $GITHUB_OUTPUT
echo "release_id=${{github.event.inputs.release_id}}" >> $GITHUB_OUTPUT
echo "open_prs=${{github.event.inputs.open_prs}}" >> $GITHUB_OUTPUT
job2:
needs: [job1]
@@ -190,7 +190,7 @@ ___
### Actions from the Marketplace
- Dont use external actions for things that can easily be accomplished manually.
- Always read through what an external action does before using it! Often an action in the GitHub Actions Marketplace can be replaced with a few lines in bash. This is much more maintainable (and wont change under us) and clear as to whats actually happening. It also prevents any
- Pin actions _we don't control_ to tags.
- Pin actions _we don't control_ to tags.
### Connecting to AWS
- Authenticate with the aws managed workflow
@@ -208,7 +208,7 @@ ___
```yaml
- name: Copy Artifacts from S3 via CLI
run: aws s3 cp ${{ env.s3_bucket }} . --recursive
run: aws s3 cp ${{ env.s3_bucket }} . --recursive
```
### Testing

View File

@@ -28,11 +28,12 @@ if __name__ == "__main__":
if package_request.status_code == 404:
if halt_on_missing:
sys.exit(1)
else:
# everything is the latest if the package doesn't exist
print(f"::set-output name=latest::{True}")
print(f"::set-output name=minor_latest::{True}")
sys.exit(0)
# everything is the latest if the package doesn't exist
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write("latest=True")
gh_output.write("minor_latest=True")
sys.exit(0)
# TODO: verify package meta is "correct"
# https://github.com/dbt-labs/dbt-core/issues/4640
@@ -91,5 +92,7 @@ if __name__ == "__main__":
latest = is_latest(pre_rel, new_version, current_latest)
minor_latest = is_latest(pre_rel, new_version, current_minor_latest)
print(f"::set-output name=latest::{latest}")
print(f"::set-output name=minor_latest::{minor_latest}")
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write(f"latest={latest}")
gh_output.write(f"minor_latest={minor_latest}")

View File

@@ -145,7 +145,6 @@ jobs:
echo "creating docs"
make html
make markdown
- name: debug
run: |

View File

@@ -101,7 +101,9 @@ jobs:
- name: Get current date
if: always()
id: date
run: echo "::set-output name=date::$(date +'%Y-%m-%dT%H_%M_%S')" #no colons allowed for artifacts
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v2
if: always()
@@ -168,7 +170,9 @@ jobs:
- name: Get current date
if: always()
id: date
run: echo "::set-output name=date::$(date +'%Y_%m_%dT%H_%M_%S')" #no colons allowed for artifacts
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v2
if: always()

109
.github/workflows/nightly-release.yml vendored Normal file
View File

@@ -0,0 +1,109 @@
# **what?**
# Nightly releases to GitHub and PyPI. This workflow produces the following outcome:
# - generate and validate data for night release (commit SHA, version number, release branch);
# - pass data to release workflow;
# - night release will be pushed to GitHub as a draft release;
# - night build will be pushed to test PyPI;
#
# **why?**
# Ensure an automated and tested release process for nightly builds
#
# **when?**
# This workflow runs on schedule or can be run manually on demand.
name: Nightly Test Release to GitHub and PyPI
on:
workflow_dispatch: # for manual triggering
schedule:
- cron: 0 9 * * *
permissions:
contents: write # this is the permission that allows creating a new release
defaults:
run:
shell: bash
env:
RELEASE_BRANCH: "main"
jobs:
aggregate-release-data:
runs-on: ubuntu-latest
outputs:
commit_sha: ${{ steps.resolve-commit-sha.outputs.release_commit }}
version_number: ${{ steps.nightly-release-version.outputs.number }}
release_branch: ${{ steps.release-branch.outputs.name }}
steps:
- name: "Checkout ${{ github.repository }} Branch ${{ env.RELEASE_BRANCH }}"
uses: actions/checkout@v3
with:
ref: ${{ env.RELEASE_BRANCH }}
- name: "Resolve Commit To Release"
id: resolve-commit-sha
run: |
commit_sha=$(git rev-parse HEAD)
echo "release_commit=$commit_sha" >> $GITHUB_OUTPUT
- name: "Get Current Version Number"
id: version-number-sources
run: |
current_version=`awk -F"current_version = " '{print $2}' .bumpversion.cfg | tr '\n' ' '`
echo "current_version=$current_version" >> $GITHUB_OUTPUT
- name: "Audit Version And Parse Into Parts"
id: semver
uses: dbt-labs/actions/parse-semver@v1.1.0
with:
version: ${{ steps.version-number-sources.outputs.current_version }}
- name: "Get Current Date"
id: current-date
run: echo "date=$(date +'%m%d%Y')" >> $GITHUB_OUTPUT
- name: "Generate Nightly Release Version Number"
id: nightly-release-version
run: |
number="${{ steps.semver.outputs.version }}.dev${{ steps.current-date.outputs.date }}+nightly"
echo "number=$number" >> $GITHUB_OUTPUT
- name: "Audit Nightly Release Version And Parse Into Parts"
uses: dbt-labs/actions/parse-semver@v1.1.0
with:
version: ${{ steps.nightly-release-version.outputs.number }}
- name: "Set Release Branch"
id: release-branch
run: |
echo "name=${{ env.RELEASE_BRANCH }}" >> $GITHUB_OUTPUT
log-outputs-aggregate-release-data:
runs-on: ubuntu-latest
needs: [aggregate-release-data]
steps:
- name: "[DEBUG] Log Outputs"
run: |
echo commit_sha : ${{ needs.aggregate-release-data.outputs.commit_sha }}
echo version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
echo release_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
release-github-pypi:
needs: [aggregate-release-data]
uses: ./.github/workflows/release.yml
with:
sha: ${{ needs.aggregate-release-data.outputs.commit_sha }}
target_branch: ${{ needs.aggregate-release-data.outputs.release-branch }}
version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
build_script_path: "scripts/build-dist.sh"
env_setup_script_path: "scripts/env-setup.sh"
s3_bucket_name: "core-team-artifacts"
package_test_command: "dbt --version"
test_run: true
nightly_release: true
secrets: inherit

View File

@@ -28,7 +28,33 @@ on:
permissions: read-all
jobs:
fetch-latest-branches:
runs-on: ubuntu-latest
outputs:
latest-branches: ${{ steps.get-latest-branches.outputs.repo-branches }}
steps:
- name: "Fetch dbt-core Latest Branches"
uses: dbt-labs/actions/fetch-repo-branches@v1.1.1
id: get-latest-branches
with:
repo_name: ${{ github.event.repository.name }}
organization: "dbt-labs"
pat: ${{ secrets.GITHUB_TOKEN }}
fetch_protected_branches_only: true
regex: "^1.[0-9]+.latest$"
perform_match_method: "match"
retries: 3
- name: "[ANNOTATION] ${{ github.event.repository.name }} - branches to test"
run: |
title="${{ github.event.repository.name }} - branches to test"
message="The workflow will run tests for the following branches of the ${{ github.event.repository.name }} repo: ${{ steps.get-latest-branches.outputs.repo-branches }}"
echo "::notice $title::$message"
kick-off-ci:
needs: [fetch-latest-branches]
name: Kick-off CI
runs-on: ubuntu-latest
@@ -39,7 +65,9 @@ jobs:
max-parallel: 1
fail-fast: false
matrix:
branch: [1.0.latest, 1.1.latest, 1.2.latest, 1.3.latest, main]
branch: ${{ fromJSON(needs.fetch-latest-branches.outputs.latest-branches) }}
include:
- branch: 'main'
steps:
- name: Call CI workflow for ${{ matrix.branch }} branch

View File

@@ -41,9 +41,9 @@ jobs:
id: version
run: |
IFS="." read -r MAJOR MINOR PATCH <<< ${{ github.event.inputs.version_number }}
echo "::set-output name=major::$MAJOR"
echo "::set-output name=minor::$MINOR"
echo "::set-output name=patch::$PATCH"
echo "major=$MAJOR" >> $GITHUB_OUTPUT
echo "minor=$MINOR" >> $GITHUB_OUTPUT
echo "patch=$PATCH" >> $GITHUB_OUTPUT
- name: Is pkg 'latest'
id: latest
@@ -70,8 +70,10 @@ jobs:
- name: Get docker build arg
id: build_arg
run: |
echo "::set-output name=build_arg_name::"$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
echo "::set-output name=build_arg_value::"$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
BUILD_ARG_NAME=$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
BUILD_ARG_VALUE=$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
echo "build_arg_name=$BUILD_ARG_NAME" >> $GITHUB_OUTPUT
echo "build_arg_value=$BUILD_ARG_VALUE" >> $GITHUB_OUTPUT
- name: Log in to the GHCR
uses: docker/login-action@v1

View File

@@ -1,24 +1,110 @@
# **what?**
# Take the given commit, run unit tests specifically on that sha, build and
# package it, and then release to GitHub and PyPi with that specific build
# Release workflow provides the following steps:
# - checkout the given commit;
# - validate version in sources and changelog file for given version;
# - bump the version and generate a changelog if needed;
# - merge all changes to the target branch if needed;
# - run unit and integration tests against given commit;
# - build and package that SHA;
# - release it to GitHub and PyPI with that specific build;
#
# **why?**
# Ensure an automated and tested release process
#
# **when?**
# This will only run manually with a given sha and version
# This workflow can be run manually on demand or can be called by other workflows
name: Release to GitHub and PyPi
name: Release to GitHub and PyPI
on:
workflow_dispatch:
inputs:
sha:
description: 'The last commit sha in the release'
required: true
description: "The last commit sha in the release"
type: string
required: true
target_branch:
description: "The branch to release from"
type: string
required: true
version_number:
description: 'The release version number (i.e. 1.0.0b1)'
required: true
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
build_script_path:
description: "Build script path"
type: string
default: "scripts/build-dist.sh"
required: true
env_setup_script_path:
description: "Environment setup script path"
type: string
default: "scripts/env-setup.sh"
required: false
s3_bucket_name:
description: "AWS S3 bucket name"
type: string
default: "core-team-artifacts"
required: true
package_test_command:
description: "Package test command"
type: string
default: "dbt --version"
required: true
test_run:
description: "Test run (Publish release as draft)"
type: boolean
default: true
required: false
nightly_release:
description: "Nightly release to dev environment"
type: boolean
default: false
required: false
workflow_call:
inputs:
sha:
description: "The last commit sha in the release"
type: string
required: true
target_branch:
description: "The branch to release from"
type: string
required: true
version_number:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
build_script_path:
description: "Build script path"
type: string
default: "scripts/build-dist.sh"
required: true
env_setup_script_path:
description: "Environment setup script path"
type: string
default: "scripts/env-setup.sh"
required: false
s3_bucket_name:
description: "AWS S3 bucket name"
type: string
default: "core-team-artifacts"
required: true
package_test_command:
description: "Package test command"
type: string
default: "dbt --version"
required: true
test_run:
description: "Test run (Publish release as draft)"
type: boolean
default: true
required: false
nightly_release:
description: "Nightly release to dev environment"
type: boolean
default: false
required: false
permissions:
contents: write # this is the permission that allows creating a new release
@@ -28,175 +114,117 @@ defaults:
shell: bash
jobs:
unit:
name: Unit test
log-inputs:
name: Log Inputs
runs-on: ubuntu-latest
env:
TOXENV: "unit"
steps:
- name: Check out the repository
uses: actions/checkout@v2
with:
persist-credentials: false
ref: ${{ github.event.inputs.sha }}
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install python dependencies
- name: "[DEBUG] Print Variables"
run: |
pip install --user --upgrade pip
pip install tox
pip --version
tox --version
echo The last commit sha in the release: ${{ inputs.sha }}
echo The branch to release from: ${{ inputs.target_branch }}
echo The release version number: ${{ inputs.version_number }}
echo Build script path: ${{ inputs.build_script_path }}
echo Environment setup script path: ${{ inputs.env_setup_script_path }}
echo AWS S3 bucket name: ${{ inputs.s3_bucket_name }}
echo Package test command: ${{ inputs.package_test_command }}
echo Test run: ${{ inputs.test_run }}
echo Nightly release: ${{ inputs.nightly_release }}
- name: Run tox
run: tox
bump-version-generate-changelog:
name: Bump package version, Generate changelog
build:
name: build packages
uses: dbt-labs/dbt-release/.github/workflows/release-prep.yml@main
with:
sha: ${{ inputs.sha }}
version_number: ${{ inputs.version_number }}
target_branch: ${{ inputs.target_branch }}
env_setup_script_path: ${{ inputs.env_setup_script_path }}
test_run: ${{ inputs.test_run }}
nightly_release: ${{ inputs.nightly_release }}
secrets:
FISHTOWN_BOT_PAT: ${{ secrets.FISHTOWN_BOT_PAT }}
log-outputs-bump-version-generate-changelog:
name: "[Log output] Bump package version, Generate changelog"
if: ${{ !failure() && !cancelled() }}
needs: [bump-version-generate-changelog]
runs-on: ubuntu-latest
steps:
- name: Check out the repository
uses: actions/checkout@v2
with:
persist-credentials: false
ref: ${{ github.event.inputs.sha }}
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install python dependencies
- name: Print variables
run: |
pip install --user --upgrade pip
pip install --upgrade setuptools wheel twine check-wheel-contents
pip --version
echo Final SHA : ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
echo Changelog path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
- name: Build distributions
run: ./scripts/build-dist.sh
build-test-package:
name: Build, Test, Package
if: ${{ !failure() && !cancelled() }}
needs: [bump-version-generate-changelog]
- name: Show distributions
run: ls -lh dist/
uses: dbt-labs/dbt-release/.github/workflows/build.yml@main
- name: Check distribution descriptions
run: |
twine check dist/*
with:
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
build_script_path: ${{ inputs.build_script_path }}
s3_bucket_name: ${{ inputs.s3_bucket_name }}
package_test_command: ${{ inputs.package_test_command }}
test_run: ${{ inputs.test_run }}
nightly_release: ${{ inputs.nightly_release }}
- name: Check wheel contents
run: |
check-wheel-contents dist/*.whl --ignore W007,W008
- uses: actions/upload-artifact@v2
with:
name: dist
path: |
dist/
!dist/dbt-${{github.event.inputs.version_number}}.tar.gz
test-build:
name: verify packages
needs: [build, unit]
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install python dependencies
run: |
pip install --user --upgrade pip
pip install --upgrade wheel
pip --version
- uses: actions/download-artifact@v2
with:
name: dist
path: dist/
- name: Show distributions
run: ls -lh dist/
- name: Install wheel distributions
run: |
find ./dist/*.whl -maxdepth 1 -type f | xargs pip install --force-reinstall --find-links=dist/
- name: Check wheel distributions
run: |
dbt --version
- name: Install source distributions
run: |
find ./dist/*.gz -maxdepth 1 -type f | xargs pip install --force-reinstall --find-links=dist/
- name: Check source distributions
run: |
dbt --version
secrets:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
github-release:
name: GitHub Release
if: ${{ !failure() && !cancelled() }}
needs: test-build
needs: [bump-version-generate-changelog, build-test-package]
runs-on: ubuntu-latest
uses: dbt-labs/dbt-release/.github/workflows/github-release.yml@main
steps:
- uses: actions/download-artifact@v2
with:
name: dist
path: '.'
# Need to set an output variable because env variables can't be taken as input
# This is needed for the next step with releasing to GitHub
- name: Find release type
id: release_type
env:
IS_PRERELEASE: ${{ contains(github.event.inputs.version_number, 'rc') || contains(github.event.inputs.version_number, 'b') }}
run: |
echo ::set-output name=isPrerelease::$IS_PRERELEASE
- name: Creating GitHub Release
uses: softprops/action-gh-release@v1
with:
name: dbt-core v${{github.event.inputs.version_number}}
tag_name: v${{github.event.inputs.version_number}}
prerelease: ${{ steps.release_type.outputs.isPrerelease }}
target_commitish: ${{github.event.inputs.sha}}
body: |
[Release notes](https://github.com/dbt-labs/dbt-core/blob/main/CHANGELOG.md)
files: |
dbt_postgres-${{github.event.inputs.version_number}}-py3-none-any.whl
dbt_core-${{github.event.inputs.version_number}}-py3-none-any.whl
dbt-postgres-${{github.event.inputs.version_number}}.tar.gz
dbt-core-${{github.event.inputs.version_number}}.tar.gz
with:
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
test_run: ${{ inputs.test_run }}
pypi-release:
name: Pypi release
name: PyPI Release
runs-on: ubuntu-latest
needs: [github-release]
needs: github-release
uses: dbt-labs/dbt-release/.github/workflows/pypi-release.yml@main
environment: PypiProd
steps:
- uses: actions/download-artifact@v2
with:
name: dist
path: 'dist'
with:
version_number: ${{ inputs.version_number }}
test_run: ${{ inputs.test_run }}
- name: Publish distribution to PyPI
uses: pypa/gh-action-pypi-publish@v1.4.2
with:
password: ${{ secrets.PYPI_API_TOKEN }}
secrets:
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
TEST_PYPI_API_TOKEN: ${{ secrets.TEST_PYPI_API_TOKEN }}
slack-notification:
name: Slack Notification
if: ${{ failure() && (!inputs.test_run || inputs.nightly_release) }}
needs:
[
bump-version-generate-changelog,
build-test-package,
github-release,
pypi-release,
]
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
with:
status: "failure"
secrets:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_DEV_CORE_ALERTS }}

View File

@@ -65,7 +65,7 @@ jobs:
- name: Set branch value
id: variables
run: |
echo "::set-output name=BRANCH_NAME::prep-release/${{ github.event.inputs.version_number }}_$GITHUB_RUN_ID"
echo "BRANCH_NAME=prep-release/${{ github.event.inputs.version_number }}_$GITHUB_RUN_ID" >> $GITHUB_OUTPUT
- name: Create PR branch
run: |

1
.gitignore vendored
View File

@@ -51,6 +51,7 @@ coverage.xml
*,cover
.hypothesis/
test.env
makefile.test.env
*.pytest_cache/

View File

@@ -5,102 +5,12 @@
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-core 1.4.0-b1 - December 15, 2022
### Features
- Added favor-state flag to optionally favor state nodes even if unselected node exists ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- Update structured logging. Convert to using protobuf messages. Ensure events are enriched with node_info. ([#5610](https://github.com/dbt-labs/dbt-core/issues/5610))
- Friendlier error messages when packages.yml is malformed ([#5486](https://github.com/dbt-labs/dbt-core/issues/5486))
- Migrate dbt-utils current_timestamp macros into core + adapters ([#5521](https://github.com/dbt-labs/dbt-core/issues/5521))
- Allow partitions in external tables to be supplied as a list ([#5929](https://github.com/dbt-labs/dbt-core/issues/5929))
- extend -f flag shorthand for seed command ([#5990](https://github.com/dbt-labs/dbt-core/issues/5990))
- This pulls the profile name from args when constructing a RuntimeConfig in lib.py, enabling the dbt-server to override the value that's in the dbt_project.yml ([#6201](https://github.com/dbt-labs/dbt-core/issues/6201))
- Adding tarball install method for packages. Allowing package tarball to be specified via url in the packages.yaml. ([#4205](https://github.com/dbt-labs/dbt-core/issues/4205))
- Added an md5 function to the base context ([#6246](https://github.com/dbt-labs/dbt-core/issues/6246))
- Exposures support metrics in lineage ([#6057](https://github.com/dbt-labs/dbt-core/issues/6057))
- Add support for Python 3.11 ([#6147](https://github.com/dbt-labs/dbt-core/issues/6147))
- incremental predicates ([#5680](https://github.com/dbt-labs/dbt-core/issues/5680))
### Fixes
- Account for disabled flags on models in schema files more completely ([#3992](https://github.com/dbt-labs/dbt-core/issues/3992))
- Add validation of enabled config for metrics, exposures and sources ([#6030](https://github.com/dbt-labs/dbt-core/issues/6030))
- check length of args of python model function before accessing it ([#6041](https://github.com/dbt-labs/dbt-core/issues/6041))
- Add functors to ensure event types with str-type attributes are initialized to spec, even when provided non-str type params. ([#5436](https://github.com/dbt-labs/dbt-core/issues/5436))
- Allow hooks to fail without halting execution flow ([#5625](https://github.com/dbt-labs/dbt-core/issues/5625))
- Clarify Error Message for how many models are allowed in a Python file ([#6245](https://github.com/dbt-labs/dbt-core/issues/6245))
- After this, will be possible to use default values for dbt.config.get ([#6309](https://github.com/dbt-labs/dbt-core/issues/6309))
- Use full path for writing manifest ([#6055](https://github.com/dbt-labs/dbt-core/issues/6055))
- [CT-1284] Change Python model default materialization to table ([#6345](https://github.com/dbt-labs/dbt-core/issues/6345))
- Repair a regression which prevented basic logging before the logging subsystem is completely configured. ([#6434](https://github.com/dbt-labs/dbt-core/issues/6434))
### Docs
- minor doc correction ([dbt-docs/#5791](https://github.com/dbt-labs/dbt-docs/issues/5791))
- Generate API docs for new CLI interface ([dbt-docs/#5528](https://github.com/dbt-labs/dbt-docs/issues/5528))
- ([dbt-docs/#5880](https://github.com/dbt-labs/dbt-docs/issues/5880))
- Fix rendering of sample code for metrics ([dbt-docs/#323](https://github.com/dbt-labs/dbt-docs/issues/323))
- Alphabetize `core/dbt/README.md` ([dbt-docs/#6368](https://github.com/dbt-labs/dbt-docs/issues/6368))
### Under the Hood
- Put black config in explicit config ([#5946](https://github.com/dbt-labs/dbt-core/issues/5946))
- Added flat_graph attribute the Manifest class's deepcopy() coverage ([#5809](https://github.com/dbt-labs/dbt-core/issues/5809))
- Add mypy configs so `mypy` passes from CLI ([#5983](https://github.com/dbt-labs/dbt-core/issues/5983))
- Exception message cleanup. ([#6023](https://github.com/dbt-labs/dbt-core/issues/6023))
- Add dmypy cache to gitignore ([#6028](https://github.com/dbt-labs/dbt-core/issues/6028))
- Provide useful errors when the value of 'materialized' is invalid ([#5229](https://github.com/dbt-labs/dbt-core/issues/5229))
- Clean up string formatting ([#6068](https://github.com/dbt-labs/dbt-core/issues/6068))
- Fixed extra whitespace in strings introduced by black. ([#1350](https://github.com/dbt-labs/dbt-core/issues/1350))
- Remove the 'root_path' field from most nodes ([#6171](https://github.com/dbt-labs/dbt-core/issues/6171))
- Combine certain logging events with different levels ([#6173](https://github.com/dbt-labs/dbt-core/issues/6173))
- Convert threading tests to pytest ([#5942](https://github.com/dbt-labs/dbt-core/issues/5942))
- Convert postgres index tests to pytest ([#5770](https://github.com/dbt-labs/dbt-core/issues/5770))
- Convert use color tests to pytest ([#5771](https://github.com/dbt-labs/dbt-core/issues/5771))
- Add github actions workflow to generate high level CLI API docs ([#5942](https://github.com/dbt-labs/dbt-core/issues/5942))
- Functionality-neutral refactor of event logging system to improve encapsulation and modularity. ([#6139](https://github.com/dbt-labs/dbt-core/issues/6139))
- Consolidate ParsedNode and CompiledNode classes ([#6383](https://github.com/dbt-labs/dbt-core/issues/6383))
- Prevent doc gen workflow from running on forks ([#6386](https://github.com/dbt-labs/dbt-core/issues/6386))
- Fix intermittent database connection failure in Windows CI test ([#6394](https://github.com/dbt-labs/dbt-core/issues/6394))
- Refactor and clean up manifest nodes ([#6426](https://github.com/dbt-labs/dbt-core/issues/6426))
- Restore important legacy logging behaviors, following refactor which removed them ([#6437](https://github.com/dbt-labs/dbt-core/issues/6437))
### Dependencies
- Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core ([#5917](https://github.com/dbt-labs/dbt-core/pull/5917))
- Bump black from 22.8.0 to 22.10.0 ([#6019](https://github.com/dbt-labs/dbt-core/pull/6019))
- Bump mashumaro[msgpack] from 3.0.4 to 3.1.1 in /core ([#6108](https://github.com/dbt-labs/dbt-core/pull/6108))
- Update colorama requirement from <0.4.6,>=0.3.9 to >=0.3.9,<0.4.7 in /core ([#6144](https://github.com/dbt-labs/dbt-core/pull/6144))
- Bump mashumaro[msgpack] from 3.1.1 to 3.2 in /core ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@andy-clapson](https://github.com/andy-clapson) ([dbt-docs/#5791](https://github.com/dbt-labs/dbt-docs/issues/5791))
- [@chamini2](https://github.com/chamini2) ([#6041](https://github.com/dbt-labs/dbt-core/issues/6041))
- [@daniel-murray](https://github.com/daniel-murray) ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#5990](https://github.com/dbt-labs/dbt-core/issues/5990))
- [@dbeatty10](https://github.com/dbeatty10) ([dbt-docs/#6368](https://github.com/dbt-labs/dbt-docs/issues/6368), [#6394](https://github.com/dbt-labs/dbt-core/issues/6394))
- [@devmessias](https://github.com/devmessias) ([#6309](https://github.com/dbt-labs/dbt-core/issues/6309))
- [@eve-johns](https://github.com/eve-johns) ([#6068](https://github.com/dbt-labs/dbt-core/issues/6068))
- [@haritamar](https://github.com/haritamar) ([#6246](https://github.com/dbt-labs/dbt-core/issues/6246))
- [@jared-rimmer](https://github.com/jared-rimmer) ([#5486](https://github.com/dbt-labs/dbt-core/issues/5486))
- [@josephberni](https://github.com/josephberni) ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- [@joshuataylor](https://github.com/joshuataylor) ([#6147](https://github.com/dbt-labs/dbt-core/issues/6147))
- [@justbldwn](https://github.com/justbldwn) ([#6245](https://github.com/dbt-labs/dbt-core/issues/6245))
- [@luke-bassett](https://github.com/luke-bassett) ([#1350](https://github.com/dbt-labs/dbt-core/issues/1350))
- [@max-sixty](https://github.com/max-sixty) ([#5946](https://github.com/dbt-labs/dbt-core/issues/5946), [#5983](https://github.com/dbt-labs/dbt-core/issues/5983), [#6028](https://github.com/dbt-labs/dbt-core/issues/6028))
- [@paulbenschmidt](https://github.com/paulbenschmidt) ([dbt-docs/#5880](https://github.com/dbt-labs/dbt-docs/issues/5880))
- [@pgoslatara](https://github.com/pgoslatara) ([#5929](https://github.com/dbt-labs/dbt-core/issues/5929))
- [@racheldaniel](https://github.com/racheldaniel) ([#6201](https://github.com/dbt-labs/dbt-core/issues/6201))
- [@timle2](https://github.com/timle2) ([#4205](https://github.com/dbt-labs/dbt-core/issues/4205))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#5680](https://github.com/dbt-labs/dbt-core/issues/5680))
## Previous Releases
For information on prior major and minor releases, see their changelogs:
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)
* [1.2](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md)
* [1.1](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md)

View File

@@ -96,12 +96,15 @@ brew install postgresql
### Installation
First make sure that you set up your `virtualenv` as described in [Setting up an environment](#setting-up-an-environment). Also ensure you have the latest version of pip installed with `pip install --upgrade pip`. Next, install `dbt-core` (and its dependencies) with:
First make sure that you set up your `virtualenv` as described in [Setting up an environment](#setting-up-an-environment). Also ensure you have the latest version of pip installed with `pip install --upgrade pip`. Next, install `dbt-core` (and its dependencies):
```sh
make dev
# or
```
or, alternatively:
```sh
pip install -r dev-requirements.txt -r editable-requirements.txt
pre-commit install
```
When installed in this way, any changes you make to your local copy of the source code will be reflected immediately in your next `dbt` run.

View File

@@ -6,24 +6,37 @@ ifeq ($(USE_DOCKER),true)
DOCKER_CMD := docker-compose run --rm test
endif
LOGS_DIR := ./logs
#
# To override CI_flags, create a file at this repo's root dir named `makefile.test.env`. Fill it
# with any ENV_VAR overrides required by your test environment, e.g.
# DBT_TEST_USER_1=user
# LOG_DIR="dir with a space in it"
#
# Warn: Restrict each line to one variable only.
#
ifeq (./makefile.test.env,$(wildcard ./makefile.test.env))
include ./makefile.test.env
endif
# Optional flag to invoke tests using our CI env.
# But we always want these active for structured
# log testing.
CI_FLAGS =\
DBT_TEST_USER_1=dbt_test_user_1\
DBT_TEST_USER_2=dbt_test_user_2\
DBT_TEST_USER_3=dbt_test_user_3\
RUSTFLAGS="-D warnings"\
LOG_DIR=./logs\
DBT_LOG_FORMAT=json
DBT_TEST_USER_1=$(if $(DBT_TEST_USER_1),$(DBT_TEST_USER_1),dbt_test_user_1)\
DBT_TEST_USER_2=$(if $(DBT_TEST_USER_2),$(DBT_TEST_USER_2),dbt_test_user_2)\
DBT_TEST_USER_3=$(if $(DBT_TEST_USER_3),$(DBT_TEST_USER_3),dbt_test_user_3)\
RUSTFLAGS=$(if $(RUSTFLAGS),$(RUSTFLAGS),"-D warnings")\
LOG_DIR=$(if $(LOG_DIR),$(LOG_DIR),./logs)\
DBT_LOG_FORMAT=$(if $(DBT_LOG_FORMAT),$(DBT_LOG_FORMAT),json)
.PHONY: dev
dev: ## Installs dbt-* packages in develop mode along with development dependencies.
.PHONY: dev_req
dev_req: ## Installs dbt-* packages in develop mode along with only development dependencies.
@\
pip install -r dev-requirements.txt -r editable-requirements.txt
.PHONY: dev
dev: dev_req ## Installs dbt-* packages in develop mode along with development dependencies and pre-commit.
@\
pre-commit install
.PHONY: mypy
mypy: .env ## Runs mypy against staged changes for static type checking.
@\
@@ -61,7 +74,7 @@ test: .env ## Runs unit tests with py and code checks against staged changes.
.PHONY: integration
integration: .env ## Runs postgres integration tests with py-integration
@\
$(if $(USE_CI_FLAGS), $(CI_FLAGS)) $(DOCKER_CMD) tox -e py-integration -- -nauto
$(CI_FLAGS) $(DOCKER_CMD) tox -e py-integration -- -nauto
.PHONY: integration-fail-fast
integration-fail-fast: .env ## Runs postgres integration tests with py-integration in "fail fast" mode.
@@ -71,9 +84,9 @@ integration-fail-fast: .env ## Runs postgres integration tests with py-integrati
.PHONY: interop
interop: clean
@\
mkdir $(LOGS_DIR) && \
mkdir $(LOG_DIR) && \
$(CI_FLAGS) $(DOCKER_CMD) tox -e py-integration -- -nauto && \
LOG_DIR=$(LOGS_DIR) cargo run --manifest-path test/interop/log_parsing/Cargo.toml
LOG_DIR=$(LOG_DIR) cargo run --manifest-path test/interop/log_parsing/Cargo.toml
.PHONY: setup-db
setup-db: ## Setup Postgres database with docker-compose for system testing.

View File

@@ -2,7 +2,7 @@ from dataclasses import dataclass
import re
from typing import Dict, ClassVar, Any, Optional
from dbt.exceptions import RuntimeException
from dbt.exceptions import DbtRuntimeError
@dataclass
@@ -85,7 +85,7 @@ class Column:
def string_size(self) -> int:
if not self.is_string():
raise RuntimeException("Called string_size() on non-string field!")
raise DbtRuntimeError("Called string_size() on non-string field!")
if self.dtype == "text" or self.char_size is None:
# char_size should never be None. Handle it reasonably just in case
@@ -124,7 +124,7 @@ class Column:
def from_description(cls, name: str, raw_data_type: str) -> "Column":
match = re.match(r"([^(]+)(\([^)]+\))?", raw_data_type)
if match is None:
raise RuntimeException(f'Could not interpret data type "{raw_data_type}"')
raise DbtRuntimeError(f'Could not interpret data type "{raw_data_type}"')
data_type, size_info = match.groups()
char_size = None
numeric_precision = None
@@ -137,7 +137,7 @@ class Column:
try:
char_size = int(parts[0])
except ValueError:
raise RuntimeException(
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[0]}" to an integer'
)
@@ -145,14 +145,14 @@ class Column:
try:
numeric_precision = int(parts[0])
except ValueError:
raise RuntimeException(
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[0]}" to an integer'
)
try:
numeric_scale = int(parts[1])
except ValueError:
raise RuntimeException(
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[1]}" to an integer'
)

View File

@@ -91,13 +91,13 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
key = self.get_thread_identifier()
with self.lock:
if key not in self.thread_connections:
raise dbt.exceptions.InvalidConnectionException(key, list(self.thread_connections))
raise dbt.exceptions.InvalidConnectionError(key, list(self.thread_connections))
return self.thread_connections[key]
def set_thread_connection(self, conn: Connection) -> None:
key = self.get_thread_identifier()
if key in self.thread_connections:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
"In set_thread_connection, existing connection exists for {}"
)
self.thread_connections[key] = conn
@@ -137,49 +137,49 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
:return: A context manager that handles exceptions raised by the
underlying database.
"""
raise dbt.exceptions.NotImplementedException(
raise dbt.exceptions.NotImplementedError(
"`exception_handler` is not implemented for this adapter!"
)
def set_connection_name(self, name: Optional[str] = None) -> Connection:
conn_name: str
if name is None:
# if a name isn't specified, we'll re-use a single handle
# named 'master'
conn_name = "master"
else:
if not isinstance(name, str):
raise dbt.exceptions.CompilerException(
f"For connection name, got {name} - not a string!"
)
assert isinstance(name, str)
conn_name = name
"""Called by 'acquire_connection' in BaseAdapter, which is called by
'connection_named', called by 'connection_for(node)'.
Creates a connection for this thread if one doesn't already
exist, and will rename an existing connection."""
conn_name: str = "master" if name is None else name
# Get a connection for this thread
conn = self.get_if_exists()
if conn and conn.name == conn_name and conn.state == "open":
# Found a connection and nothing to do, so just return it
return conn
if conn is None:
# Create a new connection
conn = Connection(
type=Identifier(self.TYPE),
name=None,
name=conn_name,
state=ConnectionState.INIT,
transaction_open=False,
handle=None,
credentials=self.profile.credentials,
)
self.set_thread_connection(conn)
if conn.name == conn_name and conn.state == "open":
return conn
fire_event(
NewConnection(conn_name=conn_name, conn_type=self.TYPE, node_info=get_node_info())
)
if conn.state == "open":
fire_event(ConnectionReused(conn_name=conn_name))
else:
conn.handle = LazyHandle(self.open)
# Add the connection to thread_connections for this thread
self.set_thread_connection(conn)
fire_event(
NewConnection(conn_name=conn_name, conn_type=self.TYPE, node_info=get_node_info())
)
else: # existing connection either wasn't open or didn't have the right name
if conn.state != "open":
conn.handle = LazyHandle(self.open)
if conn.name != conn_name:
orig_conn_name: str = conn.name or ""
conn.name = conn_name
fire_event(ConnectionReused(orig_conn_name=orig_conn_name, conn_name=conn_name))
conn.name = conn_name
return conn
@classmethod
@@ -211,7 +211,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
connect should trigger a retry.
:type retryable_exceptions: Iterable[Type[Exception]]
:param int retry_limit: How many times to retry the call to connect. If this limit
is exceeded before a successful call, a FailedToConnectException will be raised.
is exceeded before a successful call, a FailedToConnectError will be raised.
Must be non-negative.
:param retry_timeout: Time to wait between attempts to connect. Can also take a
Callable that takes the number of attempts so far, beginning at 0, and returns an int
@@ -220,14 +220,14 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
:param int _attempts: Parameter used to keep track of the number of attempts in calling the
connect function across recursive calls. Passed as an argument to retry_timeout if it
is a Callable. This parameter should not be set by the initial caller.
:raises dbt.exceptions.FailedToConnectException: Upon exhausting all retry attempts without
:raises dbt.exceptions.FailedToConnectError: Upon exhausting all retry attempts without
successfully acquiring a handle.
:return: The given connection with its appropriate state and handle attributes set
depending on whether we successfully acquired a handle or not.
"""
timeout = retry_timeout(_attempts) if callable(retry_timeout) else retry_timeout
if timeout < 0:
raise dbt.exceptions.FailedToConnectException(
raise dbt.exceptions.FailedToConnectError(
"retry_timeout cannot be negative or return a negative time."
)
@@ -235,7 +235,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
# This guard is not perfect others may add to the recursion limit (e.g. built-ins).
connection.handle = None
connection.state = ConnectionState.FAIL
raise dbt.exceptions.FailedToConnectException("retry_limit cannot be negative")
raise dbt.exceptions.FailedToConnectError("retry_limit cannot be negative")
try:
connection.handle = connect()
@@ -246,7 +246,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
if retry_limit <= 0:
connection.handle = None
connection.state = ConnectionState.FAIL
raise dbt.exceptions.FailedToConnectException(str(e))
raise dbt.exceptions.FailedToConnectError(str(e))
logger.debug(
f"Got a retryable error when attempting to open a {cls.TYPE} connection.\n"
@@ -268,12 +268,12 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
except Exception as e:
connection.handle = None
connection.state = ConnectionState.FAIL
raise dbt.exceptions.FailedToConnectException(str(e))
raise dbt.exceptions.FailedToConnectError(str(e))
@abc.abstractmethod
def cancel_open(self) -> Optional[List[str]]:
"""Cancel all open connections on the adapter. (passable)"""
raise dbt.exceptions.NotImplementedException(
raise dbt.exceptions.NotImplementedError(
"`cancel_open` is not implemented for this adapter!"
)
@@ -288,7 +288,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
This should be thread-safe, or hold the lock if necessary. The given
connection should not be in either in_use or available.
"""
raise dbt.exceptions.NotImplementedException("`open` is not implemented for this adapter!")
raise dbt.exceptions.NotImplementedError("`open` is not implemented for this adapter!")
def release(self) -> None:
with self.lock:
@@ -320,16 +320,12 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
@abc.abstractmethod
def begin(self) -> None:
"""Begin a transaction. (passable)"""
raise dbt.exceptions.NotImplementedException(
"`begin` is not implemented for this adapter!"
)
raise dbt.exceptions.NotImplementedError("`begin` is not implemented for this adapter!")
@abc.abstractmethod
def commit(self) -> None:
"""Commit a transaction. (passable)"""
raise dbt.exceptions.NotImplementedException(
"`commit` is not implemented for this adapter!"
)
raise dbt.exceptions.NotImplementedError("`commit` is not implemented for this adapter!")
@classmethod
def _rollback_handle(cls, connection: Connection) -> None:
@@ -365,7 +361,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
def _rollback(cls, connection: Connection) -> None:
"""Roll back the given connection."""
if connection.transaction_open is False:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
f"Tried to rollback transaction on connection "
f'"{connection.name}", but it does not have one open!'
)
@@ -415,6 +411,4 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
:return: A tuple of the query status and results (empty if fetch=False).
:rtype: Tuple[AdapterResponse, agate.Table]
"""
raise dbt.exceptions.NotImplementedException(
"`execute` is not implemented for this adapter!"
)
raise dbt.exceptions.NotImplementedError("`execute` is not implemented for this adapter!")

View File

@@ -17,25 +17,24 @@ from typing import (
Iterator,
Set,
)
import agate
import pytz
from dbt.exceptions import (
InternalException,
InvalidMacroArgType,
InvalidMacroResult,
InvalidQuoteConfigType,
NotImplementedException,
NullRelationCacheAttempted,
NullRelationDropAttempted,
RelationReturnedMultipleResults,
RenameToNoneAttempted,
RuntimeException,
SnapshotTargetIncomplete,
SnapshotTargetNotSnapshotTable,
UnexpectedNull,
UnexpectedNonTimestamp,
DbtInternalError,
MacroArgTypeError,
MacroResultError,
QuoteConfigTypeError,
NotImplementedError,
NullRelationCacheAttemptedError,
NullRelationDropAttemptedError,
RelationReturnedMultipleResultsError,
RenameToNoneAttemptedError,
DbtRuntimeError,
SnapshotTargetIncompleteError,
SnapshotTargetNotSnapshotTableError,
UnexpectedNullError,
UnexpectedNonTimestampError,
)
from dbt.adapters.protocol import (
@@ -54,7 +53,7 @@ from dbt.events.types import (
CodeExecutionStatus,
CatalogGenerationError,
)
from dbt.utils import filter_null_values, executor, cast_to_str
from dbt.utils import filter_null_values, executor, cast_to_str, AttrDict
from dbt.adapters.base.connections import Connection, AdapterResponse
from dbt.adapters.base.meta import AdapterMeta, available
@@ -75,7 +74,7 @@ FRESHNESS_MACRO_NAME = "collect_freshness"
def _expect_row_value(key: str, row: agate.Row):
if key not in row.keys():
raise InternalException(
raise DbtInternalError(
'Got a row without "{}" column, columns: {}'.format(key, row.keys())
)
return row[key]
@@ -104,10 +103,10 @@ def _utc(dt: Optional[datetime], source: BaseRelation, field_name: str) -> datet
assume the datetime is already for UTC and add the timezone.
"""
if dt is None:
raise UnexpectedNull(field_name, source)
raise UnexpectedNullError(field_name, source)
elif not hasattr(dt, "tzinfo"):
raise UnexpectedNonTimestamp(field_name, source, dt)
raise UnexpectedNonTimestampError(field_name, source, dt)
elif dt.tzinfo:
return dt.astimezone(pytz.UTC)
@@ -433,7 +432,7 @@ class BaseAdapter(metaclass=AdapterMeta):
"""Cache a new relation in dbt. It will show up in `list relations`."""
if relation is None:
name = self.nice_connection_name()
raise NullRelationCacheAttempted(name)
raise NullRelationCacheAttemptedError(name)
self.cache.add(relation)
# so jinja doesn't render things
return ""
@@ -445,7 +444,7 @@ class BaseAdapter(metaclass=AdapterMeta):
"""
if relation is None:
name = self.nice_connection_name()
raise NullRelationDropAttempted(name)
raise NullRelationDropAttemptedError(name)
self.cache.drop(relation)
return ""
@@ -462,7 +461,7 @@ class BaseAdapter(metaclass=AdapterMeta):
name = self.nice_connection_name()
src_name = _relation_name(from_relation)
dst_name = _relation_name(to_relation)
raise RenameToNoneAttempted(src_name, dst_name, name)
raise RenameToNoneAttemptedError(src_name, dst_name, name)
self.cache.rename(from_relation, to_relation)
return ""
@@ -474,12 +473,12 @@ class BaseAdapter(metaclass=AdapterMeta):
@abc.abstractmethod
def date_function(cls) -> str:
"""Get the date function used by this adapter's database."""
raise NotImplementedException("`date_function` is not implemented for this adapter!")
raise NotImplementedError("`date_function` is not implemented for this adapter!")
@classmethod
@abc.abstractmethod
def is_cancelable(cls) -> bool:
raise NotImplementedException("`is_cancelable` is not implemented for this adapter!")
raise NotImplementedError("`is_cancelable` is not implemented for this adapter!")
###
# Abstract methods about schemas
@@ -487,7 +486,7 @@ class BaseAdapter(metaclass=AdapterMeta):
@abc.abstractmethod
def list_schemas(self, database: str) -> List[str]:
"""Get a list of existing schemas in database"""
raise NotImplementedException("`list_schemas` is not implemented for this adapter!")
raise NotImplementedError("`list_schemas` is not implemented for this adapter!")
@available.parse(lambda *a, **k: False)
def check_schema_exists(self, database: str, schema: str) -> bool:
@@ -510,13 +509,13 @@ class BaseAdapter(metaclass=AdapterMeta):
*Implementors must call self.cache.drop() to preserve cache state!*
"""
raise NotImplementedException("`drop_relation` is not implemented for this adapter!")
raise NotImplementedError("`drop_relation` is not implemented for this adapter!")
@abc.abstractmethod
@available.parse_none
def truncate_relation(self, relation: BaseRelation) -> None:
"""Truncate the given relation."""
raise NotImplementedException("`truncate_relation` is not implemented for this adapter!")
raise NotImplementedError("`truncate_relation` is not implemented for this adapter!")
@abc.abstractmethod
@available.parse_none
@@ -525,15 +524,13 @@ class BaseAdapter(metaclass=AdapterMeta):
Implementors must call self.cache.rename() to preserve cache state.
"""
raise NotImplementedException("`rename_relation` is not implemented for this adapter!")
raise NotImplementedError("`rename_relation` is not implemented for this adapter!")
@abc.abstractmethod
@available.parse_list
def get_columns_in_relation(self, relation: BaseRelation) -> List[BaseColumn]:
"""Get a list of the columns in the given Relation."""
raise NotImplementedException(
"`get_columns_in_relation` is not implemented for this adapter!"
)
raise NotImplementedError("`get_columns_in_relation` is not implemented for this adapter!")
@available.deprecated("get_columns_in_relation", lambda *a, **k: [])
def get_columns_in_table(self, schema: str, identifier: str) -> List[BaseColumn]:
@@ -555,7 +552,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param self.Relation current: A relation that currently exists in the
database with columns of unspecified types.
"""
raise NotImplementedException(
raise NotImplementedError(
"`expand_target_column_types` is not implemented for this adapter!"
)
@@ -570,7 +567,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:return: The relations in schema
:rtype: List[self.Relation]
"""
raise NotImplementedException(
raise NotImplementedError(
"`list_relations_without_caching` is not implemented for this adapter!"
)
@@ -612,7 +609,7 @@ class BaseAdapter(metaclass=AdapterMeta):
to_relation.
"""
if not isinstance(from_relation, self.Relation):
raise InvalidMacroArgType(
raise MacroArgTypeError(
method_name="get_missing_columns",
arg_name="from_relation",
got_value=from_relation,
@@ -620,7 +617,7 @@ class BaseAdapter(metaclass=AdapterMeta):
)
if not isinstance(to_relation, self.Relation):
raise InvalidMacroArgType(
raise MacroArgTypeError(
method_name="get_missing_columns",
arg_name="to_relation",
got_value=to_relation,
@@ -641,11 +638,11 @@ class BaseAdapter(metaclass=AdapterMeta):
expected columns.
:param Relation relation: The relation to check
:raises CompilationException: If the columns are
:raises InvalidMacroArgType: If the columns are
incorrect.
"""
if not isinstance(relation, self.Relation):
raise InvalidMacroArgType(
raise MacroArgTypeError(
method_name="valid_snapshot_target",
arg_name="relation",
got_value=relation,
@@ -666,16 +663,16 @@ class BaseAdapter(metaclass=AdapterMeta):
if missing:
if extra:
raise SnapshotTargetIncomplete(extra, missing)
raise SnapshotTargetIncompleteError(extra, missing)
else:
raise SnapshotTargetNotSnapshotTable(missing)
raise SnapshotTargetNotSnapshotTableError(missing)
@available.parse_none
def expand_target_column_types(
self, from_relation: BaseRelation, to_relation: BaseRelation
) -> None:
if not isinstance(from_relation, self.Relation):
raise InvalidMacroArgType(
raise MacroArgTypeError(
method_name="expand_target_column_types",
arg_name="from_relation",
got_value=from_relation,
@@ -683,7 +680,7 @@ class BaseAdapter(metaclass=AdapterMeta):
)
if not isinstance(to_relation, self.Relation):
raise InvalidMacroArgType(
raise MacroArgTypeError(
method_name="expand_target_column_types",
arg_name="to_relation",
got_value=to_relation,
@@ -765,7 +762,7 @@ class BaseAdapter(metaclass=AdapterMeta):
"schema": schema,
"database": database,
}
raise RelationReturnedMultipleResults(kwargs, matches)
raise RelationReturnedMultipleResultsError(kwargs, matches)
elif matches:
return matches[0]
@@ -787,20 +784,20 @@ class BaseAdapter(metaclass=AdapterMeta):
@available.parse_none
def create_schema(self, relation: BaseRelation):
"""Create the given schema if it does not exist."""
raise NotImplementedException("`create_schema` is not implemented for this adapter!")
raise NotImplementedError("`create_schema` is not implemented for this adapter!")
@abc.abstractmethod
@available.parse_none
def drop_schema(self, relation: BaseRelation):
"""Drop the given schema (and everything in it) if it exists."""
raise NotImplementedException("`drop_schema` is not implemented for this adapter!")
raise NotImplementedError("`drop_schema` is not implemented for this adapter!")
@available
@classmethod
@abc.abstractmethod
def quote(cls, identifier: str) -> str:
"""Quote the given identifier, as appropriate for the database."""
raise NotImplementedException("`quote` is not implemented for this adapter!")
raise NotImplementedError("`quote` is not implemented for this adapter!")
@available
def quote_as_configured(self, identifier: str, quote_key: str) -> str:
@@ -829,7 +826,7 @@ class BaseAdapter(metaclass=AdapterMeta):
elif quote_config is None:
pass
else:
raise InvalidQuoteConfigType(quote_config)
raise QuoteConfigTypeError(quote_config)
if quote_columns:
return self.quote(column)
@@ -850,7 +847,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param col_idx: The index into the agate table for the column.
:return: The name of the type in the database
"""
raise NotImplementedException("`convert_text_type` is not implemented for this adapter!")
raise NotImplementedError("`convert_text_type` is not implemented for this adapter!")
@classmethod
@abc.abstractmethod
@@ -862,7 +859,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param col_idx: The index into the agate table for the column.
:return: The name of the type in the database
"""
raise NotImplementedException("`convert_number_type` is not implemented for this adapter!")
raise NotImplementedError("`convert_number_type` is not implemented for this adapter!")
@classmethod
@abc.abstractmethod
@@ -874,9 +871,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param col_idx: The index into the agate table for the column.
:return: The name of the type in the database
"""
raise NotImplementedException(
"`convert_boolean_type` is not implemented for this adapter!"
)
raise NotImplementedError("`convert_boolean_type` is not implemented for this adapter!")
@classmethod
@abc.abstractmethod
@@ -888,9 +883,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param col_idx: The index into the agate table for the column.
:return: The name of the type in the database
"""
raise NotImplementedException(
"`convert_datetime_type` is not implemented for this adapter!"
)
raise NotImplementedError("`convert_datetime_type` is not implemented for this adapter!")
@classmethod
@abc.abstractmethod
@@ -902,7 +895,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param col_idx: The index into the agate table for the column.
:return: The name of the type in the database
"""
raise NotImplementedException("`convert_date_type` is not implemented for this adapter!")
raise NotImplementedError("`convert_date_type` is not implemented for this adapter!")
@classmethod
@abc.abstractmethod
@@ -914,7 +907,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:param col_idx: The index into the agate table for the column.
:return: The name of the type in the database
"""
raise NotImplementedException("`convert_time_type` is not implemented for this adapter!")
raise NotImplementedError("`convert_time_type` is not implemented for this adapter!")
@available
@classmethod
@@ -949,7 +942,7 @@ class BaseAdapter(metaclass=AdapterMeta):
context_override: Optional[Dict[str, Any]] = None,
kwargs: Dict[str, Any] = None,
text_only_columns: Optional[Iterable[str]] = None,
) -> agate.Table:
) -> AttrDict:
"""Look macro_name up in the manifest and execute its results.
:param macro_name: The name of the macro to execute.
@@ -981,7 +974,7 @@ class BaseAdapter(metaclass=AdapterMeta):
else:
package_name = 'the "{}" package'.format(project)
raise RuntimeException(
raise DbtRuntimeError(
'dbt could not find a macro with the name "{}" in {}'.format(
macro_name, package_name
)
@@ -1034,7 +1027,7 @@ class BaseAdapter(metaclass=AdapterMeta):
manifest=manifest,
)
results = self._catalog_filter_table(table, manifest)
results = self._catalog_filter_table(table, manifest) # type: ignore[arg-type]
return results
def get_catalog(self, manifest: Manifest) -> Tuple[agate.Table, List[Exception]]:
@@ -1066,7 +1059,7 @@ class BaseAdapter(metaclass=AdapterMeta):
loaded_at_field: str,
filter: Optional[str],
manifest: Optional[Manifest] = None,
) -> Dict[str, Any]:
) -> Tuple[AdapterResponse, Dict[str, Any]]:
"""Calculate the freshness of sources in dbt, and return it"""
kwargs: Dict[str, Any] = {
"source": source,
@@ -1075,11 +1068,12 @@ class BaseAdapter(metaclass=AdapterMeta):
}
# run the macro
table = self.execute_macro(FRESHNESS_MACRO_NAME, kwargs=kwargs, manifest=manifest)
result = self.execute_macro(FRESHNESS_MACRO_NAME, kwargs=kwargs, manifest=manifest)
adapter_response, table = result.response, result.table # type: ignore[attr-defined]
# now we have a 1-row table of the maximum `loaded_at_field` value and
# the current time according to the db.
if len(table) != 1 or len(table[0]) != 2:
raise InvalidMacroResult(FRESHNESS_MACRO_NAME, table)
raise MacroResultError(FRESHNESS_MACRO_NAME, table)
if table[0][0] is None:
# no records in the table, so really the max_loaded_at was
# infinitely long ago. Just call it 0:00 January 1 year UTC
@@ -1089,11 +1083,12 @@ class BaseAdapter(metaclass=AdapterMeta):
snapshotted_at = _utc(table[0][1], source, loaded_at_field)
age = (snapshotted_at - max_loaded_at).total_seconds()
return {
freshness = {
"max_loaded_at": max_loaded_at,
"snapshotted_at": snapshotted_at,
"age": age,
}
return adapter_response, freshness
def pre_model_hook(self, config: Mapping[str, Any]) -> Any:
"""A hook for running some operation before the model materialization
@@ -1156,7 +1151,7 @@ class BaseAdapter(metaclass=AdapterMeta):
elif location == "prepend":
return f"'{value}' || {add_to}"
else:
raise RuntimeException(f'Got an unexpected location value of "{location}"')
raise DbtRuntimeError(f'Got an unexpected location value of "{location}"')
def get_rows_different_sql(
self,
@@ -1214,7 +1209,7 @@ class BaseAdapter(metaclass=AdapterMeta):
return self.generate_python_submission_response(submission_result)
def generate_python_submission_response(self, submission_result: Any) -> AdapterResponse:
raise NotImplementedException(
raise NotImplementedError(
"Your adapter need to implement generate_python_submission_response"
)
@@ -1238,7 +1233,7 @@ class BaseAdapter(metaclass=AdapterMeta):
valid_strategies.append("default")
builtin_strategies = self.builtin_incremental_strategies()
if strategy in builtin_strategies and strategy not in valid_strategies:
raise RuntimeException(
raise DbtRuntimeError(
f"The incremental strategy '{strategy}' is not valid for this adapter"
)
@@ -1246,7 +1241,7 @@ class BaseAdapter(metaclass=AdapterMeta):
macro_name = f"get_incremental_{strategy}_sql"
# The model_context should have MacroGenerator callable objects for all macros
if macro_name not in model_context:
raise RuntimeException(
raise DbtRuntimeError(
'dbt could not find an incremental strategy macro with the name "{}" in {}'.format(
macro_name, self.config.project_name
)

View File

@@ -1,7 +1,7 @@
from typing import List, Optional, Type
from dbt.adapters.base import Credentials
from dbt.exceptions import CompilationException
from dbt.exceptions import CompilationError
from dbt.adapters.protocol import AdapterProtocol
@@ -11,7 +11,7 @@ def project_name_from_path(include_path: str) -> str:
partial = PartialProject.from_project_root(include_path)
if partial.project_name is None:
raise CompilationException(f"Invalid project at {include_path}: name not set!")
raise CompilationError(f"Invalid project at {include_path}: name not set!")
return partial.project_name

View File

@@ -7,7 +7,7 @@ from dbt.context.manifest import generate_query_header_context
from dbt.contracts.connection import AdapterRequiredConfig, QueryComment
from dbt.contracts.graph.nodes import ResultNode
from dbt.contracts.graph.manifest import Manifest
from dbt.exceptions import RuntimeException
from dbt.exceptions import DbtRuntimeError
class NodeWrapper:
@@ -48,7 +48,7 @@ class _QueryComment(local):
if isinstance(comment, str) and "*/" in comment:
# tell the user "no" so they don't hurt themselves by writing
# garbage
raise RuntimeException(f'query comment contains illegal value "*/": {comment}')
raise DbtRuntimeError(f'query comment contains illegal value "*/": {comment}')
self.query_comment = comment
self.append = append

View File

@@ -11,7 +11,11 @@ from dbt.contracts.relation import (
Policy,
Path,
)
from dbt.exceptions import ApproximateMatch, InternalException, MultipleDatabasesNotAllowed
from dbt.exceptions import (
ApproximateMatchError,
DbtInternalError,
MultipleDatabasesNotAllowedError,
)
from dbt.node_types import NodeType
from dbt.utils import filter_null_values, deep_merge, classproperty
@@ -83,7 +87,7 @@ class BaseRelation(FakeAPIObject, Hashable):
if not search:
# nothing was passed in
raise dbt.exceptions.RuntimeException(
raise dbt.exceptions.DbtRuntimeError(
"Tried to match relation, but no search path was passed!"
)
@@ -100,7 +104,7 @@ class BaseRelation(FakeAPIObject, Hashable):
if approximate_match and not exact_match:
target = self.create(database=database, schema=schema, identifier=identifier)
raise ApproximateMatch(target, self)
raise ApproximateMatchError(target, self)
return exact_match
@@ -249,14 +253,14 @@ class BaseRelation(FakeAPIObject, Hashable):
) -> Self:
if node.resource_type == NodeType.Source:
if not isinstance(node, SourceDefinition):
raise InternalException(
raise DbtInternalError(
"type mismatch, expected SourceDefinition but got {}".format(type(node))
)
return cls.create_from_source(node, **kwargs)
else:
# Can't use ManifestNode here because of parameterized generics
if not isinstance(node, (ParsedNode)):
raise InternalException(
raise DbtInternalError(
f"type mismatch, expected ManifestNode but got {type(node)}"
)
return cls.create_from_node(config, node, **kwargs)
@@ -354,7 +358,7 @@ class InformationSchema(BaseRelation):
def __post_init__(self):
if not isinstance(self.information_schema_view, (type(None), str)):
raise dbt.exceptions.CompilationException(
raise dbt.exceptions.CompilationError(
"Got an invalid name: {}".format(self.information_schema_view)
)
@@ -438,7 +442,7 @@ class SchemaSearchMap(Dict[InformationSchema, Set[Optional[str]]]):
if not allow_multiple_databases:
seen = {r.database.lower() for r in self if r.database}
if len(seen) > 1:
raise MultipleDatabasesNotAllowed(seen)
raise MultipleDatabasesNotAllowedError(seen)
for information_schema_name, schema in self.search():
path = {"database": information_schema_name.database, "schema": schema}

View File

@@ -9,29 +9,15 @@ from dbt.adapters.reference_keys import (
_ReferenceKey,
)
from dbt.exceptions import (
DependentLinkNotCached,
NewNameAlreadyInCache,
NoneRelationFound,
ReferencedLinkNotCached,
TruncatedModelNameCausedCollision,
DependentLinkNotCachedError,
NewNameAlreadyInCacheError,
NoneRelationFoundError,
ReferencedLinkNotCachedError,
TruncatedModelNameCausedCollisionError,
)
from dbt.events.functions import fire_event, fire_event_if
from dbt.events.types import (
AddLink,
AddRelation,
DropCascade,
DropMissingRelation,
DropRelation,
DumpAfterAddGraph,
DumpAfterRenameSchema,
DumpBeforeAddGraph,
DumpBeforeRenameSchema,
RenameSchema,
TemporaryRelation,
UncachedRelation,
UpdateReference,
)
import dbt.flags as flags
from dbt.events.types import CacheAction, CacheDumpGraph
from dbt.flags import get_flags
from dbt.utils import lowercase
@@ -155,7 +141,7 @@ class _CachedRelation:
:raises InternalError: If the new key already exists.
"""
if new_key in self.referenced_by:
raise NewNameAlreadyInCache(old_key, new_key)
raise NewNameAlreadyInCacheError(old_key, new_key)
if old_key not in self.referenced_by:
return
@@ -271,17 +257,17 @@ class RelationsCache:
if referenced is None:
return
if referenced is None:
raise ReferencedLinkNotCached(referenced_key)
raise ReferencedLinkNotCachedError(referenced_key)
dependent = self.relations.get(dependent_key)
if dependent is None:
raise DependentLinkNotCached(dependent_key)
raise DependentLinkNotCachedError(dependent_key)
assert dependent is not None # we just raised!
referenced.add_reference(dependent)
# TODO: Is this dead code? I can't seem to find it grepping the codebase.
# This is called in plugins/postgres/dbt/adapters/postgres/impl.py
def add_link(self, referenced, dependent):
"""Add a link between two relations to the database. If either relation
does not exist, it will be added as an "external" relation.
@@ -303,9 +289,9 @@ class RelationsCache:
# referring to a table outside our control. There's no need to make
# a link - we will never drop the referenced relation during a run.
fire_event(
UncachedRelation(
dep_key=_make_msg_from_ref_key(dep_key),
CacheAction(
ref_key=_make_msg_from_ref_key(ref_key),
ref_key_2=_make_msg_from_ref_key(dep_key),
)
)
return
@@ -318,8 +304,10 @@ class RelationsCache:
dependent = dependent.replace(type=referenced.External)
self.add(dependent)
fire_event(
AddLink(
dep_key=_make_msg_from_ref_key(dep_key), ref_key=_make_msg_from_ref_key(ref_key)
CacheAction(
action="add_link",
ref_key=_make_msg_from_ref_key(dep_key),
ref_key_2=_make_msg_from_ref_key(ref_key),
)
)
with self.lock:
@@ -331,13 +319,20 @@ class RelationsCache:
:param BaseRelation relation: The underlying relation.
"""
flags = get_flags()
cached = _CachedRelation(relation)
fire_event(AddRelation(relation=_make_ref_key_msg(cached)))
fire_event_if(flags.LOG_CACHE_EVENTS, lambda: DumpBeforeAddGraph(dump=self.dump_graph()))
fire_event_if(
flags.LOG_CACHE_EVENTS,
lambda: CacheDumpGraph(before_after="before", action="adding", dump=self.dump_graph()),
)
fire_event(CacheAction(action="add_relation", ref_key=_make_ref_key_msg(cached)))
with self.lock:
self._setdefault(cached)
fire_event_if(flags.LOG_CACHE_EVENTS, lambda: DumpAfterAddGraph(dump=self.dump_graph()))
fire_event_if(
flags.LOG_CACHE_EVENTS,
lambda: CacheDumpGraph(before_after="after", action="adding", dump=self.dump_graph()),
)
def _remove_refs(self, keys):
"""Removes all references to all entries in keys. This does not
@@ -365,16 +360,19 @@ class RelationsCache:
"""
dropped_key = _make_ref_key(relation)
dropped_key_msg = _make_ref_key_msg(relation)
fire_event(DropRelation(dropped=dropped_key_msg))
fire_event(CacheAction(action="drop_relation", ref_key=dropped_key_msg))
with self.lock:
if dropped_key not in self.relations:
fire_event(DropMissingRelation(relation=dropped_key_msg))
fire_event(CacheAction(action="drop_missing_relation", ref_key=dropped_key_msg))
return
consequences = self.relations[dropped_key].collect_consequences()
# convert from a list of _ReferenceKeys to a list of ReferenceKeyMsgs
consequence_msgs = [_make_msg_from_ref_key(key) for key in consequences]
fire_event(DropCascade(dropped=dropped_key_msg, consequences=consequence_msgs))
fire_event(
CacheAction(
action="drop_cascade", ref_key=dropped_key_msg, ref_list=consequence_msgs
)
)
self._remove_refs(consequences)
def _rename_relation(self, old_key, new_relation):
@@ -397,12 +395,14 @@ class RelationsCache:
for cached in self.relations.values():
if cached.is_referenced_by(old_key):
fire_event(
UpdateReference(
old_key=_make_ref_key_msg(old_key),
new_key=_make_ref_key_msg(new_key),
cached_key=_make_ref_key_msg(cached.key()),
CacheAction(
action="update_reference",
ref_key=_make_ref_key_msg(old_key),
ref_key_2=_make_ref_key_msg(new_key),
ref_key_3=_make_ref_key_msg(cached.key()),
)
)
cached.rename_key(old_key, new_key)
self.relations[new_key] = relation
@@ -427,10 +427,12 @@ class RelationsCache:
if new_key in self.relations:
# Tell user when collision caused by model names truncated during
# materialization.
raise TruncatedModelNameCausedCollision(new_key, self.relations)
raise TruncatedModelNameCausedCollisionError(new_key, self.relations)
if old_key not in self.relations:
fire_event(TemporaryRelation(key=_make_msg_from_ref_key(old_key)))
fire_event(
CacheAction(action="temporary_relation", ref_key=_make_msg_from_ref_key(old_key))
)
return False
return True
@@ -449,13 +451,16 @@ class RelationsCache:
old_key = _make_ref_key(old)
new_key = _make_ref_key(new)
fire_event(
RenameSchema(
old_key=_make_msg_from_ref_key(old_key), new_key=_make_msg_from_ref_key(new)
CacheAction(
action="rename_relation",
ref_key=_make_msg_from_ref_key(old_key),
ref_key_2=_make_msg_from_ref_key(new),
)
)
flags = get_flags()
fire_event_if(
flags.LOG_CACHE_EVENTS, lambda: DumpBeforeRenameSchema(dump=self.dump_graph())
flags.LOG_CACHE_EVENTS,
lambda: CacheDumpGraph(before_after="before", action="rename", dump=self.dump_graph()),
)
with self.lock:
@@ -465,7 +470,8 @@ class RelationsCache:
self._setdefault(_CachedRelation(new))
fire_event_if(
flags.LOG_CACHE_EVENTS, lambda: DumpAfterRenameSchema(dump=self.dump_graph())
flags.LOG_CACHE_EVENTS,
lambda: CacheDumpGraph(before_after="after", action="rename", dump=self.dump_graph()),
)
def get_relations(self, database: Optional[str], schema: Optional[str]) -> List[Any]:
@@ -485,7 +491,7 @@ class RelationsCache:
]
if None in results:
raise NoneRelationFound()
raise NoneRelationFoundError()
return results
def clear(self):

View File

@@ -10,7 +10,7 @@ from dbt.adapters.protocol import AdapterConfig, AdapterProtocol, RelationProtoc
from dbt.contracts.connection import AdapterRequiredConfig, Credentials
from dbt.events.functions import fire_event
from dbt.events.types import AdapterImportError, PluginLoadError
from dbt.exceptions import InternalException, RuntimeException
from dbt.exceptions import DbtInternalError, DbtRuntimeError
from dbt.include.global_project import PACKAGE_PATH as GLOBAL_PROJECT_PATH
from dbt.include.global_project import PROJECT_NAME as GLOBAL_PROJECT_NAME
@@ -34,7 +34,7 @@ class AdapterContainer:
names = ", ".join(self.plugins.keys())
message = f"Invalid adapter type {name}! Must be one of {names}"
raise RuntimeException(message)
raise DbtRuntimeError(message)
def get_adapter_class_by_name(self, name: str) -> Type[Adapter]:
plugin = self.get_plugin_by_name(name)
@@ -60,7 +60,7 @@ class AdapterContainer:
# the user about it via a runtime error
if exc.name == "dbt.adapters." + name:
fire_event(AdapterImportError(exc=str(exc)))
raise RuntimeException(f"Could not find adapter type {name}!")
raise DbtRuntimeError(f"Could not find adapter type {name}!")
# otherwise, the error had to have come from some underlying
# library. Log the stack trace.
@@ -70,7 +70,7 @@ class AdapterContainer:
plugin_type = plugin.adapter.type()
if plugin_type != name:
raise RuntimeException(
raise DbtRuntimeError(
f"Expected to find adapter with type named {name}, got "
f"adapter with type {plugin_type}"
)
@@ -132,7 +132,7 @@ class AdapterContainer:
try:
plugin = self.plugins[plugin_name]
except KeyError:
raise InternalException(f"No plugin found for {plugin_name}") from None
raise DbtInternalError(f"No plugin found for {plugin_name}") from None
plugins.append(plugin)
seen.add(plugin_name)
for dep in plugin.dependencies:
@@ -151,7 +151,7 @@ class AdapterContainer:
try:
path = self.packages[package_name]
except KeyError:
raise InternalException(f"No internal package listing found for {package_name}")
raise DbtInternalError(f"No internal package listing found for {package_name}")
paths.append(path)
return paths

View File

@@ -27,9 +27,7 @@ class SQLConnectionManager(BaseConnectionManager):
@abc.abstractmethod
def cancel(self, connection: Connection):
"""Cancel the given connection."""
raise dbt.exceptions.NotImplementedException(
"`cancel` is not implemented for this adapter!"
)
raise dbt.exceptions.NotImplementedError("`cancel` is not implemented for this adapter!")
def cancel_open(self) -> List[str]:
names = []
@@ -95,7 +93,7 @@ class SQLConnectionManager(BaseConnectionManager):
@abc.abstractmethod
def get_response(cls, cursor: Any) -> AdapterResponse:
"""Get the status of the cursor."""
raise dbt.exceptions.NotImplementedException(
raise dbt.exceptions.NotImplementedError(
"`get_response` is not implemented for this adapter!"
)
@@ -151,7 +149,7 @@ class SQLConnectionManager(BaseConnectionManager):
def begin(self):
connection = self.get_thread_connection()
if connection.transaction_open is True:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
'Tried to begin a new transaction on connection "{}", but '
"it already had one open!".format(connection.name)
)
@@ -164,7 +162,7 @@ class SQLConnectionManager(BaseConnectionManager):
def commit(self):
connection = self.get_thread_connection()
if connection.transaction_open is False:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
'Tried to commit transaction on connection "{}", but '
"it does not have one open!".format(connection.name)
)

View File

@@ -2,7 +2,7 @@ import agate
from typing import Any, Optional, Tuple, Type, List
from dbt.contracts.connection import Connection
from dbt.exceptions import RelationTypeNull
from dbt.exceptions import RelationTypeNullError
from dbt.adapters.base import BaseAdapter, available
from dbt.adapters.cache import _make_ref_key_msg
from dbt.adapters.sql import SQLConnectionManager
@@ -131,7 +131,7 @@ class SQLAdapter(BaseAdapter):
def drop_relation(self, relation):
if relation.type is None:
raise RelationTypeNull(relation)
raise RelationTypeNullError(relation)
self.cache_dropped(relation)
self.execute_macro(DROP_RELATION_MACRO_NAME, kwargs={"relation": relation})

View File

@@ -5,37 +5,92 @@ from dataclasses import dataclass
from importlib import import_module
from multiprocessing import get_context
from pprint import pformat as pf
from typing import Set
from typing import Set, List
from click import Context, get_current_context
from click import Context, get_current_context, BadOptionUsage
from click.core import ParameterSource
from dbt.config.profile import read_user_config
from dbt.contracts.project import UserConfig
from dbt.helper_types import WarnErrorOptions
from dbt.config.project import PartialProject
from dbt.exceptions import DbtProjectError
if os.name != "nt":
# https://bugs.python.org/issue41567
import multiprocessing.popen_spawn_posix # type: ignore # noqa: F401
# TODO anything that has a default in params should be removed here?
# Or maybe only the ones that's in the root click group
FLAGS_DEFAULTS = {
"INDIRECT_SELECTION": "eager",
"TARGET_PATH": None,
# cli args without user_config or env var option
"FULL_REFRESH": False,
"STRICT_MODE": False,
"STORE_FAILURES": False,
}
# For backwards compatability, some params are defined across multiple levels,
# Top-level value should take precedence.
# e.g. dbt --target-path test2 run --target-path test2
EXPECTED_DUPLICATE_PARAMS = [
"full_refresh",
"target_path",
"version_check",
"fail_fast",
"indirect_selection",
"store_failures",
]
def convert_config(config_name, config_value):
# This function should take care of converting the values from config and original
# set_from_args to the correct type
ret = config_value
if config_name.lower() == "warn_error_options":
ret = WarnErrorOptions(
include=config_value.get("include", []), exclude=config_value.get("exclude", [])
)
return ret
@dataclass(frozen=True)
class Flags:
def __init__(self, ctx: Context = None, user_config: UserConfig = None) -> None:
# set the default flags
for key, value in FLAGS_DEFAULTS.items():
object.__setattr__(self, key, value)
if ctx is None:
ctx = get_current_context()
def assign_params(ctx, params_assigned_from_default):
"""Recursively adds all click params to flag object"""
for param_name, param_value in ctx.params.items():
# TODO: this is to avoid duplicate params being defined in two places (version_check in run and cli)
# However this is a bit of a hack and we should find a better way to do this
# N.B. You have to use the base MRO method (object.__setattr__) to set attributes
# when using frozen dataclasses.
# https://docs.python.org/3/library/dataclasses.html#frozen-instances
if hasattr(self, param_name):
raise Exception(f"Duplicate flag names found in click command: {param_name}")
object.__setattr__(self, param_name.upper(), param_value)
if ctx.get_parameter_source(param_name) == ParameterSource.DEFAULT:
params_assigned_from_default.add(param_name)
if hasattr(self, param_name.upper()):
if param_name not in EXPECTED_DUPLICATE_PARAMS:
raise Exception(
f"Duplicate flag names found in click command: {param_name}"
)
else:
# Expected duplicate param from multi-level click command (ex: dbt --full_refresh run --full_refresh)
# Overwrite user-configured param with value from parent context
if ctx.get_parameter_source(param_name) != ParameterSource.DEFAULT:
object.__setattr__(self, param_name.upper(), param_value)
else:
object.__setattr__(self, param_name.upper(), param_value)
if ctx.get_parameter_source(param_name) == ParameterSource.DEFAULT:
params_assigned_from_default.add(param_name)
if ctx.parent:
assign_params(ctx.parent, params_assigned_from_default)
@@ -59,24 +114,49 @@ class Flags:
# Overwrite default assignments with user config if available
if user_config:
param_assigned_from_default_copy = params_assigned_from_default.copy()
for param_assigned_from_default in params_assigned_from_default:
user_config_param_value = getattr(user_config, param_assigned_from_default, None)
if user_config_param_value is not None:
object.__setattr__(
self, param_assigned_from_default.upper(), user_config_param_value
self,
param_assigned_from_default.upper(),
convert_config(param_assigned_from_default, user_config_param_value),
)
param_assigned_from_default_copy.remove(param_assigned_from_default)
params_assigned_from_default = param_assigned_from_default_copy
# Hard coded flags
object.__setattr__(self, "WHICH", invoked_subcommand_name or ctx.info_name)
object.__setattr__(self, "MP_CONTEXT", get_context("spawn"))
# Default LOG_PATH from PROJECT_DIR, if available.
if getattr(self, "LOG_PATH", None) is None:
log_path = "logs"
project_dir = getattr(self, "PROJECT_DIR", None)
# If available, set LOG_PATH from log-path in dbt_project.yml
# Known limitations:
# 1. Using PartialProject here, so no jinja rendering of log-path.
# 2. Programmatic invocations of the cli via dbtRunner may pass a Project object directly,
# which is not being used here to extract log-path.
if project_dir:
try:
partial = PartialProject.from_project_root(
project_dir, verify_version=getattr(self, "VERSION_CHECK", True)
)
log_path = str(partial.project_dict.get("log-path", log_path))
except DbtProjectError:
pass
object.__setattr__(self, "LOG_PATH", log_path)
# Support console DO NOT TRACK initiave
object.__setattr__(
self,
"ANONYMOUS_USAGE_STATS",
False
if os.getenv("DO_NOT_TRACK", "").lower() in ("1", "t", "true", "y", "yes")
else True,
if os.getenv("DO_NOT_TRACK", "").lower() in ("1", "t", "true", "y", "yes"):
object.__setattr__(self, "SEND_ANONYMOUS_USAGE_STATS", False)
# Check mutual exclusivity once all flags are set
self._assert_mutually_exclusive(
params_assigned_from_default, ["WARN_ERROR", "WARN_ERROR_OPTIONS"]
)
# Support lower cased access for legacy code
@@ -88,3 +168,20 @@ class Flags:
def __str__(self) -> str:
return str(pf(self.__dict__))
def _assert_mutually_exclusive(
self, params_assigned_from_default: Set[str], group: List[str]
) -> None:
"""
Ensure no elements from group are simultaneously provided by a user, as inferred from params_assigned_from_default.
Raises click.UsageError if any two elements from group are simultaneously provided by a user.
"""
set_flag = None
for flag in group:
flag_set_by_user = flag.lower() not in params_assigned_from_default
if flag_set_by_user and set_flag:
raise BadOptionUsage(
flag.lower(), f"{flag.lower()}: not allowed with argument {set_flag.lower()}"
)
elif flag_set_by_user:
set_flag = flag

View File

@@ -1,34 +1,36 @@
import inspect # This is temporary for RAT-ing
from copy import copy
from pprint import pformat as pf # This is temporary for RAT-ing
from typing import List, Tuple, Optional
import click
from dbt.cli import requires, params as p
from dbt.config import RuntimeConfig
from dbt.config.project import Project
from dbt.config.profile import Profile
from dbt.contracts.graph.manifest import Manifest
from dbt.task.clean import CleanTask
from dbt.task.compile import CompileTask
from dbt.task.deps import DepsTask
from dbt.task.debug import DebugTask
from dbt.task.run import RunTask
# CLI invocation
def cli_runner():
# Alias "list" to "ls"
ls = copy(cli.commands["list"])
ls.hidden = True
cli.add_command(ls, "ls")
# Run the cli
cli()
from dbt.task.serve import ServeTask
from dbt.task.test import TestTask
from dbt.task.snapshot import SnapshotTask
from dbt.task.seed import SeedTask
from dbt.task.list import ListTask
from dbt.task.freshness import FreshnessTask
from dbt.task.run_operation import RunOperationTask
from dbt.task.build import BuildTask
from dbt.task.generate import GenerateTask
from dbt.task.init import InitTask
class dbtUsageException(Exception):
pass
class dbtInternalException(Exception):
pass
# Programmatic invocation
class dbtRunner:
def __init__(
@@ -41,11 +43,17 @@ class dbtRunner:
def invoke(self, args: List[str]) -> Tuple[Optional[List], bool]:
try:
dbt_ctx = cli.make_context(cli.name, args)
dbt_ctx.obj = {}
dbt_ctx.obj["project"] = self.project
dbt_ctx.obj["profile"] = self.profile
dbt_ctx.obj["manifest"] = self.manifest
dbt_ctx.obj = {
"project": self.project,
"profile": self.profile,
"manifest": self.manifest,
}
return cli.invoke(dbt_ctx)
except click.exceptions.Exit as e:
# 0 exit code, expected for --version early exit
if str(e) == "0":
return [], True
raise dbtInternalException(f"unhandled exit code {str(e)}")
except (click.NoSuchOption, click.UsageError) as e:
raise dbtUsageException(e.message)
@@ -58,7 +66,7 @@ class dbtRunner:
epilog="Specify one of these sub-commands and you can find more help from there.",
)
@click.pass_context
@p.anonymous_usage_stats
@p.send_anonymous_usage_stats
@p.cache_selected_only
@p.debug
@p.enable_legacy_logger
@@ -79,15 +87,12 @@ class dbtRunner:
@p.version
@p.version_check
@p.warn_error
@p.warn_error_options
@p.write_json
def cli(ctx, **kwargs):
"""An ELT tool for managing your SQL transformations and data models.
For more documentation on these commands, visit: docs.getdbt.com
"""
# Version info
if ctx.params["version"]:
click.echo(f"`version` called\n ctx.params: {pf(ctx.params)}")
return
# dbt build
@@ -96,11 +101,13 @@ def cli(ctx, **kwargs):
@p.defer
@p.exclude
@p.fail_fast
@p.favor_state
@p.full_refresh
@p.indirect_selection
@p.profile
@p.profiles_dir
@p.project_dir
@p.resource_type
@p.select
@p.selector
@p.show
@@ -112,10 +119,21 @@ def cli(ctx, **kwargs):
@p.vars
@p.version_check
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def build(ctx, **kwargs):
"""Run all Seeds, Models, Snapshots, and tests in DAG order"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = BuildTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt clean
@@ -127,7 +145,7 @@ def build(ctx, **kwargs):
@p.target
@p.vars
@requires.preflight
@requires.profile
@requires.unset_profile
@requires.project
def clean(ctx, **kwargs):
"""Delete all folders in the clean-targets list (usually the dbt_packages and target directories.)"""
@@ -151,6 +169,7 @@ def docs(ctx, **kwargs):
@p.compile_docs
@p.defer
@p.exclude
@p.favor_state
@p.profile
@p.profiles_dir
@p.project_dir
@@ -163,10 +182,21 @@ def docs(ctx, **kwargs):
@p.vars
@p.version_check
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest(write=False)
def docs_generate(ctx, **kwargs):
"""Generate the documentation website for your project"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = GenerateTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt docs serve
@@ -180,10 +210,21 @@ def docs_generate(ctx, **kwargs):
@p.target
@p.vars
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def docs_serve(ctx, **kwargs):
"""Serve the documentation website for your project"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = ServeTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt compile
@@ -191,6 +232,7 @@ def docs_serve(ctx, **kwargs):
@click.pass_context
@p.defer
@p.exclude
@p.favor_state
@p.full_refresh
@p.parse_only
@p.profile
@@ -205,10 +247,22 @@ def docs_serve(ctx, **kwargs):
@p.vars
@p.version_check
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def compile(ctx, **kwargs):
"""Generates executable SQL from source, model, test, and analysis files. Compiled SQL files are written to the target/ directory."""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
"""Generates executable SQL from source, model, test, and analysis files. Compiled SQL files are written to the
target/ directory."""
task = CompileTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt debug
@@ -216,7 +270,7 @@ def compile(ctx, **kwargs):
@click.pass_context
@p.config_dir
@p.profile
@p.profiles_dir
@p.profiles_dir_exists_false
@p.project_dir
@p.target
@p.vars
@@ -224,8 +278,14 @@ def compile(ctx, **kwargs):
@requires.preflight
def debug(ctx, **kwargs):
"""Show some helpful information about dbt for debugging. Not to be confused with the --debug option which increases verbosity."""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = DebugTask(
ctx.obj["flags"],
None,
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt deps
@@ -237,12 +297,11 @@ def debug(ctx, **kwargs):
@p.target
@p.vars
@requires.preflight
@requires.profile
@requires.unset_profile
@requires.project
def deps(ctx, **kwargs):
"""Pull the most recent version of the dependencies listed in packages.yml"""
task = DepsTask(ctx.obj["flags"], ctx.obj["project"])
results = task.run()
success = task.interpret_results(results)
return results, success
@@ -251,6 +310,8 @@ def deps(ctx, **kwargs):
# dbt init
@cli.command("init")
@click.pass_context
# for backwards compatibility, accept 'project_name' as an optional positional argument
@click.argument("project_name", required=False)
@p.profile
@p.profiles_dir
@p.project_dir
@@ -259,9 +320,12 @@ def deps(ctx, **kwargs):
@p.vars
@requires.preflight
def init(ctx, **kwargs):
"""Initialize a new DBT project."""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
"""Initialize a new dbt project."""
task = InitTask(ctx.obj["flags"], None)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt list
@@ -269,22 +333,40 @@ def init(ctx, **kwargs):
@click.pass_context
@p.exclude
@p.indirect_selection
@p.models
@p.output
@p.output_keys
@p.profile
@p.profiles_dir
@p.project_dir
@p.resource_type
@p.select
@p.raw_select
@p.selector
@p.state
@p.target
@p.vars
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def list(ctx, **kwargs):
"""List the resources in your project"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = ListTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# Alias "list" to "ls"
ls = copy(cli.commands["list"])
ls.hidden = True
cli.add_command(ls, "ls")
# dbt parse
@@ -301,9 +383,13 @@ def list(ctx, **kwargs):
@p.version_check
@p.write_manifest
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest(write_perf_info=True)
def parse(ctx, **kwargs):
"""Parses the project and provides information on performance"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
# manifest generation and writing happens in @requires.manifest
return None, True
@@ -311,6 +397,7 @@ def parse(ctx, **kwargs):
@cli.command("run")
@click.pass_context
@p.defer
@p.favor_state
@p.exclude
@p.fail_fast
@p.full_refresh
@@ -328,10 +415,15 @@ def parse(ctx, **kwargs):
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def run(ctx, **kwargs):
"""Compile SQL and execute against the current target database."""
config = RuntimeConfig.from_parts(ctx.obj["project"], ctx.obj["profile"], ctx.obj["flags"])
task = RunTask(ctx.obj["flags"], config)
task = RunTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
@@ -341,6 +433,7 @@ def run(ctx, **kwargs):
# dbt run operation
@cli.command("run-operation")
@click.pass_context
@click.argument("macro")
@p.args
@p.profile
@p.profiles_dir
@@ -348,10 +441,21 @@ def run(ctx, **kwargs):
@p.target
@p.vars
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def run_operation(ctx, **kwargs):
"""Run the named macro with any supplied arguments."""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = RunOperationTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt seed
@@ -372,10 +476,20 @@ def run_operation(ctx, **kwargs):
@p.vars
@p.version_check
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def seed(ctx, **kwargs):
"""Load data from csv files into your data warehouse."""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = SeedTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt snapshot
@@ -383,6 +497,7 @@ def seed(ctx, **kwargs):
@click.pass_context
@p.defer
@p.exclude
@p.favor_state
@p.profile
@p.profiles_dir
@p.project_dir
@@ -393,10 +508,21 @@ def seed(ctx, **kwargs):
@p.threads
@p.vars
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def snapshot(ctx, **kwargs):
"""Execute snapshots defined in your project"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = SnapshotTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# dbt source
@@ -421,10 +547,27 @@ def source(ctx, **kwargs):
@p.threads
@p.vars
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def freshness(ctx, **kwargs):
"""Snapshots the current freshness of the project's sources"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
"""check the current freshness of the project's sources"""
task = FreshnessTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# Alias "source freshness" to "snapshot-freshness"
snapshot_freshness = copy(cli.commands["source"].commands["freshness"]) # type: ignore
snapshot_freshness.hidden = True
cli.commands["source"].add_command(snapshot_freshness, "snapshot-freshness") # type: ignore
# dbt test
@@ -433,6 +576,7 @@ def freshness(ctx, **kwargs):
@p.defer
@p.exclude
@p.fail_fast
@p.favor_state
@p.indirect_selection
@p.profile
@p.profiles_dir
@@ -447,12 +591,23 @@ def freshness(ctx, **kwargs):
@p.vars
@p.version_check
@requires.preflight
@requires.profile
@requires.project
@requires.runtime_config
@requires.manifest
def test(ctx, **kwargs):
"""Runs tests on data in deployed models. Run this after `dbt run`"""
click.echo(f"`{inspect.stack()[0][3]}` called\n flags: {ctx.obj['flags']}")
return None, True
task = TestTask(
ctx.obj["flags"],
ctx.obj["runtime_config"],
ctx.obj["manifest"],
)
results = task.run()
success = task.interpret_results(results)
return results, success
# Support running as a module
if __name__ == "__main__":
cli_runner()
cli()

View File

@@ -1,7 +1,9 @@
from click import ParamType
from click import ParamType, Choice
from dbt.config.utils import parse_cli_vars
from dbt.exceptions import ValidationException
from dbt.exceptions import ValidationError
from dbt.helper_types import WarnErrorOptions
class YAML(ParamType):
@@ -15,10 +17,24 @@ class YAML(ParamType):
self.fail(f"Cannot load YAML from type {type(value)}", param, ctx)
try:
return parse_cli_vars(value)
except ValidationException:
except ValidationError:
self.fail(f"String '{value}' is not valid YAML", param, ctx)
class WarnErrorOptionsType(YAML):
"""The Click WarnErrorOptions type. Converts YAML strings into objects."""
name = "WarnErrorOptionsType"
def convert(self, value, param, ctx):
# this function is being used by param in click
include_exclude = super().convert(value, param, ctx)
return WarnErrorOptions(
include=include_exclude.get("include", []), exclude=include_exclude.get("exclude", [])
)
class Truthy(ParamType):
"""The Click Truthy type. Converts strings into a "truthy" type"""
@@ -33,3 +49,13 @@ class Truthy(ParamType):
return None
else:
return value
class ChoiceTuple(Choice):
name = "CHOICE_TUPLE"
def convert(self, value, param, ctx):
for value_item in value:
super().convert(value_item, param, ctx)
return value

44
core/dbt/cli/options.py Normal file
View File

@@ -0,0 +1,44 @@
import click
# Implementation from: https://stackoverflow.com/a/48394004
# Note MultiOption options must be specified with type=tuple or type=ChoiceTuple (https://github.com/pallets/click/issues/2012)
class MultiOption(click.Option):
def __init__(self, *args, **kwargs):
self.save_other_options = kwargs.pop("save_other_options", True)
nargs = kwargs.pop("nargs", -1)
assert nargs == -1, "nargs, if set, must be -1 not {}".format(nargs)
super(MultiOption, self).__init__(*args, **kwargs)
self._previous_parser_process = None
self._eat_all_parser = None
def add_to_parser(self, parser, ctx):
def parser_process(value, state):
# method to hook to the parser.process
done = False
value = [value]
if self.save_other_options:
# grab everything up to the next option
while state.rargs and not done:
for prefix in self._eat_all_parser.prefixes:
if state.rargs[0].startswith(prefix):
done = True
if not done:
value.append(state.rargs.pop(0))
else:
# grab everything remaining
value += state.rargs
state.rargs[:] = []
value = tuple(value)
# call the actual process
self._previous_parser_process(value, state)
retval = super(MultiOption, self).add_to_parser(parser, ctx)
for name in self.opts:
our_parser = parser._long_opt.get(name) or parser._short_opt.get(name)
if our_parser:
self._eat_all_parser = our_parser
self._previous_parser_process = our_parser.process
our_parser.process = parser_process
break
return retval

View File

@@ -1,17 +1,15 @@
from pathlib import Path, PurePath
import click
from dbt.cli.option_types import YAML
from dbt.cli.options import MultiOption
from dbt.cli.option_types import YAML, ChoiceTuple, WarnErrorOptionsType
from dbt.cli.resolvers import default_project_dir, default_profiles_dir
from dbt.version import get_version_information
# TODO: The name (reflected in flags) is a correction!
# The original name was `SEND_ANONYMOUS_USAGE_STATS` and used an env var called "DBT_SEND_ANONYMOUS_USAGE_STATS"
# Both of which break existing naming conventions (doesn't match param flag).
# This will need to be fixed before use in the main codebase and communicated as a change to the community!
anonymous_usage_stats = click.option(
"--anonymous-usage-stats/--no-anonymous-usage-stats",
envvar="DBT_ANONYMOUS_USAGE_STATS",
# TODO: Rename this to meet naming conventions (the word "send" is redundant)
send_anonymous_usage_stats = click.option(
"--send-anonymous-usage-stats/--no-send-anonymous-usage-stats",
envvar="DBT_SEND_ANONYMOUS_USAGE_STATS",
help="Send anonymous usage stats to dbt Labs.",
default=True,
)
@@ -80,7 +78,9 @@ enable_legacy_logger = click.option(
hidden=True,
)
exclude = click.option("--exclude", envvar=None, help="Specify the nodes to exclude.")
exclude = click.option(
"--exclude", envvar=None, type=tuple, cls=MultiOption, help="Specify the nodes to exclude."
)
fail_fast = click.option(
"--fail-fast/--no-fail-fast",
@@ -89,6 +89,12 @@ fail_fast = click.option(
help="Stop execution on first failure.",
)
favor_state = click.option(
"--favor-state/--no-favor-state",
envvar="DBT_FAVOR_STATE",
help="If set, defer to the argument provided to the state flag for resolving unselected nodes, even if the node(s) exist as a database object in the current environment.",
)
full_refresh = click.option(
"--full-refresh",
"-f",
@@ -101,7 +107,7 @@ indirect_selection = click.option(
"--indirect-selection",
envvar="DBT_INDIRECT_SELECTION",
help="Select all tests that are adjacent to selected resources, even if they those resources have been explicitly selected.",
type=click.Choice(["eager", "cautious"], case_sensitive=False),
type=click.Choice(["eager", "cautious", "buildable"], case_sensitive=False),
default="eager",
)
@@ -123,7 +129,7 @@ log_path = click.option(
"--log-path",
envvar="DBT_LOG_PATH",
help="Configure the 'log-path'. Only applies this setting for the current run. Overrides the 'DBT_LOG_PATH' if it is set.",
default=lambda: Path.cwd() / "logs",
default=None,
type=click.Path(resolve_path=True, path_type=Path),
)
@@ -133,13 +139,12 @@ macro_debugging = click.option(
hidden=True,
)
output = click.option(
"--output",
envvar=None,
help="TODO: No current help text",
type=click.Choice(["json", "name", "path", "selector"], case_sensitive=False),
default="name",
default="selector",
)
output_keys = click.option(
@@ -210,6 +215,15 @@ profiles_dir = click.option(
type=click.Path(exists=True),
)
# `dbt debug` uses this because it implements custom behaviour for non-existent profiles.yml directories
profiles_dir_exists_false = click.option(
"--profiles-dir",
envvar="DBT_PROFILES_DIR",
help="Which directory to look in for the profiles.yml file. If not set, dbt will look in the current working directory first, then HOME/.dbt/",
default=default_profiles_dir,
type=click.Path(exists=False),
)
project_dir = click.option(
"--project-dir",
envvar=None,
@@ -233,10 +247,11 @@ record_timing_info = click.option(
)
resource_type = click.option(
"--resource-types",
"--resource-type",
envvar=None,
help="TODO: No current help text",
type=click.Choice(
type=ChoiceTuple(
[
"metric",
"source",
@@ -251,17 +266,26 @@ resource_type = click.option(
],
case_sensitive=False,
),
default="default",
cls=MultiOption,
default=(),
)
select = click.option(
"-m",
"-s",
"select",
envvar=None,
help="Specify the nodes to include.",
multiple=True,
)
model_decls = ("-m", "--models", "--model")
select_decls = ("-s", "--select")
select_attrs = {
"envvar": None,
"help": "Specify the nodes to include.",
"cls": MultiOption,
"type": tuple,
}
# `--select` and `--models` are analogous for most commands except `dbt list` for legacy reasons.
# Most CLI arguments should use the combined `select` option that aliases `--models` to `--select`.
# However, if you need to split out these separators (like `dbt ls`), use the `models` and `raw_select` options instead.
# See https://github.com/dbt-labs/dbt-core/pull/6774#issuecomment-1408476095 for more info.
models = click.option(*model_decls, **select_attrs)
raw_select = click.option(*select_decls, **select_attrs)
select = click.option(*select_decls, *model_decls, **select_attrs)
selector = click.option(
"--selector", envvar=None, help="The selector name to use, as defined in selectors.yml"
@@ -285,7 +309,7 @@ single_threaded = click.option(
)
skip_profile_setup = click.option(
"--skip-profile-setup", "-s", envvar=None, help="Skip interative profile setup.", is_flag=True
"--skip-profile-setup", "-s", envvar=None, help="Skip interactive profile setup.", is_flag=True
)
# TODO: The env var and name (reflected in flags) are corrections!
@@ -298,10 +322,10 @@ state = click.option(
help="If set, use the given directory as the source for json files to compare with this project.",
type=click.Path(
dir_okay=True,
exists=True,
file_okay=False,
readable=True,
resolve_path=True,
path_type=Path,
),
)
@@ -334,7 +358,7 @@ threads = click.option(
"--threads",
envvar=None,
help="Specify number of threads to use while executing models. Overrides settings in profiles.yml.",
default=1,
default=None,
type=click.INT,
)
@@ -359,10 +383,23 @@ vars = click.option(
default="{}",
)
# TODO: when legacy flags are deprecated use
# click.version_option instead of a callback
def _version_callback(ctx, _param, value):
if not value or ctx.resilient_parsing:
return
click.echo(get_version_information())
ctx.exit()
version = click.option(
"--version",
callback=_version_callback,
envvar=None,
expose_value=False,
help="Show version information",
is_eager=True,
is_flag=True,
)
@@ -374,9 +411,20 @@ version_check = click.option(
)
warn_error = click.option(
"--warn-error/--no-warn-error",
"--warn-error",
envvar="DBT_WARN_ERROR",
help="If dbt would normally warn, instead raise an exception. Examples include --models that selects nothing, deprecations, configurations with no associated models, invalid test configurations, and missing sources/refs in tests.",
help="If dbt would normally warn, instead raise an exception. Examples include --select that selects nothing, deprecations, configurations with no associated models, invalid test configurations, and missing sources/refs in tests.",
default=None,
is_flag=True,
)
warn_error_options = click.option(
"--warn-error-options",
envvar="DBT_WARN_ERROR_OPTIONS",
default="{}",
help="""If dbt would normally warn, instead raise an exception based on include/exclude configuration. Examples include --select that selects nothing, deprecations, configurations with no associated models, invalid test configurations,
and missing sources/refs in tests. This argument should be a YAML string, with keys 'include' or 'exclude'. eg. '{"include": "all", "exclude": ["NoNodesForSelectionCriteria"]}'""",
type=WarnErrorOptionsType(),
)
write_json = click.option(

View File

@@ -1,8 +1,11 @@
from dbt.adapters.factory import adapter_management
from dbt.adapters.factory import adapter_management, register_adapter
from dbt.flags import set_flags
from dbt.cli.flags import Flags
from dbt.config.runtime import load_project, load_profile
from dbt.config import RuntimeConfig
from dbt.config.runtime import load_project, load_profile, UnsetProfile
from dbt.events.functions import setup_event_logger
from dbt.exceptions import DbtProjectError
from dbt.parser.manifest import ManifestLoader, write_manifest
from dbt.profiler import profiler
from dbt.tracking import initialize_from_flags, track_run
@@ -19,9 +22,10 @@ def preflight(func):
# Flags
flags = Flags(ctx)
ctx.obj["flags"] = flags
set_flags(flags)
# Tracking
initialize_from_flags(flags.ANONYMOUS_USAGE_STATS, flags.PROFILES_DIR)
initialize_from_flags(flags.SEND_ANONYMOUS_USAGE_STATS, flags.PROFILES_DIR)
ctx.with_resource(track_run(run_command=flags.WHICH))
# Logging
@@ -45,6 +49,22 @@ def preflight(func):
return update_wrapper(wrapper, func)
# TODO: UnsetProfile is necessary for deps and clean to load a project.
# This decorator and its usage can be removed once https://github.com/dbt-labs/dbt-core/issues/6257 is closed.
def unset_profile(func):
def wrapper(*args, **kwargs):
ctx = args[0]
assert isinstance(ctx, Context)
if ctx.obj.get("profile") is None:
profile = UnsetProfile()
ctx.obj["profile"] = profile
return func(*args, **kwargs)
return update_wrapper(wrapper, func)
def profile(func):
def wrapper(*args, **kwargs):
ctx = args[0]
@@ -85,3 +105,70 @@ def project(func):
return func(*args, **kwargs)
return update_wrapper(wrapper, func)
def runtime_config(func):
"""A decorator used by click command functions for generating a runtime
config given a profile and project.
"""
def wrapper(*args, **kwargs):
ctx = args[0]
assert isinstance(ctx, Context)
req_strs = ["profile", "project"]
reqs = [ctx.obj.get(req_str) for req_str in req_strs]
if None in reqs:
raise DbtProjectError("profile and project required for runtime_config")
ctx.obj["runtime_config"] = RuntimeConfig.from_parts(
ctx.obj["project"],
ctx.obj["profile"],
ctx.obj["flags"],
)
return func(*args, **kwargs)
return update_wrapper(wrapper, func)
def manifest(*args0, write=True, write_perf_info=False):
"""A decorator used by click command functions for generating a manifest
given a profile, project, and runtime config. This also registers the adaper
from the runtime config and conditionally writes the manifest to disc.
"""
def outer_wrapper(func):
def wrapper(*args, **kwargs):
ctx = args[0]
assert isinstance(ctx, Context)
req_strs = ["profile", "project", "runtime_config"]
reqs = [ctx.obj.get(dep) for dep in req_strs]
if None in reqs:
raise DbtProjectError("profile, project, and runtime_config required for manifest")
runtime_config = ctx.obj["runtime_config"]
register_adapter(runtime_config)
# a manifest has already been set on the context, so don't overwrite it
if ctx.obj.get("manifest") is None:
manifest = ManifestLoader.get_full_manifest(
runtime_config, write_perf_info=write_perf_info
)
ctx.obj["manifest"] = manifest
if write and ctx.obj["flags"].write_json:
write_manifest(manifest, ctx.obj["runtime_config"].target_path)
return func(*args, **kwargs)
return update_wrapper(wrapper, func)
# if there are no args, the decorator was used without params @decorator
# otherwise, the decorator was called with params @decorator(arg)
if len(args0) == 0:
return outer_wrapper
return outer_wrapper(args0[0])

View File

@@ -2,13 +2,13 @@ import re
from collections import namedtuple
from dbt.exceptions import (
BlockDefinitionNotAtTop,
InternalException,
MissingCloseTag,
MissingControlFlowStartTag,
NestedTags,
UnexpectedControlFlowEndTag,
UnexpectedMacroEOF,
BlockDefinitionNotAtTopError,
DbtInternalError,
MissingCloseTagError,
MissingControlFlowStartTagError,
NestedTagsError,
UnexpectedControlFlowEndTagError,
UnexpectedMacroEOFError,
)
@@ -147,7 +147,7 @@ class TagIterator:
def _expect_match(self, expected_name, *patterns, **kwargs):
match = self._first_match(*patterns, **kwargs)
if match is None:
raise UnexpectedMacroEOF(expected_name, self.data[self.pos :])
raise UnexpectedMacroEOFError(expected_name, self.data[self.pos :])
return match
def handle_expr(self, match):
@@ -261,7 +261,7 @@ class TagIterator:
elif block_type_name is not None:
yield self.handle_tag(match)
else:
raise InternalException(
raise DbtInternalError(
"Invalid regex match in next_block, expected block start, "
"expr start, or comment start"
)
@@ -317,16 +317,16 @@ class BlockIterator:
found = self.stack.pop()
else:
expected = _CONTROL_FLOW_END_TAGS[tag.block_type_name]
raise UnexpectedControlFlowEndTag(tag, expected, self.tag_parser)
raise UnexpectedControlFlowEndTagError(tag, expected, self.tag_parser)
expected = _CONTROL_FLOW_TAGS[found]
if expected != tag.block_type_name:
raise MissingControlFlowStartTag(tag, expected, self.tag_parser)
raise MissingControlFlowStartTagError(tag, expected, self.tag_parser)
if tag.block_type_name in allowed_blocks:
if self.stack:
raise BlockDefinitionNotAtTop(self.tag_parser, tag.start)
raise BlockDefinitionNotAtTopError(self.tag_parser, tag.start)
if self.current is not None:
raise NestedTags(outer=self.current, inner=tag)
raise NestedTagsError(outer=self.current, inner=tag)
if collect_raw_data:
raw_data = self.data[self.last_position : tag.start]
self.last_position = tag.start
@@ -347,7 +347,7 @@ class BlockIterator:
if self.current:
linecount = self.data[: self.current.end].count("\n") + 1
raise MissingCloseTag(self.current.block_type_name, linecount)
raise MissingCloseTagError(self.current.block_type_name, linecount)
if collect_raw_data:
raw_data = self.data[self.last_position :]

View File

@@ -7,7 +7,7 @@ import json
import dbt.utils
from typing import Iterable, List, Dict, Union, Optional, Any
from dbt.exceptions import RuntimeException
from dbt.exceptions import DbtRuntimeError
BOM = BOM_UTF8.decode("utf-8") # '\ufeff'
@@ -168,7 +168,7 @@ class ColumnTypeBuilder(Dict[str, NullableAgateType]):
return
elif not isinstance(value, type(existing_type)):
# actual type mismatch!
raise RuntimeException(
raise DbtRuntimeError(
f"Tables contain columns with the same names ({key}), "
f"but different types ({value} vs {existing_type})"
)

View File

@@ -16,8 +16,8 @@ from dbt.exceptions import (
CommandResultError,
GitCheckoutError,
GitCloningError,
GitCloningProblem,
RuntimeException,
UnknownGitCloningProblemError,
DbtRuntimeError,
)
from packaging import version
@@ -134,7 +134,7 @@ def clone_and_checkout(
err = exc.stderr
exists = re.match("fatal: destination path '(.+)' already exists", err)
if not exists:
raise GitCloningProblem(repo)
raise UnknownGitCloningProblemError(repo)
directory = None
start_sha = None
@@ -144,7 +144,7 @@ def clone_and_checkout(
else:
matches = re.match("Cloning into '(.+)'", err.decode("utf-8"))
if matches is None:
raise RuntimeException(f'Error cloning {repo} - never saw "Cloning into ..." from git')
raise DbtRuntimeError(f'Error cloning {repo} - never saw "Cloning into ..." from git')
directory = matches.group(1)
fire_event(GitProgressPullingNewDependency(dir=directory))
full_path = os.path.join(cwd, directory)

View File

@@ -28,19 +28,19 @@ from dbt.clients._jinja_blocks import BlockIterator, BlockData, BlockTag
from dbt.contracts.graph.nodes import GenericTestNode
from dbt.exceptions import (
CaughtMacroException,
CaughtMacroExceptionWithNode,
CompilationException,
InternalException,
InvalidMaterializationArg,
JinjaRenderingException,
CaughtMacroError,
CaughtMacroErrorWithNodeError,
CompilationError,
DbtInternalError,
MaterializationArgError,
JinjaRenderingError,
MacroReturn,
MaterializtionMacroNotUsed,
NoSupportedLanguagesFound,
UndefinedCompilation,
UndefinedMacroException,
MaterializtionMacroNotUsedError,
NoSupportedLanguagesFoundError,
UndefinedCompilationError,
UndefinedMacroError,
)
from dbt import flags
from dbt.flags import get_flags
from dbt.node_types import ModelLanguage
@@ -99,8 +99,9 @@ class MacroFuzzEnvironment(jinja2.sandbox.SandboxedEnvironment):
If the value is 'write', also write the files to disk.
WARNING: This can write a ton of data if you aren't careful.
"""
if filename == "<template>" and flags.MACRO_DEBUGGING:
write = flags.MACRO_DEBUGGING == "write"
macro_debugging = get_flags().MACRO_DEBUGGING
if filename == "<template>" and macro_debugging:
write = macro_debugging == "write"
filename = _linecache_inject(source, write)
return super()._compile(source, filename) # type: ignore
@@ -161,9 +162,9 @@ def quoted_native_concat(nodes):
except (ValueError, SyntaxError, MemoryError):
result = raw
if isinstance(raw, BoolMarker) and not isinstance(result, bool):
raise JinjaRenderingException(f"Could not convert value '{raw!s}' into type 'bool'")
raise JinjaRenderingError(f"Could not convert value '{raw!s}' into type 'bool'")
if isinstance(raw, NumberMarker) and not _is_number(result):
raise JinjaRenderingException(f"Could not convert value '{raw!s}' into type 'number'")
raise JinjaRenderingError(f"Could not convert value '{raw!s}' into type 'number'")
return result
@@ -241,12 +242,12 @@ class BaseMacroGenerator:
try:
yield
except (TypeError, jinja2.exceptions.TemplateRuntimeError) as e:
raise CaughtMacroException(e)
raise CaughtMacroError(e)
def call_macro(self, *args, **kwargs):
# called from __call__ methods
if self.context is None:
raise InternalException("Context is still None in call_macro!")
raise DbtInternalError("Context is still None in call_macro!")
assert self.context is not None
macro = self.get_macro()
@@ -273,7 +274,7 @@ class MacroStack(threading.local):
def pop(self, name):
got = self.call_stack.pop()
if got != name:
raise InternalException(f"popped {got}, expected {name}")
raise DbtInternalError(f"popped {got}, expected {name}")
class MacroGenerator(BaseMacroGenerator):
@@ -300,8 +301,8 @@ class MacroGenerator(BaseMacroGenerator):
try:
yield
except (TypeError, jinja2.exceptions.TemplateRuntimeError) as e:
raise CaughtMacroExceptionWithNode(exc=e, node=self.macro)
except CompilationException as e:
raise CaughtMacroErrorWithNodeError(exc=e, node=self.macro)
except CompilationError as e:
e.stack.append(self.macro)
raise e
@@ -380,7 +381,7 @@ class MaterializationExtension(jinja2.ext.Extension):
node.defaults.append(languages)
else:
raise InvalidMaterializationArg(materialization_name, target.name)
raise MaterializationArgError(materialization_name, target.name)
if SUPPORTED_LANG_ARG not in node.args:
node.args.append(SUPPORTED_LANG_ARG)
@@ -455,7 +456,7 @@ def create_undefined(node=None):
return self
def __reduce__(self):
raise UndefinedCompilation(name=self.name, node=node)
raise UndefinedCompilationError(name=self.name, node=node)
return Undefined
@@ -513,10 +514,10 @@ def catch_jinja(node=None) -> Iterator[None]:
yield
except jinja2.exceptions.TemplateSyntaxError as e:
e.translated = False
raise CompilationException(str(e), node) from e
raise CompilationError(str(e), node) from e
except jinja2.exceptions.UndefinedError as e:
raise UndefinedMacroException(str(e), node) from e
except CompilationException as exc:
raise UndefinedMacroError(str(e), node) from e
except CompilationError as exc:
exc.add_node(node)
raise
@@ -655,13 +656,13 @@ def add_rendered_test_kwargs(
def get_supported_languages(node: jinja2.nodes.Macro) -> List[ModelLanguage]:
if "materialization" not in node.name:
raise MaterializtionMacroNotUsed(node=node)
raise MaterializtionMacroNotUsedError(node=node)
no_kwargs = not node.defaults
no_langs_found = SUPPORTED_LANG_ARG not in node.args
if no_kwargs or no_langs_found:
raise NoSupportedLanguagesFound(node=node)
raise NoSupportedLanguagesFoundError(node=node)
lang_idx = node.args.index(SUPPORTED_LANG_ARG)
# indexing defaults from the end

View File

@@ -1,6 +1,6 @@
import jinja2
from dbt.clients.jinja import get_environment
from dbt.exceptions import MacroNamespaceNotString, MacroNameNotString
from dbt.exceptions import MacroNamespaceNotStringError, MacroNameNotStringError
def statically_extract_macro_calls(string, ctx, db_wrapper=None):
@@ -117,14 +117,14 @@ def statically_parse_adapter_dispatch(func_call, ctx, db_wrapper):
func_name = kwarg.value.value
possible_macro_calls.append(func_name)
else:
raise MacroNameNotString(kwarg_value=kwarg.value.value)
raise MacroNameNotStringError(kwarg_value=kwarg.value.value)
elif kwarg.key == "macro_namespace":
# This will remain to enable static resolution
kwarg_type = type(kwarg.value).__name__
if kwarg_type == "Const":
macro_namespace = kwarg.value.value
else:
raise MacroNamespaceNotString(kwarg_type)
raise MacroNamespaceNotStringError(kwarg_type)
# positional arguments
if packages_arg:

View File

@@ -20,11 +20,11 @@ from dbt.events.types import (
SystemCouldNotWrite,
SystemErrorRetrievingModTime,
SystemExecutingCmd,
SystemStdOut,
SystemStdErr,
SystemReportReturnCode,
SystemStdErrMsg,
SystemStdOutMsg,
)
from dbt.exceptions import InternalException
from dbt.exceptions import DbtInternalError
from dbt.utils import _connection_exception_retry as connection_exception_retry
from pathspec import PathSpec # type: ignore
@@ -115,7 +115,7 @@ def make_directory(path=None) -> None:
exist. This function handles the case where two threads try to create
a directory at once.
"""
raise InternalException(f"Can not create directory from {type(path)} ")
raise DbtInternalError(f"Can not create directory from {type(path)} ")
@make_directory.register
@@ -425,7 +425,7 @@ def _interpret_oserror(exc: OSError, cwd: str, cmd: List[str]) -> NoReturn:
_handle_posix_error(exc, cwd, cmd)
# this should not be reachable, raise _something_ at least!
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
"Unhandled exception in _interpret_oserror: {}".format(exc)
)
@@ -454,8 +454,8 @@ def run_cmd(cwd: str, cmd: List[str], env: Optional[Dict[str, Any]] = None) -> T
except OSError as exc:
_interpret_oserror(exc, cwd, cmd)
fire_event(SystemStdOutMsg(bmsg=out))
fire_event(SystemStdErrMsg(bmsg=err))
fire_event(SystemStdOut(bmsg=out))
fire_event(SystemStdErr(bmsg=err))
if proc.returncode != 0:
fire_event(SystemReportReturnCode(returncode=proc.returncode))

View File

@@ -60,4 +60,4 @@ def load_yaml_text(contents, path=None):
else:
error = str(e)
raise dbt.exceptions.ValidationException(error)
raise dbt.exceptions.DbtValidationError(error)

View File

@@ -1,12 +1,13 @@
import os
from collections import defaultdict
from typing import List, Dict, Any, Tuple, Optional
import argparse
import networkx as nx # type: ignore
import os
import pickle
import sqlparse
from dbt import flags
from collections import defaultdict
from typing import List, Dict, Any, Tuple, Optional
from dbt.flags import get_flags
from dbt.adapters.factory import get_adapter
from dbt.clients import jinja
from dbt.clients.system import make_directory
@@ -21,9 +22,9 @@ from dbt.contracts.graph.nodes import (
SeedNode,
)
from dbt.exceptions import (
GraphDependencyNotFound,
InternalException,
RuntimeException,
GraphDependencyNotFoundError,
DbtInternalError,
DbtRuntimeError,
)
from dbt.graph import Graph
from dbt.events.functions import fire_event
@@ -32,6 +33,7 @@ from dbt.events.contextvars import get_node_info
from dbt.node_types import NodeType, ModelLanguage
from dbt.events.format import pluralize
import dbt.tracking
import dbt.task.list as list_task
graph_file_name = "graph.gpickle"
@@ -257,7 +259,7 @@ class Compiler:
inserting CTEs into the SQL.
"""
if model.compiled_code is None:
raise RuntimeException("Cannot inject ctes into an unparsed node", model)
raise DbtRuntimeError("Cannot inject ctes into an unparsed node", model)
if model.extra_ctes_injected:
return (model, model.extra_ctes)
@@ -278,7 +280,7 @@ class Compiler:
# ephemeral model.
for cte in model.extra_ctes:
if cte.id not in manifest.nodes:
raise InternalException(
raise DbtInternalError(
f"During compilation, found a cte reference that "
f"could not be resolved: {cte.id}"
)
@@ -286,7 +288,7 @@ class Compiler:
assert not isinstance(cte_model, SeedNode)
if not cte_model.is_ephemeral_model:
raise InternalException(f"{cte.id} is not ephemeral")
raise DbtInternalError(f"{cte.id} is not ephemeral")
# This model has already been compiled, so it's been
# through here before
@@ -351,13 +353,6 @@ class Compiler:
)
if node.language == ModelLanguage.python:
# TODO could we also 'minify' this code at all? just aesthetic, not functional
# quoating seems like something very specific to sql so far
# for all python implementations we are seeing there's no quating.
# TODO try to find better way to do this, given that
original_quoting = self.config.quoting
self.config.quoting = {key: False for key in original_quoting.keys()}
context = self._create_node_context(node, manifest, extra_context)
postfix = jinja.get_rendered(
@@ -367,8 +362,6 @@ class Compiler:
)
# we should NOT jinja render the python model's 'raw code'
node.compiled_code = f"{node.raw_code}\n\n{postfix}"
# restore quoting settings in the end since context is lazy evaluated
self.config.quoting = original_quoting
else:
context = self._create_node_context(node, manifest, extra_context)
@@ -385,6 +378,7 @@ class Compiler:
def write_graph_file(self, linker: Linker, manifest: Manifest):
filename = graph_file_name
graph_path = os.path.join(self.config.target_path, filename)
flags = get_flags()
if flags.WRITE_JSON:
linker.write_graph(graph_path, manifest)
@@ -399,7 +393,7 @@ class Compiler:
elif dependency in manifest.metrics:
linker.dependency(node.unique_id, (manifest.metrics[dependency].unique_id))
else:
raise GraphDependencyNotFound(node, dependency)
raise GraphDependencyNotFoundError(node, dependency)
def link_graph(self, linker: Linker, manifest: Manifest, add_test_edges: bool = False):
for source in manifest.sources.values():
@@ -482,7 +476,13 @@ class Compiler:
if write:
self.write_graph_file(linker, manifest)
print_compile_stats(stats)
# Do not print these for ListTask's
if not (
self.config.args.__class__ == argparse.Namespace
and self.config.args.cls == list_task.ListTask
):
print_compile_stats(stats)
return Graph(linker.graph)

View File

@@ -4,18 +4,18 @@ import os
from dbt.dataclass_schema import ValidationError
from dbt import flags
from dbt.flags import get_flags
from dbt.clients.system import load_file_contents
from dbt.clients.yaml_helper import load_yaml_text
from dbt.contracts.connection import Credentials, HasCredentials
from dbt.contracts.project import ProfileConfig, UserConfig
from dbt.exceptions import (
CompilationException,
CompilationError,
DbtProfileError,
DbtProjectError,
ValidationException,
RuntimeException,
ProfileConfigInvalid,
DbtValidationError,
DbtRuntimeError,
ProfileConfigError,
)
from dbt.events.types import MissingProfileTarget
from dbt.events.functions import fire_event
@@ -32,22 +32,6 @@ dbt encountered an error while trying to read your profiles.yml file.
"""
NO_SUPPLIED_PROFILE_ERROR = """\
dbt cannot run because no profile was specified for this dbt project.
To specify a profile for this project, add a line like the this to
your dbt_project.yml file:
profile: [profile name]
Here, [profile name] should be replaced with a profile name
defined in your profiles.yml file. You can find profiles.yml here:
{profiles_file}/profiles.yml
""".format(
profiles_file=flags.DEFAULT_PROFILES_DIR
)
def read_profile(profiles_dir: str) -> Dict[str, Any]:
path = os.path.join(profiles_dir, "profiles.yml")
@@ -60,9 +44,9 @@ def read_profile(profiles_dir: str) -> Dict[str, Any]:
msg = f"The profiles.yml file at {path} is empty"
raise DbtProfileError(INVALID_PROFILE_MESSAGE.format(error_string=msg))
return yaml_content
except ValidationException as e:
except DbtValidationError as e:
msg = INVALID_PROFILE_MESSAGE.format(error_string=e)
raise ValidationException(msg) from e
raise DbtValidationError(msg) from e
return {}
@@ -75,7 +59,7 @@ def read_user_config(directory: str) -> UserConfig:
if user_config is not None:
UserConfig.validate(user_config)
return UserConfig.from_dict(user_config)
except (RuntimeException, ValidationError):
except (DbtRuntimeError, ValidationError):
pass
return UserConfig()
@@ -158,7 +142,7 @@ class Profile(HasCredentials):
dct = self.to_profile_info(serialize_credentials=True)
ProfileConfig.validate(dct)
except ValidationError as exc:
raise ProfileConfigInvalid(exc) from exc
raise ProfileConfigError(exc) from exc
@staticmethod
def _credentials_from_profile(
@@ -182,8 +166,8 @@ class Profile(HasCredentials):
data = cls.translate_aliases(profile)
cls.validate(data)
credentials = cls.from_dict(data)
except (RuntimeException, ValidationError) as e:
msg = str(e) if isinstance(e, RuntimeException) else e.message
except (DbtRuntimeError, ValidationError) as e:
msg = str(e) if isinstance(e, DbtRuntimeError) else e.message
raise DbtProfileError(
'Credentials in profile "{}", target "{}" invalid: {}'.format(
profile_name, target_name, msg
@@ -197,10 +181,33 @@ class Profile(HasCredentials):
args_profile_name: Optional[str],
project_profile_name: Optional[str] = None,
) -> str:
# TODO: Duplicating this method as direct copy of the implementation in dbt.cli.resolvers
# dbt.cli.resolvers implementation can't be used because it causes a circular dependency.
# This should be removed and use a safe default access on the Flags module when
# https://github.com/dbt-labs/dbt-core/issues/6259 is closed.
def default_profiles_dir():
from pathlib import Path
return Path.cwd() if (Path.cwd() / "profiles.yml").exists() else Path.home() / ".dbt"
profile_name = project_profile_name
if args_profile_name is not None:
profile_name = args_profile_name
if profile_name is None:
NO_SUPPLIED_PROFILE_ERROR = """\
dbt cannot run because no profile was specified for this dbt project.
To specify a profile for this project, add a line like the this to
your dbt_project.yml file:
profile: [profile name]
Here, [profile name] should be replaced with a profile name
defined in your profiles.yml file. You can find profiles.yml here:
{profiles_file}/profiles.yml
""".format(
profiles_file=default_profiles_dir()
)
raise DbtProjectError(NO_SUPPLIED_PROFILE_ERROR)
return profile_name
@@ -299,7 +306,7 @@ class Profile(HasCredentials):
try:
profile_data = renderer.render_data(raw_profile_data)
except CompilationException as exc:
except CompilationError as exc:
raise DbtProfileError(str(exc)) from exc
return target_name, profile_data
@@ -423,7 +430,7 @@ class Profile(HasCredentials):
target could not be found.
:returns Profile: The new Profile object.
"""
flags = get_flags()
raw_profiles = read_profile(flags.PROFILES_DIR)
profile_name = cls.pick_profile_name(profile_name_override, project_profile_name)
return cls.from_raw_profiles(

View File

@@ -15,16 +15,17 @@ from typing_extensions import Protocol, runtime_checkable
import hashlib
import os
from dbt import flags, deprecations
from dbt.flags import get_flags
from dbt import deprecations
from dbt.clients.system import path_exists, resolve_path_from_base, load_file_contents
from dbt.clients.yaml_helper import load_yaml_text
from dbt.contracts.connection import QueryComment
from dbt.exceptions import (
DbtProjectError,
SemverException,
ProjectContractBroken,
ProjectContractInvalid,
RuntimeException,
SemverError,
ProjectContractBrokenError,
ProjectContractError,
DbtRuntimeError,
)
from dbt.graph import SelectionSpec
from dbt.helper_types import NoValue
@@ -75,6 +76,11 @@ Validator Error:
{error}
"""
MISSING_DBT_PROJECT_ERROR = """\
No dbt_project.yml found at expected path {path}
Verify that each entry within packages.yml (and their transitive dependencies) contains a file named dbt_project.yml
"""
@runtime_checkable
class IsFQNResource(Protocol):
@@ -163,9 +169,7 @@ def load_raw_project(project_root: str) -> Dict[str, Any]:
# get the project.yml contents
if not path_exists(project_yaml_filepath):
raise DbtProjectError(
"no dbt_project.yml found at expected path {}".format(project_yaml_filepath)
)
raise DbtProjectError(MISSING_DBT_PROJECT_ERROR.format(path=project_yaml_filepath))
project_dict = _load_yaml(project_yaml_filepath)
@@ -219,7 +223,7 @@ def _get_required_version(
try:
dbt_version = _parse_versions(dbt_raw_version)
except SemverException as e:
except SemverError as e:
raise DbtProjectError(str(e)) from e
if verify_version:
@@ -332,7 +336,7 @@ class PartialProject(RenderComponents):
ProjectContract.validate(rendered.project_dict)
cfg = ProjectContract.from_dict(rendered.project_dict)
except ValidationError as e:
raise ProjectContractInvalid(e) from e
raise ProjectContractError(e) from e
# name/version are required in the Project definition, so we can assume
# they are present
name = cfg.name
@@ -370,9 +374,13 @@ class PartialProject(RenderComponents):
docs_paths: List[str] = value_or(cfg.docs_paths, all_source_paths)
asset_paths: List[str] = value_or(cfg.asset_paths, [])
target_path: str = flag_or(flags.TARGET_PATH, cfg.target_path, "target")
flags = get_flags()
flag_target_path = str(flags.TARGET_PATH) if flags.TARGET_PATH else None
target_path: str = flag_or(flag_target_path, cfg.target_path, "target")
log_path: str = str(flags.LOG_PATH)
clean_targets: List[str] = value_or(cfg.clean_targets, [target_path])
log_path: str = flag_or(flags.LOG_PATH, cfg.log_path, "logs")
packages_install_path: str = value_or(cfg.packages_install_path, "dbt_packages")
# in the default case we'll populate this once we know the adapter type
# It would be nice to just pass along a Quoting here, but that would
@@ -649,7 +657,14 @@ class Project:
try:
ProjectContract.validate(self.to_project_config())
except ValidationError as e:
raise ProjectContractBroken(e) from e
raise ProjectContractBrokenError(e) from e
@classmethod
def partial_load(cls, project_root: str, *, verify_version: bool = False) -> PartialProject:
return PartialProject.from_project_root(
project_root,
verify_version=verify_version,
)
@classmethod
def from_project_root(
@@ -667,7 +682,7 @@ class Project:
def get_selector(self, name: str) -> Union[SelectionSpec, bool]:
if name not in self.selectors:
raise RuntimeException(
raise DbtRuntimeError(
f"Could not find selector named {name}, expected one of {list(self.selectors)}"
)
return self.selectors[name]["definition"]

View File

@@ -8,7 +8,7 @@ from dbt.context.target import TargetContext
from dbt.context.secret import SecretContext, SECRET_PLACEHOLDER
from dbt.context.base import BaseContext
from dbt.contracts.connection import HasCredentials
from dbt.exceptions import DbtProjectError, CompilationException, RecursionException
from dbt.exceptions import DbtProjectError, CompilationError, RecursionError
from dbt.utils import deep_map_render
@@ -40,14 +40,14 @@ class BaseRenderer:
try:
with catch_jinja():
return get_rendered(value, self.context, native=True)
except CompilationException as exc:
except CompilationError as exc:
msg = f"Could not render {value}: {exc.msg}"
raise CompilationException(msg) from exc
raise CompilationError(msg) from exc
def render_data(self, data: Dict[str, Any]) -> Dict[str, Any]:
try:
return deep_map_render(self.render_entry, data)
except RecursionException:
except RecursionError:
raise DbtProjectError(
f"Cycle detected: {self.name} input has a reference to itself", project=data
)
@@ -159,7 +159,8 @@ class DbtProjectYamlRenderer(BaseRenderer):
if first in {"seeds", "models", "snapshots", "tests"}:
keypath_parts = {(k.lstrip("+ ") if isinstance(k, str) else k) for k in keypath}
# model-level hooks
if "pre-hook" in keypath_parts or "post-hook" in keypath_parts:
late_rendered_hooks = {"pre-hook", "post-hook", "pre_hook", "post_hook"}
if keypath_parts.intersection(late_rendered_hooks):
return False
return True

View File

@@ -15,26 +15,24 @@ from typing import (
Type,
)
from dbt import flags
from dbt.flags import get_flags
from dbt.adapters.factory import get_include_paths, get_relation_class_by_name
from dbt.config.profile import read_user_config
from dbt.config.project import load_raw_project
from dbt.contracts.connection import AdapterRequiredConfig, Credentials, HasCredentials
from dbt.contracts.graph.manifest import ManifestMetadata
from dbt.contracts.project import Configuration, UserConfig
from dbt.contracts.relation import ComponentName
from dbt.dataclass_schema import ValidationError
from dbt.exceptions import (
ConfigContractBroken,
DbtProjectError,
NonUniquePackageName,
RuntimeException,
UninstalledPackagesFound,
)
from dbt.events.functions import warn_or_error
from dbt.events.types import UnusedResourceConfigPath
from dbt.exceptions import (
ConfigContractBrokenError,
DbtProjectError,
NonUniquePackageNameError,
DbtRuntimeError,
UninstalledPackagesFoundError,
)
from dbt.helper_types import DictDefaultEmptyStr, FQNPath, PathSet
from .profile import Profile
from .project import Project
from .renderer import DbtProjectYamlRenderer, ProfileRenderer
@@ -199,11 +197,10 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
# load the new project and its packages. Don't pass cli variables.
renderer = DbtProjectYamlRenderer(profile)
project = Project.from_project_root(
project_root,
renderer,
verify_version=bool(flags.VERSION_CHECK),
verify_version=bool(getattr(self.args, "VERSION_CHECK", True)),
)
runtime_config = self.from_parts(
@@ -237,7 +234,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
try:
Configuration.validate(self.serialize())
except ValidationError as e:
raise ConfigContractBroken(e) from e
raise ConfigContractBrokenError(e) from e
@classmethod
def collect_parts(cls: Type["RuntimeConfig"], args: Any) -> Tuple[Project, Profile]:
@@ -249,6 +246,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
cli_vars,
args,
)
flags = get_flags()
project = load_project(project_root, bool(flags.VERSION_CHECK), profile, cli_vars)
return project, profile
@@ -262,7 +260,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
:param args: The arguments as parsed from the cli.
:raises DbtProjectError: If the project is invalid or missing.
:raises DbtProfileError: If the profile is invalid or missing.
:raises ValidationException: If the cli variables are invalid.
:raises DbtValidationError: If the cli variables are invalid.
"""
project, profile = cls.collect_parts(args)
@@ -357,7 +355,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
count_packages_specified = len(self.packages.packages) # type: ignore
count_packages_installed = len(tuple(self._get_project_directories()))
if count_packages_specified > count_packages_installed:
raise UninstalledPackagesFound(
raise UninstalledPackagesFoundError(
count_packages_specified,
count_packages_installed,
self.packages_install_path,
@@ -365,7 +363,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
project_paths = itertools.chain(internal_packages, self._get_project_directories())
for project_name, project in self.load_projects(project_paths):
if project_name in all_projects:
raise NonUniquePackageName(project_name)
raise NonUniquePackageNameError(project_name)
all_projects[project_name] = project
self.dependencies = all_projects
return self.dependencies
@@ -430,7 +428,7 @@ class UnsetProfile(Profile):
def __getattribute__(self, name):
if name in {"profile_name", "target_name", "threads"}:
raise RuntimeException(f'Error: disallowed attribute "{name}" - no profile!')
raise DbtRuntimeError(f'Error: disallowed attribute "{name}" - no profile!')
return Profile.__getattribute__(self, name)

View File

@@ -12,7 +12,7 @@ from dbt.clients.system import (
resolve_path_from_base,
)
from dbt.contracts.selection import SelectorFile
from dbt.exceptions import DbtSelectorsError, RuntimeException
from dbt.exceptions import DbtSelectorsError, DbtRuntimeError
from dbt.graph import parse_from_selectors_definition, SelectionSpec
from dbt.graph.selector_spec import SelectionCriteria
@@ -46,7 +46,7 @@ class SelectorConfig(Dict[str, Dict[str, Union[SelectionSpec, bool]]]):
f"yaml-selectors",
result_type="invalid_selector",
) from exc
except RuntimeException as exc:
except DbtRuntimeError as exc:
raise DbtSelectorsError(
f"Could not read selector file data: {exc}",
result_type="invalid_selector",
@@ -62,7 +62,7 @@ class SelectorConfig(Dict[str, Dict[str, Union[SelectionSpec, bool]]]):
) -> "SelectorConfig":
try:
rendered = renderer.render_data(data)
except (ValidationError, RuntimeException) as exc:
except (ValidationError, DbtRuntimeError) as exc:
raise DbtSelectorsError(
f"Could not render selector data: {exc}",
result_type="invalid_selector",
@@ -77,7 +77,7 @@ class SelectorConfig(Dict[str, Dict[str, Union[SelectionSpec, bool]]]):
) -> "SelectorConfig":
try:
data = load_yaml_text(load_file_contents(str(path)))
except (ValidationError, RuntimeException) as exc:
except (ValidationError, DbtRuntimeError) as exc:
raise DbtSelectorsError(
f"Could not read selector file: {exc}",
result_type="invalid_selector",

View File

@@ -1,77 +1,24 @@
from argparse import Namespace
from typing import Any, Dict, Optional, Union
from xmlrpc.client import Boolean
from dbt.contracts.project import UserConfig
from typing import Any, Dict
import dbt.flags as flags
from dbt.clients import yaml_helper
from dbt.config import Profile, Project, read_user_config
from dbt.config.renderer import DbtProjectYamlRenderer, ProfileRenderer
from dbt.events.functions import fire_event
from dbt.events.types import InvalidVarsYAML
from dbt.exceptions import ValidationException, VarsArgNotYamlDict
from dbt.events.types import InvalidOptionYAML
from dbt.exceptions import DbtValidationError, OptionNotYamlDictError
def parse_cli_vars(var: str) -> Dict[str, Any]:
def parse_cli_vars(var_string: str) -> Dict[str, Any]:
return parse_cli_yaml_string(var_string, "vars")
def parse_cli_yaml_string(var_string: str, cli_option_name: str) -> Dict[str, Any]:
try:
cli_vars = yaml_helper.load_yaml_text(var)
cli_vars = yaml_helper.load_yaml_text(var_string)
var_type = type(cli_vars)
if var_type is dict:
return cli_vars
else:
raise VarsArgNotYamlDict(var_type)
except ValidationException:
fire_event(InvalidVarsYAML())
raise OptionNotYamlDictError(var_type, cli_option_name)
except DbtValidationError:
fire_event(InvalidOptionYAML(option_name=cli_option_name))
raise
def get_project_config(
project_path: str,
profile_name: str,
args: Namespace = Namespace(),
cli_vars: Optional[Dict[str, Any]] = None,
profile: Optional[Profile] = None,
user_config: Optional[UserConfig] = None,
return_dict: Boolean = True,
) -> Union[Project, Dict]:
"""Returns a project config (dict or object) from a given project path and profile name.
Args:
project_path: Path to project
profile_name: Name of profile
args: An argparse.Namespace that represents what would have been passed in on the
command line (optional)
cli_vars: A dict of any vars that would have been passed in on the command line (optional)
(see parse_cli_vars above for formatting details)
profile: A dbt.config.profile.Profile object (optional)
user_config: A dbt.contracts.project.UserConfig object (optional)
return_dict: Return a dict if true, return the full dbt.config.project.Project object if false
Returns:
A full project config
"""
# Generate a profile if not provided
if profile is None:
# Generate user_config if not provided
if user_config is None:
user_config = read_user_config(flags.PROFILES_DIR)
# Update flags
flags.set_from_args(args, user_config)
if cli_vars is None:
cli_vars = {}
profile = Profile.render(
ProfileRenderer(cli_vars),
profile_name,
args.THREADS,
args.TARGET,
args.PROFILE,
)
# Generate a project
project = Project.from_project_root(
project_path,
DbtProjectYamlRenderer(profile),
verify_version=bool(flags.VERSION_CHECK),
)
# Return
return project.to_project_config() if return_dict else project

View File

@@ -2,7 +2,8 @@ import json
import os
from typing import Any, Dict, NoReturn, Optional, Mapping, Iterable, Set, List
from dbt import flags
from dbt.flags import get_flags
import dbt.flags as flags_module
from dbt import tracking
from dbt import utils
from dbt.clients.jinja import get_rendered
@@ -10,12 +11,12 @@ from dbt.clients.yaml_helper import yaml, safe_load, SafeLoader, Loader, Dumper
from dbt.constants import SECRET_ENV_PREFIX, DEFAULT_ENV_PLACEHOLDER
from dbt.contracts.graph.nodes import Resource
from dbt.exceptions import (
DisallowSecretEnvVar,
EnvVarMissing,
SecretEnvVarLocationError,
EnvVarMissingError,
MacroReturn,
RequiredVarNotFound,
SetStrictWrongType,
ZipStrictWrongType,
RequiredVarNotFoundError,
SetStrictWrongTypeError,
ZipStrictWrongTypeError,
)
from dbt.events.functions import fire_event, get_invocation_id
from dbt.events.types import JinjaLogInfo, JinjaLogDebug
@@ -153,7 +154,7 @@ class Var:
return "<Configuration>"
def get_missing_var(self, var_name):
raise RequiredVarNotFound(var_name, self._merged, self._node)
raise RequiredVarNotFoundError(var_name, self._merged, self._node)
def has_var(self, var_name: str):
return var_name in self._merged
@@ -297,7 +298,7 @@ class BaseContext(metaclass=ContextMeta):
"""
return_value = None
if var.startswith(SECRET_ENV_PREFIX):
raise DisallowSecretEnvVar(var)
raise SecretEnvVarLocationError(var)
if var in os.environ:
return_value = os.environ[var]
elif default is not None:
@@ -312,7 +313,7 @@ class BaseContext(metaclass=ContextMeta):
return return_value
else:
raise EnvVarMissing(var)
raise EnvVarMissingError(var)
if os.environ.get("DBT_MACRO_DEBUGGING"):
@@ -493,7 +494,7 @@ class BaseContext(metaclass=ContextMeta):
try:
return set(value)
except TypeError as e:
raise SetStrictWrongType(e)
raise SetStrictWrongTypeError(e)
@contextmember("zip")
@staticmethod
@@ -537,7 +538,7 @@ class BaseContext(metaclass=ContextMeta):
try:
return zip(*args)
except TypeError as e:
raise ZipStrictWrongType(e)
raise ZipStrictWrongTypeError(e)
@contextmember
@staticmethod
@@ -634,9 +635,8 @@ class BaseContext(metaclass=ContextMeta):
{% endif %}
This supports all flags defined in flags submodule (core/dbt/flags.py)
TODO: Replace with object that provides read-only access to flag values
"""
return flags
return flags_module.get_flag_obj()
@contextmember
@staticmethod
@@ -652,7 +652,7 @@ class BaseContext(metaclass=ContextMeta):
{% endmacro %}"
"""
if not flags.NO_PRINT:
if not get_flags().PRINT:
print(msg)
return ""

View File

@@ -8,7 +8,7 @@ from dbt.utils import MultiDict
from dbt.context.base import contextproperty, contextmember, Var
from dbt.context.target import TargetContext
from dbt.exceptions import EnvVarMissing, DisallowSecretEnvVar
from dbt.exceptions import EnvVarMissingError, SecretEnvVarLocationError
class ConfiguredContext(TargetContext):
@@ -87,7 +87,7 @@ class SchemaYamlContext(ConfiguredContext):
def env_var(self, var: str, default: Optional[str] = None) -> str:
return_value = None
if var.startswith(SECRET_ENV_PREFIX):
raise DisallowSecretEnvVar(var)
raise SecretEnvVarLocationError(var)
if var in os.environ:
return_value = os.environ[var]
elif default is not None:
@@ -105,7 +105,7 @@ class SchemaYamlContext(ConfiguredContext):
return return_value
else:
raise EnvVarMissing(var)
raise EnvVarMissingError(var)
class MacroResolvingContext(ConfiguredContext):

View File

@@ -5,7 +5,7 @@ from typing import List, Iterator, Dict, Any, TypeVar, Generic
from dbt.config import RuntimeConfig, Project, IsFQNResource
from dbt.contracts.graph.model_config import BaseConfig, get_config_for, _listify
from dbt.exceptions import InternalException
from dbt.exceptions import DbtInternalError
from dbt.node_types import NodeType
from dbt.utils import fqn_search
@@ -89,7 +89,7 @@ class BaseContextConfigGenerator(Generic[T]):
return self._active_project
dependencies = self._active_project.load_dependencies()
if project_name not in dependencies:
raise InternalException(
raise DbtInternalError(
f"Project name {project_name} not found in dependencies "
f"(found {list(dependencies)})"
)
@@ -287,14 +287,14 @@ class ContextConfig:
elif k in BaseConfig.mergebehavior["update"]:
if not isinstance(v, dict):
raise InternalException(f"expected dict, got {v}")
raise DbtInternalError(f"expected dict, got {v}")
if k in config_call_dict and isinstance(config_call_dict[k], dict):
config_call_dict[k].update(v)
else:
config_call_dict[k] = v
elif k in BaseConfig.mergebehavior["dict_key_append"]:
if not isinstance(v, dict):
raise InternalException(f"expected dict, got {v}")
raise DbtInternalError(f"expected dict, got {v}")
if k in config_call_dict: # should always be a dict
for key, value in v.items():
extend = False

View File

@@ -1,8 +1,8 @@
from typing import Any, Dict, Union
from dbt.exceptions import (
DocTargetNotFound,
InvalidDocArgs,
DocTargetNotFoundError,
DocArgsError,
)
from dbt.config.runtime import RuntimeConfig
from dbt.contracts.graph.manifest import Manifest
@@ -52,7 +52,7 @@ class DocsRuntimeContext(SchemaYamlContext):
elif len(args) == 2:
doc_package_name, doc_name = args
else:
raise InvalidDocArgs(self.node, args)
raise DocArgsError(self.node, args)
# Documentation
target_doc = self.manifest.resolve_doc(
@@ -68,7 +68,7 @@ class DocsRuntimeContext(SchemaYamlContext):
# TODO CT-211
source_file.add_node(self.node.unique_id) # type: ignore[union-attr]
else:
raise DocTargetNotFound(
raise DocTargetNotFoundError(
node=self.node, target_doc_name=doc_name, target_doc_package=doc_package_name
)

View File

@@ -6,23 +6,23 @@ from dbt.events.helpers import env_secrets, scrub_secrets
from dbt.events.types import JinjaLogWarning
from dbt.exceptions import (
RuntimeException,
MissingConfig,
MissingMaterialization,
MissingRelation,
AmbiguousAlias,
AmbiguousCatalogMatch,
CacheInconsistency,
DataclassNotDict,
CompilationException,
DatabaseException,
DependencyNotFound,
DependencyException,
DuplicatePatchPath,
DuplicateResourceName,
InvalidPropertyYML,
NotImplementedException,
RelationWrongType,
DbtRuntimeError,
MissingConfigError,
MissingMaterializationError,
MissingRelationError,
AmbiguousAliasError,
AmbiguousCatalogMatchError,
CacheInconsistencyError,
DataclassNotDictError,
CompilationError,
DbtDatabaseError,
DependencyNotFoundError,
DependencyError,
DuplicatePatchPathError,
DuplicateResourceNameError,
PropertyYMLError,
NotImplementedError,
RelationWrongTypeError,
)
@@ -32,67 +32,69 @@ def warn(msg, node=None):
def missing_config(model, name) -> NoReturn:
raise MissingConfig(unique_id=model.unique_id, name=name)
raise MissingConfigError(unique_id=model.unique_id, name=name)
def missing_materialization(model, adapter_type) -> NoReturn:
raise MissingMaterialization(model=model, adapter_type=adapter_type)
raise MissingMaterializationError(
materialization=model.config.materialized, adapter_type=adapter_type
)
def missing_relation(relation, model=None) -> NoReturn:
raise MissingRelation(relation, model)
raise MissingRelationError(relation, model)
def raise_ambiguous_alias(node_1, node_2, duped_name=None) -> NoReturn:
raise AmbiguousAlias(node_1, node_2, duped_name)
raise AmbiguousAliasError(node_1, node_2, duped_name)
def raise_ambiguous_catalog_match(unique_id, match_1, match_2) -> NoReturn:
raise AmbiguousCatalogMatch(unique_id, match_1, match_2)
raise AmbiguousCatalogMatchError(unique_id, match_1, match_2)
def raise_cache_inconsistent(message) -> NoReturn:
raise CacheInconsistency(message)
raise CacheInconsistencyError(message)
def raise_dataclass_not_dict(obj) -> NoReturn:
raise DataclassNotDict(obj)
raise DataclassNotDictError(obj)
def raise_compiler_error(msg, node=None) -> NoReturn:
raise CompilationException(msg, node)
raise CompilationError(msg, node)
def raise_database_error(msg, node=None) -> NoReturn:
raise DatabaseException(msg, node)
raise DbtDatabaseError(msg, node)
def raise_dep_not_found(node, node_description, required_pkg) -> NoReturn:
raise DependencyNotFound(node, node_description, required_pkg)
raise DependencyNotFoundError(node, node_description, required_pkg)
def raise_dependency_error(msg) -> NoReturn:
raise DependencyException(scrub_secrets(msg, env_secrets()))
raise DependencyError(scrub_secrets(msg, env_secrets()))
def raise_duplicate_patch_name(patch_1, existing_patch_path) -> NoReturn:
raise DuplicatePatchPath(patch_1, existing_patch_path)
raise DuplicatePatchPathError(patch_1, existing_patch_path)
def raise_duplicate_resource_name(node_1, node_2) -> NoReturn:
raise DuplicateResourceName(node_1, node_2)
raise DuplicateResourceNameError(node_1, node_2)
def raise_invalid_property_yml_version(path, issue) -> NoReturn:
raise InvalidPropertyYML(path, issue)
raise PropertyYMLError(path, issue)
def raise_not_implemented(msg) -> NoReturn:
raise NotImplementedException(msg)
raise NotImplementedError(msg)
def relation_wrong_type(relation, expected_type, model=None) -> NoReturn:
raise RelationWrongType(relation, expected_type, model)
raise RelationWrongTypeError(relation, expected_type, model)
# Update this when a new function should be added to the
@@ -128,7 +130,7 @@ def wrapper(model):
def inner(*args, **kwargs):
try:
return func(*args, **kwargs)
except RuntimeException as exc:
except DbtRuntimeError as exc:
exc.add_node(model)
raise exc

View File

@@ -1,6 +1,6 @@
from typing import Dict, MutableMapping, Optional
from dbt.contracts.graph.nodes import Macro
from dbt.exceptions import DuplicateMacroName, PackageNotFoundForMacro
from dbt.exceptions import DuplicateMacroNameError, PackageNotFoundForMacroError
from dbt.include.global_project import PROJECT_NAME as GLOBAL_PROJECT_NAME
from dbt.clients.jinja import MacroGenerator
@@ -86,7 +86,7 @@ class MacroResolver:
package_namespaces[macro.package_name] = namespace
if macro.name in namespace:
raise DuplicateMacroName(macro, macro, macro.package_name)
raise DuplicateMacroNameError(macro, macro, macro.package_name)
package_namespaces[macro.package_name][macro.name] = macro
def add_macro(self, macro: Macro):
@@ -187,7 +187,7 @@ class TestMacroNamespace:
elif package_name in self.macro_resolver.packages:
macro = self.macro_resolver.packages[package_name].get(name)
else:
raise PackageNotFoundForMacro(package_name)
raise PackageNotFoundForMacroError(package_name)
if not macro:
return None
macro_func = MacroGenerator(macro, self.ctx, self.node, self.thread_ctx)

View File

@@ -3,7 +3,7 @@ from typing import Any, Dict, Iterable, Union, Optional, List, Iterator, Mapping
from dbt.clients.jinja import MacroGenerator, MacroStack
from dbt.contracts.graph.nodes import Macro
from dbt.include.global_project import PROJECT_NAME as GLOBAL_PROJECT_NAME
from dbt.exceptions import DuplicateMacroName, PackageNotFoundForMacro
from dbt.exceptions import DuplicateMacroNameError, PackageNotFoundForMacroError
FlatNamespace = Dict[str, MacroGenerator]
@@ -75,7 +75,7 @@ class MacroNamespace(Mapping):
elif package_name in self.packages:
return self.packages[package_name].get(name)
else:
raise PackageNotFoundForMacro(package_name)
raise PackageNotFoundForMacroError(package_name)
# This class builds the MacroNamespace by adding macros to
@@ -122,7 +122,7 @@ class MacroNamespaceBuilder:
hierarchy[macro.package_name] = namespace
if macro.name in namespace:
raise DuplicateMacroName(macro_func.macro, macro, macro.package_name)
raise DuplicateMacroNameError(macro_func.macro, macro, macro.package_name)
hierarchy[macro.package_name][macro.name] = macro_func
def add_macro(self, macro: Macro, ctx: Dict[str, Any]):

View File

@@ -41,28 +41,28 @@ from dbt.contracts.graph.nodes import (
from dbt.contracts.graph.metrics import MetricReference, ResolvedMetricReference
from dbt.events.functions import get_metadata_vars
from dbt.exceptions import (
CompilationException,
ConflictingConfigKeys,
DisallowSecretEnvVar,
EnvVarMissing,
InternalException,
InvalidInlineModelConfig,
InvalidNumberSourceArgs,
InvalidPersistDocsValueType,
LoadAgateTableNotSeed,
CompilationError,
ConflictingConfigKeysError,
SecretEnvVarLocationError,
EnvVarMissingError,
DbtInternalError,
InlineModelConfigError,
NumberSourceArgsError,
PersistDocsValueTypeError,
LoadAgateTableNotSeedError,
LoadAgateTableValueError,
MacroInvalidDispatchArg,
MacrosSourcesUnWriteable,
MetricInvalidArgs,
MissingConfig,
OperationsCannotRefEphemeralNodes,
PackageNotInDeps,
ParsingException,
RefBadContext,
RefInvalidArgs,
RuntimeException,
TargetNotFound,
ValidationException,
MacroDispatchArgError,
MacrosSourcesUnWriteableError,
MetricArgsError,
MissingConfigError,
OperationsCannotRefEphemeralNodesError,
PackageNotInDepsError,
ParsingError,
RefBadContextError,
RefArgsError,
DbtRuntimeError,
TargetNotFoundError,
DbtValidationError,
)
from dbt.config import IsFQNResource
from dbt.node_types import NodeType, ModelLanguage
@@ -144,10 +144,10 @@ class BaseDatabaseWrapper:
f'`adapter.dispatch("{suggest_macro_name}", '
f'macro_namespace="{suggest_macro_namespace}")`?'
)
raise CompilationException(msg)
raise CompilationError(msg)
if packages is not None:
raise MacroInvalidDispatchArg(macro_name)
raise MacroDispatchArgError(macro_name)
namespace = macro_namespace
@@ -159,7 +159,7 @@ class BaseDatabaseWrapper:
search_packages = [self.config.project_name, namespace]
else:
# Not a string and not None so must be a list
raise CompilationException(
raise CompilationError(
f"In adapter.dispatch, got a list macro_namespace argument "
f'("{macro_namespace}"), but macro_namespace should be None or a string.'
)
@@ -172,8 +172,8 @@ class BaseDatabaseWrapper:
try:
# this uses the namespace from the context
macro = self._namespace.get_from_package(package_name, search_name)
except CompilationException:
# Only raise CompilationException if macro is not found in
except CompilationError:
# Only raise CompilationError if macro is not found in
# any package
macro = None
@@ -187,7 +187,7 @@ class BaseDatabaseWrapper:
searched = ", ".join(repr(a) for a in attempts)
msg = f"In dispatch: No macro named '{macro_name}' found\n Searched for: {searched}"
raise CompilationException(msg)
raise CompilationError(msg)
class BaseResolver(metaclass=abc.ABCMeta):
@@ -223,12 +223,12 @@ class BaseRefResolver(BaseResolver):
def validate_args(self, name: str, package: Optional[str]):
if not isinstance(name, str):
raise CompilationException(
raise CompilationError(
f"The name argument to ref() must be a string, got {type(name)}"
)
if package is not None and not isinstance(package, str):
raise CompilationException(
raise CompilationError(
f"The package argument to ref() must be a string or None, got {type(package)}"
)
@@ -241,7 +241,7 @@ class BaseRefResolver(BaseResolver):
elif len(args) == 2:
package, name = args
else:
raise RefInvalidArgs(node=self.model, args=args)
raise RefArgsError(node=self.model, args=args)
self.validate_args(name, package)
return self.resolve(name, package)
@@ -253,19 +253,19 @@ class BaseSourceResolver(BaseResolver):
def validate_args(self, source_name: str, table_name: str):
if not isinstance(source_name, str):
raise CompilationException(
raise CompilationError(
f"The source name (first) argument to source() must be a "
f"string, got {type(source_name)}"
)
if not isinstance(table_name, str):
raise CompilationException(
raise CompilationError(
f"The table name (second) argument to source() must be a "
f"string, got {type(table_name)}"
)
def __call__(self, *args: str) -> RelationProxy:
if len(args) != 2:
raise InvalidNumberSourceArgs(args, node=self.model)
raise NumberSourceArgsError(args, node=self.model)
self.validate_args(args[0], args[1])
return self.resolve(args[0], args[1])
@@ -282,12 +282,12 @@ class BaseMetricResolver(BaseResolver):
def validate_args(self, name: str, package: Optional[str]):
if not isinstance(name, str):
raise CompilationException(
raise CompilationError(
f"The name argument to metric() must be a string, got {type(name)}"
)
if package is not None and not isinstance(package, str):
raise CompilationException(
raise CompilationError(
f"The package argument to metric() must be a string or None, got {type(package)}"
)
@@ -300,7 +300,7 @@ class BaseMetricResolver(BaseResolver):
elif len(args) == 2:
package, name = args
else:
raise MetricInvalidArgs(node=self.model, args=args)
raise MetricArgsError(node=self.model, args=args)
self.validate_args(name, package)
return self.resolve(name, package)
@@ -321,7 +321,7 @@ class ParseConfigObject(Config):
if oldkey in config:
newkey = oldkey.replace("_", "-")
if newkey in config:
raise ConflictingConfigKeys(oldkey, newkey, node=self.model)
raise ConflictingConfigKeysError(oldkey, newkey, node=self.model)
config[newkey] = config.pop(oldkey)
return config
@@ -331,14 +331,14 @@ class ParseConfigObject(Config):
elif len(args) == 0 and len(kwargs) > 0:
opts = kwargs
else:
raise InvalidInlineModelConfig(node=self.model)
raise InlineModelConfigError(node=self.model)
opts = self._transform_config(opts)
# it's ok to have a parse context with no context config, but you must
# not call it!
if self.context_config is None:
raise RuntimeException("At parse time, did not receive a context config")
raise DbtRuntimeError("At parse time, did not receive a context config")
self.context_config.add_config_call(opts)
return ""
@@ -379,7 +379,7 @@ class RuntimeConfigObject(Config):
else:
result = self.model.config.get(name, default)
if result is _MISSING:
raise MissingConfig(unique_id=self.model.unique_id, name=name)
raise MissingConfigError(unique_id=self.model.unique_id, name=name)
return result
def require(self, name, validator=None):
@@ -401,14 +401,14 @@ class RuntimeConfigObject(Config):
def persist_relation_docs(self) -> bool:
persist_docs = self.get("persist_docs", default={})
if not isinstance(persist_docs, dict):
raise InvalidPersistDocsValueType(persist_docs)
raise PersistDocsValueTypeError(persist_docs)
return persist_docs.get("relation", False)
def persist_column_docs(self) -> bool:
persist_docs = self.get("persist_docs", default={})
if not isinstance(persist_docs, dict):
raise InvalidPersistDocsValueType(persist_docs)
raise PersistDocsValueTypeError(persist_docs)
return persist_docs.get("columns", False)
@@ -467,7 +467,7 @@ class RuntimeRefResolver(BaseRefResolver):
)
if target_model is None or isinstance(target_model, Disabled):
raise TargetNotFound(
raise TargetNotFoundError(
node=self.model,
target_name=target_name,
target_kind="node",
@@ -489,7 +489,7 @@ class RuntimeRefResolver(BaseRefResolver):
) -> None:
if resolved.unique_id not in self.model.depends_on.nodes:
args = self._repack_args(target_name, target_package)
raise RefBadContext(node=self.model, args=args)
raise RefBadContextError(node=self.model, args=args)
class OperationRefResolver(RuntimeRefResolver):
@@ -505,7 +505,7 @@ class OperationRefResolver(RuntimeRefResolver):
if target_model.is_ephemeral_model:
# In operations, we can't ref() ephemeral nodes, because
# Macros do not support set_cte
raise OperationsCannotRefEphemeralNodes(target_model.name, node=self.model)
raise OperationsCannotRefEphemeralNodesError(target_model.name, node=self.model)
else:
return super().create_relation(target_model, name)
@@ -528,7 +528,7 @@ class RuntimeSourceResolver(BaseSourceResolver):
)
if target_source is None or isinstance(target_source, Disabled):
raise TargetNotFound(
raise TargetNotFoundError(
node=self.model,
target_name=f"{source_name}.{table_name}",
target_kind="source",
@@ -555,7 +555,7 @@ class RuntimeMetricResolver(BaseMetricResolver):
)
if target_metric is None or isinstance(target_metric, Disabled):
raise TargetNotFound(
raise TargetNotFoundError(
node=self.model,
target_name=target_name,
target_kind="metric",
@@ -584,7 +584,7 @@ class ModelConfiguredVar(Var):
if package_name != self._config.project_name:
if package_name not in dependencies:
# I don't think this is actually reachable
raise PackageNotInDeps(package_name, node=self._node)
raise PackageNotInDepsError(package_name, node=self._node)
yield dependencies[package_name]
yield self._config
@@ -674,7 +674,7 @@ class ProviderContext(ManifestContext):
context_config: Optional[ContextConfig],
) -> None:
if provider is None:
raise InternalException(f"Invalid provider given to context: {provider}")
raise DbtInternalError(f"Invalid provider given to context: {provider}")
# mypy appeasement - we know it'll be a RuntimeConfig
self.config: RuntimeConfig
self.model: Union[Macro, ManifestNode] = model
@@ -751,7 +751,7 @@ class ProviderContext(ManifestContext):
return
elif value == arg:
return
raise ValidationException(
raise DbtValidationError(
'Expected value "{}" to be one of {}'.format(value, ",".join(map(str, args)))
)
@@ -767,7 +767,7 @@ class ProviderContext(ManifestContext):
def write(self, payload: str) -> str:
# macros/source defs aren't 'writeable'.
if isinstance(self.model, (Macro, SourceDefinition)):
raise MacrosSourcesUnWriteable(node=self.model)
raise MacrosSourcesUnWriteableError(node=self.model)
self.model.build_path = self.model.write_node(self.config.target_path, "run", payload)
return ""
@@ -782,12 +782,12 @@ class ProviderContext(ManifestContext):
try:
return func(*args, **kwargs)
except Exception:
raise CompilationException(message_if_exception, self.model)
raise CompilationError(message_if_exception, self.model)
@contextmember
def load_agate_table(self) -> agate.Table:
if not isinstance(self.model, SeedNode):
raise LoadAgateTableNotSeed(self.model.resource_type, node=self.model)
raise LoadAgateTableNotSeedError(self.model.resource_type, node=self.model)
assert self.model.root_path
path = os.path.join(self.model.root_path, self.model.original_file_path)
column_types = self.model.config.column_types
@@ -1185,7 +1185,7 @@ class ProviderContext(ManifestContext):
"https://docs.getdbt.com/reference/dbt-jinja-functions/dispatch)"
" adapter_macro was called for: {macro_name}".format(macro_name=name)
)
raise CompilationException(msg)
raise CompilationError(msg)
@contextmember
def env_var(self, var: str, default: Optional[str] = None) -> str:
@@ -1196,7 +1196,7 @@ class ProviderContext(ManifestContext):
"""
return_value = None
if var.startswith(SECRET_ENV_PREFIX):
raise DisallowSecretEnvVar(var)
raise SecretEnvVarLocationError(var)
if var in os.environ:
return_value = os.environ[var]
elif default is not None:
@@ -1229,7 +1229,7 @@ class ProviderContext(ManifestContext):
source_file.env_vars.append(var) # type: ignore[union-attr]
return return_value
else:
raise EnvVarMissing(var)
raise EnvVarMissingError(var)
@contextproperty
def selected_resources(self) -> List[str]:
@@ -1248,7 +1248,7 @@ class ProviderContext(ManifestContext):
and self.context_macro_stack.call_stack[1] == "macro.dbt.statement"
and "materialization" in self.context_macro_stack.call_stack[0]
):
raise RuntimeException(
raise DbtRuntimeError(
f"submit_python_job is not intended to be called here, at model {parsed_model['alias']}, with macro call_stack {self.context_macro_stack.call_stack}."
)
return self.adapter.submit_python_job(parsed_model, compiled_code)
@@ -1410,7 +1410,7 @@ def generate_runtime_macro_context(
class ExposureRefResolver(BaseResolver):
def __call__(self, *args) -> str:
if len(args) not in (1, 2):
raise RefInvalidArgs(node=self.model, args=args)
raise RefArgsError(node=self.model, args=args)
self.model.refs.append(list(args))
return ""
@@ -1418,7 +1418,7 @@ class ExposureRefResolver(BaseResolver):
class ExposureSourceResolver(BaseResolver):
def __call__(self, *args) -> str:
if len(args) != 2:
raise InvalidNumberSourceArgs(args, node=self.model)
raise NumberSourceArgsError(args, node=self.model)
self.model.sources.append(list(args))
return ""
@@ -1426,7 +1426,7 @@ class ExposureSourceResolver(BaseResolver):
class ExposureMetricResolver(BaseResolver):
def __call__(self, *args) -> str:
if len(args) not in (1, 2):
raise MetricInvalidArgs(node=self.model, args=args)
raise MetricArgsError(node=self.model, args=args)
self.model.metrics.append(list(args))
return ""
@@ -1468,14 +1468,14 @@ class MetricRefResolver(BaseResolver):
elif len(args) == 2:
package, name = args
else:
raise RefInvalidArgs(node=self.model, args=args)
raise RefArgsError(node=self.model, args=args)
self.validate_args(name, package)
self.model.refs.append(list(args))
return ""
def validate_args(self, name, package):
if not isinstance(name, str):
raise ParsingException(
raise ParsingError(
f"In a metrics section in {self.model.original_file_path} "
"the name argument to ref() must be a string"
)
@@ -1558,7 +1558,7 @@ class TestContext(ProviderContext):
def env_var(self, var: str, default: Optional[str] = None) -> str:
return_value = None
if var.startswith(SECRET_ENV_PREFIX):
raise DisallowSecretEnvVar(var)
raise SecretEnvVarLocationError(var)
if var in os.environ:
return_value = os.environ[var]
elif default is not None:
@@ -1584,7 +1584,7 @@ class TestContext(ProviderContext):
source_file.add_env_var(var, yaml_key, name) # type: ignore[union-attr]
return return_value
else:
raise EnvVarMissing(var)
raise EnvVarMissingError(var)
def generate_test_context(

View File

@@ -4,7 +4,7 @@ from typing import Any, Dict, Optional
from .base import BaseContext, contextmember
from dbt.constants import SECRET_ENV_PREFIX, DEFAULT_ENV_PLACEHOLDER
from dbt.exceptions import EnvVarMissing
from dbt.exceptions import EnvVarMissingError
SECRET_PLACEHOLDER = "$$$DBT_SECRET_START$$${}$$$DBT_SECRET_END$$$"
@@ -50,7 +50,7 @@ class SecretContext(BaseContext):
self.env_vars[var] = return_value if var in os.environ else DEFAULT_ENV_PLACEHOLDER
return return_value
else:
raise EnvVarMissing(var)
raise EnvVarMissingError(var)
def generate_secret_context(cli_vars: Dict[str, Any]) -> Dict[str, Any]:

View File

@@ -12,7 +12,7 @@ from typing import (
List,
Callable,
)
from dbt.exceptions import InternalException
from dbt.exceptions import DbtInternalError
from dbt.utils import translate_aliases
from dbt.events.functions import fire_event
from dbt.events.types import NewConnectionOpening
@@ -94,7 +94,7 @@ class Connection(ExtensibleDbtClassMixin, Replaceable):
# this will actually change 'self._handle'.
self._handle.resolve(self)
except RecursionError as exc:
raise InternalException(
raise DbtInternalError(
"A connection's open() method attempted to read the handle value"
) from exc
return self._handle

View File

@@ -40,16 +40,16 @@ from dbt.contracts.files import SourceFile, SchemaSourceFile, FileHash, AnySourc
from dbt.contracts.util import BaseArtifactMetadata, SourceKey, ArtifactMixin, schema_version
from dbt.dataclass_schema import dbtClassMixin
from dbt.exceptions import (
CompilationException,
DuplicateResourceName,
DuplicateMacroInPackage,
DuplicateMaterializationName,
CompilationError,
DuplicateResourceNameError,
DuplicateMacroInPackageError,
DuplicateMaterializationNameError,
)
from dbt.helper_types import PathSet
from dbt.events.functions import fire_event
from dbt.events.types import MergedFromState
from dbt.node_types import NodeType
from dbt import flags
from dbt.flags import get_flags, MP_CONTEXT
from dbt import tracking
import dbt.utils
@@ -102,7 +102,7 @@ class DocLookup(dbtClassMixin):
def perform_lookup(self, unique_id: UniqueID, manifest) -> Documentation:
if unique_id not in manifest.docs:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
f"Doc {unique_id} found in cache but not found in manifest"
)
return manifest.docs[unique_id]
@@ -135,7 +135,7 @@ class SourceLookup(dbtClassMixin):
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> SourceDefinition:
if unique_id not in manifest.sources:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
f"Source {unique_id} found in cache but not found in manifest"
)
return manifest.sources[unique_id]
@@ -173,7 +173,7 @@ class RefableLookup(dbtClassMixin):
def perform_lookup(self, unique_id: UniqueID, manifest) -> ManifestNode:
if unique_id not in manifest.nodes:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
f"Node {unique_id} found in cache but not found in manifest"
)
return manifest.nodes[unique_id]
@@ -206,7 +206,7 @@ class MetricLookup(dbtClassMixin):
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> Metric:
if unique_id not in manifest.metrics:
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
f"Metric {unique_id} found in cache but not found in manifest"
)
return manifest.metrics[unique_id]
@@ -303,7 +303,7 @@ class ManifestMetadata(BaseArtifactMetadata):
self.user_id = tracking.active_user.id
if self.send_anonymous_usage_stats is None:
self.send_anonymous_usage_stats = flags.SEND_ANONYMOUS_USAGE_STATS
self.send_anonymous_usage_stats = get_flags().SEND_ANONYMOUS_USAGE_STATS
@classmethod
def default(cls):
@@ -398,7 +398,7 @@ class MaterializationCandidate(MacroCandidate):
return NotImplemented
equal = self.specificity == other.specificity and self.locality == other.locality
if equal:
raise DuplicateMaterializationName(self.macro, other)
raise DuplicateMaterializationNameError(self.macro, other)
return equal
@@ -480,13 +480,13 @@ def _update_into(dest: MutableMapping[str, T], new_item: T):
"""
unique_id = new_item.unique_id
if unique_id not in dest:
raise dbt.exceptions.RuntimeException(
raise dbt.exceptions.DbtRuntimeError(
f"got an update_{new_item.resource_type} call with an "
f"unrecognized {new_item.resource_type}: {new_item.unique_id}"
)
existing = dest[unique_id]
if new_item.original_file_path != existing.original_file_path:
raise dbt.exceptions.RuntimeException(
raise dbt.exceptions.DbtRuntimeError(
f"cannot update a {new_item.resource_type} to have a new file path!"
)
dest[unique_id] = new_item
@@ -631,7 +631,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
metadata={"serialize": lambda x: None, "deserialize": lambda x: None},
)
_lock: Lock = field(
default_factory=flags.MP_CONTEXT.Lock,
default_factory=MP_CONTEXT.Lock,
metadata={"serialize": lambda x: None, "deserialize": lambda x: None},
)
@@ -643,7 +643,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
@classmethod
def __post_deserialize__(cls, obj):
obj._lock = flags.MP_CONTEXT.Lock()
obj._lock = MP_CONTEXT.Lock()
return obj
def sync_update_node(self, new_node: ManifestNode) -> ManifestNode:
@@ -839,7 +839,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
return self.metrics[unique_id]
else:
# something terrible has happened
raise dbt.exceptions.InternalException(
raise dbt.exceptions.DbtInternalError(
"Expected node {} not found in manifest".format(unique_id)
)
@@ -1035,7 +1035,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
def add_macro(self, source_file: SourceFile, macro: Macro):
if macro.unique_id in self.macros:
# detect that the macro exists and emit an error
raise DuplicateMacroInPackage(macro=macro, macro_mapping=self.macros)
raise DuplicateMacroInPackageError(macro=macro, macro_mapping=self.macros)
self.macros[macro.unique_id] = macro
source_file.macros.append(macro.unique_id)
@@ -1213,7 +1213,7 @@ class WritableManifest(ArtifactMixin):
def _check_duplicates(value: BaseNode, src: Mapping[str, BaseNode]):
if value.unique_id in src:
raise DuplicateResourceName(value, src[value.unique_id])
raise DuplicateResourceNameError(value, src[value.unique_id])
K_T = TypeVar("K_T")
@@ -1222,7 +1222,7 @@ V_T = TypeVar("V_T")
def _expect_value(key: K_T, src: Mapping[K_T, V_T], old_file: SourceFile, name: str) -> V_T:
if key not in src:
raise CompilationException(
raise CompilationError(
'Expected to find "{}" in cached "result.{}" based '
"on cached file information: {}!".format(key, name, old_file)
)

View File

@@ -9,7 +9,7 @@ from dbt.dataclass_schema import (
)
from dbt.contracts.graph.unparsed import AdditionalPropertiesAllowed, Docs
from dbt.contracts.graph.utils import validate_color
from dbt.exceptions import InternalException, CompilationException
from dbt.exceptions import DbtInternalError, CompilationError
from dbt.contracts.util import Replaceable, list_str
from dbt import hooks
from dbt.node_types import NodeType
@@ -30,7 +30,7 @@ def _get_meta_value(cls: Type[M], fld: Field, key: str, default: Any) -> M:
try:
return cls(value)
except ValueError as exc:
raise InternalException(f"Invalid {cls} value: {value}") from exc
raise DbtInternalError(f"Invalid {cls} value: {value}") from exc
def _set_meta_value(obj: M, key: str, existing: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
@@ -140,17 +140,17 @@ def _merge_field_value(
return _listify(self_value) + _listify(other_value)
elif merge_behavior == MergeBehavior.Update:
if not isinstance(self_value, dict):
raise InternalException(f"expected dict, got {self_value}")
raise DbtInternalError(f"expected dict, got {self_value}")
if not isinstance(other_value, dict):
raise InternalException(f"expected dict, got {other_value}")
raise DbtInternalError(f"expected dict, got {other_value}")
value = self_value.copy()
value.update(other_value)
return value
elif merge_behavior == MergeBehavior.DictKeyAppend:
if not isinstance(self_value, dict):
raise InternalException(f"expected dict, got {self_value}")
raise DbtInternalError(f"expected dict, got {self_value}")
if not isinstance(other_value, dict):
raise InternalException(f"expected dict, got {other_value}")
raise DbtInternalError(f"expected dict, got {other_value}")
new_dict = {}
for key in self_value.keys():
new_dict[key] = _listify(self_value[key])
@@ -172,7 +172,7 @@ def _merge_field_value(
return new_dict
else:
raise InternalException(f"Got an invalid merge_behavior: {merge_behavior}")
raise DbtInternalError(f"Got an invalid merge_behavior: {merge_behavior}")
def insensitive_patterns(*patterns: str):
@@ -227,7 +227,7 @@ class BaseConfig(AdditionalPropertiesAllowed, Replaceable):
msg = (
'Error, tried to delete config key "{}": Cannot delete ' "built-in keys"
).format(key)
raise CompilationException(msg)
raise CompilationError(msg)
else:
del self._extra[key]

View File

@@ -44,8 +44,9 @@ from dbt.events.types import (
SeedExceedsLimitChecksumChanged,
)
from dbt.events.contextvars import set_contextvars
from dbt import flags
from dbt.flags import get_flags
from dbt.node_types import ModelLanguage, NodeType
from dbt.utils import cast_dict_to_dict_of_strings
from .model_config import (
@@ -206,6 +207,8 @@ class NodeInfoMixin:
@property
def node_info(self):
meta = getattr(self, "meta", {})
meta_stringified = cast_dict_to_dict_of_strings(meta)
node_info = {
"node_path": getattr(self, "path", None),
"node_name": getattr(self, "name", None),
@@ -215,6 +218,7 @@ class NodeInfoMixin:
"node_status": str(self._event_status.get("node_status")),
"node_started_at": self._event_status.get("started_at"),
"node_finished_at": self._event_status.get("finished_at"),
"meta": meta_stringified,
}
node_info_msg = NodeInfo(**node_info)
return node_info_msg
@@ -553,7 +557,7 @@ class TestShouldStoreFailures:
def should_store_failures(self):
if self.config.store_failures:
return self.config.store_failures
return flags.STORE_FAILURES
return get_flags().STORE_FAILURES
@property
def is_relational(self):
@@ -976,12 +980,12 @@ class Metric(GraphNode):
description: str
label: str
calculation_method: str
timestamp: str
expression: str
filters: List[MetricFilter]
time_grains: List[str]
dimensions: List[str]
resource_type: NodeType = field(metadata={"restrict": [NodeType.Metric]})
timestamp: Optional[str] = None
window: Optional[MetricTime] = None
model: Optional[str] = None
model_unique_id: Optional[str] = None

View File

@@ -11,7 +11,7 @@ from dbt.contracts.util import (
# trigger the PathEncoder
import dbt.helper_types # noqa:F401
from dbt.exceptions import CompilationException, ParsingException
from dbt.exceptions import CompilationError, ParsingError
from dbt.dataclass_schema import dbtClassMixin, StrEnum, ExtensibleDbtClassMixin, ValidationError
@@ -222,7 +222,7 @@ class ExternalPartition(AdditionalPropertiesAllowed, Replaceable):
def __post_init__(self):
if self.name == "" or self.data_type == "":
raise CompilationException("External partition columns must have names and data types")
raise CompilationError("External partition columns must have names and data types")
@dataclass
@@ -484,9 +484,9 @@ class UnparsedMetric(dbtClassMixin, Replaceable):
name: str
label: str
calculation_method: str
timestamp: str
expression: str
description: str = ""
timestamp: Optional[str] = None
time_grains: List[str] = field(default_factory=list)
dimensions: List[str] = field(default_factory=list)
window: Optional[MetricTime] = None
@@ -514,10 +514,20 @@ class UnparsedMetric(dbtClassMixin, Replaceable):
errors.append("must contain only letters, numbers and underscores")
if errors:
raise ParsingException(
raise ParsingError(
f"The metric name '{data['name']}' is invalid. It {', '.join(e for e in errors)}"
)
if data.get("timestamp") is None and data.get("time_grains") is not None:
raise ValidationError(
f"The metric '{data['name']} has time_grains defined but is missing a timestamp dimension."
)
if data.get("timestamp") is None and data.get("window") is not None:
raise ValidationError(
f"The metric '{data['name']} has a window defined but is missing a timestamp dimension."
)
if data.get("model") is None and data.get("calculation_method") != "derived":
raise ValidationError("Non-derived metrics require a 'model' property")

Some files were not shown because too many files have changed in this diff Show More