Compare commits

...

41 Commits

Author SHA1 Message Date
Matt Shaver
e4f6adfdfd Code formatting fixes 2023-03-23 13:36:54 -04:00
Matt Shaver
355fad3a76 Updating docs links with current URLs 2023-03-23 10:44:45 -04:00
Quigley Malcolm
da4a90aa11 CT-1928: dbtRunner to EventManager callback support (#7214)
* add utility function to EventManager for explicitly adding callbacks

Technically these aren't necessary in their current state. We could instead
have people do `<InstantiatedEventManager>.callbacks.extend(...)` directly.
However, it's not hard to imagine a world wherein extra things need to take
place when a callback is added. Thus abstracting to a utility method
now means that as the implementation of how callbacks are actually added
changes, the invocation to do so can stay the same.

* update `setup_event_logger` to optionally take in callbacks add them to the EventManager

* update preflight decorator to check for and pass along callbacks for event logger setup

* Add `callbacks` to `dbtRunner`

On instantiation of `dbtRunner` one can now provide `callbacks`. These
callbacks are for the `EventLogger`. When `invoke` is called on a `dbtRunner`,
the `callbacks` are added to the cli context object. In the preflight
decorator these callbacks are extracted from the cli context and then
passed to the `setup_event_logger`, finally `setup_event_logger` ensures
the callbacks are added to the global `EVENT_MANAGER`.

* add test to check dbtRunner callbacks get properly set

I believe technically this tests qualifies as more of an integration
test, but no other tests like it currently exist (that I could find
via a cursory search). The `tests/unit/test_dbt_runner.py` seemed like
the most intuitive spot. However, if somewhere else makes sense, I'd be
happy to move it.

* add changie documentation for CT-1928
2023-03-22 17:32:01 -07:00
Mila Page
2cfc386773 Convert simple copy. (#7205)
* Convert simple copy.

* Adjust class names for import.

* adjust test namespacing

* Resolve test error.

---------

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2023-03-22 15:53:07 -07:00
Gerda Shank
ae485f996a CT 1998 use google protobuf to enable more flexible dictionaries (#7190) 2023-03-22 15:59:50 -04:00
Peter Webb
73ff497200 ct-2198: Unify constraints and check_constraints fields (#7130)
* ct-2198: clean up some type names and uses

* CT-2198: Unify constraints and constraints_check properties on columns

* Make mypy version consistently 0.981 (#7134)

* CT 1808 diff based partial parsing (#6873)

* model contracts on models materialized as views (#7120)

* first pass

* rename tests

* fix failing test

* changelog

* fix functional test

* Update core/dbt/parser/base.py

* Update core/dbt/parser/schemas.py

* Create method for env var deprecation (#7086)

* update to allow adapters to change model name resolution in py models (#7115)

* update to allow adapters to change model name resolution in py models

* add changie

* fix newline adds

* move quoting into macro

* use single quotes

* add env DBT_PROJECT_DIR support #6078 (#6659)

Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com>

* Add new index.html and changelog yaml files from dbt-docs (#7141)

* Make version configs optional (#7060)

* [CT-1584] New top level commands: interactive compile (#7008)

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>

* CT-2198: Add changelog entry

* CT-2198: Fix tests which broke after merge

* CT-2198: Add explicit validation of constraint types w/ unit test

* CT-2198: Move access property, per code review

* CT-2198: Remove a redundant macro

* CT-1298: Rework constraints to be adapter-generated in Python code

* CT-2198: Clarify function name per review

---------

Co-authored-by: Gerda Shank <gerda@dbtlabs.com>
Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com>
Co-authored-by: Stu Kilgore <stu.kilgore@dbtlabs.com>
Co-authored-by: colin-rogers-dbt <111200756+colin-rogers-dbt@users.noreply.github.com>
Co-authored-by: Leo Schick <67712864+leo-schick@users.noreply.github.com>
Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com>
Co-authored-by: FishtownBuildBot <77737458+FishtownBuildBot@users.noreply.github.com>
Co-authored-by: dave-connors-3 <73915542+dave-connors-3@users.noreply.github.com>
Co-authored-by: Kshitij Aranke <kshitij.aranke@dbtlabs.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-03-22 13:08:06 -04:00
Quigley Malcolm
9a7305d43f CT-2049: Add CommandCompleted event (#7180)
* add protobuf message/class for new CommandCompleted event

For [CT-2049](https://github.com/dbt-labs/dbt-core/issues/6878) we
concluded that we wanted a new event type, [CommandCompleted](https://github.com/dbt-labs/dbt-core/issues/6878#issuecomment-1419718606)
with [four (4) values](https://github.com/dbt-labs/dbt-core/issues/6878#issuecomment-1426118283):
which command was run, whether the command succeeded, the timestamp
that the command finished, and how long the command took. This commit
adds the new event proto defition, the auto generated proto_types, and
the instantiatable even type.

* begin emitting CommandCompleted event in the preflight decorator

The [preflight decorator](4186f99b74/core/dbt/cli/requires.py (L19))
runs at the start of every CLI invocation. Thus is a perfect candidate
for emitting the CommandCompleted event. This is noted in the [dicussion
on CT-2049](https://github.com/dbt-labs/dbt-core/issues/6878#issuecomment-1428643539).

* add CommandCompleted event to event unit tests

* Add: changelog entry

* fire CommandCompleted event reguardless of upstream exceptions

Previously, if `--fail-fast` was specified and an issue was run into
or an unhandled issue became an exception, the CommandCompleted event
would not get fired because at this point in the stack we'd be in
exception thrown handling mode. If an exception does reach this point,
we want to still fire the event and also continue to propogate the
exception. Hence the bare `raise` exists to reraise the caught exception

* Update CommandCompleted event to be a `Debug` level event

We don't actually "always" need this event to be logged. Thus we've
updated it to `Debug` level. [Discussion Context](https://github.com/dbt-labs/dbt-core/pull/7180#discussion_r1139281963)
2023-03-22 08:45:11 -07:00
Emily Rockman
ca23148908 Stop ignoring test directory for precommit (#7201)
* reformat test directory to pass formatting checks

* remove test comment
2023-03-22 08:04:13 -05:00
Jeremy Cohen
8225a009b5 Add deprecation warnings for log-path, target-path in dbt_project.yml (#7185)
* Add deprecation warnings for log-path, target-path in dbt_project.yml

* Fix tests/unit/test_events

* Fix failing tests

* PR feedback
2023-03-21 22:31:15 +01:00
Emily Rockman
9605b76178 update workflow to install dev requirements and remove action deprecations (#7203) 2023-03-21 10:41:33 -05:00
Stu Kilgore
137dd9aa1b Deprecate more env vars (#7175) 2023-03-20 11:51:32 -05:00
Gerda Shank
a203fe866a CT 2196, CT2121 constraints column order (#7161) 2023-03-19 19:24:07 -04:00
FishtownBuildBot
4186f99b74 [Automated] Merged prep-release/1.5.0b4_4438341695 into target main during release process 2023-03-16 10:30:09 -05:00
Github Build Bot
6db899eddd Bumping version to 1.5.0b4 and generate changelog 2023-03-16 14:52:41 +00:00
dependabot[bot]
8ea20b4ba2 Update pathspec requirement from <0.11,>=0.9 to >=0.9,<0.12 in /core (#6737)
* Update pathspec requirement from <0.11,>=0.9 to >=0.9,<0.12 in /core

Updates the requirements on [pathspec](https://github.com/cpburnz/python-pathspec) to permit the latest version.
- [Release notes](https://github.com/cpburnz/python-pathspec/releases)
- [Changelog](https://github.com/cpburnz/python-pathspec/blob/master/CHANGES.rst)
- [Commits](https://github.com/cpburnz/python-pathspec/compare/v0.9.0...v0.11.0)

---
updated-dependencies:
- dependency-name: pathspec
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
Co-authored-by: leahwicz <60146280+leahwicz@users.noreply.github.com>
2023-03-16 10:24:04 -04:00
FishtownBuildBot
3f76f82c88 Add new index.html and changelog yaml files from dbt-docs (#7174) 2023-03-15 12:21:01 -05:00
dependabot[bot]
6cbf66db58 Bump python from 3.10.7-slim-bullseye to 3.11.1-slim-bullseye in /docker (#6424)
* Bump python from 3.10.7-slim-bullseye to 3.11.1-slim-bullseye in /docker

Bumps python from 3.10.7-slim-bullseye to 3.11.1-slim-bullseye.

---
updated-dependencies:
- dependency-name: python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-03-14 23:05:25 -04:00
dependabot[bot]
8cd11b380f Bump black from 22.10.0 to 22.12.0 (#6425)
* Bump black from 22.10.0 to 22.12.0

Bumps [black](https://github.com/psf/black) from 22.10.0 to 22.12.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.10.0...22.12.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-03-14 23:04:11 -04:00
Emily Rockman
814eb65d59 Support contract config outside model yaml (#7148)
* first pass

* next pass

* works on local project

* fix tests

* changelog

* remove comment

* update error message

* format message

* update tests
2023-03-14 10:19:14 -05:00
Emily Rockman
f24452a3ab use timezone with no DST in test (#7159) 2023-03-13 12:34:14 -05:00
Kshitij Aranke
30503697f2 [CT-1584] New top level commands: interactive compile (#7008)
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2023-03-10 17:58:33 -08:00
dave-connors-3
90902689c3 Make version configs optional (#7060) 2023-03-10 10:32:29 -05:00
FishtownBuildBot
5a0e776cff Add new index.html and changelog yaml files from dbt-docs (#7141) 2023-03-09 17:23:38 -08:00
Leo Schick
9368e7a6a1 add env DBT_PROJECT_DIR support #6078 (#6659)
Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com>
2023-03-09 10:41:52 -08:00
colin-rogers-dbt
c02ddf8c0e update to allow adapters to change model name resolution in py models (#7115)
* update to allow adapters to change model name resolution in py models

* add changie

* fix newline adds

* move quoting into macro

* use single quotes
2023-03-08 15:33:32 -08:00
Stu Kilgore
64b8a12a42 Create method for env var deprecation (#7086) 2023-03-08 11:07:44 -06:00
Emily Rockman
e895fe9e4b model contracts on models materialized as views (#7120)
* first pass

* rename tests

* fix failing test

* changelog

* fix functional test

* Update core/dbt/parser/base.py

* Update core/dbt/parser/schemas.py
2023-03-07 16:27:25 -06:00
Gerda Shank
8d987521dd CT 1808 diff based partial parsing (#6873) 2023-03-07 16:37:38 -05:00
Gerda Shank
4aafc5ef4a Make mypy version consistently 0.981 (#7134) 2023-03-07 15:12:01 -05:00
Alexander Butler
24ca76ea58 [Feature] Add unix-style fqn wildcard selector method (#6599)
resolves https://github.com/dbt-labs/dbt-core/issues/6598
2023-03-05 06:36:17 -08:00
Michelle Ark
b681908ee2 get_column_schema_from_query macro (#6986)
Add adapter.get_column_schema_from_query
2023-03-03 14:21:22 -05:00
FishtownBuildBot
72076b3fe5 [Automated] Merged prep-release/1.5.0b3_4316612471 into target main during release process 2023-03-02 12:02:33 -06:00
Github Build Bot
0683c59dcd Bumping version to 1.5.0b3 and generate changelog 2023-03-02 17:31:25 +00:00
Stu Kilgore
8019498f09 Remove cli doc generation workflow (#7089) 2023-03-01 10:59:39 -06:00
FishtownBuildBot
6234aec7d2 [Automated] Merged prep-release/1.5.0b2_4298598835 into target main during release process 2023-02-28 18:43:05 -06:00
Github Build Bot
edd8059eb3 Bumping version to 1.5.0b2 and generate changelog 2023-03-01 00:14:39 +00:00
Jeremy Cohen
e3be347768 Roadmap update (Feb 2023) (#7091)
* Init roadmap

* Rework the top paragraph

* Clean-up the whole thing

* Typos and stuff

* Add a missing word

* Fix typo

* Update "when" note

* Next draft

* Propose rename

* Resolve TODOs, still needs a reread

* Being cute

* Another read through

* Fix sentence fragment

---------

Co-authored-by: Florian Eiden <florian.eiden@dbtlabs.com>
2023-02-28 18:48:29 -05:00
FishtownBuildBot
597acf1fa1 Add new index.html and changelog yaml files from dbt-docs (#7092) 2023-02-28 16:11:37 -06:00
Gerda Shank
effa1a0813 Move check of invalid groups earlier in parsing (#7090) 2023-02-28 17:05:20 -05:00
Stu Kilgore
726800be57 Make output_keys param MultiOption (#7068) 2023-02-28 09:29:58 -06:00
Sam Debruyn
8b79747908 fix: add pytz dependency (#7077) 2023-02-28 08:05:28 -06:00
240 changed files with 25957 additions and 12861 deletions

View File

@@ -1,5 +1,5 @@
[bumpversion]
current_version = 1.5.0b1
current_version = 1.5.0b4
parse = (?P<major>[\d]+) # major version number
\.(?P<minor>[\d]+) # minor version number
\.(?P<patch>[\d]+) # patch version number

40
.changes/1.5.0-b2.md Normal file
View File

@@ -0,0 +1,40 @@
## dbt-core 1.5.0-b2 - March 01, 2023
### Features
- Make project version optional ([#6603](https://github.com/dbt-labs/dbt-core/issues/6603))
- parse 'group' config on groupable nodes ([#6823](https://github.com/dbt-labs/dbt-core/issues/6823))
- Implemented new log cli parameters for finer-grained control. ([#6639](https://github.com/dbt-labs/dbt-core/issues/6639))
- Add access attribute to parsed nodes ([#6824](https://github.com/dbt-labs/dbt-core/issues/6824))
- Add ability to select by group resource ([#6825](https://github.com/dbt-labs/dbt-core/issues/6825))
- Disallow refing private model across groups ([#6826](https://github.com/dbt-labs/dbt-core/issues/6826))
### Fixes
- Remove trailing slashes from source paths (#6102) ([#6102](https://github.com/dbt-labs/dbt-core/issues/6102))
- Fix compilation logic for ephemeral nodes ([#6885](https://github.com/dbt-labs/dbt-core/issues/6885))
- Fix semver comparison logic by ensuring numeric values ([#7039](https://github.com/dbt-labs/dbt-core/issues/7039))
- add pytz dependency ([#7077](https://github.com/dbt-labs/dbt-core/issues/7077))
### Docs
- Improve displayed message under "Arguments" section for argumentless macro ([dbt-docs/#358](https://github.com/dbt-labs/dbt-docs/issues/358))
- Add access property to model details ([dbt-docs/#381](https://github.com/dbt-labs/dbt-docs/issues/381))
- Display model owner by name and email ([dbt-docs/#377](https://github.com/dbt-labs/dbt-docs/issues/377))
- Add view of public models sorted by group to left navigation ([dbt-docs/#379](https://github.com/dbt-labs/dbt-docs/issues/379))
### Under the Hood
- Rename "constraint_enabled" to "contract" ([#6748](https://github.com/dbt-labs/dbt-core/issues/6748))
- Make output_keys click param multi-option instead of a string ([#6676](https://github.com/dbt-labs/dbt-core/issues/6676))
- Move validation of group earlier ([#7087](https://github.com/dbt-labs/dbt-core/issues/7087))
### Dependency
- Bump mypy from 0.971 to 0.981 ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@MartinGuindon](https://github.com/MartinGuindon) ([#358](https://github.com/dbt-labs/dbt-core/issues/358))
- [@jmg-duarte](https://github.com/jmg-duarte) ([#6102](https://github.com/dbt-labs/dbt-core/issues/6102))
- [@sdebruyn](https://github.com/sdebruyn) ([#7077](https://github.com/dbt-labs/dbt-core/issues/7077))
- [@seub](https://github.com/seub) ([#6603](https://github.com/dbt-labs/dbt-core/issues/6603))

5
.changes/1.5.0-b3.md Normal file
View File

@@ -0,0 +1,5 @@
## dbt-core 1.5.0-b3 - March 02, 2023
### Under the Hood
- Remove cli doc generation workflow ([#7088](https://github.com/dbt-labs/dbt-core/issues/7088))

39
.changes/1.5.0-b4.md Normal file
View File

@@ -0,0 +1,39 @@
## dbt-core 1.5.0-b4 - March 16, 2023
### Features
- ✨ add unix-style wildcard selector method ([#6598](https://github.com/dbt-labs/dbt-core/issues/6598))
- add support for DBT_PROJECT_DIR env var ([#6078](https://github.com/dbt-labs/dbt-core/issues/6078))
- Enable diff based partial parsing ([#6592](https://github.com/dbt-labs/dbt-core/issues/6592))
- Enforce contracts on models materialized as tables and views ([#6751](https://github.com/dbt-labs/dbt-core/issues/6751), [#7034](https://github.com/dbt-labs/dbt-core/issues/7034), [#6756](https://github.com/dbt-labs/dbt-core/issues/6756))
- make version configs optional ([#7054](https://github.com/dbt-labs/dbt-core/issues/7054))
- [CT-1584] New top level commands: interactive compile ([#6358](https://github.com/dbt-labs/dbt-core/issues/6358))
### Fixes
- allow adapters to change model name resolution in py models ([#7114](https://github.com/dbt-labs/dbt-core/issues/7114))
### Docs
- Distiguish node "access" in the DAG with node borders & opacity. ([dbt-docs/#378](https://github.com/dbt-labs/dbt-docs/issues/378))
- Fix JSON path to package overview docs ([dbt-docs/#390](https://github.com/dbt-labs/dbt-docs/issues/390))
- Add selection by group to DAG ([dbt-docs/#380](https://github.com/dbt-labs/dbt-docs/issues/380))
### Under the Hood
- Add deprecation warning for DBT_NO_PRINT ([#6960](https://github.com/dbt-labs/dbt-core/issues/6960))
### Dependencies
- Update pathspec requirement from <0.11,>=0.9 to >=0.9,<0.12 in /core ([#6737](https://github.com/dbt-labs/dbt-core/pull/6737))
### Dependency
- Bump python from 3.10.7-slim-bullseye to 3.11.1-slim-bullseye in /docker ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
- Bump black from 22.10.0 to 22.12.0 ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@dave-connors-3](https://github.com/dave-connors-3) ([#7054](https://github.com/dbt-labs/dbt-core/issues/7054))
- [@leo-schick](https://github.com/leo-schick) ([#6078](https://github.com/dbt-labs/dbt-core/issues/6078))
- [@rlh1994](https://github.com/rlh1994) ([#390](https://github.com/dbt-labs/dbt-core/issues/390))
- [@z3z1ma](https://github.com/z3z1ma) ([#6598](https://github.com/dbt-labs/dbt-core/issues/6598))

View File

@@ -0,0 +1,6 @@
kind: "Dependencies"
body: "Update pathspec requirement from <0.11,>=0.9 to >=0.9,<0.12 in /core"
time: 2023-01-26T00:02:37.00000Z
custom:
Author: dependabot[bot]
PR: 6737

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Bump python from 3.10.7-slim-bullseye to 3.11.1-slim-bullseye in /docker"
time: 2022-12-12T00:02:40.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6424

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Bump black from 22.10.0 to 22.12.0"
time: 2022-12-12T00:26:59.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6425

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Add access property to model details
time: 2023-02-27T11:45:10.424513-06:00
custom:
Author: emmyoop
Issue: "381"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Display model owner by name and email
time: 2023-02-27T14:14:24.630816-06:00
custom:
Author: emmyoop
Issue: "377"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Add view of public models sorted by group to left navigation
time: 2023-02-28T10:09:32.015415-06:00
custom:
Author: emmyoop
Issue: "379"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Distiguish node "access" in the DAG with node borders & opacity.
time: 2023-03-07T10:42:19.044231-06:00
custom:
Author: emmyoop jtcohen6
Issue: "378"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Fix JSON path to package overview docs
time: 2023-03-07T17:49:51.256097-07:00
custom:
Author: rlh1994 dbeatty10
Issue: "390"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Add selection by group to DAG
time: 2023-03-08T09:37:36.78968-06:00
custom:
Author: emmyoop
Issue: "380"

View File

@@ -0,0 +1,6 @@
kind: Features
body: ✨ add unix-style wildcard selector method
time: 2023-01-12T19:17:05.841918-07:00
custom:
Author: z3z1ma
Issue: "6598"

View File

@@ -0,0 +1,6 @@
kind: Features
body: add support for DBT_PROJECT_DIR env var
time: 2023-01-19T14:11:56.638325919+01:00
custom:
Author: leo-schick
Issue: "6078"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Enable diff based partial parsing
time: 2023-02-06T08:47:49.688889-05:00
custom:
Author: gshank
Issue: "6592"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Enforce contracts on models materialized as tables and views
time: 2023-02-22T13:06:32.583743-05:00
custom:
Author: jtcohen6 michelleark emmyoop
Issue: 6751 7034 6756

View File

@@ -0,0 +1,6 @@
kind: Features
body: make version configs optional
time: 2023-02-27T09:13:16.104386-06:00
custom:
Author: dave-connors-3
Issue: "7054"

View File

@@ -0,0 +1,6 @@
kind: Features
body: '[CT-1584] New top level commands: interactive compile'
time: 2023-03-06T17:02:51.240582-08:00
custom:
Author: aranke
Issue: "6358"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: add pytz dependency
time: 2023-02-28T13:03:18.353468+01:00
custom:
Author: sdebruyn
Issue: "7077"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: allow adapters to change model name resolution in py models
time: 2023-03-03T11:25:19.276637-08:00
custom:
Author: colin-rogers-dbt
Issue: "7114"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Add deprecation warning for DBT_NO_PRINT
time: 2023-02-24T13:28:11.295561-06:00
custom:
Author: stu-k
Issue: "6960"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Make output_keys click param multi-option instead of a string
time: 2023-02-27T10:20:16.16233-06:00
custom:
Author: stu-k
Issue: "6676"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Remove cli doc generation workflow
time: 2023-02-28T13:04:35.20038-06:00
custom:
Author: stu-k
Issue: "7088"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Move validation of group earlier
time: 2023-02-28T14:04:40.978022-05:00
custom:
Author: gshank
Issue: "7087"

View File

@@ -0,0 +1,9 @@
kind: Breaking Changes
body: Specifying "log-path" and "target-path" in "dbt_project.yml" is deprecated.
This functionality will be removed in a future version of dbt-core. If you need
to specify a custom path for logs or artifacts, please set via CLI flag or env var
instead.
time: 2023-03-17T11:00:33.448472+01:00
custom:
Author: jtcohen6
Issue: "6882"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Make model contracts agnostic to ordering
time: 2023-03-13T13:59:17.255368-04:00
custom:
Author: gshank
Issue: 6975 7064

View File

@@ -0,0 +1,6 @@
kind: Features
body: Unified constraints and check_constraints properties for columns and models
time: 2023-03-15T13:51:08.259624-04:00
custom:
Author: peterallenwebb
Issue: "7066"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Switch from betterproto to google protobuf and enable more flexible meta dictionary
in logs
time: 2023-03-18T16:43:26.782738-04:00
custom:
Author: gshank
Issue: "6832"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Support setting of callbacks for programmatic uses of `dbtRunner`
time: 2023-03-22T12:46:15.877884-07:00
custom:
Author: QMalcolm
Issue: "6763"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Deprecate additional environment variables
time: 2023-03-15T12:27:23.194686-05:00
custom:
Author: stu-k
Issue: "6903"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Add CommandCompleted event, and fire it upon completion of every command
time: 2023-03-16T12:33:05.696752-07:00
custom:
Author: QMalcolm
Issue: "6878"

1
.gitattributes vendored
View File

@@ -1,2 +1,3 @@
core/dbt/include/index.html binary
tests/functional/artifacts/data/state/*/manifest.json binary
core/dbt/events/types_pb2.py binary

View File

@@ -18,7 +18,7 @@ body:
For "big ideas" about future capabilities of dbt, we ask that you open a
[discussion](https://github.com/dbt-labs/dbt-core/discussions) in the "Ideas" category instead.
options:
- label: I have read the [expectations for open source contributors](https://docs.getdbt.com/docs/contributing/oss-expectations)
- label: I have read the [expectations for open source contributors](https://docs.getdbt.com/community/resources/oss-expectations)
required: true
- label: I have searched the existing issues, and I could not find an existing issue for this feature
required: true

View File

@@ -16,7 +16,7 @@ resolves #
### Checklist
- [ ] I have read [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md) and understand what's expected of me
- [ ] I have signed the [CLA](https://docs.getdbt.com/docs/contributor-license-agreements)
- [ ] I have signed the [CLA](https://docs.getdbt.com/community/resources/contributor-license-agreements)
- [ ] I have run this code in development and it appears to resolve the stated issue
- [ ] This PR includes tests, or tests are not required/relevant for this PR
- [ ] I have [opened an issue to add/update docs](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose), or docs changes are not required/relevant for this PR

View File

@@ -1,165 +0,0 @@
# **what?**
# On push, if anything in core/dbt/docs or core/dbt/cli has been
# created or modified, regenerate the CLI API docs using sphinx.
# **why?**
# We watch for changes in core/dbt/cli because the CLI API docs rely on click
# and all supporting flags/params to be generated. We watch for changes in
# core/dbt/docs since any changes to sphinx configuration or any of the
# .rst files there could result in a differently build final index.html file.
# **when?**
# Whenever a change has been pushed to a branch, and only if there is a diff
# between the PR branch and main's core/dbt/cli and or core/dbt/docs dirs.
# TODO: add bot comment to PR informing contributor that the docs have been committed
# TODO: figure out why github action triggered pushes cause github to fail to report
# the status of jobs
name: Generate CLI API docs
on:
pull_request:
permissions:
contents: write
pull-requests: write
env:
CLI_DIR: ${{ github.workspace }}/core/dbt/cli
DOCS_DIR: ${{ github.workspace }}/core/dbt/docs
DOCS_BUILD_DIR: ${{ github.workspace }}/core/dbt/docs/build
jobs:
check_gen:
name: check if generation needed
runs-on: ubuntu-latest
if: ${{ github.event.pull_request.head.repo.fork == false }}
outputs:
cli_dir_changed: ${{ steps.check_cli.outputs.cli_dir_changed }}
docs_dir_changed: ${{ steps.check_docs.outputs.docs_dir_changed }}
steps:
- name: "[DEBUG] print variables"
run: |
echo "env.CLI_DIR: ${{ env.CLI_DIR }}"
echo "env.DOCS_BUILD_DIR: ${{ env.DOCS_BUILD_DIR }}"
echo "env.DOCS_DIR: ${{ env.DOCS_DIR }}"
- name: git checkout
uses: actions/checkout@v3
with:
fetch-depth: 0
ref: ${{ github.head_ref }}
- name: set shas
id: set_shas
run: |
THIS_SHA=$(git rev-parse @)
LAST_SHA=$(git rev-parse @~1)
echo "this sha: $THIS_SHA"
echo "last sha: $LAST_SHA"
echo "this_sha=$THIS_SHA" >> $GITHUB_OUTPUT
echo "last_sha=$LAST_SHA" >> $GITHUB_OUTPUT
- name: check for changes in core/dbt/cli
id: check_cli
run: |
CLI_DIR_CHANGES=$(git diff \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.CLI_DIR }})
if [ -n "$CLI_DIR_CHANGES" ]; then
echo "changes found"
echo $CLI_DIR_CHANGES
echo "cli_dir_changed=true" >> $GITHUB_OUTPUT
exit 0
fi
echo "cli_dir_changed=false" >> $GITHUB_OUTPUT
echo "no changes found"
- name: check for changes in core/dbt/docs
id: check_docs
if: steps.check_cli.outputs.cli_dir_changed == 'false'
run: |
DOCS_DIR_CHANGES=$(git diff --name-only \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.DOCS_DIR }} ':!${{ env.DOCS_BUILD_DIR }}')
DOCS_BUILD_DIR_CHANGES=$(git diff --name-only \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.DOCS_BUILD_DIR }})
if [ -n "$DOCS_DIR_CHANGES" ] && [ -z "$DOCS_BUILD_DIR_CHANGES" ]; then
echo "changes found"
echo $DOCS_DIR_CHANGES
echo "docs_dir_changed=true" >> $GITHUB_OUTPUT
exit 0
fi
echo "docs_dir_changed=false" >> $GITHUB_OUTPUT
echo "no changes found"
gen_docs:
name: generate docs
runs-on: ubuntu-latest
needs: [check_gen]
if: |
needs.check_gen.outputs.cli_dir_changed == 'true'
|| needs.check_gen.outputs.docs_dir_changed == 'true'
steps:
- name: "[DEBUG] print variables"
run: |
echo "env.DOCS_DIR: ${{ env.DOCS_DIR }}"
echo "github head_ref: ${{ github.head_ref }}"
- name: git checkout
uses: actions/checkout@v3
with:
ref: ${{ github.head_ref }}
- name: install python
uses: actions/setup-python@v4.3.0
with:
python-version: 3.8
- name: install dev requirements
run: |
python3 -m venv env
source env/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt -r dev-requirements.txt
- name: generate docs
run: |
source env/bin/activate
cd ${{ env.DOCS_DIR }}
echo "cleaning existing docs"
make clean
echo "creating docs"
make html
- name: debug
run: |
echo ">>>>> status"
git status
echo ">>>>> remotes"
git remote -v
echo ">>>>> branch"
git branch -v
echo ">>>>> log"
git log --pretty=oneline | head -5
- name: commit docs
run: |
git config user.name 'Github Build Bot'
git config user.email 'buildbot@fishtownanalytics.com'
git commit -am "Add generated CLI API docs"
git push -u origin ${{ github.head_ref }}

View File

@@ -42,7 +42,7 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v2
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4.3.0
@@ -53,12 +53,8 @@ jobs:
run: |
python -m pip install --user --upgrade pip
python -m pip --version
python -m pip install pre-commit
pre-commit --version
python -m pip install mypy==0.942
make dev
mypy --version
python -m pip install -r requirements.txt
python -m pip install -r dev-requirements.txt
dbt --version
- name: Run pre-commit hooks
@@ -81,7 +77,7 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v2
uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4.3.0
@@ -105,7 +101,7 @@ jobs:
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v2
- uses: actions/upload-artifact@v3
if: always()
with:
name: unit_results_${{ matrix.python-version }}-${{ steps.date.outputs.date }}.csv
@@ -138,7 +134,7 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v2
uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4.3.0
@@ -174,13 +170,13 @@ jobs:
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v2
- uses: actions/upload-artifact@v3
if: always()
with:
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}
path: ./logs
- uses: actions/upload-artifact@v2
- uses: actions/upload-artifact@v3
if: always()
with:
name: integration_results_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}.csv
@@ -193,7 +189,7 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v2
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4.3.0

View File

@@ -1,8 +1,7 @@
# Configuration for pre-commit hooks (see https://pre-commit.com/).
# Eventually the hooks described here will be run as tests before merging each PR.
# TODO: remove global exclusion of tests when testing overhaul is complete
exclude: ^(test/|core/dbt/docs/build/)
exclude: ^(core/dbt/docs/build/|core/dbt/events/types_pb2.py)
# Force all unspecified python hooks to run python 3.8
default_language_version:
@@ -38,7 +37,7 @@ repos:
alias: flake8-check
stages: [manual]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.942
rev: v0.981
hooks:
- id: mypy
# N.B.: Mypy is... a bit fragile.

View File

@@ -5,6 +5,94 @@
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-core 1.5.0-b4 - March 16, 2023
### Features
- ✨ add unix-style wildcard selector method ([#6598](https://github.com/dbt-labs/dbt-core/issues/6598))
- add support for DBT_PROJECT_DIR env var ([#6078](https://github.com/dbt-labs/dbt-core/issues/6078))
- Enable diff based partial parsing ([#6592](https://github.com/dbt-labs/dbt-core/issues/6592))
- Enforce contracts on models materialized as tables and views ([#6751](https://github.com/dbt-labs/dbt-core/issues/6751), [#7034](https://github.com/dbt-labs/dbt-core/issues/7034), [#6756](https://github.com/dbt-labs/dbt-core/issues/6756))
- make version configs optional ([#7054](https://github.com/dbt-labs/dbt-core/issues/7054))
- [CT-1584] New top level commands: interactive compile ([#6358](https://github.com/dbt-labs/dbt-core/issues/6358))
### Fixes
- allow adapters to change model name resolution in py models ([#7114](https://github.com/dbt-labs/dbt-core/issues/7114))
### Docs
- Distiguish node "access" in the DAG with node borders & opacity. ([dbt-docs/#378](https://github.com/dbt-labs/dbt-docs/issues/378))
- Fix JSON path to package overview docs ([dbt-docs/#390](https://github.com/dbt-labs/dbt-docs/issues/390))
- Add selection by group to DAG ([dbt-docs/#380](https://github.com/dbt-labs/dbt-docs/issues/380))
### Under the Hood
- Add deprecation warning for DBT_NO_PRINT ([#6960](https://github.com/dbt-labs/dbt-core/issues/6960))
### Dependencies
- Update pathspec requirement from <0.11,>=0.9 to >=0.9,<0.12 in /core ([#6737](https://github.com/dbt-labs/dbt-core/pull/6737))
### Dependency
- Bump python from 3.10.7-slim-bullseye to 3.11.1-slim-bullseye in /docker ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
- Bump black from 22.10.0 to 22.12.0 ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@dave-connors-3](https://github.com/dave-connors-3) ([#7054](https://github.com/dbt-labs/dbt-core/issues/7054))
- [@leo-schick](https://github.com/leo-schick) ([#6078](https://github.com/dbt-labs/dbt-core/issues/6078))
- [@rlh1994](https://github.com/rlh1994) ([#390](https://github.com/dbt-labs/dbt-core/issues/390))
- [@z3z1ma](https://github.com/z3z1ma) ([#6598](https://github.com/dbt-labs/dbt-core/issues/6598))
## dbt-core 1.5.0-b3 - March 02, 2023
### Under the Hood
- Remove cli doc generation workflow ([#7088](https://github.com/dbt-labs/dbt-core/issues/7088))
## dbt-core 1.5.0-b2 - March 01, 2023
### Features
- Make project version optional ([#6603](https://github.com/dbt-labs/dbt-core/issues/6603))
- parse 'group' config on groupable nodes ([#6823](https://github.com/dbt-labs/dbt-core/issues/6823))
- Implemented new log cli parameters for finer-grained control. ([#6639](https://github.com/dbt-labs/dbt-core/issues/6639))
- Add access attribute to parsed nodes ([#6824](https://github.com/dbt-labs/dbt-core/issues/6824))
- Add ability to select by group resource ([#6825](https://github.com/dbt-labs/dbt-core/issues/6825))
- Disallow refing private model across groups ([#6826](https://github.com/dbt-labs/dbt-core/issues/6826))
### Fixes
- Remove trailing slashes from source paths (#6102) ([#6102](https://github.com/dbt-labs/dbt-core/issues/6102))
- Fix compilation logic for ephemeral nodes ([#6885](https://github.com/dbt-labs/dbt-core/issues/6885))
- Fix semver comparison logic by ensuring numeric values ([#7039](https://github.com/dbt-labs/dbt-core/issues/7039))
- add pytz dependency ([#7077](https://github.com/dbt-labs/dbt-core/issues/7077))
### Docs
- Improve displayed message under "Arguments" section for argumentless macro ([dbt-docs/#358](https://github.com/dbt-labs/dbt-docs/issues/358))
- Add access property to model details ([dbt-docs/#381](https://github.com/dbt-labs/dbt-docs/issues/381))
- Display model owner by name and email ([dbt-docs/#377](https://github.com/dbt-labs/dbt-docs/issues/377))
- Add view of public models sorted by group to left navigation ([dbt-docs/#379](https://github.com/dbt-labs/dbt-docs/issues/379))
### Under the Hood
- Rename "constraint_enabled" to "contract" ([#6748](https://github.com/dbt-labs/dbt-core/issues/6748))
- Make output_keys click param multi-option instead of a string ([#6676](https://github.com/dbt-labs/dbt-core/issues/6676))
- Move validation of group earlier ([#7087](https://github.com/dbt-labs/dbt-core/issues/7087))
### Dependency
- Bump mypy from 0.971 to 0.981 ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@MartinGuindon](https://github.com/MartinGuindon) ([#358](https://github.com/dbt-labs/dbt-core/issues/358))
- [@jmg-duarte](https://github.com/jmg-duarte) ([#6102](https://github.com/dbt-labs/dbt-core/issues/6102))
- [@sdebruyn](https://github.com/sdebruyn) ([#7077](https://github.com/dbt-labs/dbt-core/issues/7077))
- [@seub](https://github.com/seub) ([#6603](https://github.com/dbt-labs/dbt-core/issues/6603))
## dbt-core 1.5.0-b1 - February 17, 2023
### Features
@@ -86,7 +174,6 @@
- [@ryancharris](https://github.com/ryancharris) ([#None](https://github.com/dbt-labs/dbt-core/issues/None))
- [@sungchun12](https://github.com/sungchun12) ([#6079](https://github.com/dbt-labs/dbt-core/issues/6079))
## Previous Releases
For information on prior major and minor releases, see their changelogs:

View File

@@ -13,7 +13,7 @@
## About this document
There are many ways to contribute to the ongoing development of `dbt-core`, such as by participating in discussions and issues. We encourage you to first read our higher-level document: ["Expectations for Open Source Contributors"](https://docs.getdbt.com/docs/contributing/oss-expectations).
There are many ways to contribute to the ongoing development of `dbt-core`, such as by participating in discussions and issues. We encourage you to first read our higher-level document: ["Expectations for Open Source Contributors"](https://docs.getdbt.com/community/resources/oss-expectations).
The rest of this document serves as a more granular guide for contributing code changes to `dbt-core` (this repository). It is not intended as a guide for using `dbt-core`, and some pieces assume a level of familiarity with Python development (virtualenvs, `pip`, etc). Specific code snippets in this guide assume you are using macOS or Linux and are comfortable with the command line.
@@ -21,8 +21,8 @@ If you get stuck, we're happy to help! Drop us a line in the `#dbt-core-developm
### Notes
- **Adapters:** Is your issue or proposed code change related to a specific [database adapter](https://docs.getdbt.com/docs/available-adapters)? If so, please open issues, PRs, and discussions in that adapter's repository instead. The sole exception is Postgres; the `dbt-postgres` plugin lives in this repository (`dbt-core`).
- **CLA:** Please note that anyone contributing code to `dbt-core` must sign the [Contributor License Agreement](https://docs.getdbt.com/docs/contributor-license-agreements). If you are unable to sign the CLA, the `dbt-core` maintainers will unfortunately be unable to merge any of your Pull Requests. We welcome you to participate in discussions, open issues, and comment on existing ones.
- **Adapters:** Is your issue or proposed code change related to a specific [database adapter](https://docs.getdbt.com/docs/supported-data-platforms)? If so, please open issues, PRs, and discussions in that adapter's repository instead. The sole exception is Postgres; the `dbt-postgres` plugin lives in this repository (`dbt-core`).
- **CLA:** Please note that anyone contributing code to `dbt-core` must sign the [Contributor License Agreement](https://docs.getdbt.com/community/resources/contributor-license-agreements). If you are unable to sign the CLA, the `dbt-core` maintainers will unfortunately be unable to merge any of your Pull Requests. We welcome you to participate in discussions, open issues, and comment on existing ones.
- **Branches:** All pull requests from community contributors should target the `main` branch (default). If the change is needed as a patch for a minor version of dbt that has already been released (or is already a release candidate), a maintainer will backport the changes in your PR to the relevant "latest" release branch (`1.0.latest`, `1.1.latest`, ...). If an issue fix applies to a release branch, that fix should be first committed to the development branch and then to the release branch (rarely release-branch fixes may not apply to `main`).
- **Releases**: Before releasing a new minor version of Core, we prepare a series of alphas and release candidates to allow users (especially employees of dbt Labs!) to test the new version in live environments. This is an important quality assurance step, as it exposes the new code to a wide variety of complicated deployments and can surface bugs before official release. Releases are accessible via pip, homebrew, and dbt Cloud.
@@ -113,7 +113,7 @@ When installed in this way, any changes you make to your local copy of the sourc
With your virtualenv activated, the `dbt` script should point back to the source code you've cloned on your machine. You can verify this by running `which dbt`. This command should show you a path to an executable in your virtualenv.
Configure your [profile](https://docs.getdbt.com/docs/configure-your-profile) as necessary to connect to your target databases. It may be a good idea to add a new profile pointing to a local Postgres instance, or a specific test sandbox within your data warehouse if appropriate.
Configure your [profile](https://docs.getdbt.com/docs/core/connection-profiles) as necessary to connect to your target databases. It may be a good idea to add a new profile pointing to a local Postgres instance, or a specific test sandbox within your data warehouse if appropriate.
## Testing

View File

@@ -37,6 +37,10 @@ dev: dev_req ## Installs dbt-* packages in develop mode along with development d
@\
pre-commit install
.PHONY: proto_types
proto_types: ## generates google protobuf python file from types.proto
protoc -I=./core/dbt/events --python_out=./core/dbt/events ./core/dbt/events/types.proto
.PHONY: mypy
mypy: .env ## Runs mypy against staged changes for static type checking.
@\

View File

@@ -15,14 +15,14 @@
Analysts using dbt can transform their data by simply writing select statements, while dbt handles turning these statements into tables and views in a data warehouse.
These select statements, or "models", form a dbt project. Models frequently build on top of one another dbt makes it easy to [manage relationships](https://docs.getdbt.com/docs/ref) between models, and [visualize these relationships](https://docs.getdbt.com/docs/documentation), as well as assure the quality of your transformations through [testing](https://docs.getdbt.com/docs/testing).
These select statements, or "models", form a dbt project. Models frequently build on top of one another dbt makes it easy to [manage relationships](https://docs.getdbt.com/reference/dbt-jinja-functions/ref) between models, and [visualize these relationships](https://docs.getdbt.com/docs/collaborate/documentation), as well as assure the quality of your transformations through [testing](https://docs.getdbt.com/docs/build/tests).
![dbt dag](https://raw.githubusercontent.com/dbt-labs/dbt-core/6c6649f9129d5d108aa3b0526f634cd8f3a9d1ed/etc/dbt-dag.png)
## Getting started
- [Install dbt](https://docs.getdbt.com/docs/get-started/installation)
- Read the [introduction](https://docs.getdbt.com/docs/introduction/) and [viewpoint](https://docs.getdbt.com/docs/about/viewpoint/)
- Read the [introduction](https://docs.getdbt.com/docs/introduction/) and [viewpoint](https://docs.getdbt.com/community/resources/viewpoint)
## Join the dbt Community

View File

@@ -15,14 +15,14 @@
Analysts using dbt can transform their data by simply writing select statements, while dbt handles turning these statements into tables and views in a data warehouse.
These select statements, or "models", form a dbt project. Models frequently build on top of one another dbt makes it easy to [manage relationships](https://docs.getdbt.com/docs/ref) between models, and [visualize these relationships](https://docs.getdbt.com/docs/documentation), as well as assure the quality of your transformations through [testing](https://docs.getdbt.com/docs/testing).
These select statements, or "models", form a dbt project. Models frequently build on top of one another dbt makes it easy to [manage relationships](https://docs.getdbt.com/reference/dbt-jinja-functions/ref) between models, and [visualize these relationships](https://docs.getdbt.com/docs/collaborate/documentation), as well as assure the quality of your transformations through [testing](https://docs.getdbt.com/docs/build/tests).
![dbt dag](https://raw.githubusercontent.com/dbt-labs/dbt-core/6c6649f9129d5d108aa3b0526f634cd8f3a9d1ed/etc/dbt-dag.png)
## Getting started
- [Install dbt](https://docs.getdbt.com/docs/installation)
- Read the [introduction](https://docs.getdbt.com/docs/introduction/) and [viewpoint](https://docs.getdbt.com/docs/about/viewpoint/)
- [Install dbt](https://docs.getdbt.com/docs/core/installation)
- Read the [introduction](https://docs.getdbt.com/docs/introduction/) and [viewpoint](https://docs.getdbt.com/community/resources/viewpoint)
## Join the dbt Community

View File

@@ -5,42 +5,43 @@ from datetime import datetime
import time
from itertools import chain
from typing import (
Optional,
Tuple,
Callable,
Iterable,
Type,
Dict,
Any,
Callable,
Dict,
Iterable,
Iterator,
List,
Mapping,
Iterator,
Optional,
Set,
Tuple,
Type,
)
from dbt.contracts.graph.nodes import ColumnLevelConstraint, ConstraintType
import agate
import pytz
from dbt.exceptions import (
DbtInternalError,
DbtRuntimeError,
DbtValidationError,
MacroArgTypeError,
MacroResultError,
QuoteConfigTypeError,
NotImplementedError,
NullRelationCacheAttemptedError,
NullRelationDropAttemptedError,
QuoteConfigTypeError,
RelationReturnedMultipleResultsError,
RenameToNoneAttemptedError,
DbtRuntimeError,
SnapshotTargetIncompleteError,
SnapshotTargetNotSnapshotTableError,
UnexpectedNullError,
UnexpectedNonTimestampError,
UnexpectedNullError,
)
from dbt.adapters.protocol import (
AdapterConfig,
ConnectionManagerProtocol,
)
from dbt.adapters.protocol import AdapterConfig, ConnectionManagerProtocol
from dbt.clients.agate_helper import empty_table, merge_tables, table_from_rows
from dbt.clients.jinja import MacroGenerator
from dbt.contracts.graph.manifest import Manifest, MacroManifest
@@ -65,7 +66,7 @@ from dbt.adapters.base.relation import (
)
from dbt.adapters.base import Column as BaseColumn
from dbt.adapters.base import Credentials
from dbt.adapters.cache import RelationsCache, _make_ref_key_msg
from dbt.adapters.cache import RelationsCache, _make_ref_key_dict
GET_CATALOG_MACRO_NAME = "get_catalog"
@@ -176,6 +177,7 @@ class BaseAdapter(metaclass=AdapterMeta):
- truncate_relation
- rename_relation
- get_columns_in_relation
- get_column_schema_from_query
- expand_column_types
- list_relations_without_caching
- is_cancelable
@@ -268,6 +270,19 @@ class BaseAdapter(metaclass=AdapterMeta):
"""
return self.connections.execute(sql=sql, auto_begin=auto_begin, fetch=fetch)
@available.parse(lambda *a, **k: [])
def get_column_schema_from_query(self, sql: str) -> List[BaseColumn]:
"""Get a list of the Columns with names and data types from the given sql."""
_, cursor = self.connections.add_select_query(sql)
columns = [
self.Column.create(
column_name, self.connections.data_type_code_to_name(column_type_code)
)
# https://peps.python.org/pep-0249/#description
for column_name, column_type_code, *_ in cursor.description
]
return columns
@available.parse(lambda *a, **k: ("", empty_table()))
def get_partitions_metadata(self, table: str) -> Tuple[agate.Table]:
"""Obtain partitions metadata for a BigQuery partitioned table.
@@ -707,7 +722,7 @@ class BaseAdapter(metaclass=AdapterMeta):
ListRelations(
database=cast_to_str(database),
schema=schema,
relations=[_make_ref_key_msg(x) for x in relations],
relations=[_make_ref_key_dict(x) for x in relations],
)
)
@@ -1250,6 +1265,39 @@ class BaseAdapter(metaclass=AdapterMeta):
# This returns a callable macro
return model_context[macro_name]
@classmethod
def _parse_column_constraint(cls, raw_constraint: Dict[str, Any]) -> ColumnLevelConstraint:
try:
ColumnLevelConstraint.validate(raw_constraint)
return ColumnLevelConstraint.from_dict(raw_constraint)
except Exception:
raise DbtValidationError(f"Could not parse constraint: {raw_constraint}")
@available
@classmethod
def render_raw_column_constraint(cls, raw_constraint: Dict[str, Any]) -> str:
constraint = cls._parse_column_constraint(raw_constraint)
return cls.render_column_constraint(constraint)
@classmethod
def render_column_constraint(cls, constraint: ColumnLevelConstraint) -> str:
"""Render the given constraint as DDL text. Should be overriden by adapters which need custom constraint
rendering."""
if constraint.type == ConstraintType.check and constraint.expression:
return f"check {constraint.expression}"
elif constraint.type == ConstraintType.not_null:
return "not null"
elif constraint.type == ConstraintType.unique:
return "unique"
elif constraint.type == ConstraintType.primary_key:
return "primary key"
elif constraint.type == ConstraintType.foreign_key:
return "foreign key"
elif constraint.type == ConstraintType.custom and constraint.expression:
return constraint.expression
else:
return ""
COLUMNS_EQUAL_SQL = """
with diff_count as (

View File

@@ -4,8 +4,7 @@ from typing import Any, Dict, Iterable, List, Optional, Set, Tuple
from dbt.adapters.reference_keys import (
_make_ref_key,
_make_ref_key_msg,
_make_msg_from_ref_key,
_make_ref_key_dict,
_ReferenceKey,
)
from dbt.exceptions import (
@@ -230,7 +229,7 @@ class RelationsCache:
# self.relations or any cache entry's referenced_by during iteration
# it's a runtime error!
with self.lock:
return {dot_separated(k): v.dump_graph_entry() for k, v in self.relations.items()}
return {dot_separated(k): str(v.dump_graph_entry()) for k, v in self.relations.items()}
def _setdefault(self, relation: _CachedRelation):
"""Add a relation to the cache, or return it if it already exists.
@@ -290,8 +289,8 @@ class RelationsCache:
# a link - we will never drop the referenced relation during a run.
fire_event(
CacheAction(
ref_key=_make_msg_from_ref_key(ref_key),
ref_key_2=_make_msg_from_ref_key(dep_key),
ref_key=ref_key._asdict(),
ref_key_2=dep_key._asdict(),
)
)
return
@@ -306,8 +305,8 @@ class RelationsCache:
fire_event(
CacheAction(
action="add_link",
ref_key=_make_msg_from_ref_key(dep_key),
ref_key_2=_make_msg_from_ref_key(ref_key),
ref_key=dep_key._asdict(),
ref_key_2=ref_key._asdict(),
)
)
with self.lock:
@@ -325,7 +324,7 @@ class RelationsCache:
flags.LOG_CACHE_EVENTS,
lambda: CacheDumpGraph(before_after="before", action="adding", dump=self.dump_graph()),
)
fire_event(CacheAction(action="add_relation", ref_key=_make_ref_key_msg(cached)))
fire_event(CacheAction(action="add_relation", ref_key=_make_ref_key_dict(cached)))
with self.lock:
self._setdefault(cached)
@@ -359,7 +358,7 @@ class RelationsCache:
:param str identifier: The identifier of the relation to drop.
"""
dropped_key = _make_ref_key(relation)
dropped_key_msg = _make_ref_key_msg(relation)
dropped_key_msg = _make_ref_key_dict(relation)
fire_event(CacheAction(action="drop_relation", ref_key=dropped_key_msg))
with self.lock:
if dropped_key not in self.relations:
@@ -367,7 +366,7 @@ class RelationsCache:
return
consequences = self.relations[dropped_key].collect_consequences()
# convert from a list of _ReferenceKeys to a list of ReferenceKeyMsgs
consequence_msgs = [_make_msg_from_ref_key(key) for key in consequences]
consequence_msgs = [key._asdict() for key in consequences]
fire_event(
CacheAction(
action="drop_cascade", ref_key=dropped_key_msg, ref_list=consequence_msgs
@@ -397,9 +396,9 @@ class RelationsCache:
fire_event(
CacheAction(
action="update_reference",
ref_key=_make_ref_key_msg(old_key),
ref_key_2=_make_ref_key_msg(new_key),
ref_key_3=_make_ref_key_msg(cached.key()),
ref_key=_make_ref_key_dict(old_key),
ref_key_2=_make_ref_key_dict(new_key),
ref_key_3=_make_ref_key_dict(cached.key()),
)
)
@@ -430,9 +429,7 @@ class RelationsCache:
raise TruncatedModelNameCausedCollisionError(new_key, self.relations)
if old_key not in self.relations:
fire_event(
CacheAction(action="temporary_relation", ref_key=_make_msg_from_ref_key(old_key))
)
fire_event(CacheAction(action="temporary_relation", ref_key=old_key._asdict()))
return False
return True
@@ -453,8 +450,8 @@ class RelationsCache:
fire_event(
CacheAction(
action="rename_relation",
ref_key=_make_msg_from_ref_key(old_key),
ref_key_2=_make_msg_from_ref_key(new),
ref_key=old_key._asdict(),
ref_key_2=new_key._asdict(),
)
)
flags = get_flags()

View File

@@ -2,7 +2,6 @@
from collections import namedtuple
from typing import Any, Optional
from dbt.events.proto_types import ReferenceKeyMsg
_ReferenceKey = namedtuple("_ReferenceKey", "database schema identifier")
@@ -30,11 +29,9 @@ def _make_ref_key(relation: Any) -> _ReferenceKey:
)
def _make_ref_key_msg(relation: Any):
return _make_msg_from_ref_key(_make_ref_key(relation))
def _make_msg_from_ref_key(ref_key: _ReferenceKey) -> ReferenceKeyMsg:
return ReferenceKeyMsg(
database=ref_key.database, schema=ref_key.schema, identifier=ref_key.identifier
)
def _make_ref_key_dict(relation: Any):
return {
"database": relation.database,
"schema": relation.schema,
"identifier": relation.identifier,
}

View File

@@ -1,6 +1,6 @@
import abc
import time
from typing import List, Optional, Tuple, Any, Iterable, Dict
from typing import List, Optional, Tuple, Any, Iterable, Dict, Union
import agate
@@ -52,6 +52,7 @@ class SQLConnectionManager(BaseConnectionManager):
bindings: Optional[Any] = None,
abridge_sql_log: bool = False,
) -> Tuple[Connection, Any]:
connection = self.get_thread_connection()
if auto_begin and connection.transaction_open is False:
self.begin()
@@ -128,6 +129,14 @@ class SQLConnectionManager(BaseConnectionManager):
return dbt.clients.agate_helper.table_from_data_flat(data, column_names)
@classmethod
def data_type_code_to_name(cls, type_code: Union[int, str]) -> str:
"""Get the string representation of the data type from the type_code."""
# https://peps.python.org/pep-0249/#type-objects
raise dbt.exceptions.NotImplementedError(
"`data_type_code_to_name` is not implemented for this adapter!"
)
def execute(
self, sql: str, auto_begin: bool = False, fetch: bool = False
) -> Tuple[AdapterResponse, agate.Table]:
@@ -146,6 +155,10 @@ class SQLConnectionManager(BaseConnectionManager):
def add_commit_query(self):
return self.add_query("COMMIT", auto_begin=False)
def add_select_query(self, sql: str) -> Tuple[Connection, Any]:
sql = self._add_query_comment(sql)
return self.add_query(sql, auto_begin=False)
def begin(self):
connection = self.get_thread_connection()
if connection.transaction_open is True:

View File

@@ -4,7 +4,7 @@ from typing import Any, Optional, Tuple, Type, List
from dbt.contracts.connection import Connection
from dbt.exceptions import RelationTypeNullError
from dbt.adapters.base import BaseAdapter, available
from dbt.adapters.cache import _make_ref_key_msg
from dbt.adapters.cache import _make_ref_key_dict
from dbt.adapters.sql import SQLConnectionManager
from dbt.events.functions import fire_event
from dbt.events.types import ColTypeChange, SchemaCreation, SchemaDrop
@@ -109,7 +109,7 @@ class SQLAdapter(BaseAdapter):
ColTypeChange(
orig_type=target_column.data_type,
new_type=new_type,
table=_make_ref_key_msg(current),
table=_make_ref_key_dict(current),
)
)
@@ -152,7 +152,7 @@ class SQLAdapter(BaseAdapter):
def create_schema(self, relation: BaseRelation) -> None:
relation = relation.without_identifier()
fire_event(SchemaCreation(relation=_make_ref_key_msg(relation)))
fire_event(SchemaCreation(relation=_make_ref_key_dict(relation)))
kwargs = {
"relation": relation,
}
@@ -163,7 +163,7 @@ class SQLAdapter(BaseAdapter):
def drop_schema(self, relation: BaseRelation) -> None:
relation = relation.without_identifier()
fire_event(SchemaDrop(relation=_make_ref_key_msg(relation)))
fire_event(SchemaDrop(relation=_make_ref_key_dict(relation)))
kwargs = {
"relation": relation,
}

View File

@@ -5,13 +5,14 @@ from dataclasses import dataclass
from importlib import import_module
from multiprocessing import get_context
from pprint import pformat as pf
from typing import Set, List
from typing import Callable, Dict, List, Set
from click import Context, get_current_context, BadOptionUsage
from click.core import ParameterSource, Command, Group
from dbt.config.profile import read_user_config
from dbt.contracts.project import UserConfig
from dbt.deprecations import renamed_env_var
from dbt.helper_types import WarnErrorOptions
from dbt.cli.resolvers import default_project_dir, default_log_path
@@ -29,6 +30,7 @@ FLAGS_DEFAULTS = {
"FULL_REFRESH": False,
"STRICT_MODE": False,
"STORE_FAILURES": False,
"INTROSPECT": True,
}
@@ -42,6 +44,7 @@ EXPECTED_DUPLICATE_PARAMS = [
"fail_fast",
"indirect_selection",
"store_failures",
"introspect",
]
@@ -76,6 +79,14 @@ def args_to_context(args: List[str]) -> Context:
return sub_command_ctx
DEPRECATED_PARAMS = {
"deprecated_defer": "defer",
"deprecated_favor_state": "favor_state",
"deprecated_print": "print",
"deprecated_state": "state",
}
@dataclass(frozen=True)
class Flags:
def __init__(self, ctx: Context = None, user_config: UserConfig = None) -> None:
@@ -87,7 +98,7 @@ class Flags:
if ctx is None:
ctx = get_current_context()
def assign_params(ctx, params_assigned_from_default):
def assign_params(ctx, params_assigned_from_default, deprecated_env_vars):
"""Recursively adds all click params to flag object"""
for param_name, param_value in ctx.params.items():
# TODO: this is to avoid duplicate params being defined in two places (version_check in run and cli)
@@ -97,6 +108,10 @@ class Flags:
# when using frozen dataclasses.
# https://docs.python.org/3/library/dataclasses.html#frozen-instances
if hasattr(self, param_name.upper()):
if param_name in deprecated_env_vars:
# param already set via its deprecated but still respected env var
continue
if param_name not in EXPECTED_DUPLICATE_PARAMS:
raise Exception(
f"Duplicate flag names found in click command: {param_name}"
@@ -107,15 +122,64 @@ class Flags:
if ctx.get_parameter_source(param_name) != ParameterSource.DEFAULT:
object.__setattr__(self, param_name.upper(), param_value)
else:
object.__setattr__(self, param_name.upper(), param_value)
if ctx.get_parameter_source(param_name) == ParameterSource.DEFAULT:
params_assigned_from_default.add(param_name)
# handle deprecated env vars while still respecting old values
# e.g. DBT_NO_PRINT -> DBT_PRINT if DBT_NO_PRINT is set, it is
# respected over DBT_PRINT or --print
if param_name in DEPRECATED_PARAMS:
# deprecated env vars can only be set via env var.
# we use the deprecated option in click to serialize the value
# from the env var string
param_source = ctx.get_parameter_source(param_name)
if param_source == ParameterSource.DEFAULT:
continue
elif param_source != ParameterSource.ENVIRONMENT:
raise BadOptionUsage(
param_name,
"Deprecated parameters can only be set via environment variables",
)
# rename for clarity
dep_name = param_name
new_name = DEPRECATED_PARAMS.get(dep_name)
# find param objects for their envvar name
try:
dep_opt = [x for x in ctx.command.params if x.name == dep_name][0]
new_opt = [x for x in ctx.command.params if x.name == new_name][0]
except IndexError:
raise Exception(
f"No deprecated param name match from {dep_name} to {new_name}"
)
# remove param from defaulted set since the deprecated
# value is not set from default, but from an env var
if new_name in params_assigned_from_default:
params_assigned_from_default.remove(new_name)
# adding the deprecation warning function to the set
deprecated_env_vars[new_name] = renamed_env_var(
old_name=dep_opt.envvar,
new_name=new_opt.envvar,
)
object.__setattr__(self, new_name.upper(), param_value)
else:
object.__setattr__(self, param_name.upper(), param_value)
if ctx.get_parameter_source(param_name) == ParameterSource.DEFAULT:
params_assigned_from_default.add(param_name)
if ctx.parent:
assign_params(ctx.parent, params_assigned_from_default)
assign_params(ctx.parent, params_assigned_from_default, deprecated_env_vars)
params_assigned_from_default = set() # type: Set[str]
assign_params(ctx, params_assigned_from_default)
deprecated_env_vars: Dict[str, Callable] = {}
assign_params(ctx, params_assigned_from_default, deprecated_env_vars)
# set deprecated_env_var_warnings to be fired later after events have been init
object.__setattr__(
self, "deprecated_env_var_warnings", [x for x in deprecated_env_vars.values()]
)
# Get the invoked command flags
invoked_subcommand_name = (
@@ -126,7 +190,9 @@ class Flags:
invoked_subcommand.allow_extra_args = True
invoked_subcommand.ignore_unknown_options = True
invoked_subcommand_ctx = invoked_subcommand.make_context(None, sys.argv)
assign_params(invoked_subcommand_ctx, params_assigned_from_default)
assign_params(
invoked_subcommand_ctx, params_assigned_from_default, deprecated_env_vars
)
if not user_config:
profiles_dir = getattr(self, "PROFILES_DIR", None)
@@ -156,6 +222,8 @@ class Flags:
self._override_if_set("LOG_FORMAT", "LOG_FORMAT_FILE", params_assigned_from_default)
# Default LOG_PATH from PROJECT_DIR, if available.
# Starting in v1.5, if `log-path` is set in `dbt_project.yml`, it will raise a deprecation warning,
# with the possibility of removing it in a future release.
if getattr(self, "LOG_PATH", None) is None:
project_dir = getattr(self, "PROJECT_DIR", default_project_dir())
version_check = getattr(self, "VERSION_CHECK", True)
@@ -202,3 +270,9 @@ class Flags:
)
elif flag_set_by_user:
set_flag = flag
def fire_deprecations(self):
[dep_fn() for dep_fn in self.deprecated_env_var_warnings]
# it is necessary to remove this attr from the class so it does
# not get pickled when written to disk as json
object.__delattr__(self, "deprecated_env_var_warnings")

View File

@@ -1,11 +1,12 @@
from copy import copy
from typing import List, Tuple, Optional
from typing import Callable, List, Tuple, Optional
import click
from dbt.cli import requires, params as p
from dbt.config.project import Project
from dbt.config.profile import Profile
from dbt.contracts.graph.manifest import Manifest
from dbt.events.base_types import EventMsg
from dbt.task.clean import CleanTask
from dbt.task.compile import CompileTask
from dbt.task.deps import DepsTask
@@ -34,11 +35,16 @@ class dbtInternalException(Exception):
# Programmatic invocation
class dbtRunner:
def __init__(
self, project: Project = None, profile: Profile = None, manifest: Manifest = None
self,
project: Project = None,
profile: Profile = None,
manifest: Manifest = None,
callbacks: List[Callable[[EventMsg], None]] = [],
):
self.project = project
self.profile = profile
self.manifest = manifest
self.callbacks = callbacks
def invoke(self, args: List[str]) -> Tuple[Optional[List], bool]:
try:
@@ -47,6 +53,7 @@ class dbtRunner:
"project": self.project,
"profile": self.profile,
"manifest": self.manifest,
"callbacks": self.callbacks,
}
return cli.invoke(dbt_ctx)
except click.exceptions.Exit as e:
@@ -80,6 +87,7 @@ class dbtRunner:
@p.macro_debugging
@p.partial_parse
@p.print
@p.deprecated_print
@p.printer_width
@p.quiet
@p.record_timing_info
@@ -103,9 +111,11 @@ def cli(ctx, **kwargs):
@cli.command("build")
@click.pass_context
@p.defer
@p.deprecated_defer
@p.exclude
@p.fail_fast
@p.favor_state
@p.deprecated_favor_state
@p.full_refresh
@p.indirect_selection
@p.profile
@@ -116,6 +126,7 @@ def cli(ctx, **kwargs):
@p.selector
@p.show
@p.state
@p.deprecated_state
@p.store_failures
@p.target
@p.target_path
@@ -172,14 +183,17 @@ def docs(ctx, **kwargs):
@click.pass_context
@p.compile_docs
@p.defer
@p.deprecated_defer
@p.exclude
@p.favor_state
@p.deprecated_favor_state
@p.profile
@p.profiles_dir
@p.project_dir
@p.select
@p.selector
@p.state
@p.deprecated_state
@p.target
@p.target_path
@p.threads
@@ -235,16 +249,22 @@ def docs_serve(ctx, **kwargs):
@cli.command("compile")
@click.pass_context
@p.defer
@p.deprecated_defer
@p.exclude
@p.favor_state
@p.deprecated_favor_state
@p.full_refresh
@p.indirect_selection
@p.introspect
@p.parse_only
@p.profile
@p.profiles_dir
@p.project_dir
@p.select
@p.selector
@p.inline
@p.state
@p.deprecated_state
@p.target
@p.target_path
@p.threads
@@ -347,6 +367,7 @@ def init(ctx, **kwargs):
@p.raw_select
@p.selector
@p.state
@p.deprecated_state
@p.target
@p.vars
@requires.preflight
@@ -401,7 +422,9 @@ def parse(ctx, **kwargs):
@cli.command("run")
@click.pass_context
@p.defer
@p.deprecated_defer
@p.favor_state
@p.deprecated_favor_state
@p.exclude
@p.fail_fast
@p.full_refresh
@@ -411,6 +434,7 @@ def parse(ctx, **kwargs):
@p.select
@p.selector
@p.state
@p.deprecated_state
@p.target
@p.target_path
@p.threads
@@ -474,6 +498,7 @@ def run_operation(ctx, **kwargs):
@p.selector
@p.show
@p.state
@p.deprecated_state
@p.target
@p.target_path
@p.threads
@@ -500,14 +525,17 @@ def seed(ctx, **kwargs):
@cli.command("snapshot")
@click.pass_context
@p.defer
@p.deprecated_defer
@p.exclude
@p.favor_state
@p.deprecated_favor_state
@p.profile
@p.profiles_dir
@p.project_dir
@p.select
@p.selector
@p.state
@p.deprecated_state
@p.target
@p.threads
@p.vars
@@ -547,6 +575,7 @@ def source(ctx, **kwargs):
@p.select
@p.selector
@p.state
@p.deprecated_state
@p.target
@p.threads
@p.vars
@@ -578,9 +607,11 @@ cli.commands["source"].add_command(snapshot_freshness, "snapshot-freshness") #
@cli.command("test")
@click.pass_context
@p.defer
@p.deprecated_defer
@p.exclude
@p.fail_fast
@p.favor_state
@p.deprecated_favor_state
@p.indirect_selection
@p.profile
@p.profiles_dir
@@ -588,6 +619,7 @@ cli.commands["source"].add_command(snapshot_freshness, "snapshot-freshness") #
@p.select
@p.selector
@p.state
@p.deprecated_state
@p.store_failures
@p.target
@p.target_path

View File

@@ -6,13 +6,6 @@ from dbt.cli.option_types import YAML, ChoiceTuple, WarnErrorOptionsType
from dbt.cli.resolvers import default_project_dir, default_profiles_dir
from dbt.version import get_version_information
# TODO: Rename this to meet naming conventions (the word "send" is redundant)
send_anonymous_usage_stats = click.option(
"--send-anonymous-usage-stats/--no-send-anonymous-usage-stats",
envvar="DBT_SEND_ANONYMOUS_USAGE_STATS",
help="Send anonymous usage stats to dbt Labs.",
default=True,
)
args = click.option(
"--args",
@@ -34,10 +27,17 @@ cache_selected_only = click.option(
help="Pre cache database objects relevant to selected resource only.",
)
introspect = click.option(
"--introspect/--no-introspect",
envvar="DBT_INTROSPECT",
help="Whether to scaffold introspective queries as part of compilation",
default=True,
)
compile_docs = click.option(
"--compile/--no-compile",
envvar=None,
help="Wether or not to run 'dbt compile' as part of docs generation",
help="Whether or not to run 'dbt compile' as part of docs generation",
default=True,
)
@@ -62,16 +62,21 @@ debug = click.option(
help="Display debug logging during dbt execution. Useful for debugging and making bug reports.",
)
# TODO: The env var and name (reflected in flags) are corrections!
# The original name was `DEFER_MODE` and used an env var called "DBT_DEFER_TO_STATE"
# Both of which break existing naming conventions.
# This will need to be fixed before use in the main codebase and communicated as a change to the community!
# flag was previously named DEFER_MODE
defer = click.option(
"--defer/--no-defer",
envvar="DBT_DEFER",
help="If set, defer to the state variable for resolving unselected nodes.",
)
deprecated_defer = click.option(
"--deprecated-defer",
envvar="DBT_DEFER_TO_STATE",
help="Internal flag for deprecating old env var.",
default=False,
hidden=True,
)
enable_legacy_logger = click.option(
"--enable-legacy-logger/--no-enable-legacy-logger",
envvar="DBT_ENABLE_LEGACY_LOGGER",
@@ -95,6 +100,12 @@ favor_state = click.option(
help="If set, defer to the argument provided to the state flag for resolving unselected nodes, even if the node(s) exist as a database object in the current environment.",
)
deprecated_favor_state = click.option(
"--deprecated-favor-state",
envvar="DBT_FAVOR_STATE_MODE",
help="Internal flag for deprecating old env var.",
)
full_refresh = click.option(
"--full-refresh",
"-f",
@@ -107,7 +118,7 @@ indirect_selection = click.option(
"--indirect-selection",
envvar="DBT_INDIRECT_SELECTION",
help="Select all tests that are adjacent to selected resources, even if they those resources have been explicitly selected.",
type=click.Choice(["eager", "cautious", "buildable"], case_sensitive=False),
type=click.Choice(["eager", "cautious", "buildable", "empty"], case_sensitive=False),
default="eager",
)
@@ -172,7 +183,15 @@ output = click.option(
)
output_keys = click.option(
"--output-keys", envvar=None, help="TODO: No current help text", type=click.STRING
"--output-keys",
envvar=None,
help=(
"Space-delimited listing of node properties to include as custom keys for JSON output "
"(e.g. `--output json --output-keys name resource_type description`)"
),
type=list,
cls=MultiOption,
default=[],
)
output_path = click.option(
@@ -206,10 +225,6 @@ port = click.option(
type=click.INT,
)
# TODO: The env var and name (reflected in flags) are corrections!
# The original name was `NO_PRINT` and used the env var `DBT_NO_PRINT`.
# Both of which break existing naming conventions.
# This will need to be fixed before use in the main codebase and communicated as a change to the community!
print = click.option(
"--print/--no-print",
envvar="DBT_PRINT",
@@ -217,6 +232,15 @@ print = click.option(
default=True,
)
deprecated_print = click.option(
"--deprecated-print/--deprecated-no-print",
envvar="DBT_NO_PRINT",
help="Internal flag for deprecating old env var.",
default=True,
hidden=True,
callback=lambda ctx, param, value: not value,
)
printer_width = click.option(
"--printer-width",
envvar="DBT_PRINTER_WIDTH",
@@ -250,7 +274,7 @@ profiles_dir_exists_false = click.option(
project_dir = click.option(
"--project-dir",
envvar=None,
envvar="DBT_PROJECT_DIR",
help="Which directory to look in for the dbt_project.yml file. Default is the current working directory and its parents.",
default=default_project_dir,
type=click.Path(exists=True),
@@ -303,6 +327,8 @@ select_attrs = {
"type": tuple,
}
inline = click.option("--inline", envvar=None, help="Pass SQL inline to dbt compile and preview")
# `--select` and `--models` are analogous for most commands except `dbt list` for legacy reasons.
# Most CLI arguments should use the combined `select` option that aliases `--models` to `--select`.
# However, if you need to split out these separators (like `dbt ls`), use the `models` and `raw_select` options instead.
@@ -315,6 +341,13 @@ selector = click.option(
"--selector", envvar=None, help="The selector name to use, as defined in selectors.yml"
)
send_anonymous_usage_stats = click.option(
"--send-anonymous-usage-stats/--no-send-anonymous-usage-stats",
envvar="DBT_SEND_ANONYMOUS_USAGE_STATS",
help="Send anonymous usage stats to dbt Labs.",
default=True,
)
show = click.option(
"--show", envvar=None, help="Show a sample of the loaded data in the terminal", is_flag=True
)
@@ -336,10 +369,6 @@ skip_profile_setup = click.option(
"--skip-profile-setup", "-s", envvar=None, help="Skip interactive profile setup.", is_flag=True
)
# TODO: The env var and name (reflected in flags) are corrections!
# The original name was `ARTIFACT_STATE_PATH` and used the env var `DBT_ARTIFACT_STATE_PATH`.
# Both of which break existing naming conventions.
# This will need to be fixed before use in the main codebase and communicated as a change to the community!
state = click.option(
"--state",
envvar="DBT_STATE",
@@ -353,6 +382,20 @@ state = click.option(
),
)
deprecated_state = click.option(
"--deprecated-state",
envvar="DBT_ARTIFACT_STATE_PATH",
help="Internal flag for deprecating old env var.",
hidden=True,
type=click.Path(
dir_okay=True,
file_okay=False,
readable=True,
resolve_path=True,
path_type=Path,
),
)
static_parser = click.option(
"--static-parser/--no-static-parser",
envvar="DBT_STATIC_PARSER",

View File

@@ -5,7 +5,12 @@ from dbt.cli.flags import Flags
from dbt.config import RuntimeConfig
from dbt.config.runtime import load_project, load_profile, UnsetProfile
from dbt.events.functions import setup_event_logger, fire_event, LOG_VERSION
from dbt.events.types import MainReportVersion, MainReportArgs, MainTrackingUserState
from dbt.events.types import (
CommandCompleted,
MainReportVersion,
MainReportArgs,
MainTrackingUserState,
)
from dbt.exceptions import DbtProjectError
from dbt.parser.manifest import ManifestLoader, write_manifest
from dbt.profiler import profiler
@@ -13,7 +18,9 @@ from dbt.tracking import active_user, initialize_from_flags, track_run
from dbt.utils import cast_dict_to_dict_of_strings
from click import Context
import datetime
from functools import update_wrapper
import time
def preflight(func):
@@ -33,13 +40,17 @@ def preflight(func):
# Logging
# N.B. Legacy logger is not supported
setup_event_logger(flags)
callabcks = ctx.obj.get("callbacks", [])
setup_event_logger(flags=flags, callbacks=callabcks)
# Now that we have our logger, fire away!
fire_event(MainReportVersion(version=str(installed_version), log_version=LOG_VERSION))
flags_dict_str = cast_dict_to_dict_of_strings(get_flag_dict())
fire_event(MainReportArgs(args=flags_dict_str))
# Deprecation warnings
flags.fire_deprecations()
if active_user is not None: # mypy appeasement, always true
fire_event(MainTrackingUserState(user_state=active_user.state()))
@@ -50,7 +61,33 @@ def preflight(func):
# Adapter management
ctx.with_resource(adapter_management())
return func(*args, **kwargs)
start_func = time.perf_counter()
try:
(results, success) = func(*args, **kwargs)
fire_event(
CommandCompleted(
command=ctx.command_path,
success=success,
completed_at=datetime.datetime.utcnow(),
elapsed=time.perf_counter() - start_func,
)
)
# Bare except because we really do want to catch ALL exceptions,
# i.e. we want to fire this event in ALL cases.
except: # noqa
fire_event(
CommandCompleted(
command=ctx.command_path,
success=False,
completed_at=datetime.datetime.utcnow(),
elapsed=time.perf_counter() - start_func,
)
)
raise
return (results, success)
return update_wrapper(wrapper, func)

View File

@@ -483,7 +483,7 @@ def get_environment(
native: bool = False,
) -> jinja2.Environment:
args: Dict[str, List[Union[str, Type[jinja2.ext.Extension]]]] = {
"extensions": ["jinja2.ext.do"]
"extensions": ["jinja2.ext.do", "jinja2.ext.loopcontrols"]
}
if capture_macros:

View File

@@ -18,7 +18,6 @@ import requests
from dbt.events.functions import fire_event
from dbt.events.types import (
SystemCouldNotWrite,
SystemErrorRetrievingModTime,
SystemExecutingCmd,
SystemStdOut,
SystemStdErr,
@@ -77,11 +76,7 @@ def find_matching(
relative_path = os.path.relpath(absolute_path, absolute_path_to_search)
relative_path_to_root = os.path.join(relative_path_to_search, relative_path)
modification_time = 0.0
try:
modification_time = os.path.getmtime(absolute_path)
except OSError:
fire_event(SystemErrorRetrievingModTime(path=absolute_path))
modification_time = os.path.getmtime(absolute_path)
if reobj.match(local_file) and (
not ignore_spec or not ignore_spec.match_file(relative_path_to_root)
):
@@ -454,8 +449,8 @@ def run_cmd(cwd: str, cmd: List[str], env: Optional[Dict[str, Any]] = None) -> T
except OSError as exc:
_interpret_oserror(exc, cwd, cmd)
fire_event(SystemStdOut(bmsg=out))
fire_event(SystemStdErr(bmsg=err))
fire_event(SystemStdOut(bmsg=str(out)))
fire_event(SystemStdErr(bmsg=str(err)))
if proc.returncode != 0:
fire_event(SystemReportReturnCode(returncode=proc.returncode))

View File

@@ -69,7 +69,7 @@ The packages.yml file in this project is malformed. Please double check
the contents of this file and fix any errors before retrying.
You can find more information on the syntax for this file here:
https://docs.getdbt.com/docs/package-management
https://docs.getdbt.com/docs/build/packages
Validator Error:
{error}
@@ -298,23 +298,27 @@ class PartialProject(RenderComponents):
raise DbtProjectError("Package dbt_project.yml must have a name!")
return ProjectPackageMetadata(self.project_name, packages_config.packages)
def check_config_path(self, project_dict, deprecated_path, exp_path):
def check_config_path(
self, project_dict, deprecated_path, expected_path=None, default_value=None
):
if deprecated_path in project_dict:
if exp_path in project_dict:
if expected_path in project_dict:
msg = (
"{deprecated_path} and {exp_path} cannot both be defined. The "
"`{deprecated_path}` config has been deprecated in favor of `{exp_path}`. "
"{deprecated_path} and {expected_path} cannot both be defined. The "
"`{deprecated_path}` config has been deprecated in favor of `{expected_path}`. "
"Please update your `dbt_project.yml` configuration to reflect this "
"change."
)
raise DbtProjectError(
msg.format(deprecated_path=deprecated_path, exp_path=exp_path)
msg.format(deprecated_path=deprecated_path, expected_path=expected_path)
)
# this field is no longer supported, but many projects may specify it with the default value
# if so, let's only raise this deprecation warning if they set a custom value
if not default_value or project_dict[deprecated_path] != default_value:
deprecations.warn(
f"project-config-{deprecated_path}",
deprecated_path=deprecated_path,
)
deprecations.warn(
f"project-config-{deprecated_path}",
deprecated_path=deprecated_path,
exp_path=exp_path,
)
def create_project(self, rendered: RenderComponents) -> "Project":
unrendered = RenderComponents(
@@ -329,6 +333,8 @@ class PartialProject(RenderComponents):
self.check_config_path(rendered.project_dict, "source-paths", "model-paths")
self.check_config_path(rendered.project_dict, "data-paths", "seed-paths")
self.check_config_path(rendered.project_dict, "log-path", default_value="logs")
self.check_config_path(rendered.project_dict, "target-path", default_value="target")
try:
ProjectContract.validate(rendered.project_dict)
@@ -499,13 +505,6 @@ class PartialProject(RenderComponents):
) -> "PartialProject":
project_root = os.path.normpath(project_root)
project_dict = load_raw_project(project_root)
config_version = project_dict.get("config-version", 1)
if config_version != 2:
raise DbtProjectError(
f"Invalid config version: {config_version}, expected 2",
path=os.path.join(project_root, "dbt_project.yml"),
)
packages_dict = package_data_from_root(project_root)
selectors_dict = selector_data_from_root(project_root)
return cls.from_dicts(

View File

@@ -21,7 +21,7 @@ The selectors.yml file in this project is malformed. Please double check
the contents of this file and fix any errors before retrying.
You can find more information on the syntax for this file here:
https://docs.getdbt.com/docs/package-management
https://docs.getdbt.com/docs/build/packages
Validator Error:
{error}

View File

@@ -5,6 +5,4 @@ METADATA_ENV_PREFIX = "DBT_ENV_CUSTOM_ENV_"
MAXIMUM_SEED_SIZE = 1 * 1024 * 1024
MAXIMUM_SEED_SIZE_NAME = "1MB"
PIN_PACKAGE_URL = (
"https://docs.getdbt.com/docs/package-management#section-specifying-package-versions"
)
PIN_PACKAGE_URL = "https://docs.getdbt.com/docs/build/packages#section-specifying-package-versions"

View File

@@ -23,6 +23,8 @@ from dbt.exceptions import (
PropertyYMLError,
NotImplementedError,
RelationWrongTypeError,
ContractError,
ColumnTypeMissingError,
)
@@ -65,6 +67,10 @@ def raise_compiler_error(msg, node=None) -> NoReturn:
raise CompilationError(msg, node)
def raise_contract_error(yaml_columns, sql_columns) -> NoReturn:
raise ContractError(yaml_columns, sql_columns)
def raise_database_error(msg, node=None) -> NoReturn:
raise DbtDatabaseError(msg, node)
@@ -97,6 +103,10 @@ def relation_wrong_type(relation, expected_type, model=None) -> NoReturn:
raise RelationWrongTypeError(relation, expected_type, model)
def column_type_missing(column_names) -> NoReturn:
raise ColumnTypeMissingError(column_names)
# Update this when a new function should be added to the
# dbt context's `exceptions` key!
CONTEXT_EXPORTS = {
@@ -119,6 +129,8 @@ CONTEXT_EXPORTS = {
raise_invalid_property_yml_version,
raise_not_implemented,
relation_wrong_type,
raise_contract_error,
column_type_missing,
]
}

View File

@@ -61,8 +61,6 @@ class FilePath(dbtClassMixin):
@property
def original_file_path(self) -> str:
# this is mostly used for reporting errors. It doesn't show the project
# name, should it?
return os.path.join(self.searched_path, self.relative_path)
def seed_too_large(self) -> bool:

View File

@@ -1,6 +1,8 @@
import os
import time
from dataclasses import dataclass, field
from enum import Enum
from mashumaro.types import SerializableType
from typing import (
Optional,
@@ -35,7 +37,6 @@ from dbt.contracts.graph.unparsed import (
MetricTime,
)
from dbt.contracts.util import Replaceable, AdditionalPropertiesMixin
from dbt.events.proto_types import NodeInfo
from dbt.events.functions import warn_or_error
from dbt.exceptions import ParsingError, InvalidAccessTypeError
from dbt.events.types import (
@@ -48,7 +49,6 @@ from dbt.events.types import (
from dbt.events.contextvars import set_contextvars
from dbt.flags import get_flags
from dbt.node_types import ModelLanguage, NodeType, AccessType
from dbt.utils import cast_dict_to_dict_of_strings
from .model_config import (
@@ -140,6 +140,36 @@ class GraphNode(BaseNode):
return self.fqn == other.fqn
class ConstraintType(str, Enum):
check = "check"
not_null = "not_null"
unique = "unique"
primary_key = "primary_key"
foreign_key = "foreign_key"
custom = "custom"
@classmethod
def is_valid(cls, item):
try:
cls(item)
except ValueError:
return False
return True
@dataclass
class ColumnLevelConstraint(dbtClassMixin):
type: ConstraintType
name: Optional[str] = None
expression: Optional[str] = None
warn_unenforced: bool = (
True # Warn if constraint cannot be enforced by platform but will be in DDL
)
warn_unsupported: bool = (
True # Warn if constraint is not supported by the platform and won't be in DDL
)
@dataclass
class ColumnInfo(AdditionalPropertiesMixin, ExtensibleDbtClassMixin, Replaceable):
"""Used in all ManifestNodes and SourceDefinition"""
@@ -148,8 +178,7 @@ class ColumnInfo(AdditionalPropertiesMixin, ExtensibleDbtClassMixin, Replaceable
description: str = ""
meta: Dict[str, Any] = field(default_factory=dict)
data_type: Optional[str] = None
constraints: Optional[List[str]] = None
constraints_check: Optional[str] = None
constraints: List[ColumnLevelConstraint] = field(default_factory=list)
quote: Optional[bool] = None
tags: List[str] = field(default_factory=list)
_extra: Dict[str, Any] = field(default_factory=dict)
@@ -212,8 +241,6 @@ class NodeInfoMixin:
@property
def node_info(self):
meta = getattr(self, "meta", {})
meta_stringified = cast_dict_to_dict_of_strings(meta)
node_info = {
"node_path": getattr(self, "path", None),
"node_name": getattr(self, "name", None),
@@ -223,10 +250,9 @@ class NodeInfoMixin:
"node_status": str(self._event_status.get("node_status")),
"node_started_at": self._event_status.get("started_at"),
"node_finished_at": self._event_status.get("finished_at"),
"meta": meta_stringified,
"meta": getattr(self, "meta", {}),
}
node_info_msg = NodeInfo(**node_info)
return node_info_msg
return node_info
def update_event_status(self, **kwargs):
for k, v in kwargs.items():

View File

@@ -88,15 +88,13 @@ class Docs(dbtClassMixin, Replaceable):
@dataclass
class HasDocs(AdditionalPropertiesMixin, ExtensibleDbtClassMixin, Replaceable):
class HasColumnProps(AdditionalPropertiesMixin, ExtensibleDbtClassMixin, Replaceable):
name: str
description: str = ""
meta: Dict[str, Any] = field(default_factory=dict)
data_type: Optional[str] = None
constraints: Optional[List[str]] = None
constraints_check: Optional[str] = None
constraints: List[Dict[str, Any]] = field(default_factory=list)
docs: Docs = field(default_factory=Docs)
access: Optional[str] = None
_extra: Dict[str, Any] = field(default_factory=dict)
@@ -104,7 +102,7 @@ TestDef = Union[Dict[str, Any], str]
@dataclass
class HasTests(HasDocs):
class HasColumnAndTestProps(HasColumnProps):
tests: Optional[List[TestDef]] = None
def __post_init__(self):
@@ -113,18 +111,18 @@ class HasTests(HasDocs):
@dataclass
class UnparsedColumn(HasTests):
class UnparsedColumn(HasColumnAndTestProps):
quote: Optional[bool] = None
tags: List[str] = field(default_factory=list)
@dataclass
class HasColumnDocs(dbtClassMixin, Replaceable):
columns: Sequence[HasDocs] = field(default_factory=list)
columns: Sequence[HasColumnProps] = field(default_factory=list)
@dataclass
class HasColumnTests(HasColumnDocs):
class HasColumnTests(dbtClassMixin, Replaceable):
columns: Sequence[UnparsedColumn] = field(default_factory=list)
@@ -145,13 +143,14 @@ class HasConfig:
@dataclass
class UnparsedAnalysisUpdate(HasConfig, HasColumnDocs, HasDocs, HasYamlMetadata):
pass
class UnparsedAnalysisUpdate(HasConfig, HasColumnDocs, HasColumnProps, HasYamlMetadata):
access: Optional[str] = None
@dataclass
class UnparsedNodeUpdate(HasConfig, HasColumnTests, HasTests, HasYamlMetadata):
class UnparsedNodeUpdate(HasConfig, HasColumnTests, HasColumnAndTestProps, HasYamlMetadata):
quote_columns: Optional[bool] = None
access: Optional[str] = None
@dataclass
@@ -162,7 +161,7 @@ class MacroArgument(dbtClassMixin):
@dataclass
class UnparsedMacroUpdate(HasConfig, HasDocs, HasYamlMetadata):
class UnparsedMacroUpdate(HasConfig, HasColumnProps, HasYamlMetadata):
arguments: List[MacroArgument] = field(default_factory=list)
@@ -249,7 +248,7 @@ class Quoting(dbtClassMixin, Mergeable):
@dataclass
class UnparsedSourceTableDefinition(HasColumnTests, HasTests):
class UnparsedSourceTableDefinition(HasColumnTests, HasColumnAndTestProps):
config: Dict[str, Any] = field(default_factory=dict)
loaded_at_field: Optional[str] = None
identifier: Optional[str] = None

View File

@@ -184,7 +184,7 @@ BANNED_PROJECT_NAMES = {
@dataclass
class Project(HyphenatedDbtClassMixin, Replaceable):
name: Identifier
config_version: int
config_version: Optional[int] = 2
version: Optional[Union[SemverString, float]] = None
project_root: Optional[str] = None
source_paths: Optional[List[str]] = None

View File

@@ -10,10 +10,10 @@ from dbt.contracts.util import (
from dbt.exceptions import DbtInternalError
from dbt.events.functions import fire_event
from dbt.events.types import TimingInfoCollected
from dbt.events.proto_types import RunResultMsg, TimingInfoMsg
from dbt.events.contextvars import get_node_info
from dbt.events.helpers import datetime_to_json_string
from dbt.logger import TimingProcessor
from dbt.utils import lowercase, cast_to_str, cast_to_int, cast_dict_to_dict_of_strings
from dbt.utils import lowercase, cast_to_str, cast_to_int
from dbt.dataclass_schema import dbtClassMixin, StrEnum
import agate
@@ -45,11 +45,13 @@ class TimingInfo(dbtClassMixin):
def end(self):
self.completed_at = datetime.utcnow()
def to_msg(self):
timsg = TimingInfoMsg(
name=self.name, started_at=self.started_at, completed_at=self.completed_at
)
return timsg
def to_msg_dict(self):
msg_dict = {"name": self.name}
if self.started_at:
msg_dict["started_at"] = datetime_to_json_string(self.started_at)
if self.completed_at:
msg_dict["completed_at"] = datetime_to_json_string(self.completed_at)
return msg_dict
# This is a context manager
@@ -67,7 +69,7 @@ class collect_timing_info:
with TimingProcessor(self.timing_info):
fire_event(
TimingInfoCollected(
timing_info=self.timing_info.to_msg(), node_info=get_node_info()
timing_info=self.timing_info.to_msg_dict(), node_info=get_node_info()
)
)
@@ -129,16 +131,17 @@ class BaseResult(dbtClassMixin):
data["failures"] = None
return data
def to_msg(self):
msg = RunResultMsg()
msg.status = str(self.status)
msg.message = cast_to_str(self.message)
msg.thread = self.thread_id
msg.execution_time = self.execution_time
msg.num_failures = cast_to_int(self.failures)
msg.timing_info = [ti.to_msg() for ti in self.timing]
msg.adapter_response = cast_dict_to_dict_of_strings(self.adapter_response)
return msg
def to_msg_dict(self):
msg_dict = {
"status": str(self.status),
"message": cast_to_str(self.message),
"thread": self.thread_id,
"execution_time": self.execution_time,
"num_failures": cast_to_int(self.failures),
"timing_info": [ti.to_msg_dict() for ti in self.timing],
"adapter_response": self.adapter_response,
}
return msg_dict
@dataclass

View File

@@ -81,6 +81,31 @@ class ExposureNameDeprecation(DBTDeprecation):
_event = "ExposureNameDeprecation"
class ConfigLogPathDeprecation(DBTDeprecation):
_name = "project-config-log-path"
_event = "ConfigLogPathDeprecation"
class ConfigTargetPathDeprecation(DBTDeprecation):
_name = "project-config-target-path"
_event = "ConfigTargetPathDeprecation"
def renamed_env_var(old_name: str, new_name: str):
class EnvironmentVariableRenamed(DBTDeprecation):
_name = f"environment-variable-renamed:{old_name}"
_event = "EnvironmentVariableRenamed"
dep = EnvironmentVariableRenamed()
deprecations_list.append(dep)
deprecations[dep.name] = dep
def cb():
dep.show(old_name=old_name, new_name=new_name)
return cb
def warn(name, *args, **kwargs):
if name not in deprecations:
# this should (hopefully) never happen
@@ -101,6 +126,8 @@ deprecations_list: List[DBTDeprecation] = [
ConfigDataPathDeprecation(),
MetricAttributesRenamed(),
ExposureNameDeprecation(),
ConfigLogPathDeprecation(),
ConfigTargetPathDeprecation(),
]
deprecations: Dict[str, DBTDeprecation] = {d.name: d for d in deprecations_list}

View File

@@ -1,11 +1,11 @@
# Deps README
The deps module is responsible for installing dbt packages into dbt projects. A dbt package is a standalone dbt project with models and macros that solve a specific problem area. More specific information on dbt packages is available on the [docs site](https://docs.getdbt.com/docs/building-a-dbt-project/package-management).
The deps module is responsible for installing dbt packages into dbt projects. A dbt package is a standalone dbt project with models and macros that solve a specific problem area. More specific information on dbt packages is available on the [docs site](https://docs.getdbt.com/docs/build/packages).
# What's a package?
See [How do I specify a package?](https://docs.getdbt.com/docs/building-a-dbt-project/package-management#how-do-i-specify-a-package) on the docs site for a detailed explination of the different types of packages supported and expected formats.
See [How do I specify a package?](https://docs.getdbt.com/docs/build/packages#how-do-i-specify-a-package) on the docs site for a detailed explination of the different types of packages supported and expected formats.
# Files

Binary file not shown.

Binary file not shown.

View File

@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: e27d6c1c419f2f0af393858cdf674109
config: 0d25ef12a43286020bcd8b805064f01c
tags: 645f666f9bcd5a90fca523b33c5a78b7

View File

@@ -0,0 +1,134 @@
/*
* _sphinx_javascript_frameworks_compat.js
* ~~~~~~~~~~
*
* Compatability shim for jQuery and underscores.js.
*
* WILL BE REMOVED IN Sphinx 6.0
* xref RemovedInSphinx60Warning
*
*/
/**
* select a different prefix for underscore
*/
$u = _.noConflict();
/**
* small helper function to urldecode strings
*
* See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL
*/
jQuery.urldecode = function(x) {
if (!x) {
return x
}
return decodeURIComponent(x.replace(/\+/g, ' '));
};
/**
* small helper function to urlencode strings
*/
jQuery.urlencode = encodeURIComponent;
/**
* This function returns the parsed url parameters of the
* current request. Multiple values per key are supported,
* it will always return arrays of strings for the value parts.
*/
jQuery.getQueryParameters = function(s) {
if (typeof s === 'undefined')
s = document.location.search;
var parts = s.substr(s.indexOf('?') + 1).split('&');
var result = {};
for (var i = 0; i < parts.length; i++) {
var tmp = parts[i].split('=', 2);
var key = jQuery.urldecode(tmp[0]);
var value = jQuery.urldecode(tmp[1]);
if (key in result)
result[key].push(value);
else
result[key] = [value];
}
return result;
};
/**
* highlight a given string on a jquery object by wrapping it in
* span elements with the given class name.
*/
jQuery.fn.highlightText = function(text, className) {
function highlight(node, addItems) {
if (node.nodeType === 3) {
var val = node.nodeValue;
var pos = val.toLowerCase().indexOf(text);
if (pos >= 0 &&
!jQuery(node.parentNode).hasClass(className) &&
!jQuery(node.parentNode).hasClass("nohighlight")) {
var span;
var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg");
if (isInSVG) {
span = document.createElementNS("http://www.w3.org/2000/svg", "tspan");
} else {
span = document.createElement("span");
span.className = className;
}
span.appendChild(document.createTextNode(val.substr(pos, text.length)));
node.parentNode.insertBefore(span, node.parentNode.insertBefore(
document.createTextNode(val.substr(pos + text.length)),
node.nextSibling));
node.nodeValue = val.substr(0, pos);
if (isInSVG) {
var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect");
var bbox = node.parentElement.getBBox();
rect.x.baseVal.value = bbox.x;
rect.y.baseVal.value = bbox.y;
rect.width.baseVal.value = bbox.width;
rect.height.baseVal.value = bbox.height;
rect.setAttribute('class', className);
addItems.push({
"parent": node.parentNode,
"target": rect});
}
}
}
else if (!jQuery(node).is("button, select, textarea")) {
jQuery.each(node.childNodes, function() {
highlight(this, addItems);
});
}
}
var addItems = [];
var result = this.each(function() {
highlight(this, addItems);
});
for (var i = 0; i < addItems.length; ++i) {
jQuery(addItems[i].parent).before(addItems[i].target);
}
return result;
};
/*
* backward compatibility for jQuery.browser
* This will be supported until firefox bug is fixed.
*/
if (!jQuery.browser) {
jQuery.uaMatch = function(ua) {
ua = ua.toLowerCase();
var match = /(chrome)[ \/]([\w.]+)/.exec(ua) ||
/(webkit)[ \/]([\w.]+)/.exec(ua) ||
/(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) ||
/(msie) ([\w.]+)/.exec(ua) ||
ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) ||
[];
return {
browser: match[ 1 ] || "",
version: match[ 2 ] || "0"
};
};
jQuery.browser = {};
jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true;
}

View File

@@ -419,9 +419,7 @@ table.footnote td {
}
dl {
margin-left: 0;
margin-right: 0;
margin-top: 0;
margin: 0;
padding: 0;
}

View File

@@ -4,7 +4,7 @@
*
* Sphinx stylesheet -- basic theme.
*
* :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS.
* :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/
@@ -324,7 +324,6 @@ aside.sidebar {
p.sidebar-title {
font-weight: bold;
}
nav.contents,
aside.topic,
div.admonition, div.topic, blockquote {
@@ -332,7 +331,6 @@ div.admonition, div.topic, blockquote {
}
/* -- topics ---------------------------------------------------------------- */
nav.contents,
aside.topic,
div.topic {
@@ -608,7 +606,6 @@ ol.simple p,
ul.simple p {
margin-bottom: 0;
}
aside.footnote > span,
div.citation > span {
float: left;

View File

@@ -4,7 +4,7 @@
*
* Base JavaScript utilities for all Sphinx HTML documentation.
*
* :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS.
* :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -5,7 +5,7 @@
* This script contains the language-specific data used by searchtools.js,
* namely the list of stopwords, stemmer, scorer and splitter.
*
* :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS.
* :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/

View File

@@ -54,7 +54,6 @@ span.linenos.special { color: #000000; background-color: #ffffc0; padding-left:
.highlight .nt { color: #004461; font-weight: bold } /* Name.Tag */
.highlight .nv { color: #000000 } /* Name.Variable */
.highlight .ow { color: #004461; font-weight: bold } /* Operator.Word */
.highlight .pm { color: #000000; font-weight: bold } /* Punctuation.Marker */
.highlight .w { color: #f8f8f8; text-decoration: underline } /* Text.Whitespace */
.highlight .mb { color: #990000 } /* Literal.Number.Bin */
.highlight .mf { color: #990000 } /* Literal.Number.Float */

View File

@@ -4,7 +4,7 @@
*
* Sphinx JavaScript utilities for the full-text search.
*
* :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS.
* :copyright: Copyright 2007-2022 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -9,6 +9,9 @@
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
<script src="_static/jquery.js"></script>
<script src="_static/underscore.js"></script>
<script src="_static/_sphinx_javascript_frameworks_compat.js"></script>
<script src="_static/doctools.js"></script>
<script src="_static/sphinx_highlight.js"></script>
<link rel="index" title="Index" href="#" />
@@ -87,8 +90,8 @@
&copy;2022, dbt Labs.
|
Powered by <a href="http://sphinx-doc.org/">Sphinx 6.1.3</a>
&amp; <a href="https://github.com/bitprophet/alabaster">Alabaster 0.7.13</a>
Powered by <a href="http://sphinx-doc.org/">Sphinx 5.2.3</a>
&amp; <a href="https://github.com/bitprophet/alabaster">Alabaster 0.7.12</a>
</div>

View File

@@ -10,6 +10,9 @@
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
<script src="_static/jquery.js"></script>
<script src="_static/underscore.js"></script>
<script src="_static/_sphinx_javascript_frameworks_compat.js"></script>
<script src="_static/doctools.js"></script>
<script src="_static/sphinx_highlight.js"></script>
<link rel="index" title="Index" href="genindex.html" />
@@ -208,6 +211,11 @@
<p>Type: boolean</p>
<p>If specified, dbt will drop incremental models and fully-recalculate the incremental table from the model definition.</p>
</section>
<section id="compile|introspect">
<h4>introspect<a class="headerlink" href="#compile|introspect" title="Permalink to this heading"></a></h4>
<p>Type: boolean</p>
<p>Whether to scaffold introspective queries as part of compilation</p>
</section>
<section id="compile|parse_only">
<h4>parse_only<a class="headerlink" href="#compile|parse_only" title="Permalink to this heading"></a></h4>
<p>Type: boolean</p>
@@ -389,8 +397,8 @@
</section>
<section id="list|output_keys">
<h4>output_keys<a class="headerlink" href="#list|output_keys" title="Permalink to this heading"></a></h4>
<p>Type: string</p>
<p>TODO: No current help text</p>
<p>Type: unknown</p>
<p>Space-delimited listing of node properties to include as custom keys for JSON output (e.g. `output json output-keys name resource_type description`)</p>
</section>
<section id="list|profile">
<h4>profile<a class="headerlink" href="#list|profile" title="Permalink to this heading"></a></h4>
@@ -460,8 +468,8 @@
</section>
<section id="list|output_keys">
<h4>output_keys<a class="headerlink" href="#list|output_keys" title="Permalink to this heading"></a></h4>
<p>Type: string</p>
<p>TODO: No current help text</p>
<p>Type: unknown</p>
<p>Space-delimited listing of node properties to include as custom keys for JSON output (e.g. `output json output-keys name resource_type description`)</p>
</section>
<section id="list|profile">
<h4>profile<a class="headerlink" href="#list|profile" title="Permalink to this heading"></a></h4>
@@ -949,8 +957,8 @@
&copy;2022, dbt Labs.
|
Powered by <a href="http://sphinx-doc.org/">Sphinx 6.1.3</a>
&amp; <a href="https://github.com/bitprophet/alabaster">Alabaster 0.7.13</a>
Powered by <a href="http://sphinx-doc.org/">Sphinx 5.2.3</a>
&amp; <a href="https://github.com/bitprophet/alabaster">Alabaster 0.7.12</a>
|
<a href="_sources/index.rst.txt"

View File

@@ -10,6 +10,9 @@
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
<script src="_static/jquery.js"></script>
<script src="_static/underscore.js"></script>
<script src="_static/_sphinx_javascript_frameworks_compat.js"></script>
<script src="_static/doctools.js"></script>
<script src="_static/sphinx_highlight.js"></script>
<script src="_static/searchtools.js"></script>
@@ -106,8 +109,8 @@
&copy;2022, dbt Labs.
|
Powered by <a href="http://sphinx-doc.org/">Sphinx 6.1.3</a>
&amp; <a href="https://github.com/bitprophet/alabaster">Alabaster 0.7.13</a>
Powered by <a href="http://sphinx-doc.org/">Sphinx 5.2.3</a>
&amp; <a href="https://github.com/bitprophet/alabaster">Alabaster 0.7.12</a>
</div>

File diff suppressed because one or more lines are too long

View File

@@ -8,13 +8,14 @@ The event module provides types that represent what is happening in dbt in `even
When events are processed via `fire_event`, nearly everything is logged. Whether or not the user has enabled the debug flag, all debug messages are still logged to the file. However, some events are particularly time consuming to construct because they return a huge amount of data. Today, the only messages in this category are cache events and are only logged if the `--log-cache-events` flag is on. This is important because these messages should not be created unless they are going to be logged, because they cause a noticable performance degredation. These events use a "fire_event_if" functions.
# Adding a New Event
* Add a new message in types.proto with an EventInfo field first
* run the protoc compiler to update proto_types.py: ```protoc --python_betterproto_out . types.proto```
* Add a wrapping class in core/dbt/event/types.py with a Level superclass and the superclass from proto_types.py, plus code and message methods
* Add a new message in types.proto, and a second message with the same name + "Msg". The "Msg" message should have two fields, an "info" field of EventInfo, and a "data" field referring to the message name without "Msg"
* run the protoc compiler to update types_pb2.py: make proto_types
* Add a wrapping class in core/dbt/event/types.py with a Level superclass plus code and message methods
* Add the class to tests/unit/test_events.py
Note that no attributes can exist in these event classes except for fields defined in the protobuf definitions, because the betterproto metaclass will throw an error. Betterproto provides a to_dict() method to convert the generated classes to a dictionary and from that to json. However some attributes will successfully convert to dictionaries but not to serialized protobufs, so we need to test both output formats.
We have switched from using betterproto to using google protobuf, because of a lack of support for Struct fields in betterproto.
The google protobuf interface is janky and very much non-Pythonic. The "generated" classes in types_pb2.py do not resemble regular Python classes. They do not have normal constructors; they can only be constructed empty. They can be "filled" by setting fields individually or using a json_format method like ParseDict. We have wrapped the logging events with a class (in types.py) which allows using a constructor -- keywords only, no positional parameters.
## Required for Every Event
@@ -24,8 +25,7 @@ Note that no attributes can exist in these event classes except for fields defin
Example
```
@dataclass
class PartialParsingDeletedExposure(DebugLevel, pt.PartialParsingDeletedExposure):
class PartialParsingDeletedExposure(DebugLevel):
def code(self):
return "I049"
@@ -50,4 +50,8 @@ logger = AdapterLogger("<database name>")
## Compiling types.proto
After adding a new message in types.proto, in the core/dbt/events directory: ```protoc --python_betterproto_out . types.proto```
After adding a new message in types.proto, execute Makefile target:
make proto_types in the repository root directory, or
`protoc -I=. --python_out=. types.proto`
in the core/dbt/events directory

View File

@@ -17,38 +17,42 @@ class AdapterLogger:
def debug(self, msg, *args):
event = AdapterEventDebug(
name=self.name, base_msg=msg, args=list(args), node_info=get_node_info()
name=self.name, base_msg=str(msg), args=list(args), node_info=get_node_info()
)
fire_event(event)
def info(self, msg, *args):
event = AdapterEventInfo(
name=self.name, base_msg=msg, args=list(args), node_info=get_node_info()
name=self.name, base_msg=str(msg), args=list(args), node_info=get_node_info()
)
fire_event(event)
def warning(self, msg, *args):
event = AdapterEventWarning(
name=self.name, base_msg=msg, args=list(args), node_info=get_node_info()
name=self.name, base_msg=str(msg), args=list(args), node_info=get_node_info()
)
fire_event(event)
def error(self, msg, *args):
event = AdapterEventError(
name=self.name, base_msg=msg, args=list(args), node_info=get_node_info()
name=self.name, base_msg=str(msg), args=list(args), node_info=get_node_info()
)
fire_event(event)
# The default exc_info=True is what makes this method different
def exception(self, msg, *args):
exc_info = str(traceback.format_exc())
event = AdapterEventError(
name=self.name, base_msg=msg, args=list(args), node_info=get_node_info()
name=self.name,
base_msg=str(msg),
args=list(args),
node_info=get_node_info(),
exc_info=exc_info,
)
event.exc_info = traceback.format_exc()
fire_event(event)
def critical(self, msg, *args):
event = AdapterEventError(
name=self.name, base_msg=msg, args=list(args), node_info=get_node_info()
name=self.name, base_msg=str(msg), args=list(args), node_info=get_node_info()
)
fire_event(event)

View File

@@ -1,27 +1,24 @@
from dataclasses import dataclass
from enum import Enum
import os
import threading
from datetime import datetime
import dbt.events.proto_types as pt
from dbt.events import types_pb2
import sys
from google.protobuf.json_format import ParseDict, MessageToDict, MessageToJson
from google.protobuf.message import Message
from dbt.events.helpers import get_json_string_utcnow
if sys.version_info >= (3, 8):
from typing import Protocol
else:
from typing_extensions import Protocol
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# These base types define the _required structure_ for the concrete event #
# types defined in types.py #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
class Cache:
# Events with this class will only be logged when the `--log-cache-events` flag is passed
pass
def get_global_metadata_vars() -> dict:
from dbt.events.functions import get_metadata_vars
@@ -39,13 +36,6 @@ def get_pid() -> int:
return os.getpid()
# preformatted time stamp
def get_ts_rfc3339() -> str:
ts = datetime.utcnow()
ts_rfc3339 = ts.strftime("%Y-%m-%dT%H:%M:%S.%fZ")
return ts_rfc3339
# in theory threads can change so we don't cache them.
def get_thread_name() -> str:
return threading.current_thread().name
@@ -61,24 +51,53 @@ class EventLevel(str, Enum):
ERROR = "error"
@dataclass
class BaseEvent:
"""BaseEvent for proto message generated python events"""
# def __post_init__(self):
# super().__post_init__()
# if not self.info.level:
# self.info.level = self.level_tag()
# assert self.info.level in ["info", "warn", "error", "debug", "test"]
# if not hasattr(self.info, "msg") or not self.info.msg:
# self.info.msg = self.message()
# self.info.invocation_id = get_invocation_id()
# self.info.extra = get_global_metadata_vars()
# self.info.ts = datetime.utcnow()
# self.info.pid = get_pid()
# self.info.thread = get_thread_name()
# self.info.code = self.code()
# self.info.name = type(self).__name__
def __init__(self, *args, **kwargs):
class_name = type(self).__name__
msg_cls = getattr(types_pb2, class_name)
if class_name == "Formatting" and len(args) > 0:
kwargs["msg"] = args[0]
args = ()
assert (
len(args) == 0
), f"[{class_name}] Don't use positional arguments when constructing logging events"
if "base_msg" in kwargs:
kwargs["base_msg"] = str(kwargs["base_msg"])
if "msg" in kwargs:
kwargs["msg"] = str(kwargs["msg"])
try:
self.pb_msg = ParseDict(kwargs, msg_cls())
except Exception:
# Imports need to be here to avoid circular imports
from dbt.events.types import Note
from dbt.events.functions import fire_event
fire_event(Note(msg=f"[{class_name}]: Unable to parse dict {kwargs}"))
self.pb_msg = msg_cls()
def __setattr__(self, key, value):
if key == "pb_msg":
super().__setattr__(key, value)
else:
super().__getattribute__("pb_msg").__setattr__(key, value)
def __getattr__(self, key):
if key == "pb_msg":
return super().__getattribute__(key)
else:
return super().__getattribute__("pb_msg").__getattribute__(key)
def to_dict(self):
return MessageToDict(
self.pb_msg, preserving_proto_field_name=True, including_default_value_fields=True
)
def to_json(self):
return MessageToJson(
self.pb_msg, preserving_proto_field_name=True, including_default_valud_fields=True
)
def level_tag(self) -> EventLevel:
return EventLevel.DEBUG
@@ -90,42 +109,48 @@ class BaseEvent:
raise Exception("code() not implemented for event")
class EventInfo(Protocol):
level: str
name: str
ts: str
code: str
class EventMsg(Protocol):
info: pt.EventInfo
data: BaseEvent
info: EventInfo
data: Message
def msg_from_base_event(event: BaseEvent, level: EventLevel = None):
msg_class_name = f"{type(event).__name__}Msg"
msg_cls = getattr(pt, msg_class_name)
msg_cls = getattr(types_pb2, msg_class_name)
# level in EventInfo must be a string, not an EventLevel
msg_level: str = level.value if level else event.level_tag().value
assert msg_level is not None
event_info = pt.EventInfo(
level=msg_level,
msg=event.message(),
invocation_id=get_invocation_id(),
extra=get_global_metadata_vars(),
ts=datetime.utcnow(),
pid=get_pid(),
thread=get_thread_name(),
code=event.code(),
name=type(event).__name__,
)
new_event = msg_cls(data=event, info=event_info)
event_info = {
"level": msg_level,
"msg": event.message(),
"invocation_id": get_invocation_id(),
"extra": get_global_metadata_vars(),
"ts": get_json_string_utcnow(),
"pid": get_pid(),
"thread": get_thread_name(),
"code": event.code(),
"name": type(event).__name__,
}
new_event = ParseDict({"info": event_info}, msg_cls())
new_event.data.CopyFrom(event.pb_msg)
return new_event
# DynamicLevel requires that the level be supplied on the
# event construction call using the "info" function from functions.py
@dataclass # type: ignore[misc]
class DynamicLevel(BaseEvent):
pass
@dataclass
class TestLevel(BaseEvent):
__test__ = False
@@ -133,54 +158,21 @@ class TestLevel(BaseEvent):
return EventLevel.TEST
@dataclass # type: ignore[misc]
class DebugLevel(BaseEvent):
def level_tag(self) -> EventLevel:
return EventLevel.DEBUG
@dataclass # type: ignore[misc]
class InfoLevel(BaseEvent):
def level_tag(self) -> EventLevel:
return EventLevel.INFO
@dataclass # type: ignore[misc]
class WarnLevel(BaseEvent):
def level_tag(self) -> EventLevel:
return EventLevel.WARN
@dataclass # type: ignore[misc]
class ErrorLevel(BaseEvent):
def level_tag(self) -> EventLevel:
return EventLevel.ERROR
# Included to ensure classes with str-type message members are initialized correctly.
@dataclass # type: ignore[misc]
class AdapterEventStringFunctor:
def __post_init__(self):
super().__post_init__()
if not isinstance(self.base_msg, str):
self.base_msg = str(self.base_msg)
@dataclass # type: ignore[misc]
class EventStringFunctor:
def __post_init__(self):
super().__post_init__()
if not isinstance(self.msg, str):
self.msg = str(self.msg)
# prevents an event from going to the file
# This should rarely be used in core code. It is currently
# only used in integration tests and for the 'clean' command.
class NoFile:
pass
# prevents an event from going to stdout
class NoStdOut:
pass

View File

@@ -2,7 +2,6 @@ import contextlib
import contextvars
from typing import Any, Generator, Mapping, Dict
from dbt.events.proto_types import NodeInfo
LOG_PREFIX = "log_"
@@ -27,7 +26,7 @@ def get_node_info():
if "node_info" in cvars:
return cvars["node_info"]
else:
return NodeInfo()
return {}
def clear_contextvars() -> None:

View File

@@ -144,12 +144,10 @@ class _TextLogger(_Logger):
log_line: str = ""
# Create a separator if this is the beginning of an invocation
# TODO: This is an ugly hack, get rid of it if we can
ts: str = timestamp_to_datetime_string(msg.info.ts)
if msg.info.name == "MainReportVersion":
separator = 30 * "="
log_line = (
f"\n\n{separator} {msg.info.ts} | {self.event_manager.invocation_id} {separator}\n"
)
ts: str = msg.info.ts.strftime("%H:%M:%S.%f")
log_line = f"\n\n{separator} {ts} | {self.event_manager.invocation_id} {separator}\n"
scrubbed_msg: str = self.scrubber(msg.info.msg) # type: ignore
level = msg.info.level
log_line += (
@@ -192,7 +190,7 @@ class EventManager:
if os.environ.get("DBT_TEST_BINARY_SERIALIZATION"):
print(f"--- {msg.info.name}")
try:
bytes(msg)
msg.SerializeToString()
except Exception as exc:
raise Exception(
f"{msg.info.name} is not serializable to binary. Originating exception: {exc}, {traceback.format_exc()}"
@@ -214,6 +212,15 @@ class EventManager:
logger.event_manager = self
self.loggers.append(logger)
def add_callbacks(self, callbacks: List[Callable[[EventMsg], None]]):
"""Helper for adding a sequence of Callbacks to the EventManager"""
self.callbacks.extend(callbacks)
def flush(self):
for logger in self.loggers:
logger.flush()
def timestamp_to_datetime_string(ts):
timestamp_dt = datetime.fromtimestamp(ts.seconds + ts.nanos / 1e9)
return timestamp_dt.strftime("%H:%M:%S.%f")

Some files were not shown because too many files have changed in this diff Show More