Compare commits

...

86 Commits

Author SHA1 Message Date
Github Build Bot
49309e7c9e Add generated CLI API docs 2022-12-14 19:13:24 +00:00
Colin
e4843e8da3 Merge branch 'main' into feature/support-incremental-predicates 2022-12-14 11:11:54 -08:00
Peter Webb
5e4e917de5 CT-1685: Restore certain aspects of legacy logging behavior important… (#6443)
* CT-1685: Restore certain aspects of legacy logging behavior important to dbt-rpc

* CT-1658: And changelog entry
2022-12-14 11:13:34 -05:00
github-actions[bot]
05dc0212e7 Bumping version to 1.4.0b1 and generate changelog (#6440)
* Bumping version to 1.4.0b1 and generate CHANGELOG

* Updating date

* Updating date

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
Co-authored-by: leahwicz <60146280+leahwicz@users.noreply.github.com>
2022-12-13 20:18:11 -05:00
Gerda Shank
c00052cbfb Add Optional back on "database" field of HasRelationMetadata (#6439) 2022-12-13 18:15:25 -05:00
Kshitij Aranke
3d54a83822 [CT-1284] Change Python model default materialization to table (#6432) 2022-12-13 15:07:56 -08:00
Gerda Shank
fafd5edbda CT 1644 node cleanup (#6427)
* Remove unneeded SQL compilation attributes from SeedNode

* Fix various places that referenced removed attributes

* Cleanup a few Unions

* More formatting in nodes.py

* Mypy passing. Untested.

* Unit tests working

* use "doc" in documentation unique_ids

* update some doc_ids

* Fix some artifact tests. Still need previous version.

* Update manifest/v8.json

* Move relation_names to parsing

* Fix a couple of tests

* Update some artifacts. snapshot_seed has wrong schema.

* Changie

* Tweak NodeType.Documentation

* Put store_failures property in the right place

* Fix setting relation_name
2022-12-13 12:39:35 -05:00
Josh Devlin
8478262580 Update docker README (#6423) 2022-12-13 11:12:34 -05:00
Kshitij Aranke
83b1fee062 Add aranke to core committers (#6431) 2022-12-12 15:13:03 -08:00
Emily Rockman
0fbbc896b2 Remove PR from most changelog kinds (#6374)
* update changie to require issue or pr, and allow multiple

* remove extraneous data from changelog files.

* allow for multiple PR/issues to be entered

* update contributing guide

* remove issue number from bot changelogs

* update format of PR

* fix dependency changelogs

* remove extra line

* remove extra lines, tweak contributor wording

* Update CONTRIBUTING.md

Co-authored-by: Doug Beatty <44704949+dbeatty10@users.noreply.github.com>

Co-authored-by: Doug Beatty <44704949+dbeatty10@users.noreply.github.com>
2022-12-12 13:18:15 -06:00
Ian Knox
0544b08543 Add support for Python 3.11 (#6326)
* Get running with Python 3.11

* More tests passing, mypy still unhappy

* Upgrade to 3.11, and bump mashumaro

* patch importlib.import_module last

* lambda: Policy() default_factory on include and quote policy

* Add changelog entry

* Put a lambda on it

* Fix text formatting for log file

* Handle variant type return from e.log_level()

Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com>
Co-authored-by: Josh Taylor <joshuataylorx@gmail.com>
Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>
2022-12-08 18:34:03 +01:00
bruno messias
bef6edb942 Fix dbt.config.get default values (python-model) (#6317)
* feat: add a list of default values to the ctx manager

* tests: dbt.get.config default values

* feat: validate the num of args in config.get

* feat: jinja template for dbt.config.get default values

* docs: changie yaml

* fix:typo on error message

Co-authored-by: Chenyu Li <chenyulee777@gmail.com>

Co-authored-by: Chenyu Li <chenyulee777@gmail.com>
2022-12-07 15:52:38 -08:00
timle
99f27de934 Feature/dbt deps tarball (#4689)
* v0 - new dbt deps type: tarball url

in support of
https://github.com/dbt-labs/dbt-core/issues/4205

* flake8 fixes

* adding max size tarball condition

* clean up imports

* typing

* adding sha1 and subdirectory options; improve logging feedback

sha1: allow user to specify sha1 in packages.yaml, will only install if package matches
subdirectory: allow user to specify subdirectory of package in tarfile, if the package is a non standard structure (like with git subdirectory option)

* simple tests added

* flake fixes

* changes to support tests; adding exceptions; fire_event logging

* new logging events

* tarball exceptions added

* build out tests

* removing in memory tarball test

* update type codes to M - Misc

* adding new events to test_events

* fix spacing for flake

* add retry download code - as used in registry calls

* clean

* remove saving tar in memory inside tarfile object

will hit url multiple times instead

* remove duplicative code after refactor

* black updates

* black formatting

* black formatting

* refactor - no more in-memory tarfile - all as file operations now

- remove tarfile passing, always use tempfile instead
- reorganize system.* functions, removing duplicative code
- more notes on current flow and structure - esp need for pattern of 1) unpack 2) scan for package dir 3) copy to destination.
- cleaning

* cleaning and sync to new tarball code

* cleaning and sync to new tarball code

* requested changes from PR

https://github.com/dbt-labs/dbt-core/pull/4689#discussion_r812970847

* reversions from revision 2

removing sha1 check to simplify/mirror hub install pattern

* simplify/mirror hub install pattern

to simplify/mirror hub install pattern
- removing sha1 check
- supply name/version to act as our 'metadata' source

* simplify/mirror hub install pattern

simplify with goal of mirroring hub install pattern
- supporting subfolders like git packages, and sha1 checks are removed
- existing code from RegistryPinnedPackage (install() and download_and_untar()) performs the operations
- RegistryPinnedPackage install() and download_and_untar() are not currently set up as functions that can be used across classes - this should be moved to dbt.deps.base, or to a dbt.deps.common file - need dbt labs feedback on how to proceed (or leave as is)

* remove revisions, no longer doing package check

* slim down to basic tests

more complex features have been removed (sha1, subfolder) so testing is much simpler!

* fix naming to match hubs behavior

remove version from package folder name

* refactor install and download to upstream PinnedPackage class

i'm on the fence if this is right approach, but seems like most sensible after some thought

* Create Features-20221107-105018.yaml

* fix flake, black, mypy errors

* additional flake/black fixes

* Update .changes/unreleased/Features-20221107-105018.yaml

fix username on changelog

Co-authored-by: Emily Rockman <ebuschang@gmail.com>

* change to fstring

Co-authored-by: Emily Rockman <ebuschang@gmail.com>

* cleaning - remove comment

* remove comment/question for dbt team

* in support of issuecomment 1334055944

https://github.com/dbt-labs/dbt-core/pull/4689#issuecomment-1334055944

* in support of issuecomment 1334118433

https://github.com/dbt-labs/dbt-core/pull/4689#issuecomment-1334118433

* black fixes; remove debug bits

* remove `.format` & add 'tarball' as version

'tarball' as version so that the temp files format nicely:
[tempfile_location]/dbt_utils_2..tar.gz # old
vs
[tempfile_location]/dbt_utils_1.tarball.tar.gz # current

* port os.path refs in `PinnedPackage._install` to pathlib

* lowercase as per PR feedback

* update tests after removing version arg

goes along with 8787ba41af

Co-authored-by: Emily Rockman <ebuschang@gmail.com>
2022-12-07 15:48:17 -06:00
Jeremy Cohen
9c91f3a7bd Adjust tox passenv to be multiline (#6405) 2022-12-07 22:47:51 +01:00
Gerda Shank
1b6fed2ffd CT 1604 remove compiled classes (#6384)
* removed Compiled versions of nodes

* Remove compiled fields from dictionary if not compiled

* check compiled is False instead of attribute existence in env_var
processing

* Update artifacts test (CompiledSnapshotNode did not have SnapshotConfig)

* Changie

* more complicated 'compiling' check in env_var

* Update test_exit_codes.py
2022-12-07 15:21:05 -05:00
dependabot[bot]
0721f2c1b7 Bump mashumaro[msgpack] from 3.1.1 to 3.2 in /core (#6375)
* Bump mashumaro[msgpack] from 3.1.1 to 3.2 in /core

Bumps [mashumaro[msgpack]](https://github.com/Fatal1ty/mashumaro) from 3.1.1 to 3.2.
- [Release notes](https://github.com/Fatal1ty/mashumaro/releases)
- [Commits](https://github.com/Fatal1ty/mashumaro/compare/v3.1.1...v3.2)

---
updated-dependencies:
- dependency-name: mashumaro[msgpack]
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2022-12-07 10:16:31 -05:00
Doug Beatty
b9a35da118 Fix intermittent database connection failure in Windows CI test (#6395)
* Fix intermittent database connection failure in Windows CI test

* Changelog entry
2022-12-06 18:34:39 -07:00
Peter Webb
60f80056b1 CT-1405: Refactor event logging code (#6291)
* CT-1405: Refactor event logging code

* CT-1405: Add changelog entry

* CT-1405: Add code to protect against using closed streams from past tests.

* CT-1405: Restore unit test which was only failing locally

* CT-1405: Document a hack with issue # to resolve it in the future

* CT-1405: Make black happy

* CT-1405: Get rid of confusing factory function and duplicated function

* CT-1405: Remove unused event from types.proto and auto-gen'd file
2022-12-06 15:51:52 -05:00
Stu Kilgore
540c3b79aa Prevent docs gen workflow on forks (#6390) 2022-12-06 11:14:20 -06:00
Gerda Shank
16f529e1d4 CT 1477 enrich logging events with data similar to legacy logger (#6325) 2022-12-02 19:29:25 -05:00
Doug Beatty
ebfcf2a9ef Update core/dbt/README.md to match current (#6371)
* Update core/dbt/README.md to match current

Add missing files/folders and alphabetize

* Changelog entry
2022-12-02 15:45:53 -07:00
Alexander Butler
67a8138b65 [fix] Fix the partial parse write path (#6081)
* Fix the partial parse path

Partial parse should use project root or it does not resolve to correct path. 
Eg. `target-path: ../some/dir/target`, if not ran from root, creates an erroneous folder.

* Run pre-commit

* Changie

Co-authored-by: Gerda Shank <gerda@dbtlabs.com>
2022-12-02 17:23:37 -05:00
leahwicz
85d0b5afc7 Reverting back to older ubuntu image (#6363)
* Reverting back to older ubuntu image

* Updating the structured logging workflow as well
2022-12-02 12:09:46 -05:00
Matthew McKnight
1fbcaa4484 reformatting of test after some spike investigation (#6314)
* reformatting of test after some spike investigation

* reformat code to pull tests back into base class definition, move a test to more appropriate spot
2022-12-01 16:54:58 -06:00
justbldwn
481235a943 clarify error log for number of allowed models in a Python file (#6251) 2022-12-01 14:43:36 -05:00
Michelle Ark
2289e45571 Exposures support metrics (#6342)
* exposures support metrics
2022-12-01 11:01:16 -05:00
Dave Connors
3a173874d9 test for predicates keyword 2022-12-01 09:49:53 -06:00
Dave Connors
bfe46b1deb handle predicates config for backwards compatibility 2022-12-01 09:45:15 -06:00
dependabot[bot]
b5d303f12a Bump mashumaro[msgpack] from 3.0.4 to 3.1 in /core (#6108)
* Bump mashumaro[msgpack] from 3.0.4 to 3.1.1 in /core

Bumps [mashumaro[msgpack]](https://github.com/Fatal1ty/mashumaro) from 3.0.4 to 3.1.
- [Release notes](https://github.com/Fatal1ty/mashumaro/releases)
- [Commits](https://github.com/Fatal1ty/mashumaro/compare/v3.0.4...v3.1)

---
updated-dependencies:
- dependency-name: mashumaro[msgpack]
  dependency-type: direct:production
  update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>
2022-11-30 17:32:47 -05:00
Mila Page
c3be975783 Ct 288/convert 070 incremental test (#6330)
* Convert incremental schema tests.

* Drop the old test.

* Bad git add. My disappoint is immeasurable and my day has been ruined.

* Adjustments for flake8.

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2022-11-29 12:47:20 -08:00
Mila Page
47c2edb42a Ct 1518/convert 063 relation names tests (#6304)
* Convert old test.

Add documentation. Adapt and reenable previously skipped test.

* Convert test and adapt and comment for current standards.

* Remove old versions of tests.

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2022-11-29 12:25:36 -08:00
Stu Kilgore
b3440417ad Add GHA workflow to build CLI API docs (#6187) 2022-11-29 13:30:47 -06:00
leahwicz
020f639c7a Update stale.yml (#6258) 2022-11-29 09:40:59 -05:00
Mila Page
55db15aba8 Convert test 067. (#6305)
* Convert test 067. One bug outstanding.

* Test now working! Schema needed renaming to avoid 63 char max problems

* Remove old test.

* Add some docs and rewrite.

* Add exception for when audit tables' schema runs over the db limit.

* Code cleanup.

* Revert exception.

* Round out comments.

* Rename what shouldn't be a base class.

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2022-11-29 00:06:07 -08:00
Itamar Hartstein
bce0e7c096 BaseContext: expose md5 function in context (#6247)
* BaseContext: expose md5 function in context

* BaseContext: add return value type

* Add changie entry

* rename "md5" to "local_md5"

* fix test_context.py
2022-11-28 10:23:40 -05:00
Gerda Shank
7d7066466d CT 1537 fix event test and rename a couple of fields (#6293)
* Rename MacroEvent to JinjaLog

* Rename ConnectionClosed/2

* Fix LogSeedResult

* Rename ConnectionLeftOpen events, fix test_events.py

* Update events README.md, add "category" to EventInfo

* Rename GeneralMacroWarning to JinjaLogWarning
2022-11-22 14:54:20 -05:00
Emily Rockman
517576c088 add back in conditional node length check (#6298) 2022-11-21 21:20:55 -08:00
leahwicz
987764858b Revert "Bump python from 3.10.7-slim-bullseye to 3.11.0-slim-bullseye in /docker (#6180)" (#6281)
This reverts commit 8e28f5906e.
2022-11-17 09:14:22 -05:00
FishtownBuildBot
a235abd176 Add new index.html and changelog yaml files from dbt-docs (#6265) 2022-11-16 17:00:33 +01:00
dependabot[bot]
9297e4d55c Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core (#5917)
* Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core

Updates the requirements on [pathspec](https://github.com/cpburnz/python-pathspec) to permit the latest version.
- [Release notes](https://github.com/cpburnz/python-pathspec/releases)
- [Changelog](https://github.com/cpburnz/python-pathspec/blob/master/CHANGES.rst)
- [Commits](https://github.com/cpburnz/python-pathspec/compare/v0.9.0...v0.10.1)

---
updated-dependencies:
- dependency-name: pathspec
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2022-11-15 22:02:37 -05:00
Michelle Ark
eae98677b9 s/gitlab/github for flake8 precommit repo (#6252) 2022-11-15 10:30:00 -05:00
Matthew McKnight
66ac107409 [CT-1262] Convert dbt_debug (#6125)
* init pr for dbt_debug test conversion

* removal of old test

* minor test format change

* add new Base class and Test classes

* reformatting test, new method for capsys and error messgae to check, todo fix badproject

* refomatting tests, ready for review

* checking yaml file, and small reformat

* modifying since update wasn't working in ci/cd
2022-11-14 14:22:48 -06:00
Michelle Ark
39c5c42215 converting 044_test_run_operations (#6122)
* converting 044_test_run_operations
2022-11-14 10:39:57 -05:00
dependabot[bot]
9f280a8469 Update colorama requirement from <0.4.6,>=0.3.9 to >=0.3.9,<0.4.7 in /core (#6144)
* Update colorama requirement in /core

Updates the requirements on [colorama](https://github.com/tartley/colorama) to permit the latest version.
- [Release notes](https://github.com/tartley/colorama/releases)
- [Changelog](https://github.com/tartley/colorama/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/tartley/colorama/compare/0.3.9...0.4.6)

---
updated-dependencies:
- dependency-name: colorama
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2022-11-13 09:57:33 -05:00
Joe Berni
73116fb816 feature/favor-state-node (#5859) 2022-11-09 10:58:01 -06:00
Stu Kilgore
f02243506d Convert postgres index tests (#6228) 2022-11-08 15:30:29 -06:00
Stu Kilgore
d5e9ce1797 Convert color tests to pytest (#6230) 2022-11-08 15:25:57 -06:00
Stu Kilgore
4e786184d2 Convert threading tests to pytest (#6226) 2022-11-08 08:56:10 -06:00
Chenyu Li
930bd3541e properly track hook running (#6059) 2022-11-07 10:44:29 -06:00
Gerda Shank
6c76137da4 CT 1443 remove root path (#6172)
* Remove root_path

* Bump manifest schema to 8

* Update tests and compability utility for v8, root_path removal
2022-11-04 16:38:26 -04:00
Gerda Shank
68d06d8a9c Combine various print result log events with different levels (#6174)
* Combine various print result log events with different levels

* Changie

* more merge cleanup

* Specify DynamicLevel for event classes that must specify level
2022-11-04 14:26:37 -04:00
Dave Connors
86464070c9 update test structure for inheritance 2022-11-04 13:18:25 -05:00
Rachel
d0543c9242 Updates lib to use new profile name functionality (#6202)
* Updates lib to use new profile name functionality

* Adds changie entry

* Fixes formatting
2022-11-04 10:05:24 -07:00
Dave Connors
c3a49e41e9 Merge branch 'main' into feature/support-incremental-predicates 2022-11-04 10:02:48 -05:00
Dave Connors
54da843e83 add test for incremental predicates delete and insert postgres 2022-11-04 09:58:02 -05:00
Michelle Ark
cfad27f963 add typing to DepsTask.run (#6192) 2022-11-03 17:35:16 -04:00
Dave Connors
ce28ab3111 comma in test config 2022-11-03 09:48:05 -05:00
Emily Rockman
c3ccbe3357 add python version and upgrade action (#6204) 2022-11-03 09:13:00 -05:00
Dave Connors
c79ae61429 add functional test to adapter zone 2022-11-02 15:52:35 -05:00
Dave Connors
560ccbe4be Merge branch 'main' into feature/support-incremental-predicates 2022-11-02 15:06:48 -05:00
dependabot[bot]
8e28f5906e Bump python from 3.10.7-slim-bullseye to 3.11.0-slim-bullseye in /docker (#6180)
* Bump python from 3.10.7-slim-bullseye to 3.11.0-slim-bullseye in /docker

Bumps python from 3.10.7-slim-bullseye to 3.11.0-slim-bullseye.

---
updated-dependencies:
- dependency-name: python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
2022-11-02 08:40:51 -07:00
FishtownBuildBot
d23285b4ba Add new index.html and changelog yaml files from dbt-docs (#6112) 2022-11-02 08:36:56 -07:00
Michelle Ark
a42748433d converting 023_exit_codes_tests (#6105)
* converting 023_exit_codes_tests

* use packages fixture, clean up test names
2022-11-01 16:26:12 -04:00
Emily Rockman
be4a91a0fe Convert messages to struct logs (#6064)
* Initial structured logging changes

* remove "this" from core/dbt/events/functions.py

* CT-1047: Fix execution_time definitions to use float

* CT-1047: Revert unintended checking of changes to functions.py

* WIP

* first pass to resolve circular deps

* more circular dep resolution

* remove a bunch of duplication

* move message into log line

* update comments

* fix field that wen missing during rebase

* remove double import

* remove some comments and extra code

* fix pre-commit

* rework deprecations

* WIP converting messages

* WIP converting messages

* remove stray comment

* WIP more message conversion

* WIP more message conversion

* tweak the messages

* convert last message

* rename

* remove warn_or_raise as never used

* add fake calls to all new events

* fix some tests

* put back deprecation

* restore deprecation fully

* fix unit test

* fix log levels

* remove some skipped ids

* fix macro log function

* fix how messages are built to match expected outcome

* fix expected test message

* small fixes from reviews

* fix conflict resolution in UI

Co-authored-by: Gerda Shank <gerda@dbtlabs.com>
Co-authored-by: Peter Allen Webb <peter.webb@dbtlabs.com>
2022-10-31 12:04:56 -05:00
Emily Rockman
8145eed603 revert to community action (#6163) 2022-10-27 16:10:58 -05:00
Emily Rockman
fc00239f36 point to correct workflow (#6161)
* point to correct workflow

* add inputs
2022-10-27 14:05:09 -05:00
Ian Knox
77dfec7214 more ergonomic profile name handling (#6157) 2022-10-27 10:49:27 -05:00
Emily Rockman
7b73264ec8 switch out to use internal action for triage labels (#6120)
* switch out to use our action

* point to main
2022-10-27 08:33:15 -05:00
Mila Page
1916784287 Ct 1167/030 statement tests conversion (#6109)
* Convert test to functional set.

* Remove old statement tests from integration test set.

* Nix whitespace

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2022-10-26 03:37:44 -07:00
Ian Knox
c2856017a1 [BUGFIX] Force tox to update pip (fixes psycopg2-binary @ 2.9.5) (#6134) 2022-10-25 13:01:38 -05:00
Michelle Ark
17b82661d2 convert 027 cycle test (#6094)
* convert 027 cycle test

* remove no-op expect_pass=False

* remove postgres from test names
2022-10-21 11:41:51 -04:00
Michelle Ark
6c8609499a Add 'michelleark' to changie's core_team list (#6084) 2022-10-20 14:41:41 -04:00
Peter Webb
53ae325576 CT-1099: Migrate test 071_commented_yaml_regression_3568_tests (#6106) 2022-10-20 12:43:30 -04:00
Mila Page
a7670a3ab9 Add unit tests for recent stringifier functors added to events library. (#6095)
Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2022-10-19 22:52:32 -07:00
Mila Page
ff2f1f42c3 Working solution serialization bug. (#5874)
* Create functors to initialize event types with str-type member attributes. Before this change, the spec of various classes expected base_msg and msg params to be str's. This assumption did not always hold true. post_init hooks ensures the spec is obeyed.
* Add new changelog.
* Add msg type change functor to a few other events that could use it.

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
2022-10-18 12:20:30 -07:00
Luke Bassett
35f7975d8f Updated string formatting on non-f-strings. (#6086)
* Updated string formatting on non-f-strings.

Found all cases of strings separated by white space on a single line and
removed white space separation. EX: "hello " "world" -> "hello world".

* add changelog entry
2022-10-17 15:58:31 -05:00
Eve Johns
a9c8bc0e0a f-string cleanup #6068 (#6082)
* fix f string issue

* removed one space

* Add changelog

* fixed return format

Co-authored-by: Leah Antkiewicz <leah.antkiewicz@fishtownanalytics.com>
2022-10-17 16:58:04 -04:00
Peter Webb
73aebd8159 CT-625: Fail with clear message for invalid materialized vals (#6025)
* CT-625: Fail with clear message for invalid materialized vals

* CT-625: Increase test coverage, run pre-commit checks

* CT-625: run black on problem file

* CT-625: Add changelog entry

* CT-625: Remove test that didn't make sense
2022-10-14 16:14:32 -04:00
Gerda Shank
9b84b6e2e8 Initial structured logging changes (#5954)
Co-authored-by: Peter Allen Webb <peter.webb@dbtlabs.com>
Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com>
2022-10-14 13:57:46 -04:00
colin-rogers-dbt
095997913e migrate 034 integration test to functional (#6054)
* migrate 034 integration test to functional

* formatting fixes

* move to adapter directory

* add extra args
2022-10-12 15:24:51 -07:00
pgoslatara
6de1d29cf9 Allowing partitions in external table to be a list (#5930)
* Allowing partitions in external table to be a list

* Adding changelog entry
2022-10-12 15:23:58 -07:00
Dave Connors
b3df2ca7ed changie 2022-08-23 08:57:37 -05:00
Dave Connors
ca25734abe update to use arbitrary list of predicates, not dictionaries, merge and delete 2022-08-22 14:30:54 -05:00
Dave Connors
b8fd4bb319 merge with predicates 2022-08-10 09:49:17 -05:00
Dave Connors
39e636ad41 postgres delete and insert 2022-08-10 08:35:24 -05:00
Dave Connors
607d5b01b6 pass predicated to merge strategy 2022-08-10 08:18:03 -05:00
373 changed files with 33991 additions and 7468 deletions

View File

@@ -1,5 +1,5 @@
[bumpversion]
current_version = 1.4.0a1
current_version = 1.4.0b1
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)

87
.changes/1.4.0-b1.md Normal file
View File

@@ -0,0 +1,87 @@
## dbt-core 1.4.0-b1 - December 14, 2022
### Features
- Added favor-state flag to optionally favor state nodes even if unselected node exists ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- Update structured logging. Convert to using protobuf messages. Ensure events are enriched with node_info. ([#5610](https://github.com/dbt-labs/dbt-core/issues/5610))
- Friendlier error messages when packages.yml is malformed ([#5486](https://github.com/dbt-labs/dbt-core/issues/5486))
- Migrate dbt-utils current_timestamp macros into core + adapters ([#5521](https://github.com/dbt-labs/dbt-core/issues/5521))
- Allow partitions in external tables to be supplied as a list ([#5929](https://github.com/dbt-labs/dbt-core/issues/5929))
- extend -f flag shorthand for seed command ([#5990](https://github.com/dbt-labs/dbt-core/issues/5990))
- This pulls the profile name from args when constructing a RuntimeConfig in lib.py, enabling the dbt-server to override the value that's in the dbt_project.yml ([#6201](https://github.com/dbt-labs/dbt-core/issues/6201))
- Adding tarball install method for packages. Allowing package tarball to be specified via url in the packages.yaml. ([#4205](https://github.com/dbt-labs/dbt-core/issues/4205))
- Added an md5 function to the base context ([#6246](https://github.com/dbt-labs/dbt-core/issues/6246))
- Exposures support metrics in lineage ([#6057](https://github.com/dbt-labs/dbt-core/issues/6057))
- Add support for Python 3.11 ([#6147](https://github.com/dbt-labs/dbt-core/issues/6147))
### Fixes
- Account for disabled flags on models in schema files more completely ([#3992](https://github.com/dbt-labs/dbt-core/issues/3992))
- Add validation of enabled config for metrics, exposures and sources ([#6030](https://github.com/dbt-labs/dbt-core/issues/6030))
- check length of args of python model function before accessing it ([#6041](https://github.com/dbt-labs/dbt-core/issues/6041))
- Add functors to ensure event types with str-type attributes are initialized to spec, even when provided non-str type params. ([#5436](https://github.com/dbt-labs/dbt-core/issues/5436))
- Allow hooks to fail without halting execution flow ([#5625](https://github.com/dbt-labs/dbt-core/issues/5625))
- Clarify Error Message for how many models are allowed in a Python file ([#6245](https://github.com/dbt-labs/dbt-core/issues/6245))
- After this, will be possible to use default values for dbt.config.get ([#6309](https://github.com/dbt-labs/dbt-core/issues/6309))
- Use full path for writing manifest ([#6055](https://github.com/dbt-labs/dbt-core/issues/6055))
### Docs
- minor doc correction ([dbt-docs/#5791](https://github.com/dbt-labs/dbt-docs/issues/5791))
- Generate API docs for new CLI interface ([dbt-docs/#5528](https://github.com/dbt-labs/dbt-docs/issues/5528))
- ([dbt-docs/#5880](https://github.com/dbt-labs/dbt-docs/issues/5880))
- Fix rendering of sample code for metrics ([dbt-docs/#323](https://github.com/dbt-labs/dbt-docs/issues/323))
- Alphabetize `core/dbt/README.md` ([dbt-docs/#6368](https://github.com/dbt-labs/dbt-docs/issues/6368))
### Under the Hood
- Put black config in explicit config ([#5946](https://github.com/dbt-labs/dbt-core/issues/5946))
- Added flat_graph attribute the Manifest class's deepcopy() coverage ([#5809](https://github.com/dbt-labs/dbt-core/issues/5809))
- Add mypy configs so `mypy` passes from CLI ([#5983](https://github.com/dbt-labs/dbt-core/issues/5983))
- Exception message cleanup. ([#6023](https://github.com/dbt-labs/dbt-core/issues/6023))
- Add dmypy cache to gitignore ([#6028](https://github.com/dbt-labs/dbt-core/issues/6028))
- Provide useful errors when the value of 'materialized' is invalid ([#5229](https://github.com/dbt-labs/dbt-core/issues/5229))
- Clean up string formatting ([#6068](https://github.com/dbt-labs/dbt-core/issues/6068))
- Fixed extra whitespace in strings introduced by black. ([#1350](https://github.com/dbt-labs/dbt-core/issues/1350))
- Remove the 'root_path' field from most nodes ([#6171](https://github.com/dbt-labs/dbt-core/issues/6171))
- Combine certain logging events with different levels ([#6173](https://github.com/dbt-labs/dbt-core/issues/6173))
- Convert threading tests to pytest ([#5942](https://github.com/dbt-labs/dbt-core/issues/5942))
- Convert postgres index tests to pytest ([#5770](https://github.com/dbt-labs/dbt-core/issues/5770))
- Convert use color tests to pytest ([#5771](https://github.com/dbt-labs/dbt-core/issues/5771))
- Add github actions workflow to generate high level CLI API docs ([#5942](https://github.com/dbt-labs/dbt-core/issues/5942))
- Functionality-neutral refactor of event logging system to improve encapsulation and modularity. ([#6139](https://github.com/dbt-labs/dbt-core/issues/6139))
- Consolidate ParsedNode and CompiledNode classes ([#6383](https://github.com/dbt-labs/dbt-core/issues/6383))
- Prevent doc gen workflow from running on forks ([#6386](https://github.com/dbt-labs/dbt-core/issues/6386))
- Fix intermittent database connection failure in Windows CI test ([#6394](https://github.com/dbt-labs/dbt-core/issues/6394))
- Refactor and clean up manifest nodes ([#6426](https://github.com/dbt-labs/dbt-core/issues/6426))
### Dependencies
- Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core ([#5917](https://github.com/dbt-labs/dbt-core/pull/5917))
- Bump black from 22.8.0 to 22.10.0 ([#6019](https://github.com/dbt-labs/dbt-core/pull/6019))
- Bump mashumaro[msgpack] from 3.0.4 to 3.1.1 in /core ([#6108](https://github.com/dbt-labs/dbt-core/pull/6108))
- Update colorama requirement from <0.4.6,>=0.3.9 to >=0.3.9,<0.4.7 in /core ([#6144](https://github.com/dbt-labs/dbt-core/pull/6144))
### Dependency
- Bump mashumaro[msgpack] from 3.1.1 to 3.2 in /core ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@andy-clapson](https://github.com/andy-clapson) ([dbt-docs/#5791](https://github.com/dbt-labs/dbt-docs/issues/5791))
- [@chamini2](https://github.com/chamini2) ([#6041](https://github.com/dbt-labs/dbt-core/issues/6041))
- [@daniel-murray](https://github.com/daniel-murray) ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#5990](https://github.com/dbt-labs/dbt-core/issues/5990))
- [@dbeatty10](https://github.com/dbeatty10) ([dbt-docs/#6368](https://github.com/dbt-labs/dbt-docs/issues/6368), [#6394](https://github.com/dbt-labs/dbt-core/issues/6394))
- [@devmessias](https://github.com/devmessias) ([#6309](https://github.com/dbt-labs/dbt-core/issues/6309))
- [@eve-johns](https://github.com/eve-johns) ([#6068](https://github.com/dbt-labs/dbt-core/issues/6068))
- [@haritamar](https://github.com/haritamar) ([#6246](https://github.com/dbt-labs/dbt-core/issues/6246))
- [@jared-rimmer](https://github.com/jared-rimmer) ([#5486](https://github.com/dbt-labs/dbt-core/issues/5486))
- [@josephberni](https://github.com/josephberni) ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- [@joshuataylor](https://github.com/joshuataylor) ([#6147](https://github.com/dbt-labs/dbt-core/issues/6147))
- [@justbldwn](https://github.com/justbldwn) ([#6245](https://github.com/dbt-labs/dbt-core/issues/6245))
- [@luke-bassett](https://github.com/luke-bassett) ([#1350](https://github.com/dbt-labs/dbt-core/issues/1350))
- [@max-sixty](https://github.com/max-sixty) ([#5946](https://github.com/dbt-labs/dbt-core/issues/5946), [#5983](https://github.com/dbt-labs/dbt-core/issues/5983), [#6028](https://github.com/dbt-labs/dbt-core/issues/6028))
- [@paulbenschmidt](https://github.com/paulbenschmidt) ([dbt-docs/#5880](https://github.com/dbt-labs/dbt-docs/issues/5880))
- [@pgoslatara](https://github.com/pgoslatara) ([#5929](https://github.com/dbt-labs/dbt-core/issues/5929))
- [@racheldaniel](https://github.com/racheldaniel) ([#6201](https://github.com/dbt-labs/dbt-core/issues/6201))
- [@timle2](https://github.com/timle2) ([#4205](https://github.com/dbt-labs/dbt-core/issues/4205))

View File

@@ -0,0 +1,6 @@
kind: "Dependencies"
body: "Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core"
time: 2022-09-23T00:06:46.00000Z
custom:
Author: dependabot[bot]
PR: "5917"

View File

@@ -1,7 +1,6 @@
kind: "Dependency"
kind: "Dependencies"
body: "Bump black from 22.8.0 to 22.10.0"
time: 2022-10-07T00:08:48.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6019
PR: "6019"

View File

@@ -0,0 +1,6 @@
kind: "Dependencies"
body: "Bump mashumaro[msgpack] from 3.0.4 to 3.1.1 in /core"
time: 2022-10-20T00:07:53.00000Z
custom:
Author: dependabot[bot]
PR: "6108"

View File

@@ -0,0 +1,6 @@
kind: "Dependencies"
body: "Update colorama requirement from <0.4.6,>=0.3.9 to >=0.3.9,<0.4.7 in /core"
time: 2022-10-26T00:09:10.00000Z
custom:
Author: dependabot[bot]
PR: "6144"

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Bump mashumaro[msgpack] from 3.1.1 to 3.2 in /core"
time: 2022-12-05T00:21:18.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6375

View File

@@ -4,4 +4,3 @@ time: 2022-09-08T15:41:57.689162-04:00
custom:
Author: andy-clapson
Issue: "5791"
PR: "5684"

View File

@@ -4,4 +4,3 @@ time: 2022-10-07T09:06:56.446078-05:00
custom:
Author: stu-k
Issue: "5528"
PR: "6022"

View File

@@ -0,0 +1,5 @@
kind: Docs
time: 2022-10-17T17:14:11.715348-05:00
custom:
Author: paulbenschmidt
Issue: "5880"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Fix rendering of sample code for metrics
time: 2022-11-16T15:57:43.204201+01:00
custom:
Author: jtcohen6
Issue: "323"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Alphabetize `core/dbt/README.md`
time: 2022-12-02T15:05:23.695333-07:00
custom:
Author: dbeatty10
Issue: "6368"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Added favor-state flag to optionally favor state nodes even if unselected node
exists
time: 2022-04-08T16:54:59.696564+01:00
custom:
Author: daniel-murray josephberni
Issue: "2968"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Update structured logging. Convert to using protobuf messages. Ensure events are enriched with node_info.
time: 2022-08-17T15:48:57.225267-04:00
custom:
Author: gshank
Issue: "5610"

View File

@@ -4,4 +4,3 @@ time: 2022-09-12T12:59:35.121188+01:00
custom:
Author: jared-rimmer
Issue: "5486"
PR: "5812"

View File

@@ -4,4 +4,3 @@ time: 2022-09-14T09:56:25.97818-07:00
custom:
Author: colin-rogers-dbt
Issue: "5521"
PR: "5838"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Allow partitions in external tables to be supplied as a list
time: 2022-09-25T21:16:51.051239654+02:00
custom:
Author: pgoslatara
Issue: "5929"

View File

@@ -4,4 +4,3 @@ time: 2022-10-03T11:07:05.381632-05:00
custom:
Author: dave-connors-3
Issue: "5990"
PR: "5991"

View File

@@ -0,0 +1,7 @@
kind: Features
body: This pulls the profile name from args when constructing a RuntimeConfig in lib.py,
enabling the dbt-server to override the value that's in the dbt_project.yml
time: 2022-11-02T15:00:03.000805-05:00
custom:
Author: racheldaniel
Issue: "6201"

View File

@@ -0,0 +1,8 @@
kind: Features
body: Adding tarball install method for packages. Allowing package tarball to be specified
via url in the packages.yaml.
time: 2022-11-07T10:50:18.464545-05:00
custom:
Author: timle2
Issue: "4205"
PR: "4689"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Added an md5 function to the base context
time: 2022-11-14T18:52:07.788593+02:00
custom:
Author: haritamar
Issue: "6246"

View File

@@ -0,0 +1,6 @@
kind: Features
body: Exposures support metrics in lineage
time: 2022-11-30T11:29:13.256034-05:00
custom:
Author: michelleark
Issue: "6057"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Add support for Python 3.11
time: 2022-12-06T15:07:04.753127+01:00
custom:
Author: joshuataylor MichelleArk jtcohen6
Issue: "6147"
PR: "6326"

View File

@@ -4,4 +4,3 @@ time: 2022-09-16T10:48:54.162273-05:00
custom:
Author: emmyoop
Issue: "3992"
PR: "5868"

View File

@@ -4,4 +4,3 @@ time: 2022-10-10T11:32:18.752322-05:00
custom:
Author: emmyoop
Issue: "6030"
PR: "6038"

View File

@@ -4,4 +4,3 @@ time: 2022-10-11T16:07:15.464093-04:00
custom:
Author: chamini2
Issue: "6041"
PR: "6042"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: Add functors to ensure event types with str-type attributes are initialized
to spec, even when provided non-str type params.
time: 2022-10-16T17:37:42.846683-07:00
custom:
Author: versusfacit
Issue: "5436"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Allow hooks to fail without halting execution flow
time: 2022-11-07T09:53:14.340257-06:00
custom:
Author: ChenyuLInx
Issue: "5625"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Clarify Error Message for how many models are allowed in a Python file
time: 2022-11-15T08:10:21.527884-05:00
custom:
Author: justbldwn
Issue: "6245"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: After this, will be possible to use default values for dbt.config.get
time: 2022-11-24T16:34:19.039512764-03:00
custom:
Author: devmessias
Issue: "6309"
PR: "6317"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Use full path for writing manifest
time: 2022-12-02T16:48:59.029519-05:00
custom:
Author: gshank
Issue: "6055"

View File

@@ -4,4 +4,3 @@ time: 2022-09-27T19:42:59.241433-07:00
custom:
Author: max-sixty
Issue: "5946"
PR: "5947"

View File

@@ -4,4 +4,3 @@ time: 2022-09-29T13:44:06.275941-04:00
custom:
Author: peterallenwebb
Issue: "5809"
PR: "5975"

View File

@@ -4,4 +4,3 @@ time: 2022-10-05T12:03:10.061263-07:00
custom:
Author: max-sixty
Issue: "5983"
PR: "5983"

View File

@@ -4,4 +4,3 @@ time: 2022-10-07T09:46:27.682872-05:00
custom:
Author: emmyoop
Issue: "6023"
PR: "6024"

View File

@@ -4,4 +4,3 @@ time: 2022-10-07T14:00:44.227644-07:00
custom:
Author: max-sixty
Issue: "6028"
PR: "5978"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Provide useful errors when the value of 'materialized' is invalid
time: 2022-10-13T18:19:12.167548-04:00
custom:
Author: peterallenwebb
Issue: "5229"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Fixed extra whitespace in strings introduced by black.
time: 2022-10-17T15:15:11.499246-05:00
custom:
Author: luke-bassett
Issue: "1350"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Clean up string formatting
time: 2022-10-17T15:58:44.676549-04:00
custom:
Author: eve-johns
Issue: "6068"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Remove the 'root_path' field from most nodes
time: 2022-10-28T10:48:37.687886-04:00
custom:
Author: gshank
Issue: "6171"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Combine certain logging events with different levels
time: 2022-10-28T11:03:44.887836-04:00
custom:
Author: gshank
Issue: "6173"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Convert threading tests to pytest
time: 2022-11-08T07:45:50.589147-06:00
custom:
Author: stu-k
Issue: "5942"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Convert postgres index tests to pytest
time: 2022-11-08T11:56:33.743042-06:00
custom:
Author: stu-k
Issue: "5770"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Convert use color tests to pytest
time: 2022-11-08T13:31:04.788547-06:00
custom:
Author: stu-k
Issue: "5771"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Add github actions workflow to generate high level CLI API docs
time: 2022-11-16T13:00:37.916202-06:00
custom:
Author: stu-k
Issue: "5942"

View File

@@ -0,0 +1,8 @@
kind: Under the Hood
body: Functionality-neutral refactor of event logging system to improve encapsulation
and modularity.
time: 2022-11-18T14:57:17.792622-05:00
custom:
Author: peterallenwebb
Issue: "6139"
PR: "6291"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Consolidate ParsedNode and CompiledNode classes
time: 2022-12-05T16:49:48.563583-05:00
custom:
Author: gshank
Issue: "6383"
PR: "6384"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Prevent doc gen workflow from running on forks
time: 2022-12-06T09:40:15.301984-06:00
custom:
Author: stu-k
Issue: "6386"
PR: "6390"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Fix intermittent database connection failure in Windows CI test
time: 2022-12-06T11:30:53.166009-07:00
custom:
Author: MichelleArk dbeatty10
Issue: "6394"
PR: "6395"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Refactor and clean up manifest nodes
time: 2022-12-11T21:42:40.560074-05:00
custom:
Author: gshank
Issue: "6426"
PR: "6427"

View File

@@ -0,0 +1,7 @@
kind: Features
body: incremental predicates
time: 2022-08-23T08:57:27.640804-05:00
custom:
Author: dave-connors-3
Issue: "5680"
PR: "5702"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: '[CT-1284] Change Python model default materialization to table'
time: 2022-12-13T11:26:20.550017-08:00
custom:
Author: aranke
Issue: "6345"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Restore important legacy logging behaviors, following refactor which removed
them
time: 2022-12-13T21:41:06.815133-05:00
custom:
Author: peterallenwebb
Issue: "6437"

View File

@@ -6,19 +6,67 @@ changelogPath: CHANGELOG.md
versionExt: md
versionFormat: '## dbt-core {{.Version}} - {{.Time.Format "January 02, 2006"}}'
kindFormat: '### {{.Kind}}'
changeFormat: '- {{.Body}} ([#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-core/issues/{{.Custom.Issue}}), [#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-core/pull/{{.Custom.PR}}))'
changeFormat: |-
{{- $IssueList := list }}
{{- $changes := splitList " " $.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
kinds:
- label: Breaking Changes
- label: Features
- label: Fixes
- label: Docs
changeFormat: '- {{.Body}} ([dbt-docs/#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-docs/issues/{{.Custom.Issue}}), [dbt-docs/#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-docs/pull/{{.Custom.PR}}))'
changeFormat: |-
{{- $IssueList := list }}
{{- $changes := splitList " " $.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[dbt-docs/#nbr](https://github.com/dbt-labs/dbt-docs/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
- label: Under the Hood
- label: Dependencies
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-core/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-core/pull/{{.Custom.PR}}))'
changeFormat: |-
{{- $PRList := list }}
{{- $changes := splitList " " $.Custom.PR }}
{{- range $pullrequest := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $pullrequest }}
{{- $PRList = append $PRList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $PRList }}{{if $index}}, {{end}}{{$element}}{{end}})
skipGlobalChoices: true
additionalChoices:
- key: Author
label: GitHub Username(s) (separated by a single space if multiple)
type: string
minLength: 3
- key: PR
label: GitHub Pull Request Number (separated by a single space if multiple)
type: string
minLength: 1
- label: Security
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-core/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-core/pull/{{.Custom.PR}}))'
changeFormat: |-
{{- $PRList := list }}
{{- $changes := splitList " " $.Custom.PR }}
{{- range $pullrequest := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $pullrequest }}
{{- $PRList = append $PRList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $PRList }}{{if $index}}, {{end}}{{$element}}{{end}})
skipGlobalChoices: true
additionalChoices:
- key: Author
label: GitHub Username(s) (separated by a single space if multiple)
type: string
minLength: 3
- key: PR
label: GitHub Pull Request Number (separated by a single space if multiple)
type: string
minLength: 1
newlines:
afterChangelogHeader: 1
@@ -33,42 +81,41 @@ custom:
type: string
minLength: 3
- key: Issue
label: GitHub Issue Number
type: int
minInt: 1
- key: PR
label: GitHub Pull Request Number
type: int
minInt: 1
label: GitHub Issue Number (separated by a single space if multiple)
type: string
minLength: 1
footerFormat: |
{{- $contributorDict := dict }}
{{- /* any names added to this list should be all lowercase for later matching purposes */}}
{{- $core_team := list "peterallenwebb" "emmyoop" "nathaniel-may" "gshank" "leahwicz" "chenyulinx" "stu-k" "iknox-fa" "versusfacit" "mcknight-42" "jtcohen6" "dependabot[bot]" "snyk-bot" "colin-rogers-dbt" }}
{{- $core_team := list "michelleark" "peterallenwebb" "emmyoop" "nathaniel-may" "gshank" "leahwicz" "chenyulinx" "stu-k" "iknox-fa" "versusfacit" "mcknight-42" "jtcohen6" "aranke" "dependabot[bot]" "snyk-bot" "colin-rogers-dbt" }}
{{- range $change := .Changes }}
{{- $authorList := splitList " " $change.Custom.Author }}
{{- /* loop through all authors for a PR */}}
{{- /* loop through all authors for a single changelog */}}
{{- range $author := $authorList }}
{{- $authorLower := lower $author }}
{{- /* we only want to include non-core team contributors */}}
{{- if not (has $authorLower $core_team)}}
{{- /* Docs kind link back to dbt-docs instead of dbt-core PRs */}}
{{- $prLink := $change.Kind }}
{{- if eq $change.Kind "Docs" }}
{{- $prLink = "[dbt-docs/#pr](https://github.com/dbt-labs/dbt-docs/pull/pr)" | replace "pr" $change.Custom.PR }}
{{- else }}
{{- $prLink = "[#pr](https://github.com/dbt-labs/dbt-core/pull/pr)" | replace "pr" $change.Custom.PR }}
{{- end }}
{{- /* check if this contributor has other PRs associated with them already */}}
{{- if hasKey $contributorDict $author }}
{{- $prList := get $contributorDict $author }}
{{- $prList = append $prList $prLink }}
{{- $contributorDict := set $contributorDict $author $prList }}
{{- else }}
{{- $prList := list $prLink }}
{{- $contributorDict := set $contributorDict $author $prList }}
{{- end }}
{{- end}}
{{- $changeList := splitList " " $change.Custom.Author }}
{{- /* Docs kind link back to dbt-docs instead of dbt-core issues */}}
{{- $changeLink := $change.Kind }}
{{- if or (eq $change.Kind "Dependencies") (eq $change.Kind "Security") }}
{{- $changeLink = "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $change.Custom.PR }}
{{- else if eq $change.Kind "Docs"}}
{{- $changeLink = "[dbt-docs/#nbr](https://github.com/dbt-labs/dbt-docs/issues/nbr)" | replace "nbr" $change.Custom.Issue }}
{{- else }}
{{- $changeLink = "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $change.Custom.Issue }}
{{- end }}
{{- /* check if this contributor has other changes associated with them already */}}
{{- if hasKey $contributorDict $author }}
{{- $contributionList := get $contributorDict $author }}
{{- $contributionList = append $contributionList $changeLink }}
{{- $contributorDict := set $contributorDict $author $contributionList }}
{{- else }}
{{- $contributionList := list $changeLink }}
{{- $contributorDict := set $contributorDict $author $contributionList }}
{{- end }}
{{- end}}
{{- end}}
{{- end }}
{{- /* no indentation here for formatting so the final markdown doesn't have unneeded indentations */}}

View File

@@ -40,7 +40,7 @@ jobs:
matrix:
include:
- label: "dependencies"
changie_kind: "Dependency"
changie_kind: "Dependencies"
- label: "snyk"
changie_kind: "Security"
runs-on: ubuntu-latest
@@ -58,4 +58,4 @@ jobs:
commit_message: "Add automated changelog yaml from template for bot PR"
changie_kind: ${{ matrix.changie_kind }}
label: ${{ matrix.label }}
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n Issue: 4904\n PR: ${{ github.event.pull_request.number }}"
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n PR: ${{ github.event.pull_request.number }}"

View File

@@ -0,0 +1,165 @@
# **what?**
# On push, if anything in core/dbt/docs or core/dbt/cli has been
# created or modified, regenerate the CLI API docs using sphinx.
# **why?**
# We watch for changes in core/dbt/cli because the CLI API docs rely on click
# and all supporting flags/params to be generated. We watch for changes in
# core/dbt/docs since any changes to sphinx configuration or any of the
# .rst files there could result in a differently build final index.html file.
# **when?**
# Whenever a change has been pushed to a branch, and only if there is a diff
# between the PR branch and main's core/dbt/cli and or core/dbt/docs dirs.
# TODO: add bot comment to PR informing contributor that the docs have been committed
# TODO: figure out why github action triggered pushes cause github to fail to report
# the status of jobs
name: Generate CLI API docs
on:
pull_request:
permissions:
contents: write
pull-requests: write
env:
CLI_DIR: ${{ github.workspace }}/core/dbt/cli
DOCS_DIR: ${{ github.workspace }}/core/dbt/docs
DOCS_BUILD_DIR: ${{ github.workspace }}/core/dbt/docs/build
jobs:
check_gen:
name: check if generation needed
runs-on: ubuntu-latest
if: ${{ github.event.pull_request.head.repo.fork == false }}
outputs:
cli_dir_changed: ${{ steps.check_cli.outputs.cli_dir_changed }}
docs_dir_changed: ${{ steps.check_docs.outputs.docs_dir_changed }}
steps:
- name: "[DEBUG] print variables"
run: |
echo "env.CLI_DIR: ${{ env.CLI_DIR }}"
echo "env.DOCS_BUILD_DIR: ${{ env.DOCS_BUILD_DIR }}"
echo "env.DOCS_DIR: ${{ env.DOCS_DIR }}"
- name: git checkout
uses: actions/checkout@v3
with:
fetch-depth: 0
ref: ${{ github.head_ref }}
- name: set shas
id: set_shas
run: |
THIS_SHA=$(git rev-parse @)
LAST_SHA=$(git rev-parse @~1)
echo "this sha: $THIS_SHA"
echo "last sha: $LAST_SHA"
echo "this_sha=$THIS_SHA" >> $GITHUB_OUTPUT
echo "last_sha=$LAST_SHA" >> $GITHUB_OUTPUT
- name: check for changes in core/dbt/cli
id: check_cli
run: |
CLI_DIR_CHANGES=$(git diff \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.CLI_DIR }})
if [ -n "$CLI_DIR_CHANGES" ]; then
echo "changes found"
echo $CLI_DIR_CHANGES
echo "cli_dir_changed=true" >> $GITHUB_OUTPUT
exit 0
fi
echo "cli_dir_changed=false" >> $GITHUB_OUTPUT
echo "no changes found"
- name: check for changes in core/dbt/docs
id: check_docs
if: steps.check_cli.outputs.cli_dir_changed == 'false'
run: |
DOCS_DIR_CHANGES=$(git diff --name-only \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.DOCS_DIR }} ':!${{ env.DOCS_BUILD_DIR }}')
DOCS_BUILD_DIR_CHANGES=$(git diff --name-only \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.DOCS_BUILD_DIR }})
if [ -n "$DOCS_DIR_CHANGES" ] && [ -z "$DOCS_BUILD_DIR_CHANGES" ]; then
echo "changes found"
echo $DOCS_DIR_CHANGES
echo "docs_dir_changed=true" >> $GITHUB_OUTPUT
exit 0
fi
echo "docs_dir_changed=false" >> $GITHUB_OUTPUT
echo "no changes found"
gen_docs:
name: generate docs
runs-on: ubuntu-latest
needs: [check_gen]
if: |
needs.check_gen.outputs.cli_dir_changed == 'true'
|| needs.check_gen.outputs.docs_dir_changed == 'true'
steps:
- name: "[DEBUG] print variables"
run: |
echo "env.DOCS_DIR: ${{ env.DOCS_DIR }}"
echo "github head_ref: ${{ github.head_ref }}"
- name: git checkout
uses: actions/checkout@v3
with:
ref: ${{ github.head_ref }}
- name: install python
uses: actions/setup-python@v4.3.0
with:
python-version: 3.8
- name: install dev requirements
run: |
python3 -m venv env
source env/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt -r dev-requirements.txt
- name: generate docs
run: |
source env/bin/activate
cd ${{ env.DOCS_DIR }}
echo "cleaning existing docs"
make clean
echo "creating docs"
make html
- name: debug
run: |
echo ">>>>> status"
git status
echo ">>>>> remotes"
git remote -v
echo ">>>>> branch"
git branch -v
echo ">>>>> log"
git log --pretty=oneline | head -5
- name: commit docs
run: |
git config user.name 'Github Build Bot'
git config user.email 'buildbot@fishtownanalytics.com'
git commit -am "Add generated CLI API docs"
git push -u origin ${{ github.head_ref }}

View File

@@ -45,7 +45,9 @@ jobs:
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v4.3.0
with:
python-version: '3.8'
- name: Install python dependencies
run: |
@@ -71,7 +73,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11"]
env:
TOXENV: "unit"
@@ -82,7 +84,7 @@ jobs:
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
uses: actions/setup-python@v4.3.0
with:
python-version: ${{ matrix.python-version }}
@@ -116,8 +118,8 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
os: [ubuntu-latest]
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11"]
os: [ubuntu-20.04]
include:
- python-version: 3.8
os: windows-latest
@@ -137,7 +139,7 @@ jobs:
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
uses: actions/setup-python@v4.3.0
with:
python-version: ${{ matrix.python-version }}
@@ -190,9 +192,9 @@ jobs:
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v4.3.0
with:
python-version: 3.8
python-version: '3.8'
- name: Install python dependencies
run: |

View File

@@ -9,13 +9,4 @@ permissions:
jobs:
stale:
runs-on: ubuntu-latest
steps:
# pinned at v4 (https://github.com/actions/stale/releases/tag/v4.0.0)
- uses: actions/stale@cdf15f641adb27a71842045a94023bef6945e3aa
with:
stale-issue-message: "This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please remove the stale label or comment on the issue, or it will be closed in 7 days."
stale-pr-message: "This PR has been marked as Stale because it has been open for 180 days with no activity. If you would like the PR to remain open, please remove the stale label or comment on the PR, or it will be closed in 7 days."
close-issue-message: "Although we are closing this issue as stale, it's not gone forever. Issues can be reopened if there is renewed community interest; add a comment to notify the maintainers."
# mark issues/PRs stale when they haven't seen activity in 180 days
days-before-stale: 180
uses: dbt-labs/actions/.github/workflows/stale-bot-matrix.yml@main

View File

@@ -22,7 +22,7 @@ jobs:
# run the performance measurements on the current or default branch
test-schema:
name: Test Log Schema
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
env:
# turns warnings into errors
RUSTFLAGS: "-D warnings"
@@ -46,12 +46,6 @@ jobs:
with:
python-version: "3.8"
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Install python dependencies
run: |
pip install --user --upgrade pip
@@ -69,10 +63,3 @@ jobs:
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs
- name: Run integration tests
run: tox -e integration -- -nauto
# apply our schema tests to every log event from the previous step
# skips any output that isn't valid json
- uses: actions-rs/cargo@v1
with:
command: run
args: --manifest-path test/interop/log_parsing/Cargo.toml

1
.gitignore vendored
View File

@@ -11,6 +11,7 @@ __pycache__/
env*/
dbt_env/
build/
!core/dbt/docs/build
develop-eggs/
dist/
downloads/

View File

@@ -2,7 +2,7 @@
# Eventually the hooks described here will be run as tests before merging each PR.
# TODO: remove global exclusion of tests when testing overhaul is complete
exclude: ^test/
exclude: ^(test/|core/dbt/docs/build/)
# Force all unspecified python hooks to run python 3.8
default_language_version:
@@ -30,7 +30,7 @@ repos:
args:
- "--check"
- "--diff"
- repo: https://gitlab.com/pycqa/flake8
- repo: https://github.com/pycqa/flake8
rev: 4.0.1
hooks:
- id: flake8

View File

@@ -5,6 +5,94 @@
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-core 1.4.0-b1 - December 14, 2022
### Features
- Added favor-state flag to optionally favor state nodes even if unselected node exists ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- Update structured logging. Convert to using protobuf messages. Ensure events are enriched with node_info. ([#5610](https://github.com/dbt-labs/dbt-core/issues/5610))
- Friendlier error messages when packages.yml is malformed ([#5486](https://github.com/dbt-labs/dbt-core/issues/5486))
- Migrate dbt-utils current_timestamp macros into core + adapters ([#5521](https://github.com/dbt-labs/dbt-core/issues/5521))
- Allow partitions in external tables to be supplied as a list ([#5929](https://github.com/dbt-labs/dbt-core/issues/5929))
- extend -f flag shorthand for seed command ([#5990](https://github.com/dbt-labs/dbt-core/issues/5990))
- This pulls the profile name from args when constructing a RuntimeConfig in lib.py, enabling the dbt-server to override the value that's in the dbt_project.yml ([#6201](https://github.com/dbt-labs/dbt-core/issues/6201))
- Adding tarball install method for packages. Allowing package tarball to be specified via url in the packages.yaml. ([#4205](https://github.com/dbt-labs/dbt-core/issues/4205))
- Added an md5 function to the base context ([#6246](https://github.com/dbt-labs/dbt-core/issues/6246))
- Exposures support metrics in lineage ([#6057](https://github.com/dbt-labs/dbt-core/issues/6057))
- Add support for Python 3.11 ([#6147](https://github.com/dbt-labs/dbt-core/issues/6147))
### Fixes
- Account for disabled flags on models in schema files more completely ([#3992](https://github.com/dbt-labs/dbt-core/issues/3992))
- Add validation of enabled config for metrics, exposures and sources ([#6030](https://github.com/dbt-labs/dbt-core/issues/6030))
- check length of args of python model function before accessing it ([#6041](https://github.com/dbt-labs/dbt-core/issues/6041))
- Add functors to ensure event types with str-type attributes are initialized to spec, even when provided non-str type params. ([#5436](https://github.com/dbt-labs/dbt-core/issues/5436))
- Allow hooks to fail without halting execution flow ([#5625](https://github.com/dbt-labs/dbt-core/issues/5625))
- Clarify Error Message for how many models are allowed in a Python file ([#6245](https://github.com/dbt-labs/dbt-core/issues/6245))
- After this, will be possible to use default values for dbt.config.get ([#6309](https://github.com/dbt-labs/dbt-core/issues/6309))
- Use full path for writing manifest ([#6055](https://github.com/dbt-labs/dbt-core/issues/6055))
### Docs
- minor doc correction ([dbt-docs/#5791](https://github.com/dbt-labs/dbt-docs/issues/5791))
- Generate API docs for new CLI interface ([dbt-docs/#5528](https://github.com/dbt-labs/dbt-docs/issues/5528))
- ([dbt-docs/#5880](https://github.com/dbt-labs/dbt-docs/issues/5880))
- Fix rendering of sample code for metrics ([dbt-docs/#323](https://github.com/dbt-labs/dbt-docs/issues/323))
- Alphabetize `core/dbt/README.md` ([dbt-docs/#6368](https://github.com/dbt-labs/dbt-docs/issues/6368))
### Under the Hood
- Put black config in explicit config ([#5946](https://github.com/dbt-labs/dbt-core/issues/5946))
- Added flat_graph attribute the Manifest class's deepcopy() coverage ([#5809](https://github.com/dbt-labs/dbt-core/issues/5809))
- Add mypy configs so `mypy` passes from CLI ([#5983](https://github.com/dbt-labs/dbt-core/issues/5983))
- Exception message cleanup. ([#6023](https://github.com/dbt-labs/dbt-core/issues/6023))
- Add dmypy cache to gitignore ([#6028](https://github.com/dbt-labs/dbt-core/issues/6028))
- Provide useful errors when the value of 'materialized' is invalid ([#5229](https://github.com/dbt-labs/dbt-core/issues/5229))
- Clean up string formatting ([#6068](https://github.com/dbt-labs/dbt-core/issues/6068))
- Fixed extra whitespace in strings introduced by black. ([#1350](https://github.com/dbt-labs/dbt-core/issues/1350))
- Remove the 'root_path' field from most nodes ([#6171](https://github.com/dbt-labs/dbt-core/issues/6171))
- Combine certain logging events with different levels ([#6173](https://github.com/dbt-labs/dbt-core/issues/6173))
- Convert threading tests to pytest ([#5942](https://github.com/dbt-labs/dbt-core/issues/5942))
- Convert postgres index tests to pytest ([#5770](https://github.com/dbt-labs/dbt-core/issues/5770))
- Convert use color tests to pytest ([#5771](https://github.com/dbt-labs/dbt-core/issues/5771))
- Add github actions workflow to generate high level CLI API docs ([#5942](https://github.com/dbt-labs/dbt-core/issues/5942))
- Functionality-neutral refactor of event logging system to improve encapsulation and modularity. ([#6139](https://github.com/dbt-labs/dbt-core/issues/6139))
- Consolidate ParsedNode and CompiledNode classes ([#6383](https://github.com/dbt-labs/dbt-core/issues/6383))
- Prevent doc gen workflow from running on forks ([#6386](https://github.com/dbt-labs/dbt-core/issues/6386))
- Fix intermittent database connection failure in Windows CI test ([#6394](https://github.com/dbt-labs/dbt-core/issues/6394))
- Refactor and clean up manifest nodes ([#6426](https://github.com/dbt-labs/dbt-core/issues/6426))
### Dependencies
- Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core ([#5917](https://github.com/dbt-labs/dbt-core/pull/5917))
- Bump black from 22.8.0 to 22.10.0 ([#6019](https://github.com/dbt-labs/dbt-core/pull/6019))
- Bump mashumaro[msgpack] from 3.0.4 to 3.1.1 in /core ([#6108](https://github.com/dbt-labs/dbt-core/pull/6108))
- Update colorama requirement from <0.4.6,>=0.3.9 to >=0.3.9,<0.4.7 in /core ([#6144](https://github.com/dbt-labs/dbt-core/pull/6144))
### Dependency
- Bump mashumaro[msgpack] from 3.1.1 to 3.2 in /core ([#4904](https://github.com/dbt-labs/dbt-core/issues/4904))
### Contributors
- [@andy-clapson](https://github.com/andy-clapson) ([dbt-docs/#5791](https://github.com/dbt-labs/dbt-docs/issues/5791))
- [@chamini2](https://github.com/chamini2) ([#6041](https://github.com/dbt-labs/dbt-core/issues/6041))
- [@daniel-murray](https://github.com/daniel-murray) ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#5990](https://github.com/dbt-labs/dbt-core/issues/5990))
- [@dbeatty10](https://github.com/dbeatty10) ([dbt-docs/#6368](https://github.com/dbt-labs/dbt-docs/issues/6368), [#6394](https://github.com/dbt-labs/dbt-core/issues/6394))
- [@devmessias](https://github.com/devmessias) ([#6309](https://github.com/dbt-labs/dbt-core/issues/6309))
- [@eve-johns](https://github.com/eve-johns) ([#6068](https://github.com/dbt-labs/dbt-core/issues/6068))
- [@haritamar](https://github.com/haritamar) ([#6246](https://github.com/dbt-labs/dbt-core/issues/6246))
- [@jared-rimmer](https://github.com/jared-rimmer) ([#5486](https://github.com/dbt-labs/dbt-core/issues/5486))
- [@josephberni](https://github.com/josephberni) ([#2968](https://github.com/dbt-labs/dbt-core/issues/2968))
- [@joshuataylor](https://github.com/joshuataylor) ([#6147](https://github.com/dbt-labs/dbt-core/issues/6147))
- [@justbldwn](https://github.com/justbldwn) ([#6245](https://github.com/dbt-labs/dbt-core/issues/6245))
- [@luke-bassett](https://github.com/luke-bassett) ([#1350](https://github.com/dbt-labs/dbt-core/issues/1350))
- [@max-sixty](https://github.com/max-sixty) ([#5946](https://github.com/dbt-labs/dbt-core/issues/5946), [#5983](https://github.com/dbt-labs/dbt-core/issues/5983), [#6028](https://github.com/dbt-labs/dbt-core/issues/6028))
- [@paulbenschmidt](https://github.com/paulbenschmidt) ([dbt-docs/#5880](https://github.com/dbt-labs/dbt-docs/issues/5880))
- [@pgoslatara](https://github.com/pgoslatara) ([#5929](https://github.com/dbt-labs/dbt-core/issues/5929))
- [@racheldaniel](https://github.com/racheldaniel) ([#6201](https://github.com/dbt-labs/dbt-core/issues/6201))
- [@timle2](https://github.com/timle2) ([#4205](https://github.com/dbt-labs/dbt-core/issues/4205))
## Previous Releases

View File

@@ -56,7 +56,7 @@ There are some tools that will be helpful to you in developing locally. While th
These are the tools used in `dbt-core` development and testing:
- [`tox`](https://tox.readthedocs.io/en/latest/) to manage virtualenvs across python versions. We currently target the latest patch releases for Python 3.7, 3.8, 3.9, and 3.10
- [`tox`](https://tox.readthedocs.io/en/latest/) to manage virtualenvs across python versions. We currently target the latest patch releases for Python 3.7, 3.8, 3.9, 3.10 and 3.11
- [`pytest`](https://docs.pytest.org/en/latest/) to define, discover, and run tests
- [`flake8`](https://flake8.pycqa.org/en/latest/) for code linting
- [`black`](https://github.com/psf/black) for code formatting
@@ -160,7 +160,7 @@ suites.
#### `tox`
[`tox`](https://tox.readthedocs.io/en/latest/) takes care of managing virtualenvs and install dependencies in order to run tests. You can also run tests in parallel, for example, you can run unit tests for Python 3.7, Python 3.8, Python 3.9, and Python 3.10 checks in parallel with `tox -p`. Also, you can run unit tests for specific python versions with `tox -e py37`. The configuration for these tests in located in `tox.ini`.
[`tox`](https://tox.readthedocs.io/en/latest/) takes care of managing virtualenvs and install dependencies in order to run tests. You can also run tests in parallel, for example, you can run unit tests for Python 3.7, Python 3.8, Python 3.9, Python 3.10 and Python 3.11 checks in parallel with `tox -p`. Also, you can run unit tests for specific python versions with `tox -e py37`. The configuration for these tests in located in `tox.ini`.
#### `pytest`
@@ -201,13 +201,21 @@ Here are some general rules for adding tests:
* Sometimes flake8 complains about lines that are actually fine, in which case you can put a comment on the line such as: # noqa or # noqa: ANNN, where ANNN is the error code that flake8 issues.
* To collect output for `CProfile`, run dbt with the `-r` option and the name of an output file, i.e. `dbt -r dbt.cprof run`. If you just want to profile parsing, you can do: `dbt -r dbt.cprof parse`. `pip` install `snakeviz` to view the output. Run `snakeviz dbt.cprof` and output will be rendered in a browser window.
## Adding a CHANGELOG Entry
## Adding or modifying a CHANGELOG Entry
We use [changie](https://changie.dev) to generate `CHANGELOG` entries. **Note:** Do not edit the `CHANGELOG.md` directly. Your modifications will be lost.
Follow the steps to [install `changie`](https://changie.dev/guide/installation/) for your system.
Once changie is installed and your PR is created, simply run `changie new` and changie will walk you through the process of creating a changelog entry. Commit the file that's created and your changelog entry is complete!
Once changie is installed and your PR is created for a new feature, simply run the following command and changie will walk you through the process of creating a changelog entry:
```shell
changie new
```
Commit the file that's created and your changelog entry is complete!
If you are contributing to a feature already in progress, you will modify the changie yaml file in dbt/.changes/unreleased/ related to your change. If you need help finding this file, please ask within the discussion for the pull request!
You don't need to worry about which `dbt-core` version your change will go into. Just create the changelog entry with `changie`, and open your PR against the `main` branch. All merged changes will be included in the next minor version of `dbt-core`. The Core maintainers _may_ choose to "backport" specific changes in order to patch older minor versions. In that case, a maintainer will take care of that backport after merging your PR, before releasing the new version of `dbt-core`.

View File

@@ -49,6 +49,9 @@ RUN apt-get update \
python3.10 \
python3.10-dev \
python3.10-venv \
python3.11 \
python3.11-dev \
python3.11-venv \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*

View File

@@ -2,50 +2,59 @@
## The following are individual files in this directory.
### deprecations.py
### flags.py
### main.py
### tracking.py
### version.py
### lib.py
### node_types.py
### helper_types.py
### links.py
### semver.py
### ui.py
### compilation.py
### constants.py
### dataclass_schema.py
### deprecations.py
### exceptions.py
### flags.py
### helper_types.py
### hooks.py
### lib.py
### links.py
### logger.py
### main.py
### node_types.py
### profiler.py
### selected_resources.py
### semver.py
### tracking.py
### ui.py
### utils.py
### version.py
## The subdirectories will be documented in a README in the subdirectory
* config
* include
* adapters
* context
* deps
* graph
* task
* cli
* clients
* config
* context
* contracts
* deps
* docs
* events
* graph
* include
* parser
* task
* tests

View File

@@ -2,6 +2,7 @@ import abc
import os
from time import sleep
import sys
import traceback
# multiprocessing.RLock is a function returning this type
from multiprocessing.synchronize import RLock
@@ -40,14 +41,16 @@ from dbt.events.functions import fire_event
from dbt.events.types import (
NewConnection,
ConnectionReused,
ConnectionLeftOpenInCleanup,
ConnectionLeftOpen,
ConnectionLeftOpen2,
ConnectionClosedInCleanup,
ConnectionClosed,
ConnectionClosed2,
Rollback,
RollbackFailed,
)
from dbt.events.contextvars import get_node_info
from dbt import flags
from dbt.utils import cast_to_str
SleepTime = Union[int, float] # As taken by time.sleep.
AdapterHandle = Any # Adapter connection handle objects can be any class.
@@ -167,7 +170,9 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
if conn.name == conn_name and conn.state == "open":
return conn
fire_event(NewConnection(conn_name=conn_name, conn_type=self.TYPE))
fire_event(
NewConnection(conn_name=conn_name, conn_type=self.TYPE, node_info=get_node_info())
)
if conn.state == "open":
fire_event(ConnectionReused(conn_name=conn_name))
@@ -304,9 +309,9 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
with self.lock:
for connection in self.thread_connections.values():
if connection.state not in {"closed", "init"}:
fire_event(ConnectionLeftOpen(conn_name=connection.name))
fire_event(ConnectionLeftOpenInCleanup(conn_name=cast_to_str(connection.name)))
else:
fire_event(ConnectionClosed(conn_name=connection.name))
fire_event(ConnectionClosedInCleanup(conn_name=cast_to_str(connection.name)))
self.close(connection)
# garbage collect these connections
@@ -332,17 +337,29 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
try:
connection.handle.rollback()
except Exception:
fire_event(RollbackFailed(conn_name=connection.name))
fire_event(
RollbackFailed(
conn_name=cast_to_str(connection.name),
exc_info=traceback.format_exc(),
node_info=get_node_info(),
)
)
@classmethod
def _close_handle(cls, connection: Connection) -> None:
"""Perform the actual close operation."""
# On windows, sometimes connection handles don't have a close() attr.
if hasattr(connection.handle, "close"):
fire_event(ConnectionClosed2(conn_name=connection.name))
fire_event(
ConnectionClosed(conn_name=cast_to_str(connection.name), node_info=get_node_info())
)
connection.handle.close()
else:
fire_event(ConnectionLeftOpen2(conn_name=connection.name))
fire_event(
ConnectionLeftOpen(
conn_name=cast_to_str(connection.name), node_info=get_node_info()
)
)
@classmethod
def _rollback(cls, connection: Connection) -> None:
@@ -353,7 +370,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
f'"{connection.name}", but it does not have one open!'
)
fire_event(Rollback(conn_name=connection.name))
fire_event(Rollback(conn_name=cast_to_str(connection.name), node_info=get_node_info()))
cls._rollback_handle(connection)
connection.transaction_open = False
@@ -365,7 +382,7 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
return connection
if connection.transaction_open and connection.handle:
fire_event(Rollback(conn_name=connection.name))
fire_event(Rollback(conn_name=cast_to_str(connection.name), node_info=get_node_info()))
cls._rollback_handle(connection)
connection.transaction_open = False

View File

@@ -15,7 +15,6 @@ from typing import (
List,
Mapping,
Iterator,
Union,
Set,
)
@@ -38,18 +37,17 @@ from dbt.adapters.protocol import (
)
from dbt.clients.agate_helper import empty_table, merge_tables, table_from_rows
from dbt.clients.jinja import MacroGenerator
from dbt.contracts.graph.compiled import CompileResultNode, CompiledSeedNode
from dbt.contracts.graph.manifest import Manifest, MacroManifest
from dbt.contracts.graph.parsed import ParsedSeedNode
from dbt.exceptions import warn_or_error
from dbt.events.functions import fire_event
from dbt.contracts.graph.nodes import ResultNode
from dbt.events.functions import fire_event, warn_or_error
from dbt.events.types import (
CacheMiss,
ListRelations,
CodeExecution,
CodeExecutionStatus,
CatalogGenerationError,
)
from dbt.utils import filter_null_values, executor
from dbt.utils import filter_null_values, executor, cast_to_str
from dbt.adapters.base.connections import Connection, AdapterResponse
from dbt.adapters.base.meta import AdapterMeta, available
@@ -61,10 +59,7 @@ from dbt.adapters.base.relation import (
)
from dbt.adapters.base import Column as BaseColumn
from dbt.adapters.base import Credentials
from dbt.adapters.cache import RelationsCache, _make_key
SeedModel = Union[ParsedSeedNode, CompiledSeedNode]
from dbt.adapters.cache import RelationsCache, _make_ref_key_msg
GET_CATALOG_MACRO_NAME = "get_catalog"
@@ -243,9 +238,7 @@ class BaseAdapter(metaclass=AdapterMeta):
return conn.name
@contextmanager
def connection_named(
self, name: str, node: Optional[CompileResultNode] = None
) -> Iterator[None]:
def connection_named(self, name: str, node: Optional[ResultNode] = None) -> Iterator[None]:
try:
if self.connections.query_header is not None:
self.connections.query_header.set(name, node)
@@ -257,7 +250,7 @@ class BaseAdapter(metaclass=AdapterMeta):
self.connections.query_header.reset()
@contextmanager
def connection_for(self, node: CompileResultNode) -> Iterator[None]:
def connection_for(self, node: ResultNode) -> Iterator[None]:
with self.connection_named(node.unique_id, node):
yield
@@ -343,7 +336,7 @@ class BaseAdapter(metaclass=AdapterMeta):
fire_event(
CacheMiss(
conn_name=self.nice_connection_name(),
database=database,
database=cast_to_str(database),
schema=schema,
)
)
@@ -372,7 +365,7 @@ class BaseAdapter(metaclass=AdapterMeta):
lowercase strings.
"""
info_schema_name_map = SchemaSearchMap()
nodes: Iterator[CompileResultNode] = chain(
nodes: Iterator[ResultNode] = chain(
[
node
for node in manifest.nodes.values()
@@ -581,7 +574,7 @@ class BaseAdapter(metaclass=AdapterMeta):
:rtype: List[self.Relation]
"""
raise NotImplementedException(
"`list_relations_without_caching` is not implemented for this " "adapter!"
"`list_relations_without_caching` is not implemented for this adapter!"
)
###
@@ -726,9 +719,9 @@ class BaseAdapter(metaclass=AdapterMeta):
relations = self.list_relations_without_caching(schema_relation)
fire_event(
ListRelations(
database=database,
database=cast_to_str(database),
schema=schema,
relations=[_make_key(x) for x in relations],
relations=[_make_ref_key_msg(x) for x in relations],
)
)
@@ -1327,7 +1320,7 @@ def catch_as_completed(
elif isinstance(exc, KeyboardInterrupt) or not isinstance(exc, Exception):
raise exc
else:
warn_or_error(f"Encountered an error while generating catalog: {str(exc)}")
warn_or_error(CatalogGenerationError(exc=str(exc)))
# exc is not None, derives from Exception, and isn't ctrl+c
exceptions.append(exc)
return merge_tables(tables), exceptions

View File

@@ -5,7 +5,7 @@ from dbt.clients.jinja import QueryStringGenerator
from dbt.context.manifest import generate_query_header_context
from dbt.contracts.connection import AdapterRequiredConfig, QueryComment
from dbt.contracts.graph.compiled import CompileResultNode
from dbt.contracts.graph.nodes import ResultNode
from dbt.contracts.graph.manifest import Manifest
from dbt.exceptions import RuntimeException
@@ -90,7 +90,7 @@ class MacroQueryStringSetter:
def reset(self):
self.set("master", None)
def set(self, name: str, node: Optional[CompileResultNode]):
def set(self, name: str, node: Optional[ResultNode]):
wrapped: Optional[NodeWrapper] = None
if node is not None:
wrapped = NodeWrapper(node)

View File

@@ -1,9 +1,8 @@
from collections.abc import Hashable
from dataclasses import dataclass
from typing import Optional, TypeVar, Any, Type, Dict, Union, Iterator, Tuple, Set
from dataclasses import dataclass, field
from typing import Optional, TypeVar, Any, Type, Dict, Iterator, Tuple, Set
from dbt.contracts.graph.compiled import CompiledNode
from dbt.contracts.graph.parsed import ParsedSourceDefinition, ParsedNode
from dbt.contracts.graph.nodes import SourceDefinition, ManifestNode, ResultNode, ParsedNode
from dbt.contracts.relation import (
RelationType,
ComponentName,
@@ -27,8 +26,10 @@ class BaseRelation(FakeAPIObject, Hashable):
path: Path
type: Optional[RelationType] = None
quote_character: str = '"'
include_policy: Policy = Policy()
quote_policy: Policy = Policy()
# Python 3.11 requires that these use default_factory instead of simple default
# ValueError: mutable default <class 'dbt.contracts.relation.Policy'> for field include_policy is not allowed: use default_factory
include_policy: Policy = field(default_factory=lambda: Policy())
quote_policy: Policy = field(default_factory=lambda: Policy())
dbt_created: bool = False
def _is_exactish_match(self, field: ComponentName, value: str) -> bool:
@@ -39,9 +40,9 @@ class BaseRelation(FakeAPIObject, Hashable):
@classmethod
def _get_field_named(cls, field_name):
for field, _ in cls._get_fields():
if field.name == field_name:
return field
for f, _ in cls._get_fields():
if f.name == field_name:
return f
# this should be unreachable
raise ValueError(f"BaseRelation has no {field_name} field!")
@@ -52,11 +53,11 @@ class BaseRelation(FakeAPIObject, Hashable):
@classmethod
def get_default_quote_policy(cls) -> Policy:
return cls._get_field_named("quote_policy").default
return cls._get_field_named("quote_policy").default_factory()
@classmethod
def get_default_include_policy(cls) -> Policy:
return cls._get_field_named("include_policy").default
return cls._get_field_named("include_policy").default_factory()
def get(self, key, default=None):
"""Override `.get` to return a metadata object so we don't break
@@ -184,7 +185,7 @@ class BaseRelation(FakeAPIObject, Hashable):
)
@classmethod
def create_from_source(cls: Type[Self], source: ParsedSourceDefinition, **kwargs: Any) -> Self:
def create_from_source(cls: Type[Self], source: SourceDefinition, **kwargs: Any) -> Self:
source_quoting = source.quoting.to_dict(omit_none=True)
source_quoting.pop("column", None)
quote_policy = deep_merge(
@@ -209,7 +210,7 @@ class BaseRelation(FakeAPIObject, Hashable):
def create_ephemeral_from_node(
cls: Type[Self],
config: HasQuoting,
node: Union[ParsedNode, CompiledNode],
node: ManifestNode,
) -> Self:
# Note that ephemeral models are based on the name.
identifier = cls.add_ephemeral_prefix(node.name)
@@ -222,7 +223,7 @@ class BaseRelation(FakeAPIObject, Hashable):
def create_from_node(
cls: Type[Self],
config: HasQuoting,
node: Union[ParsedNode, CompiledNode],
node: ManifestNode,
quote_policy: Optional[Dict[str, bool]] = None,
**kwargs: Any,
) -> Self:
@@ -243,20 +244,20 @@ class BaseRelation(FakeAPIObject, Hashable):
def create_from(
cls: Type[Self],
config: HasQuoting,
node: Union[CompiledNode, ParsedNode, ParsedSourceDefinition],
node: ResultNode,
**kwargs: Any,
) -> Self:
if node.resource_type == NodeType.Source:
if not isinstance(node, ParsedSourceDefinition):
if not isinstance(node, SourceDefinition):
raise InternalException(
"type mismatch, expected ParsedSourceDefinition but got {}".format(type(node))
"type mismatch, expected SourceDefinition but got {}".format(type(node))
)
return cls.create_from_source(node, **kwargs)
else:
if not isinstance(node, (ParsedNode, CompiledNode)):
# Can't use ManifestNode here because of parameterized generics
if not isinstance(node, (ParsedNode)):
raise InternalException(
"type mismatch, expected ParsedNode or CompiledNode but "
"got {}".format(type(node))
f"type mismatch, expected ManifestNode but got {type(node)}"
)
return cls.create_from_node(config, node, **kwargs)

View File

@@ -3,9 +3,14 @@ import threading
from copy import deepcopy
from typing import Any, Dict, Iterable, List, Optional, Set, Tuple
from dbt.adapters.reference_keys import _make_key, _ReferenceKey
from dbt.adapters.reference_keys import (
_make_ref_key,
_make_ref_key_msg,
_make_msg_from_ref_key,
_ReferenceKey,
)
import dbt.exceptions
from dbt.events.functions import fire_event
from dbt.events.functions import fire_event, fire_event_if
from dbt.events.types import (
AddLink,
AddRelation,
@@ -21,8 +26,8 @@ from dbt.events.types import (
UncachedRelation,
UpdateReference,
)
import dbt.flags as flags
from dbt.utils import lowercase
from dbt.helper_types import Lazy
def dot_separated(key: _ReferenceKey) -> str:
@@ -82,7 +87,7 @@ class _CachedRelation:
:return _ReferenceKey: A key for this relation.
"""
return _make_key(self)
return _make_ref_key(self)
def add_reference(self, referrer: "_CachedRelation"):
"""Add a reference from referrer to self, indicating that if this node
@@ -294,13 +299,18 @@ class RelationsCache:
:param BaseRelation dependent: The dependent model.
:raises InternalError: If either entry does not exist.
"""
ref_key = _make_key(referenced)
dep_key = _make_key(dependent)
ref_key = _make_ref_key(referenced)
dep_key = _make_ref_key(dependent)
if (ref_key.database, ref_key.schema) not in self:
# if we have not cached the referenced schema at all, we must be
# referring to a table outside our control. There's no need to make
# a link - we will never drop the referenced relation during a run.
fire_event(UncachedRelation(dep_key=dep_key, ref_key=ref_key))
fire_event(
UncachedRelation(
dep_key=_make_msg_from_ref_key(dep_key),
ref_key=_make_msg_from_ref_key(ref_key),
)
)
return
if ref_key not in self.relations:
# Insert a dummy "external" relation.
@@ -310,7 +320,11 @@ class RelationsCache:
# Insert a dummy "external" relation.
dependent = dependent.replace(type=referenced.External)
self.add(dependent)
fire_event(AddLink(dep_key=dep_key, ref_key=ref_key))
fire_event(
AddLink(
dep_key=_make_msg_from_ref_key(dep_key), ref_key=_make_msg_from_ref_key(ref_key)
)
)
with self.lock:
self._add_link(ref_key, dep_key)
@@ -321,12 +335,12 @@ class RelationsCache:
:param BaseRelation relation: The underlying relation.
"""
cached = _CachedRelation(relation)
fire_event(AddRelation(relation=_make_key(cached)))
fire_event(DumpBeforeAddGraph(dump=Lazy.defer(lambda: self.dump_graph())))
fire_event(AddRelation(relation=_make_ref_key_msg(cached)))
fire_event_if(flags.LOG_CACHE_EVENTS, lambda: DumpBeforeAddGraph(dump=self.dump_graph()))
with self.lock:
self._setdefault(cached)
fire_event(DumpAfterAddGraph(dump=Lazy.defer(lambda: self.dump_graph())))
fire_event_if(flags.LOG_CACHE_EVENTS, lambda: DumpAfterAddGraph(dump=self.dump_graph()))
def _remove_refs(self, keys):
"""Removes all references to all entries in keys. This does not
@@ -341,19 +355,6 @@ class RelationsCache:
for cached in self.relations.values():
cached.release_references(keys)
def _drop_cascade_relation(self, dropped_key):
"""Drop the given relation and cascade it appropriately to all
dependent relations.
:param _CachedRelation dropped: An existing _CachedRelation to drop.
"""
if dropped_key not in self.relations:
fire_event(DropMissingRelation(relation=dropped_key))
return
consequences = self.relations[dropped_key].collect_consequences()
fire_event(DropCascade(dropped=dropped_key, consequences=consequences))
self._remove_refs(consequences)
def drop(self, relation):
"""Drop the named relation and cascade it appropriately to all
dependent relations.
@@ -365,10 +366,19 @@ class RelationsCache:
:param str schema: The schema of the relation to drop.
:param str identifier: The identifier of the relation to drop.
"""
dropped_key = _make_key(relation)
fire_event(DropRelation(dropped=dropped_key))
dropped_key = _make_ref_key(relation)
dropped_key_msg = _make_ref_key_msg(relation)
fire_event(DropRelation(dropped=dropped_key_msg))
with self.lock:
self._drop_cascade_relation(dropped_key)
if dropped_key not in self.relations:
fire_event(DropMissingRelation(relation=dropped_key_msg))
return
consequences = self.relations[dropped_key].collect_consequences()
# convert from a list of _ReferenceKeys to a list of ReferenceKeyMsgs
consequence_msgs = [_make_msg_from_ref_key(key) for key in consequences]
fire_event(DropCascade(dropped=dropped_key_msg, consequences=consequence_msgs))
self._remove_refs(consequences)
def _rename_relation(self, old_key, new_relation):
"""Rename a relation named old_key to new_key, updating references.
@@ -390,7 +400,11 @@ class RelationsCache:
for cached in self.relations.values():
if cached.is_referenced_by(old_key):
fire_event(
UpdateReference(old_key=old_key, new_key=new_key, cached_key=cached.key())
UpdateReference(
old_key=_make_ref_key_msg(old_key),
new_key=_make_ref_key_msg(new_key),
cached_key=_make_ref_key_msg(cached.key()),
)
)
cached.rename_key(old_key, new_key)
@@ -436,7 +450,7 @@ class RelationsCache:
)
if old_key not in self.relations:
fire_event(TemporaryRelation(key=old_key))
fire_event(TemporaryRelation(key=_make_msg_from_ref_key(old_key)))
return False
return True
@@ -452,11 +466,17 @@ class RelationsCache:
:param BaseRelation new: The new relation name information.
:raises InternalError: If the new key is already present.
"""
old_key = _make_key(old)
new_key = _make_key(new)
fire_event(RenameSchema(old_key=old_key, new_key=new_key))
old_key = _make_ref_key(old)
new_key = _make_ref_key(new)
fire_event(
RenameSchema(
old_key=_make_msg_from_ref_key(old_key), new_key=_make_msg_from_ref_key(new)
)
)
fire_event(DumpBeforeRenameSchema(dump=Lazy.defer(lambda: self.dump_graph())))
fire_event_if(
flags.LOG_CACHE_EVENTS, lambda: DumpBeforeRenameSchema(dump=self.dump_graph())
)
with self.lock:
if self._check_rename_constraints(old_key, new_key):
@@ -464,7 +484,9 @@ class RelationsCache:
else:
self._setdefault(_CachedRelation(new))
fire_event(DumpAfterRenameSchema(dump=Lazy.defer(lambda: self.dump_graph())))
fire_event_if(
flags.LOG_CACHE_EVENTS, lambda: DumpAfterRenameSchema(dump=self.dump_graph())
)
def get_relations(self, database: Optional[str], schema: Optional[str]) -> List[Any]:
"""Case-insensitively yield all relations matching the given schema.
@@ -512,6 +534,6 @@ class RelationsCache:
"""
for relation in to_remove:
# it may have been cascaded out already
drop_key = _make_key(relation)
drop_key = _make_ref_key(relation)
if drop_key in self.relations:
self.drop(drop_key)

View File

@@ -1,4 +1,5 @@
import threading
import traceback
from contextlib import contextmanager
from importlib import import_module
from pathlib import Path
@@ -63,7 +64,7 @@ class AdapterContainer:
# otherwise, the error had to have come from some underlying
# library. Log the stack trace.
fire_event(PluginLoadError())
fire_event(PluginLoadError(exc_info=traceback.format_exc()))
raise
plugin: AdapterPlugin = mod.Plugin
plugin_type = plugin.adapter.type()

View File

@@ -8,7 +8,6 @@ from typing import (
Generic,
TypeVar,
Tuple,
Union,
Dict,
Any,
)
@@ -17,8 +16,7 @@ from typing_extensions import Protocol
import agate
from dbt.contracts.connection import Connection, AdapterRequiredConfig, AdapterResponse
from dbt.contracts.graph.compiled import CompiledNode, ManifestNode, NonSourceCompiledNode
from dbt.contracts.graph.parsed import ParsedNode, ParsedSourceDefinition
from dbt.contracts.graph.nodes import ResultNode, ManifestNode
from dbt.contracts.graph.model_config import BaseConfig
from dbt.contracts.graph.manifest import Manifest
from dbt.contracts.relation import Policy, HasQuoting
@@ -48,11 +46,7 @@ class RelationProtocol(Protocol):
...
@classmethod
def create_from(
cls: Type[Self],
config: HasQuoting,
node: Union[CompiledNode, ParsedNode, ParsedSourceDefinition],
) -> Self:
def create_from(cls: Type[Self], config: HasQuoting, node: ResultNode) -> Self:
...
@@ -65,7 +59,7 @@ class CompilerProtocol(Protocol):
node: ManifestNode,
manifest: Manifest,
extra_context: Optional[Dict[str, Any]] = None,
) -> NonSourceCompiledNode:
) -> ManifestNode:
...

View File

@@ -2,6 +2,7 @@
from collections import namedtuple
from typing import Any, Optional
from dbt.events.proto_types import ReferenceKeyMsg
_ReferenceKey = namedtuple("_ReferenceKey", "database schema identifier")
@@ -14,7 +15,12 @@ def lowercase(value: Optional[str]) -> Optional[str]:
return value.lower()
# For backwards compatibility. New code should use _make_ref_key
def _make_key(relation: Any) -> _ReferenceKey:
return _make_ref_key(relation)
def _make_ref_key(relation: Any) -> _ReferenceKey:
"""Make _ReferenceKeys with lowercase values for the cache so we don't have
to keep track of quoting
"""
@@ -22,3 +28,13 @@ def _make_key(relation: Any) -> _ReferenceKey:
return _ReferenceKey(
lowercase(relation.database), lowercase(relation.schema), lowercase(relation.identifier)
)
def _make_ref_key_msg(relation: Any):
return _make_msg_from_ref_key(_make_ref_key(relation))
def _make_msg_from_ref_key(ref_key: _ReferenceKey) -> ReferenceKeyMsg:
return ReferenceKeyMsg(
database=ref_key.database, schema=ref_key.schema, identifier=ref_key.identifier
)

View File

@@ -10,6 +10,8 @@ from dbt.adapters.base import BaseConnectionManager
from dbt.contracts.connection import Connection, ConnectionState, AdapterResponse
from dbt.events.functions import fire_event
from dbt.events.types import ConnectionUsed, SQLQuery, SQLCommit, SQLQueryStatus
from dbt.events.contextvars import get_node_info
from dbt.utils import cast_to_str
class SQLConnectionManager(BaseConnectionManager):
@@ -55,7 +57,13 @@ class SQLConnectionManager(BaseConnectionManager):
connection = self.get_thread_connection()
if auto_begin and connection.transaction_open is False:
self.begin()
fire_event(ConnectionUsed(conn_type=self.TYPE, conn_name=connection.name))
fire_event(
ConnectionUsed(
conn_type=self.TYPE,
conn_name=cast_to_str(connection.name),
node_info=get_node_info(),
)
)
with self.exception_handler(sql):
if abridge_sql_log:
@@ -63,7 +71,11 @@ class SQLConnectionManager(BaseConnectionManager):
else:
log_sql = sql
fire_event(SQLQuery(conn_name=connection.name, sql=log_sql))
fire_event(
SQLQuery(
conn_name=cast_to_str(connection.name), sql=log_sql, node_info=get_node_info()
)
)
pre = time.time()
cursor = connection.handle.cursor()
@@ -71,7 +83,9 @@ class SQLConnectionManager(BaseConnectionManager):
fire_event(
SQLQueryStatus(
status=str(self.get_response(cursor)), elapsed=round((time.time() - pre), 2)
status=str(self.get_response(cursor)),
elapsed=round((time.time() - pre)),
node_info=get_node_info(),
)
)
@@ -155,7 +169,7 @@ class SQLConnectionManager(BaseConnectionManager):
"it does not have one open!".format(connection.name)
)
fire_event(SQLCommit(conn_name=connection.name))
fire_event(SQLCommit(conn_name=connection.name, node_info=get_node_info()))
self.add_commit_query()
connection.transaction_open = False

View File

@@ -5,7 +5,7 @@ import dbt.clients.agate_helper
from dbt.contracts.connection import Connection
import dbt.exceptions
from dbt.adapters.base import BaseAdapter, available
from dbt.adapters.cache import _make_key
from dbt.adapters.cache import _make_ref_key_msg
from dbt.adapters.sql import SQLConnectionManager
from dbt.events.functions import fire_event
from dbt.events.types import ColTypeChange, SchemaCreation, SchemaDrop
@@ -110,7 +110,7 @@ class SQLAdapter(BaseAdapter):
ColTypeChange(
orig_type=target_column.data_type,
new_type=new_type,
table=_make_key(current),
table=_make_ref_key_msg(current),
)
)
@@ -155,7 +155,7 @@ class SQLAdapter(BaseAdapter):
def create_schema(self, relation: BaseRelation) -> None:
relation = relation.without_identifier()
fire_event(SchemaCreation(relation=_make_key(relation)))
fire_event(SchemaCreation(relation=_make_ref_key_msg(relation)))
kwargs = {
"relation": relation,
}
@@ -166,7 +166,7 @@ class SQLAdapter(BaseAdapter):
def drop_schema(self, relation: BaseRelation) -> None:
relation = relation.without_identifier()
fire_event(SchemaDrop(relation=_make_key(relation)))
fire_event(SchemaDrop(relation=_make_ref_key_msg(relation)))
kwargs = {
"relation": relation,
}

View File

@@ -31,7 +31,6 @@ def cli_runner():
@p.cache_selected_only
@p.debug
@p.enable_legacy_logger
@p.event_buffer_size
@p.fail_fast
@p.log_cache_events
@p.log_format

View File

@@ -80,14 +80,6 @@ enable_legacy_logger = click.option(
hidden=True,
)
event_buffer_size = click.option(
"--event-buffer-size",
envvar="DBT_EVENT_BUFFER_SIZE",
help="Sets the max number of events to buffer in EVENT_HISTORY.",
default=100000,
type=click.INT,
)
exclude = click.option("--exclude", envvar=None, help="Specify the nodes to exclude.")
fail_fast = click.option(

View File

@@ -367,9 +367,9 @@ class BlockIterator:
if self.current:
linecount = self.data[: self.current.end].count("\n") + 1
dbt.exceptions.raise_compiler_error(
(
"Reached EOF without finding a close tag for " "{} (searched from line {})"
).format(self.current.block_type_name, linecount)
("Reached EOF without finding a close tag for {} (searched from line {})").format(
self.current.block_type_name, linecount
)
)
if collect_raw_data:

View File

@@ -25,8 +25,7 @@ from dbt.utils import (
)
from dbt.clients._jinja_blocks import BlockIterator, BlockData, BlockTag
from dbt.contracts.graph.compiled import CompiledGenericTestNode
from dbt.contracts.graph.parsed import ParsedGenericTestNode
from dbt.contracts.graph.nodes import GenericTestNode
from dbt.exceptions import (
InternalException,
@@ -620,7 +619,7 @@ GENERIC_TEST_KWARGS_NAME = "_dbt_generic_test_kwargs"
def add_rendered_test_kwargs(
context: Dict[str, Any],
node: Union[ParsedGenericTestNode, CompiledGenericTestNode],
node: GenericTestNode,
capture_macros: bool = False,
) -> None:
"""Render each of the test kwargs in the given context using the native

View File

@@ -3,9 +3,9 @@ from typing import Any, Dict, List
import requests
from dbt.events.functions import fire_event
from dbt.events.types import (
RegistryProgressMakingGETRequest,
RegistryProgressGETRequest,
RegistryProgressGETResponse,
RegistryIndexProgressMakingGETRequest,
RegistryIndexProgressGETRequest,
RegistryIndexProgressGETResponse,
RegistryResponseUnexpectedType,
RegistryResponseMissingTopKeys,
@@ -38,7 +38,7 @@ def _get_with_retries(package_name, registry_base_url=None):
def _get(package_name, registry_base_url=None):
url = _get_url(package_name, registry_base_url)
fire_event(RegistryProgressMakingGETRequest(url=url))
fire_event(RegistryProgressGETRequest(url=url))
# all exceptions from requests get caught in the retry logic so no need to wrap this here
resp = requests.get(url, timeout=30)
fire_event(RegistryProgressGETResponse(url=url, resp_code=resp.status_code))
@@ -162,7 +162,7 @@ def get_compatible_versions(package_name, dbt_version, should_version_check) ->
def _get_index(registry_base_url=None):
url = _get_url("index", registry_base_url)
fire_event(RegistryIndexProgressMakingGETRequest(url=url))
fire_event(RegistryIndexProgressGETRequest(url=url))
# all exceptions from requests get caught in the retry logic so no need to wrap this here
resp = requests.get(url, timeout=30)
fire_event(RegistryIndexProgressGETResponse(url=url, resp_code=resp.status_code))

View File

@@ -1,6 +1,6 @@
import os
from collections import defaultdict
from typing import List, Dict, Any, Tuple, cast, Optional
from typing import List, Dict, Any, Tuple, Optional
import networkx as nx # type: ignore
import pickle
@@ -12,15 +12,14 @@ from dbt.clients import jinja
from dbt.clients.system import make_directory
from dbt.context.providers import generate_runtime_model_context
from dbt.contracts.graph.manifest import Manifest, UniqueID
from dbt.contracts.graph.compiled import (
COMPILED_TYPES,
CompiledGenericTestNode,
from dbt.contracts.graph.nodes import (
ManifestNode,
ManifestSQLNode,
GenericTestNode,
GraphMemberNode,
InjectedCTE,
ManifestNode,
NonSourceCompiledNode,
SeedNode,
)
from dbt.contracts.graph.parsed import ParsedNode
from dbt.exceptions import (
dependency_not_found,
InternalException,
@@ -28,7 +27,8 @@ from dbt.exceptions import (
)
from dbt.graph import Graph
from dbt.events.functions import fire_event
from dbt.events.types import FoundStats, CompilingNode, WritingInjectedSQLForNode
from dbt.events.types import FoundStats, WritingInjectedSQLForNode
from dbt.events.contextvars import get_node_info
from dbt.node_types import NodeType, ModelLanguage
from dbt.events.format import pluralize
import dbt.tracking
@@ -36,14 +36,6 @@ import dbt.tracking
graph_file_name = "graph.gpickle"
def _compiled_type_for(model: ParsedNode):
if type(model) not in COMPILED_TYPES:
raise InternalException(
f"Asked to compile {type(model)} node, but it has no compiled form"
)
return COMPILED_TYPES[type(model)]
def print_compile_stats(stats):
names = {
NodeType.Model: "model",
@@ -176,7 +168,7 @@ class Compiler:
# a dict for jinja rendering of SQL
def _create_node_context(
self,
node: NonSourceCompiledNode,
node: ManifestSQLNode,
manifest: Manifest,
extra_context: Dict[str, Any],
) -> Dict[str, Any]:
@@ -184,7 +176,7 @@ class Compiler:
context = generate_runtime_model_context(node, self.config, manifest)
context.update(extra_context)
if isinstance(node, CompiledGenericTestNode):
if isinstance(node, GenericTestNode):
# for test nodes, add a special keyword args value to the context
jinja.add_rendered_test_kwargs(context, node)
@@ -195,14 +187,6 @@ class Compiler:
relation_cls = adapter.Relation
return relation_cls.add_ephemeral_prefix(name)
def _get_relation_name(self, node: ParsedNode):
relation_name = None
if node.is_relational and not node.is_ephemeral_model:
adapter = get_adapter(self.config)
relation_cls = adapter.Relation
relation_name = str(relation_cls.create_from(self.config, node))
return relation_name
def _inject_ctes_into_sql(self, sql: str, ctes: List[InjectedCTE]) -> str:
"""
`ctes` is a list of InjectedCTEs like:
@@ -261,10 +245,10 @@ class Compiler:
def _recursively_prepend_ctes(
self,
model: NonSourceCompiledNode,
model: ManifestSQLNode,
manifest: Manifest,
extra_context: Optional[Dict[str, Any]],
) -> Tuple[NonSourceCompiledNode, List[InjectedCTE]]:
) -> Tuple[ManifestSQLNode, List[InjectedCTE]]:
"""This method is called by the 'compile_node' method. Starting
from the node that it is passed in, it will recursively call
itself using the 'extra_ctes'. The 'ephemeral' models do
@@ -279,7 +263,8 @@ class Compiler:
# Just to make it plain that nothing is actually injected for this case
if not model.extra_ctes:
model.extra_ctes_injected = True
if not isinstance(model, SeedNode):
model.extra_ctes_injected = True
manifest.update_node(model)
return (model, model.extra_ctes)
@@ -298,6 +283,7 @@ class Compiler:
f"could not be resolved: {cte.id}"
)
cte_model = manifest.nodes[cte.id]
assert not isinstance(cte_model, SeedNode)
if not cte_model.is_ephemeral_model:
raise InternalException(f"{cte.id} is not ephemeral")
@@ -305,8 +291,6 @@ class Compiler:
# This model has already been compiled, so it's been
# through here before
if getattr(cte_model, "compiled", False):
assert isinstance(cte_model, tuple(COMPILED_TYPES.values()))
cte_model = cast(NonSourceCompiledNode, cte_model)
new_prepended_ctes = cte_model.extra_ctes
# if the cte_model isn't compiled, i.e. first time here
@@ -343,21 +327,19 @@ class Compiler:
return model, prepended_ctes
# creates a compiled_node from the ManifestNode passed in,
# Sets compiled fields in the ManifestSQLNode passed in,
# creates a "context" dictionary for jinja rendering,
# and then renders the "compiled_code" using the node, the
# raw_code and the context.
def _compile_node(
self,
node: ManifestNode,
node: ManifestSQLNode,
manifest: Manifest,
extra_context: Optional[Dict[str, Any]] = None,
) -> NonSourceCompiledNode:
) -> ManifestSQLNode:
if extra_context is None:
extra_context = {}
fire_event(CompilingNode(unique_id=node.unique_id))
data = node.to_dict(omit_none=True)
data.update(
{
@@ -367,9 +349,8 @@ class Compiler:
"extra_ctes": [],
}
)
compiled_node = _compiled_type_for(node).from_dict(data)
if compiled_node.language == ModelLanguage.python:
if node.language == ModelLanguage.python:
# TODO could we also 'minify' this code at all? just aesthetic, not functional
# quoating seems like something very specific to sql so far
@@ -377,7 +358,7 @@ class Compiler:
# TODO try to find better way to do this, given that
original_quoting = self.config.quoting
self.config.quoting = {key: False for key in original_quoting.keys()}
context = self._create_node_context(compiled_node, manifest, extra_context)
context = self._create_node_context(node, manifest, extra_context)
postfix = jinja.get_rendered(
"{{ py_script_postfix(model) }}",
@@ -385,23 +366,21 @@ class Compiler:
node,
)
# we should NOT jinja render the python model's 'raw code'
compiled_node.compiled_code = f"{node.raw_code}\n\n{postfix}"
node.compiled_code = f"{node.raw_code}\n\n{postfix}"
# restore quoting settings in the end since context is lazy evaluated
self.config.quoting = original_quoting
else:
context = self._create_node_context(compiled_node, manifest, extra_context)
compiled_node.compiled_code = jinja.get_rendered(
context = self._create_node_context(node, manifest, extra_context)
node.compiled_code = jinja.get_rendered(
node.raw_code,
context,
node,
)
compiled_node.relation_name = self._get_relation_name(node)
node.compiled = True
compiled_node.compiled = True
return compiled_node
return node
def write_graph_file(self, linker: Linker, manifest: Manifest):
filename = graph_file_name
@@ -508,10 +487,13 @@ class Compiler:
return Graph(linker.graph)
# writes the "compiled_code" into the target/compiled directory
def _write_node(self, node: NonSourceCompiledNode) -> ManifestNode:
if not node.extra_ctes_injected or node.resource_type == NodeType.Snapshot:
def _write_node(self, node: ManifestSQLNode) -> ManifestSQLNode:
if not node.extra_ctes_injected or node.resource_type in (
NodeType.Snapshot,
NodeType.Seed,
):
return node
fire_event(WritingInjectedSQLForNode(unique_id=node.unique_id))
fire_event(WritingInjectedSQLForNode(node_info=get_node_info()))
if node.compiled_code:
node.compiled_path = node.write_node(
@@ -521,11 +503,11 @@ class Compiler:
def compile_node(
self,
node: ManifestNode,
node: ManifestSQLNode,
manifest: Manifest,
extra_context: Optional[Dict[str, Any]] = None,
write: bool = True,
) -> NonSourceCompiledNode:
) -> ManifestSQLNode:
"""This is the main entry point into this code. It's called by
CompileRunner.compile, GenericRPCRunner.compile, and
RunTask.get_hook_sql. It calls '_compile_node' to convert

View File

@@ -248,7 +248,7 @@ class PartialProject(RenderComponents):
project_name: Optional[str] = field(
metadata=dict(
description=(
"The name of the project. This should always be set and will not " "be rendered"
"The name of the project. This should always be set and will not be rendered"
)
)
)
@@ -668,7 +668,7 @@ class Project:
def get_selector(self, name: str) -> Union[SelectionSpec, bool]:
if name not in self.selectors:
raise RuntimeException(
f"Could not find selector named {name}, expected one of " f"{list(self.selectors)}"
f"Could not find selector named {name}, expected one of {list(self.selectors)}"
)
return self.selectors[name]["definition"]

View File

@@ -3,31 +3,41 @@ import os
from copy import deepcopy
from dataclasses import dataclass, field
from pathlib import Path
from typing import Dict, Any, Optional, Mapping, Iterator, Iterable, Tuple, List, MutableSet, Type
from typing import (
Any,
Dict,
Iterable,
Iterator,
Mapping,
MutableSet,
Optional,
Tuple,
Type,
Union,
)
from .profile import Profile
from .project import Project
from .renderer import DbtProjectYamlRenderer, ProfileRenderer
from .utils import parse_cli_vars
from dbt import flags
from dbt.adapters.factory import get_relation_class_by_name, get_include_paths
from dbt.helper_types import FQNPath, PathSet, DictDefaultEmptyStr
from dbt.adapters.factory import get_include_paths, get_relation_class_by_name
from dbt.config.profile import read_user_config
from dbt.contracts.connection import AdapterRequiredConfig, Credentials
from dbt.contracts.graph.manifest import ManifestMetadata
from dbt.contracts.relation import ComponentName
from dbt.ui import warning_tag
from dbt.contracts.project import Configuration, UserConfig
from dbt.exceptions import (
RuntimeException,
DbtProjectError,
validator_error_message,
warn_or_error,
raise_compiler_error,
)
from dbt.contracts.relation import ComponentName
from dbt.dataclass_schema import ValidationError
from dbt.exceptions import (
DbtProjectError,
RuntimeException,
raise_compiler_error,
validator_error_message,
)
from dbt.events.functions import warn_or_error
from dbt.events.types import UnusedResourceConfigPath
from dbt.helper_types import DictDefaultEmptyStr, FQNPath, PathSet
from .profile import Profile
from .project import Project, PartialProject
from .renderer import DbtProjectYamlRenderer, ProfileRenderer
from .utils import parse_cli_vars
def _project_quoting_dict(proj: Project, profile: Profile) -> Dict[ComponentName, bool]:
@@ -190,28 +200,52 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
@classmethod
def collect_parts(cls: Type["RuntimeConfig"], args: Any) -> Tuple[Project, Profile]:
# profile_name from the project
project_root = args.project_dir if args.project_dir else os.getcwd()
version_check = bool(flags.VERSION_CHECK)
partial = Project.partial_load(project_root, verify_version=version_check)
# build the profile using the base renderer and the one fact we know
# Note: only the named profile section is rendered. The rest of the
# profile is ignored.
cli_vars: Dict[str, Any] = parse_cli_vars(getattr(args, "vars", "{}"))
profile = cls.collect_profile(args=args)
project_renderer = DbtProjectYamlRenderer(profile, cli_vars)
project = cls.collect_project(args=args, project_renderer=project_renderer)
assert type(project) is Project
return (project, profile)
@classmethod
def collect_profile(
cls: Type["RuntimeConfig"], args: Any, profile_name: Optional[str] = None
) -> Profile:
cli_vars: Dict[str, Any] = parse_cli_vars(getattr(args, "vars", "{}"))
profile_renderer = ProfileRenderer(cli_vars)
profile_name = partial.render_profile_name(profile_renderer)
# build the profile using the base renderer and the one fact we know
if profile_name is None:
# Note: only the named profile section is rendered here. The rest of the
# profile is ignored.
partial = cls.collect_project(args)
assert type(partial) is PartialProject
profile_name = partial.render_profile_name(profile_renderer)
profile = cls._get_rendered_profile(args, profile_renderer, profile_name)
# Save env_vars encountered in rendering for partial parsing
profile.profile_env_vars = profile_renderer.ctx_obj.env_vars
return profile
# get a new renderer using our target information and render the
# project
project_renderer = DbtProjectYamlRenderer(profile, cli_vars)
project = partial.render(project_renderer)
# Save env_vars encountered in rendering for partial parsing
project.project_env_vars = project_renderer.ctx_obj.env_vars
return (project, profile)
@classmethod
def collect_project(
cls: Type["RuntimeConfig"],
args: Any,
project_renderer: Optional[DbtProjectYamlRenderer] = None,
) -> Union[Project, PartialProject]:
project_root = args.project_dir if args.project_dir else os.getcwd()
version_check = bool(flags.VERSION_CHECK)
partial = Project.partial_load(project_root, verify_version=version_check)
if project_renderer is None:
return partial
else:
project = partial.render(project_renderer)
project.project_env_vars = project_renderer.ctx_obj.env_vars
return project
# Called in main.py, lib.py, task/base.py
@classmethod
@@ -280,11 +314,11 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
"exposures": self._get_config_paths(self.exposures),
}
def get_unused_resource_config_paths(
def warn_for_unused_resource_config_paths(
self,
resource_fqns: Mapping[str, PathSet],
disabled: PathSet,
) -> List[FQNPath]:
) -> None:
"""Return a list of lists of strings, where each inner list of strings
represents a type + FQN path of a resource configuration that is not
used.
@@ -298,23 +332,13 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
for config_path in config_paths:
if not _is_config_used(config_path, fqns):
unused_resource_config_paths.append((resource_type,) + config_path)
return unused_resource_config_paths
resource_path = ".".join(i for i in ((resource_type,) + config_path))
unused_resource_config_paths.append(resource_path)
def warn_for_unused_resource_config_paths(
self,
resource_fqns: Mapping[str, PathSet],
disabled: PathSet,
) -> None:
unused = self.get_unused_resource_config_paths(resource_fqns, disabled)
if len(unused) == 0:
if len(unused_resource_config_paths) == 0:
return
msg = UNUSED_RESOURCE_CONFIGURATION_PATH_MESSAGE.format(
len(unused), "\n".join("- {}".format(".".join(u)) for u in unused)
)
warn_or_error(msg, log_fmt=warning_tag("{}"))
warn_or_error(UnusedResourceConfigPath(unused_config_paths=unused_resource_config_paths))
def load_dependencies(self, base_only=False) -> Mapping[str, "RuntimeConfig"]:
if self.dependencies is None:
@@ -591,14 +615,6 @@ class UnsetProfileConfig(RuntimeConfig):
return cls.from_parts(project=project, profile=profile, args=args)
UNUSED_RESOURCE_CONFIGURATION_PATH_MESSAGE = """\
Configuration paths exist in your dbt_project.yml file which do not \
apply to any resources.
There are {} unused configuration paths:
{}
"""
def _is_config_used(path, fqns):
if fqns:
for fqn in fqns:

View File

@@ -1,2 +1,10 @@
SECRET_ENV_PREFIX = "DBT_ENV_SECRET_"
DEFAULT_ENV_PLACEHOLDER = "DBT_DEFAULT_PLACEHOLDER"
METADATA_ENV_PREFIX = "DBT_ENV_CUSTOM_ENV_"
MAXIMUM_SEED_SIZE = 1 * 1024 * 1024
MAXIMUM_SEED_SIZE_NAME = "1MB"
PIN_PACKAGE_URL = (
"https://docs.getdbt.com/docs/package-management#section-specifying-package-versions"
)

View File

@@ -4,10 +4,11 @@ from typing import Any, Dict, NoReturn, Optional, Mapping, Iterable, Set, List
from dbt import flags
from dbt import tracking
from dbt import utils
from dbt.clients.jinja import get_rendered
from dbt.clients.yaml_helper import yaml, safe_load, SafeLoader, Loader, Dumper # noqa: F401
from dbt.constants import SECRET_ENV_PREFIX, DEFAULT_ENV_PLACEHOLDER
from dbt.contracts.graph.compiled import CompiledResource
from dbt.contracts.graph.nodes import Resource
from dbt.exceptions import (
CompilationException,
MacroReturn,
@@ -16,7 +17,8 @@ from dbt.exceptions import (
disallow_secret_env_var,
)
from dbt.events.functions import fire_event, get_invocation_id
from dbt.events.types import MacroEventInfo, MacroEventDebug
from dbt.events.types import JinjaLogInfo, JinjaLogDebug
from dbt.events.contextvars import get_node_info
from dbt.version import __version__ as dbt_version
# These modules are added to the context. Consider alternative
@@ -126,18 +128,18 @@ class ContextMeta(type):
class Var:
UndefinedVarError = "Required var '{}' not found in config:\nVars " "supplied to {} = {}"
UndefinedVarError = "Required var '{}' not found in config:\nVars supplied to {} = {}"
_VAR_NOTSET = object()
def __init__(
self,
context: Mapping[str, Any],
cli_vars: Mapping[str, Any],
node: Optional[CompiledResource] = None,
node: Optional[Resource] = None,
) -> None:
self._context: Mapping[str, Any] = context
self._cli_vars: Mapping[str, Any] = cli_vars
self._node: Optional[CompiledResource] = node
self._node: Optional[Resource] = node
self._merged: Mapping[str, Any] = self._generate_merged()
def _generate_merged(self) -> Mapping[str, Any]:
@@ -557,9 +559,9 @@ class BaseContext(metaclass=ContextMeta):
{% endmacro %}"
"""
if info:
fire_event(MacroEventInfo(msg=msg))
fire_event(JinjaLogInfo(msg=msg, node_info=get_node_info()))
else:
fire_event(MacroEventDebug(msg=msg))
fire_event(JinjaLogDebug(msg=msg, node_info=get_node_info()))
return ""
@contextproperty
@@ -687,6 +689,19 @@ class BaseContext(metaclass=ContextMeta):
dict_diff.update({k: dict_a[k]})
return dict_diff
@contextmember
@staticmethod
def local_md5(value: str) -> str:
"""Calculates an MD5 hash of the given string.
It's called "local_md5" to emphasize that it runs locally in dbt (in jinja context) and not an MD5 SQL command.
:param value: The value to hash
Usage:
{% set value_hash = local_md5("hello world") %}
"""
return utils.md5(value)
def generate_base_context(cli_vars: Dict[str, Any]) -> Dict[str, Any]:
ctx = BaseContext(cli_vars)

View File

@@ -5,9 +5,8 @@ from dbt.exceptions import (
doc_target_not_found,
)
from dbt.config.runtime import RuntimeConfig
from dbt.contracts.graph.compiled import CompileResultNode
from dbt.contracts.graph.manifest import Manifest
from dbt.contracts.graph.parsed import ParsedMacro
from dbt.contracts.graph.nodes import Macro, ResultNode
from dbt.context.base import contextmember
from dbt.context.configured import SchemaYamlContext
@@ -17,7 +16,7 @@ class DocsRuntimeContext(SchemaYamlContext):
def __init__(
self,
config: RuntimeConfig,
node: Union[ParsedMacro, CompileResultNode],
node: Union[Macro, ResultNode],
manifest: Manifest,
current_project: str,
) -> None:
@@ -55,7 +54,7 @@ class DocsRuntimeContext(SchemaYamlContext):
else:
doc_invalid_args(self.node, args)
# ParsedDocumentation
# Documentation
target_doc = self.manifest.resolve_doc(
doc_name,
doc_package_name,

View File

@@ -1,10 +1,10 @@
from typing import Dict, MutableMapping, Optional
from dbt.contracts.graph.parsed import ParsedMacro
from dbt.contracts.graph.nodes import Macro
from dbt.exceptions import raise_duplicate_macro_name, raise_compiler_error
from dbt.include.global_project import PROJECT_NAME as GLOBAL_PROJECT_NAME
from dbt.clients.jinja import MacroGenerator
MacroNamespace = Dict[str, ParsedMacro]
MacroNamespace = Dict[str, Macro]
# This class builds the MacroResolver by adding macros
@@ -21,7 +21,7 @@ MacroNamespace = Dict[str, ParsedMacro]
class MacroResolver:
def __init__(
self,
macros: MutableMapping[str, ParsedMacro],
macros: MutableMapping[str, Macro],
root_project_name: str,
internal_package_names,
) -> None:
@@ -77,7 +77,7 @@ class MacroResolver:
def _add_macro_to(
self,
package_namespaces: Dict[str, MacroNamespace],
macro: ParsedMacro,
macro: Macro,
):
if macro.package_name in package_namespaces:
namespace = package_namespaces[macro.package_name]
@@ -89,7 +89,7 @@ class MacroResolver:
raise_duplicate_macro_name(macro, macro, macro.package_name)
package_namespaces[macro.package_name][macro.name] = macro
def add_macro(self, macro: ParsedMacro):
def add_macro(self, macro: Macro):
macro_name: str = macro.name
# internal macros (from plugins) will be processed separately from

View File

@@ -1,7 +1,7 @@
from typing import Any, Dict, Iterable, Union, Optional, List, Iterator, Mapping, Set
from dbt.clients.jinja import MacroGenerator, MacroStack
from dbt.contracts.graph.parsed import ParsedMacro
from dbt.contracts.graph.nodes import Macro
from dbt.include.global_project import PROJECT_NAME as GLOBAL_PROJECT_NAME
from dbt.exceptions import raise_duplicate_macro_name, raise_compiler_error
@@ -112,7 +112,7 @@ class MacroNamespaceBuilder:
def _add_macro_to(
self,
hierarchy: Dict[str, FlatNamespace],
macro: ParsedMacro,
macro: Macro,
macro_func: MacroGenerator,
):
if macro.package_name in hierarchy:
@@ -125,7 +125,7 @@ class MacroNamespaceBuilder:
raise_duplicate_macro_name(macro_func.macro, macro, macro.package_name)
hierarchy[macro.package_name][macro.name] = macro_func
def add_macro(self, macro: ParsedMacro, ctx: Dict[str, Any]):
def add_macro(self, macro: Macro, ctx: Dict[str, Any]):
macro_name: str = macro.name
# MacroGenerator is in clients/jinja.py
@@ -147,13 +147,11 @@ class MacroNamespaceBuilder:
elif macro.package_name == self.root_package:
self.globals[macro_name] = macro_func
def add_macros(self, macros: Iterable[ParsedMacro], ctx: Dict[str, Any]):
def add_macros(self, macros: Iterable[Macro], ctx: Dict[str, Any]):
for macro in macros:
self.add_macro(macro, ctx)
def build_namespace(
self, macros: Iterable[ParsedMacro], ctx: Dict[str, Any]
) -> MacroNamespace:
def build_namespace(self, macros: Iterable[Macro], ctx: Dict[str, Any]) -> MacroNamespace:
self.add_macros(macros, ctx)
# Iterate in reverse-order and overwrite: the packages that are first

View File

@@ -28,20 +28,17 @@ from .macros import MacroNamespaceBuilder, MacroNamespace
from .manifest import ManifestContext
from dbt.contracts.connection import AdapterResponse
from dbt.contracts.graph.manifest import Manifest, Disabled
from dbt.contracts.graph.compiled import (
CompiledResource,
CompiledSeedNode,
from dbt.contracts.graph.nodes import (
Macro,
Exposure,
Metric,
SeedNode,
SourceDefinition,
Resource,
ManifestNode,
)
from dbt.contracts.graph.parsed import (
ParsedMacro,
ParsedExposure,
ParsedMetric,
ParsedSeedNode,
ParsedSourceDefinition,
)
from dbt.contracts.graph.metrics import MetricReference, ResolvedMetricReference
from dbt.contracts.util import get_metadata_env
from dbt.events.functions import get_metadata_vars
from dbt.exceptions import (
CompilationException,
ParsingException,
@@ -53,7 +50,6 @@ from dbt.exceptions import (
raise_compiler_error,
ref_invalid_args,
metric_invalid_args,
ref_target_not_found,
target_not_found,
ref_bad_context,
wrapped_exports,
@@ -182,7 +178,7 @@ class BaseDatabaseWrapper:
return macro
searched = ", ".join(repr(a) for a in attempts)
msg = f"In dispatch: No macro named '{macro_name}' found\n" f" Searched for: {searched}"
msg = f"In dispatch: No macro named '{macro_name}' found\n Searched for: {searched}"
raise CompilationException(msg)
@@ -220,12 +216,12 @@ class BaseRefResolver(BaseResolver):
def validate_args(self, name: str, package: Optional[str]):
if not isinstance(name, str):
raise CompilationException(
f"The name argument to ref() must be a string, got " f"{type(name)}"
f"The name argument to ref() must be a string, got {type(name)}"
)
if package is not None and not isinstance(package, str):
raise CompilationException(
f"The package argument to ref() must be a string or None, got " f"{type(package)}"
f"The package argument to ref() must be a string or None, got {type(package)}"
)
def __call__(self, *args: str) -> RelationProxy:
@@ -476,10 +472,11 @@ class RuntimeRefResolver(BaseRefResolver):
)
if target_model is None or isinstance(target_model, Disabled):
ref_target_not_found(
self.model,
target_name,
target_package,
target_not_found(
node=self.model,
target_name=target_name,
target_kind="node",
target_package=target_package,
disabled=isinstance(target_model, Disabled),
)
self.validate(target_model, target_name, target_package)
@@ -512,7 +509,7 @@ class OperationRefResolver(RuntimeRefResolver):
def create_relation(self, target_model: ManifestNode, name: str) -> RelationProxy:
if target_model.is_ephemeral_model:
# In operations, we can't ref() ephemeral nodes, because
# ParsedMacros do not support set_cte
# Macros do not support set_cte
raise_compiler_error(
"Operations can not ref() ephemeral nodes, but {} is ephemeral".format(
target_model.name
@@ -584,9 +581,9 @@ class ModelConfiguredVar(Var):
self,
context: Dict[str, Any],
config: RuntimeConfig,
node: CompiledResource,
node: Resource,
) -> None:
self._node: CompiledResource
self._node: Resource
self._config: RuntimeConfig = config
super().__init__(context, config.cli_vars, node=node)
@@ -690,7 +687,7 @@ class ProviderContext(ManifestContext):
raise InternalException(f"Invalid provider given to context: {provider}")
# mypy appeasement - we know it'll be a RuntimeConfig
self.config: RuntimeConfig
self.model: Union[ParsedMacro, ManifestNode] = model
self.model: Union[Macro, ManifestNode] = model
super().__init__(config, manifest, model.package_name)
self.sql_results: Dict[str, AttrDict] = {}
self.context_config: Optional[ContextConfig] = context_config
@@ -713,7 +710,7 @@ class ProviderContext(ManifestContext):
@contextproperty
def dbt_metadata_envs(self) -> Dict[str, str]:
return get_metadata_env()
return get_metadata_vars()
@contextproperty
def invocation_args_dict(self):
@@ -779,7 +776,7 @@ class ProviderContext(ManifestContext):
@contextmember
def write(self, payload: str) -> str:
# macros/source defs aren't 'writeable'.
if isinstance(self.model, (ParsedMacro, ParsedSourceDefinition)):
if isinstance(self.model, (Macro, SourceDefinition)):
raise_compiler_error('cannot "write" macros or sources')
self.model.build_path = self.model.write_node(self.config.target_path, "run", payload)
return ""
@@ -799,10 +796,11 @@ class ProviderContext(ManifestContext):
@contextmember
def load_agate_table(self) -> agate.Table:
if not isinstance(self.model, (ParsedSeedNode, CompiledSeedNode)):
if not isinstance(self.model, SeedNode):
raise_compiler_error(
"can only load_agate_table for seeds (got a {})".format(self.model.resource_type)
)
assert self.model.root_path
path = os.path.join(self.model.root_path, self.model.original_file_path)
column_types = self.model.config.column_types
try:
@@ -1219,7 +1217,13 @@ class ProviderContext(ManifestContext):
if return_value is not None:
# Save the env_var value in the manifest and the var name in the source_file.
# If this is compiling, do not save because it's irrelevant to parsing.
if self.model and not hasattr(self.model, "compiled"):
compiling = (
True
if hasattr(self.model, "compiled")
and getattr(self.model, "compiled", False) is True
else False
)
if self.model and not compiling:
# If the environment variable is set from a default, store a string indicating
# that so we can skip partial parsing. Otherwise the file will be scheduled for
# reparsing. If the default changes, the file will have been updated and therefore
@@ -1274,7 +1278,7 @@ class MacroContext(ProviderContext):
def __init__(
self,
model: ParsedMacro,
model: Macro,
config: RuntimeConfig,
manifest: Manifest,
provider: Provider,
@@ -1389,7 +1393,7 @@ def generate_parser_model_context(
def generate_generate_name_macro_context(
macro: ParsedMacro,
macro: Macro,
config: RuntimeConfig,
manifest: Manifest,
) -> Dict[str, Any]:
@@ -1407,7 +1411,7 @@ def generate_runtime_model_context(
def generate_runtime_macro_context(
macro: ParsedMacro,
macro: Macro,
config: RuntimeConfig,
manifest: Manifest,
package_name: Optional[str],
@@ -1434,8 +1438,16 @@ class ExposureSourceResolver(BaseResolver):
return ""
class ExposureMetricResolver(BaseResolver):
def __call__(self, *args) -> str:
if len(args) not in (1, 2):
metric_invalid_args(self.model, args)
self.model.metrics.append(list(args))
return ""
def generate_parse_exposure(
exposure: ParsedExposure,
exposure: Exposure,
config: RuntimeConfig,
manifest: Manifest,
package_name: str,
@@ -1454,6 +1466,12 @@ def generate_parse_exposure(
project,
manifest,
),
"metric": ExposureMetricResolver(
None,
exposure,
project,
manifest,
),
}
@@ -1479,7 +1497,7 @@ class MetricRefResolver(BaseResolver):
def generate_parse_metrics(
metric: ParsedMetric,
metric: Metric,
config: RuntimeConfig,
manifest: Manifest,
package_name: str,

View File

@@ -16,6 +16,7 @@ from dbt.exceptions import InternalException
from dbt.utils import translate_aliases
from dbt.events.functions import fire_event
from dbt.events.types import NewConnectionOpening
from dbt.events.contextvars import get_node_info
from typing_extensions import Protocol
from dbt.dataclass_schema import (
dbtClassMixin,
@@ -94,7 +95,7 @@ class Connection(ExtensibleDbtClassMixin, Replaceable):
self._handle.resolve(self)
except RecursionError as exc:
raise InternalException(
"A connection's open() method attempted to read the " "handle value"
"A connection's open() method attempted to read the handle value"
) from exc
return self._handle
@@ -112,7 +113,9 @@ class LazyHandle:
self.opener = opener
def resolve(self, connection: Connection) -> Connection:
fire_event(NewConnectionOpening(connection_state=connection.state))
fire_event(
NewConnectionOpening(connection_state=connection.state, node_info=get_node_info())
)
return self.opener(connection)

View File

@@ -1,18 +1,16 @@
import hashlib
import os
from dataclasses import dataclass, field
from mashumaro.types import SerializableType
from typing import List, Optional, Union, Dict, Any
from dbt.constants import MAXIMUM_SEED_SIZE
from dbt.dataclass_schema import dbtClassMixin, StrEnum
from .util import SourceKey
MAXIMUM_SEED_SIZE = 1 * 1024 * 1024
MAXIMUM_SEED_SIZE_NAME = "1MB"
class ParseFileType(StrEnum):
Macro = "macro"
Model = "model"

View File

@@ -1,235 +0,0 @@
from dbt.contracts.graph.parsed import (
HasTestMetadata,
ParsedNode,
ParsedAnalysisNode,
ParsedSingularTestNode,
ParsedHookNode,
ParsedModelNode,
ParsedExposure,
ParsedMetric,
ParsedResource,
ParsedRPCNode,
ParsedSqlNode,
ParsedGenericTestNode,
ParsedSeedNode,
ParsedSnapshotNode,
ParsedSourceDefinition,
SeedConfig,
TestConfig,
same_seeds,
)
from dbt.node_types import NodeType
from dbt.contracts.util import Replaceable
from dbt.dataclass_schema import dbtClassMixin
from dataclasses import dataclass, field
from typing import Optional, List, Union, Dict, Type
@dataclass
class InjectedCTE(dbtClassMixin, Replaceable):
id: str
sql: str
@dataclass
class CompiledNodeMixin(dbtClassMixin):
# this is a special mixin class to provide a required argument. If a node
# is missing a `compiled` flag entirely, it must not be a CompiledNode.
compiled: bool
@dataclass
class CompiledNode(ParsedNode, CompiledNodeMixin):
compiled_code: Optional[str] = None
extra_ctes_injected: bool = False
extra_ctes: List[InjectedCTE] = field(default_factory=list)
relation_name: Optional[str] = None
_pre_injected_sql: Optional[str] = None
def set_cte(self, cte_id: str, sql: str):
"""This is the equivalent of what self.extra_ctes[cte_id] = sql would
do if extra_ctes were an OrderedDict
"""
for cte in self.extra_ctes:
if cte.id == cte_id:
cte.sql = sql
break
else:
self.extra_ctes.append(InjectedCTE(id=cte_id, sql=sql))
def __post_serialize__(self, dct):
dct = super().__post_serialize__(dct)
if "_pre_injected_sql" in dct:
del dct["_pre_injected_sql"]
return dct
@dataclass
class CompiledAnalysisNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.Analysis]})
@dataclass
class CompiledHookNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.Operation]})
index: Optional[int] = None
@dataclass
class CompiledModelNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.Model]})
# TODO: rm?
@dataclass
class CompiledRPCNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.RPCCall]})
@dataclass
class CompiledSqlNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.SqlOperation]})
@dataclass
class CompiledSeedNode(CompiledNode):
# keep this in sync with ParsedSeedNode!
resource_type: NodeType = field(metadata={"restrict": [NodeType.Seed]})
config: SeedConfig = field(default_factory=SeedConfig)
@property
def empty(self):
"""Seeds are never empty"""
return False
def same_body(self, other) -> bool:
return same_seeds(self, other)
@dataclass
class CompiledSnapshotNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.Snapshot]})
@dataclass
class CompiledSingularTestNode(CompiledNode):
resource_type: NodeType = field(metadata={"restrict": [NodeType.Test]})
# Was not able to make mypy happy and keep the code working. We need to
# refactor the various configs.
config: TestConfig = field(default_factory=TestConfig) # type:ignore
@dataclass
class CompiledGenericTestNode(CompiledNode, HasTestMetadata):
# keep this in sync with ParsedGenericTestNode!
resource_type: NodeType = field(metadata={"restrict": [NodeType.Test]})
column_name: Optional[str] = None
file_key_name: Optional[str] = None
# Was not able to make mypy happy and keep the code working. We need to
# refactor the various configs.
config: TestConfig = field(default_factory=TestConfig) # type:ignore
def same_contents(self, other) -> bool:
if other is None:
return False
return self.same_config(other) and self.same_fqn(other) and True
CompiledTestNode = Union[CompiledSingularTestNode, CompiledGenericTestNode]
PARSED_TYPES: Dict[Type[CompiledNode], Type[ParsedResource]] = {
CompiledAnalysisNode: ParsedAnalysisNode,
CompiledModelNode: ParsedModelNode,
CompiledHookNode: ParsedHookNode,
CompiledRPCNode: ParsedRPCNode,
CompiledSqlNode: ParsedSqlNode,
CompiledSeedNode: ParsedSeedNode,
CompiledSnapshotNode: ParsedSnapshotNode,
CompiledSingularTestNode: ParsedSingularTestNode,
CompiledGenericTestNode: ParsedGenericTestNode,
}
COMPILED_TYPES: Dict[Type[ParsedResource], Type[CompiledNode]] = {
ParsedAnalysisNode: CompiledAnalysisNode,
ParsedModelNode: CompiledModelNode,
ParsedHookNode: CompiledHookNode,
ParsedRPCNode: CompiledRPCNode,
ParsedSqlNode: CompiledSqlNode,
ParsedSeedNode: CompiledSeedNode,
ParsedSnapshotNode: CompiledSnapshotNode,
ParsedSingularTestNode: CompiledSingularTestNode,
ParsedGenericTestNode: CompiledGenericTestNode,
}
# for some types, the compiled type is the parsed type, so make this easy
CompiledType = Union[Type[CompiledNode], Type[ParsedResource]]
CompiledResource = Union[ParsedResource, CompiledNode]
def compiled_type_for(parsed: ParsedNode) -> CompiledType:
if type(parsed) in COMPILED_TYPES:
return COMPILED_TYPES[type(parsed)]
else:
return type(parsed)
def parsed_instance_for(compiled: CompiledNode) -> ParsedResource:
cls = PARSED_TYPES.get(type(compiled))
if cls is None:
# how???
raise ValueError("invalid resource_type: {}".format(compiled.resource_type))
return cls.from_dict(compiled.to_dict(omit_none=True))
NonSourceCompiledNode = Union[
CompiledAnalysisNode,
CompiledSingularTestNode,
CompiledModelNode,
CompiledHookNode,
CompiledRPCNode,
CompiledSqlNode,
CompiledGenericTestNode,
CompiledSeedNode,
CompiledSnapshotNode,
]
NonSourceParsedNode = Union[
ParsedAnalysisNode,
ParsedSingularTestNode,
ParsedHookNode,
ParsedModelNode,
ParsedRPCNode,
ParsedSqlNode,
ParsedGenericTestNode,
ParsedSeedNode,
ParsedSnapshotNode,
]
# This is anything that can be in manifest.nodes.
ManifestNode = Union[
NonSourceCompiledNode,
NonSourceParsedNode,
]
# We allow either parsed or compiled nodes, or parsed sources, as some
# 'compile()' calls in the runner actually just return the original parsed
# node they were given.
CompileResultNode = Union[
ManifestNode,
ParsedSourceDefinition,
]
# anything that participates in the graph: sources, exposures, metrics,
# or manifest nodes
GraphMemberNode = Union[
CompileResultNode,
ParsedExposure,
ParsedMetric,
]

View File

@@ -16,29 +16,24 @@ from typing import (
TypeVar,
Callable,
Generic,
cast,
AbstractSet,
ClassVar,
)
from typing_extensions import Protocol
from uuid import UUID
from dbt.contracts.graph.compiled import (
CompileResultNode,
ManifestNode,
NonSourceCompiledNode,
GraphMemberNode,
)
from dbt.contracts.graph.parsed import (
ParsedMacro,
ParsedDocumentation,
ParsedSourceDefinition,
ParsedGenericTestNode,
ParsedExposure,
ParsedMetric,
HasUniqueID,
from dbt.contracts.graph.nodes import (
Macro,
Documentation,
SourceDefinition,
GenericTestNode,
Exposure,
Metric,
UnpatchedSourceDefinition,
ManifestNodes,
ManifestNode,
GraphMemberNode,
ResultNode,
BaseNode,
)
from dbt.contracts.graph.unparsed import SourcePatch
from dbt.contracts.files import SourceFile, SchemaSourceFile, FileHash, AnySourceFile
@@ -96,7 +91,7 @@ class DocLookup(dbtClassMixin):
return self.perform_lookup(unique_id, manifest)
return None
def add_doc(self, doc: ParsedDocumentation):
def add_doc(self, doc: Documentation):
if doc.name not in self.storage:
self.storage[doc.name] = {}
self.storage[doc.name][doc.package_name] = doc.unique_id
@@ -105,7 +100,7 @@ class DocLookup(dbtClassMixin):
for doc in manifest.docs.values():
self.add_doc(doc)
def perform_lookup(self, unique_id: UniqueID, manifest) -> ParsedDocumentation:
def perform_lookup(self, unique_id: UniqueID, manifest) -> Documentation:
if unique_id not in manifest.docs:
raise dbt.exceptions.InternalException(
f"Doc {unique_id} found in cache but not found in manifest"
@@ -127,7 +122,7 @@ class SourceLookup(dbtClassMixin):
return self.perform_lookup(unique_id, manifest)
return None
def add_source(self, source: ParsedSourceDefinition):
def add_source(self, source: SourceDefinition):
if source.search_name not in self.storage:
self.storage[source.search_name] = {}
@@ -138,7 +133,7 @@ class SourceLookup(dbtClassMixin):
if hasattr(source, "source_name"):
self.add_source(source)
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> ParsedSourceDefinition:
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> SourceDefinition:
if unique_id not in manifest.sources:
raise dbt.exceptions.InternalException(
f"Source {unique_id} found in cache but not found in manifest"
@@ -198,7 +193,7 @@ class MetricLookup(dbtClassMixin):
return self.perform_lookup(unique_id, manifest)
return None
def add_metric(self, metric: ParsedMetric):
def add_metric(self, metric: Metric):
if metric.search_name not in self.storage:
self.storage[metric.search_name] = {}
@@ -209,7 +204,7 @@ class MetricLookup(dbtClassMixin):
if hasattr(metric, "name"):
self.add_metric(metric)
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> ParsedMetric:
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> Metric:
if unique_id not in manifest.metrics:
raise dbt.exceptions.InternalException(
f"Metric {unique_id} found in cache but not found in manifest"
@@ -325,7 +320,7 @@ def _sort_values(dct):
def build_node_edges(nodes: List[ManifestNode]):
"""Build the forward and backward edges on the given list of ParsedNodes
"""Build the forward and backward edges on the given list of ManifestNodes
and return them as two separate dictionaries, each mapping unique IDs to
lists of edges.
"""
@@ -343,10 +338,10 @@ def build_node_edges(nodes: List[ManifestNode]):
# Build a map of children of macros and generic tests
def build_macro_edges(nodes: List[Any]):
forward_edges: Dict[str, List[str]] = {
n.unique_id: [] for n in nodes if n.unique_id.startswith("macro") or n.depends_on.macros
n.unique_id: [] for n in nodes if n.unique_id.startswith("macro") or n.depends_on_macros
}
for node in nodes:
for unique_id in node.depends_on.macros:
for unique_id in node.depends_on_macros:
if unique_id in forward_edges.keys():
forward_edges[unique_id].append(node.unique_id)
return _sort_values(forward_edges)
@@ -365,7 +360,7 @@ class Locality(enum.IntEnum):
@dataclass
class MacroCandidate:
locality: Locality
macro: ParsedMacro
macro: Macro
def __eq__(self, other: object) -> bool:
if not isinstance(other, MacroCandidate):
@@ -430,16 +425,14 @@ M = TypeVar("M", bound=MacroCandidate)
class CandidateList(List[M]):
def last(self) -> Optional[ParsedMacro]:
def last(self) -> Optional[Macro]:
if not self:
return None
self.sort()
return self[-1].macro
def _get_locality(
macro: ParsedMacro, root_project_name: str, internal_packages: Set[str]
) -> Locality:
def _get_locality(macro: Macro, root_project_name: str, internal_packages: Set[str]) -> Locality:
if macro.package_name == root_project_name:
return Locality.Root
elif macro.package_name in internal_packages:
@@ -465,16 +458,16 @@ class Disabled(Generic[D]):
target: D
MaybeMetricNode = Optional[Union[ParsedMetric, Disabled[ParsedMetric]]]
MaybeMetricNode = Optional[Union[Metric, Disabled[Metric]]]
MaybeDocumentation = Optional[ParsedDocumentation]
MaybeDocumentation = Optional[Documentation]
MaybeParsedSource = Optional[
Union[
ParsedSourceDefinition,
Disabled[ParsedSourceDefinition],
SourceDefinition,
Disabled[SourceDefinition],
]
]
@@ -499,7 +492,7 @@ def _update_into(dest: MutableMapping[str, T], new_item: T):
existing = dest[unique_id]
if new_item.original_file_path != existing.original_file_path:
raise dbt.exceptions.RuntimeException(
f"cannot update a {new_item.resource_type} to have a new file " f"path!"
f"cannot update a {new_item.resource_type} to have a new file path!"
)
dest[unique_id] = new_item
@@ -514,7 +507,7 @@ class MacroMethods:
def find_macro_by_name(
self, name: str, root_project_name: str, package: Optional[str]
) -> Optional[ParsedMacro]:
) -> Optional[Macro]:
"""Find a macro in the graph by its name and package name, or None for
any package. The root project name is used to determine priority:
- locally defined macros come first
@@ -537,7 +530,7 @@ class MacroMethods:
def find_generate_macro_by_name(
self, component: str, root_project_name: str
) -> Optional[ParsedMacro]:
) -> Optional[Macro]:
"""
The `generate_X_name` macros are similar to regular ones, but ignore
imported packages.
@@ -606,11 +599,11 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
# is added it must all be added in the __reduce_ex__ method in the
# args tuple in the right position.
nodes: MutableMapping[str, ManifestNode] = field(default_factory=dict)
sources: MutableMapping[str, ParsedSourceDefinition] = field(default_factory=dict)
macros: MutableMapping[str, ParsedMacro] = field(default_factory=dict)
docs: MutableMapping[str, ParsedDocumentation] = field(default_factory=dict)
exposures: MutableMapping[str, ParsedExposure] = field(default_factory=dict)
metrics: MutableMapping[str, ParsedMetric] = field(default_factory=dict)
sources: MutableMapping[str, SourceDefinition] = field(default_factory=dict)
macros: MutableMapping[str, Macro] = field(default_factory=dict)
docs: MutableMapping[str, Documentation] = field(default_factory=dict)
exposures: MutableMapping[str, Exposure] = field(default_factory=dict)
metrics: MutableMapping[str, Metric] = field(default_factory=dict)
selectors: MutableMapping[str, Any] = field(default_factory=dict)
files: MutableMapping[str, AnySourceFile] = field(default_factory=dict)
metadata: ManifestMetadata = field(default_factory=ManifestMetadata)
@@ -658,7 +651,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
obj._lock = flags.MP_CONTEXT.Lock()
return obj
def sync_update_node(self, new_node: NonSourceCompiledNode) -> NonSourceCompiledNode:
def sync_update_node(self, new_node: ManifestNode) -> ManifestNode:
"""update the node with a lock. The only time we should want to lock is
when compiling an ephemeral ancestor of a node at runtime, because
multiple threads could be just-in-time compiling the same ephemeral
@@ -671,21 +664,21 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
with self._lock:
existing = self.nodes[new_node.unique_id]
if getattr(existing, "compiled", False):
# already compiled -> must be a NonSourceCompiledNode
return cast(NonSourceCompiledNode, existing)
# already compiled
return existing
_update_into(self.nodes, new_node)
return new_node
def update_exposure(self, new_exposure: ParsedExposure):
def update_exposure(self, new_exposure: Exposure):
_update_into(self.exposures, new_exposure)
def update_metric(self, new_metric: ParsedMetric):
def update_metric(self, new_metric: Metric):
_update_into(self.metrics, new_metric)
def update_node(self, new_node: ManifestNode):
_update_into(self.nodes, new_node)
def update_source(self, new_source: ParsedSourceDefinition):
def update_source(self, new_source: SourceDefinition):
_update_into(self.sources, new_source)
def build_flat_graph(self):
@@ -738,7 +731,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
def find_materialization_macro_by_name(
self, project_name: str, materialization_name: str, adapter_type: str
) -> Optional[ParsedMacro]:
) -> Optional[Macro]:
candidates: CandidateList = CandidateList(
chain.from_iterable(
self._materialization_candidates_for(
@@ -943,8 +936,8 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
search_name = f"{target_source_name}.{target_table_name}"
candidates = _search_packages(current_project, node_package)
source: Optional[ParsedSourceDefinition] = None
disabled: Optional[List[ParsedSourceDefinition]] = None
source: Optional[SourceDefinition] = None
disabled: Optional[List[SourceDefinition]] = None
for pkg in candidates:
source = self.source_lookup.find(search_name, pkg, self)
@@ -968,8 +961,8 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
node_package: str,
) -> MaybeMetricNode:
metric: Optional[ParsedMetric] = None
disabled: Optional[List[ParsedMetric]] = None
metric: Optional[Metric] = None
disabled: Optional[List[Metric]] = None
candidates = _search_packages(current_project, node_package, target_metric_package)
for pkg in candidates:
@@ -992,7 +985,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
package: Optional[str],
current_project: str,
node_package: str,
) -> Optional[ParsedDocumentation]:
) -> Optional[Documentation]:
"""Resolve the given documentation. This follows the same algorithm as
resolve_ref except the is_enabled checks are unnecessary as docs are
always enabled.
@@ -1011,6 +1004,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
adapter,
other: "WritableManifest",
selected: AbstractSet[UniqueID],
favor_state: bool = False,
) -> None:
"""Given the selected unique IDs and a writable manifest, update this
manifest by replacing any unselected nodes with their counterpart.
@@ -1025,7 +1019,10 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
node.resource_type in refables
and not node.is_ephemeral
and unique_id not in selected
and not adapter.get_relation(current.database, current.schema, current.identifier)
and (
not adapter.get_relation(current.database, current.schema, current.identifier)
or favor_state
)
):
merged.add(unique_id)
self.nodes[unique_id] = node.replace(deferred=True)
@@ -1036,11 +1033,11 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
# log up to 5 items
sample = list(islice(merged, 5))
fire_event(MergedFromState(nbr_merged=len(merged), sample=sample))
fire_event(MergedFromState(num_merged=len(merged), sample=sample))
# Methods that were formerly in ParseResult
def add_macro(self, source_file: SourceFile, macro: ParsedMacro):
def add_macro(self, source_file: SourceFile, macro: Macro):
if macro.unique_id in self.macros:
# detect that the macro exists and emit an error
other_path = self.macros[macro.unique_id].original_file_path
@@ -1082,30 +1079,30 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self.sources[source.unique_id] = source # type: ignore
source_file.sources.append(source.unique_id)
def add_node_nofile(self, node: ManifestNodes):
def add_node_nofile(self, node: ManifestNode):
# nodes can't be overwritten!
_check_duplicates(node, self.nodes)
self.nodes[node.unique_id] = node
def add_node(self, source_file: AnySourceFile, node: ManifestNodes, test_from=None):
def add_node(self, source_file: AnySourceFile, node: ManifestNode, test_from=None):
self.add_node_nofile(node)
if isinstance(source_file, SchemaSourceFile):
if isinstance(node, ParsedGenericTestNode):
if isinstance(node, GenericTestNode):
assert test_from
source_file.add_test(node.unique_id, test_from)
if isinstance(node, ParsedMetric):
if isinstance(node, Metric):
source_file.metrics.append(node.unique_id)
if isinstance(node, ParsedExposure):
if isinstance(node, Exposure):
source_file.exposures.append(node.unique_id)
else:
source_file.nodes.append(node.unique_id)
def add_exposure(self, source_file: SchemaSourceFile, exposure: ParsedExposure):
def add_exposure(self, source_file: SchemaSourceFile, exposure: Exposure):
_check_duplicates(exposure, self.exposures)
self.exposures[exposure.unique_id] = exposure
source_file.exposures.append(exposure.unique_id)
def add_metric(self, source_file: SchemaSourceFile, metric: ParsedMetric):
def add_metric(self, source_file: SchemaSourceFile, metric: Metric):
_check_duplicates(metric, self.metrics)
self.metrics[metric.unique_id] = metric
source_file.metrics.append(metric.unique_id)
@@ -1117,20 +1114,20 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
else:
self.disabled[node.unique_id] = [node]
def add_disabled(self, source_file: AnySourceFile, node: CompileResultNode, test_from=None):
def add_disabled(self, source_file: AnySourceFile, node: ResultNode, test_from=None):
self.add_disabled_nofile(node)
if isinstance(source_file, SchemaSourceFile):
if isinstance(node, ParsedGenericTestNode):
if isinstance(node, GenericTestNode):
assert test_from
source_file.add_test(node.unique_id, test_from)
if isinstance(node, ParsedMetric):
if isinstance(node, Metric):
source_file.metrics.append(node.unique_id)
if isinstance(node, ParsedExposure):
if isinstance(node, Exposure):
source_file.exposures.append(node.unique_id)
else:
source_file.nodes.append(node.unique_id)
def add_doc(self, source_file: SourceFile, doc: ParsedDocumentation):
def add_doc(self, source_file: SourceFile, doc: Documentation):
_check_duplicates(doc, self.docs)
self.docs[doc.unique_id] = doc
source_file.docs.append(doc.unique_id)
@@ -1183,32 +1180,32 @@ AnyManifest = Union[Manifest, MacroManifest]
@dataclass
@schema_version("manifest", 7)
@schema_version("manifest", 8)
class WritableManifest(ArtifactMixin):
nodes: Mapping[UniqueID, ManifestNode] = field(
metadata=dict(description=("The nodes defined in the dbt project and its dependencies"))
)
sources: Mapping[UniqueID, ParsedSourceDefinition] = field(
sources: Mapping[UniqueID, SourceDefinition] = field(
metadata=dict(description=("The sources defined in the dbt project and its dependencies"))
)
macros: Mapping[UniqueID, ParsedMacro] = field(
macros: Mapping[UniqueID, Macro] = field(
metadata=dict(description=("The macros defined in the dbt project and its dependencies"))
)
docs: Mapping[UniqueID, ParsedDocumentation] = field(
docs: Mapping[UniqueID, Documentation] = field(
metadata=dict(description=("The docs defined in the dbt project and its dependencies"))
)
exposures: Mapping[UniqueID, ParsedExposure] = field(
exposures: Mapping[UniqueID, Exposure] = field(
metadata=dict(
description=("The exposures defined in the dbt project and its dependencies")
)
)
metrics: Mapping[UniqueID, ParsedMetric] = field(
metrics: Mapping[UniqueID, Metric] = field(
metadata=dict(description=("The metrics defined in the dbt project and its dependencies"))
)
selectors: Mapping[UniqueID, Any] = field(
metadata=dict(description=("The selectors defined in selectors.yml"))
)
disabled: Optional[Mapping[UniqueID, List[CompileResultNode]]] = field(
disabled: Optional[Mapping[UniqueID, List[ResultNode]]] = field(
metadata=dict(description="A mapping of the disabled nodes in the target")
)
parent_map: Optional[NodeEdgeMap] = field(
@@ -1229,7 +1226,7 @@ class WritableManifest(ArtifactMixin):
@classmethod
def compatible_previous_versions(self):
return [("manifest", 4), ("manifest", 5), ("manifest", 6)]
return [("manifest", 4), ("manifest", 5), ("manifest", 6), ("manifest", 7)]
def __post_serialize__(self, dct):
for unique_id, node in dct["nodes"].items():
@@ -1238,7 +1235,7 @@ class WritableManifest(ArtifactMixin):
return dct
def _check_duplicates(value: HasUniqueID, src: Mapping[str, HasUniqueID]):
def _check_duplicates(value: BaseNode, src: Mapping[str, BaseNode]):
if value.unique_id in src:
raise_duplicate_resource_name(value, src[value.unique_id])

View File

@@ -12,7 +12,7 @@ class MetricReference(object):
class ResolvedMetricReference(MetricReference):
"""
Simple proxy over a ParsedMetric which delegates property
Simple proxy over a Metric which delegates property
lookups to the underlying node. Also adds helper functions
for working with metrics (ie. __str__ and templating functions)
"""

View File

@@ -495,6 +495,12 @@ class SeedConfig(NodeConfig):
materialized: str = "seed"
quote_columns: Optional[bool] = None
@classmethod
def validate(cls, data):
super().validate(data)
if data.get("materialized") and data.get("materialized") != "seed":
raise ValidationError("A seed must have a materialized value of 'seed'")
@dataclass
class TestConfig(NodeAndTestConfig):
@@ -534,6 +540,12 @@ class TestConfig(NodeAndTestConfig):
return False
return True
@classmethod
def validate(cls, data):
super().validate(data)
if data.get("materialized") and data.get("materialized") != "test":
raise ValidationError("A test must have a materialized value of 'test'")
@dataclass
class EmptySnapshotConfig(NodeConfig):
@@ -570,7 +582,6 @@ class SnapshotConfig(EmptySnapshotConfig):
f"Invalid value for 'check_cols': {data['check_cols']}. "
"Expected 'all' or a list of strings."
)
elif data.get("strategy") == "timestamp":
if not data.get("updated_at"):
raise ValidationError(
@@ -582,6 +593,9 @@ class SnapshotConfig(EmptySnapshotConfig):
# If the strategy is not 'check' or 'timestamp' it's a custom strategy,
# formerly supported with GenericSnapshotConfig
if data.get("materialized") and data.get("materialized") != "snapshot":
raise ValidationError("A snapshot must have a materialized value of 'snapshot'")
def finalize_and_validate(self):
data = self.to_dict(omit_none=True)
self.validate(data)

View File

@@ -24,7 +24,6 @@ from typing import Optional, List, Union, Dict, Any, Sequence
@dataclass
class UnparsedBaseNode(dbtClassMixin, Replaceable):
package_name: str
root_path: str
path: str
original_file_path: str
@@ -232,7 +231,7 @@ class ExternalTable(AdditionalPropertiesAllowed, Mergeable):
file_format: Optional[str] = None
row_format: Optional[str] = None
tbl_properties: Optional[str] = None
partitions: Optional[List[ExternalPartition]] = None
partitions: Optional[Union[List[str], List[ExternalPartition]]] = None
def __bool__(self):
return self.location is not None
@@ -364,7 +363,6 @@ class SourcePatch(dbtClassMixin, Replaceable):
@dataclass
class UnparsedDocumentation(dbtClassMixin, Replaceable):
package_name: str
root_path: str
path: str
original_file_path: str

View File

@@ -12,9 +12,7 @@ from dataclasses import dataclass, field
from typing import Optional, List, Dict, Union, Any
from mashumaro.types import SerializableType
PIN_PACKAGE_URL = (
"https://docs.getdbt.com/docs/package-management#section-specifying-package-versions" # noqa
)
DEFAULT_SEND_ANONYMOUS_USAGE_STATS = True
@@ -57,6 +55,12 @@ class LocalPackage(Package):
RawVersion = Union[str, float]
@dataclass
class TarballPackage(Package):
tarball: str
name: str
@dataclass
class GitPackage(Package):
git: str
@@ -84,7 +88,7 @@ class RegistryPackage(Package):
return [str(self.version)]
PackageSpec = Union[LocalPackage, GitPackage, RegistryPackage]
PackageSpec = Union[LocalPackage, TarballPackage, GitPackage, RegistryPackage]
@dataclass
@@ -218,7 +222,7 @@ class Project(HyphenatedDbtClassMixin, Replaceable):
),
)
packages: List[PackageSpec] = field(default_factory=list)
query_comment: Optional[Union[QueryComment, NoValue, str]] = NoValue()
query_comment: Optional[Union[QueryComment, NoValue, str]] = field(default_factory=NoValue)
@classmethod
def validate(cls, data):
@@ -253,7 +257,6 @@ class UserConfig(ExtensibleDbtClassMixin, Replaceable, UserConfigContract):
static_parser: Optional[bool] = None
indirect_selection: Optional[str] = None
cache_selected_only: Optional[bool] = None
event_buffer_size: Optional[int] = None
@dataclass

View File

@@ -1,6 +1,5 @@
from dbt.contracts.graph.manifest import CompileResultNode
from dbt.contracts.graph.unparsed import FreshnessThreshold
from dbt.contracts.graph.parsed import ParsedSourceDefinition
from dbt.contracts.graph.nodes import SourceDefinition, ResultNode
from dbt.contracts.util import (
BaseArtifactMetadata,
ArtifactMixin,
@@ -11,11 +10,10 @@ from dbt.contracts.util import (
from dbt.exceptions import InternalException
from dbt.events.functions import fire_event
from dbt.events.types import TimingInfoCollected
from dbt.logger import (
TimingProcessor,
JsonOnly,
)
from dbt.utils import lowercase
from dbt.events.proto_types import RunResultMsg, TimingInfoMsg
from dbt.events.contextvars import get_node_info
from dbt.logger import TimingProcessor
from dbt.utils import lowercase, cast_to_str, cast_to_int
from dbt.dataclass_schema import dbtClassMixin, StrEnum
import agate
@@ -47,7 +45,14 @@ class TimingInfo(dbtClassMixin):
def end(self):
self.completed_at = datetime.utcnow()
def to_msg(self):
timsg = TimingInfoMsg(
name=self.name, started_at=self.started_at, completed_at=self.completed_at
)
return timsg
# This is a context manager
class collect_timing_info:
def __init__(self, name: str):
self.timing_info = TimingInfo(name=name)
@@ -58,8 +63,13 @@ class collect_timing_info:
def __exit__(self, exc_type, exc_value, traceback):
self.timing_info.end()
with JsonOnly(), TimingProcessor(self.timing_info):
fire_event(TimingInfoCollected())
# Note: when legacy logger is removed, we can remove the following line
with TimingProcessor(self.timing_info):
fire_event(
TimingInfoCollected(
timing_info=self.timing_info.to_msg(), node_info=get_node_info()
)
)
class RunningStatus(StrEnum):
@@ -119,10 +129,22 @@ class BaseResult(dbtClassMixin):
data["failures"] = None
return data
def to_msg(self):
# TODO: add more fields
msg = RunResultMsg()
msg.status = str(self.status)
msg.message = cast_to_str(self.message)
msg.thread = self.thread_id
msg.execution_time = self.execution_time
msg.num_failures = cast_to_int(self.failures)
msg.timing_info = [ti.to_msg() for ti in self.timing]
# adapter_response
return msg
@dataclass
class NodeResult(BaseResult):
node: CompileResultNode
node: ResultNode
@dataclass
@@ -208,7 +230,9 @@ class RunResultsArtifact(ExecutionResult, ArtifactMixin):
generated_at: datetime,
args: Dict,
):
processed_results = [process_run_result(result) for result in results]
processed_results = [
process_run_result(result) for result in results if isinstance(result, RunResult)
]
meta = RunResultsMetadata(
dbt_schema_version=str(cls.dbt_schema_version),
generated_at=generated_at,
@@ -259,7 +283,7 @@ class RunOperationResultsArtifact(RunOperationResult, ArtifactMixin):
@dataclass
class SourceFreshnessResult(NodeResult):
node: ParsedSourceDefinition
node: SourceDefinition
status: FreshnessStatus
max_loaded_at: datetime
snapshotted_at: datetime
@@ -327,7 +351,7 @@ def process_freshness_result(result: FreshnessNodeResult) -> FreshnessNodeOutput
criteria = result.node.freshness
if criteria is None:
raise InternalException(
"Somehow evaluated a freshness result for a source " "that has no freshness criteria!"
"Somehow evaluated a freshness result for a source that has no freshness criteria!"
)
return SourceFreshnessOutput(
unique_id=unique_id,

View File

@@ -5,7 +5,7 @@ from typing import Optional, List, Any, Dict, Sequence
from dbt.dataclass_schema import dbtClassMixin
from dbt.contracts.graph.compiled import CompileResultNode
from dbt.contracts.graph.nodes import ResultNode
from dbt.contracts.results import (
RunResult,
RunResultsArtifact,
@@ -32,7 +32,7 @@ class RemoteResult(VersionedSchema):
class RemoteCompileResultMixin(RemoteResult):
raw_code: str
compiled_code: str
node: CompileResultNode
node: ResultNode
timing: List[TimingInfo]

Some files were not shown because too many files have changed in this diff Show More