mirror of
https://github.com/dbt-labs/dbt-core
synced 2025-12-20 15:01:28 +00:00
Compare commits
34 Commits
adding-sem
...
hackoween-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
203a39dd09 | ||
|
|
c643dd49fa | ||
|
|
c009485de2 | ||
|
|
9cdc451bc8 | ||
|
|
4718dd3a1e | ||
|
|
49796d6e13 | ||
|
|
ece3d2c105 | ||
|
|
52bedbad23 | ||
|
|
eb079dd818 | ||
|
|
2a99431c8d | ||
|
|
df5953a71d | ||
|
|
641b0fa365 | ||
|
|
8876afdb14 | ||
|
|
a97a9c9942 | ||
|
|
0070cd99de | ||
|
|
78a1bbe3c7 | ||
|
|
cde82fa2b1 | ||
|
|
34cec7c7b0 | ||
|
|
db5caf97ae | ||
|
|
847046171e | ||
|
|
5dd37a9fb8 | ||
|
|
a2bdd08d88 | ||
|
|
1807526d0a | ||
|
|
362770f5bd | ||
|
|
af38f51041 | ||
|
|
efc8ece12e | ||
|
|
7471f07431 | ||
|
|
6fa30d10ea | ||
|
|
35150f914f | ||
|
|
b477be9eff | ||
|
|
b67e877cc1 | ||
|
|
1c066cd680 | ||
|
|
ec97b46caf | ||
|
|
b5bb354929 |
@@ -1,5 +1,5 @@
|
|||||||
[bumpversion]
|
[bumpversion]
|
||||||
current_version = 0.21.0b2
|
current_version = 0.21.0
|
||||||
parse = (?P<major>\d+)
|
parse = (?P<major>\d+)
|
||||||
\.(?P<minor>\d+)
|
\.(?P<minor>\d+)
|
||||||
\.(?P<patch>\d+)
|
\.(?P<patch>\d+)
|
||||||
|
|||||||
25
CHANGELOG.md
25
CHANGELOG.md
@@ -1,10 +1,21 @@
|
|||||||
## dbt 0.21.0 (Release TBD)
|
## dbt 0.21.0 (October 04, 2021)
|
||||||
|
|
||||||
## dbt 0.21.0b2 (August 19, 2021)
|
## dbt 0.21.0rc2 (September 27, 2021)
|
||||||
|
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
- Fix batching for large seeds on Snowflake ([#3941](https://github.com/dbt-labs/dbt/issues/3941), [#3942](https://github.com/dbt-labs/dbt/pull/3942))
|
||||||
|
- Avoid infinite recursion in `state:modified.macros` check ([#3904](https://github.com/dbt-labs/dbt/issues/3904), [#3957](https://github.com/dbt-labs/dbt/pull/3957))
|
||||||
|
- Cast log messages to strings before scrubbing of prefixed env vars ([#3971](https://github.com/dbt-labs/dbt/issues/3971), [#3972](https://github.com/dbt-labs/dbt/pull/3972))
|
||||||
|
|
||||||
|
### Under the hood
|
||||||
|
- Bump artifact schema versions for 0.21.0 ([#3945](https://github.com/dbt-labs/dbt/pull/3945))
|
||||||
|
|
||||||
|
## dbt 0.21.0rc1 (September 20, 2021)
|
||||||
|
|
||||||
### Features
|
### Features
|
||||||
|
|
||||||
|
- Make `--models` and `--select` synonyms, except for `ls` (to preserve existing behavior) ([#3210](https://github.com/dbt-labs/dbt/pull/3210), [#3791](https://github.com/dbt-labs/dbt/pull/3791))
|
||||||
- Experimental parser now detects macro overrides of ref, source, and config builtins. ([#3581](https://github.com/dbt-labs/dbt/issues/3866), [#3582](https://github.com/dbt-labs/dbt/pull/3877))
|
- Experimental parser now detects macro overrides of ref, source, and config builtins. ([#3581](https://github.com/dbt-labs/dbt/issues/3866), [#3582](https://github.com/dbt-labs/dbt/pull/3877))
|
||||||
- Add connect_timeout profile configuration for Postgres and Redshift adapters. ([#3581](https://github.com/dbt-labs/dbt/issues/3581), [#3582](https://github.com/dbt-labs/dbt/pull/3582))
|
- Add connect_timeout profile configuration for Postgres and Redshift adapters. ([#3581](https://github.com/dbt-labs/dbt/issues/3581), [#3582](https://github.com/dbt-labs/dbt/pull/3582))
|
||||||
- Enhance BigQuery copy materialization ([#3570](https://github.com/dbt-labs/dbt/issues/3570), [#3606](https://github.com/dbt-labs/dbt/pull/3606)):
|
- Enhance BigQuery copy materialization ([#3570](https://github.com/dbt-labs/dbt/issues/3570), [#3606](https://github.com/dbt-labs/dbt/pull/3606)):
|
||||||
@@ -16,6 +27,7 @@
|
|||||||
- Added default field in the `selectors.yml` to allow user to define default selector ([#3448](https://github.com/dbt-labs/dbt/issues/3448), [#3875](https://github.com/dbt-labs/dbt/issues/3875), [#3892](https://github.com/dbt-labs/dbt/issues/3892))
|
- Added default field in the `selectors.yml` to allow user to define default selector ([#3448](https://github.com/dbt-labs/dbt/issues/3448), [#3875](https://github.com/dbt-labs/dbt/issues/3875), [#3892](https://github.com/dbt-labs/dbt/issues/3892))
|
||||||
- Added timing and thread information to sources.json artifact ([#3804](https://github.com/dbt-labs/dbt/issues/3804), [#3894](https://github.com/dbt-labs/dbt/pull/3894))
|
- Added timing and thread information to sources.json artifact ([#3804](https://github.com/dbt-labs/dbt/issues/3804), [#3894](https://github.com/dbt-labs/dbt/pull/3894))
|
||||||
- Update cli and rpc flags for the `build` task to align with other commands (`--resource-type`, `--store-failures`) ([#3596](https://github.com/dbt-labs/dbt/issues/3596), [#3884](https://github.com/dbt-labs/dbt/pull/3884))
|
- Update cli and rpc flags for the `build` task to align with other commands (`--resource-type`, `--store-failures`) ([#3596](https://github.com/dbt-labs/dbt/issues/3596), [#3884](https://github.com/dbt-labs/dbt/pull/3884))
|
||||||
|
- Log tests that are not indirectly selected. Add `--greedy` flag to `test`, `list`, `build` and `greedy` property in yaml selectors ([#3723](https://github.com/dbt-labs/dbt/pull/3723), [#3833](https://github.com/dbt-labs/dbt/pull/3833))
|
||||||
|
|
||||||
### Fixes
|
### Fixes
|
||||||
|
|
||||||
@@ -29,13 +41,13 @@
|
|||||||
|
|
||||||
### Under the hood
|
### Under the hood
|
||||||
|
|
||||||
- Use GitHub Actions for CI ([#3688](https://github.com/dbt-labs/dbt/issues/3688), [#3669](https://github.com/dbt-labs/dbt/pull/3669))
|
|
||||||
- Better dbt hub registry packages version logging that prompts the user for upgrades to relevant packages ([#3560](https://github.com/dbt-labs/dbt/issues/3560), [#3763](https://github.com/dbt-labs/dbt/issues/3763), [#3759](https://github.com/dbt-labs/dbt/pull/3759))
|
- Better dbt hub registry packages version logging that prompts the user for upgrades to relevant packages ([#3560](https://github.com/dbt-labs/dbt/issues/3560), [#3763](https://github.com/dbt-labs/dbt/issues/3763), [#3759](https://github.com/dbt-labs/dbt/pull/3759))
|
||||||
- Allow the default seed macro's SQL parameter, `%s`, to be replaced by dispatching a new macro, `get_binding_char()`. This enables adapters with parameter marker characters such as `?` to not have to override `basic_load_csv_rows`. ([#3622](https://github.com/dbt-labs/dbt/issues/3622), [#3623](https://github.com/dbt-labs/dbt/pull/3623))
|
- Allow the default seed macro's SQL parameter, `%s`, to be replaced by dispatching a new macro, `get_binding_char()`. This enables adapters with parameter marker characters such as `?` to not have to override `basic_load_csv_rows`. ([#3622](https://github.com/dbt-labs/dbt/issues/3622), [#3623](https://github.com/dbt-labs/dbt/pull/3623))
|
||||||
- Alert users on package rename ([hub.getdbt.com#180](https://github.com/dbt-labs/hub.getdbt.com/issues/810), [#3825](https://github.com/dbt-labs/dbt/pull/3825))
|
- Alert users on package rename ([hub.getdbt.com#180](https://github.com/dbt-labs/hub.getdbt.com/issues/810), [#3825](https://github.com/dbt-labs/dbt/pull/3825))
|
||||||
- Add `adapter_unique_id` to invocation context in anonymous usage tracking, to better understand dbt adoption ([#3713](https://github.com/dbt-labs/dbt/issues/3713), [#3796](https://github.com/dbt-labs/dbt/issues/3796))
|
- Add `adapter_unique_id` to invocation context in anonymous usage tracking, to better understand dbt adoption ([#3713](https://github.com/dbt-labs/dbt/issues/3713), [#3796](https://github.com/dbt-labs/dbt/issues/3796))
|
||||||
- Specify `macro_namespace = 'dbt'` for all dispatched macros in the global project, making it possible to dispatch to macro implementations defined in packages. Dispatch `generate_schema_name` and `generate_alias_name` ([#3456](https://github.com/dbt-labs/dbt/issues/3456), [#3851](https://github.com/dbt-labs/dbt/issues/3851))
|
- Specify `macro_namespace = 'dbt'` for all dispatched macros in the global project, making it possible to dispatch to macro implementations defined in packages. Dispatch `generate_schema_name` and `generate_alias_name` ([#3456](https://github.com/dbt-labs/dbt/issues/3456), [#3851](https://github.com/dbt-labs/dbt/issues/3851))
|
||||||
- Retry transient GitHub failures during download ([#3729](https://github.com/dbt-labs/dbt/pull/3729))
|
- Retry transient GitHub failures during download ([#3546](https://github.com/dbt-labs/dbt/pull/3546), [#3729](https://github.com/dbt-labs/dbt/pull/3729))
|
||||||
|
- Don't reload and validate schema files if they haven't changed ([#3563](https://github.com/dbt-labs/dbt/issues/3563), [#3888](https://github.com/dbt-labs/dbt/issues/3888))
|
||||||
|
|
||||||
Contributors:
|
Contributors:
|
||||||
|
|
||||||
@@ -44,7 +56,7 @@ Contributors:
|
|||||||
- [@dbrtly](https://github.com/dbrtly) ([#3834](https://github.com/dbt-labs/dbt/pull/3834))
|
- [@dbrtly](https://github.com/dbrtly) ([#3834](https://github.com/dbt-labs/dbt/pull/3834))
|
||||||
- [@swanderz](https://github.com/swanderz) [#3623](https://github.com/dbt-labs/dbt/pull/3623)
|
- [@swanderz](https://github.com/swanderz) [#3623](https://github.com/dbt-labs/dbt/pull/3623)
|
||||||
- [@JasonGluck](https://github.com/JasonGluck) ([#3582](https://github.com/dbt-labs/dbt/pull/3582))
|
- [@JasonGluck](https://github.com/JasonGluck) ([#3582](https://github.com/dbt-labs/dbt/pull/3582))
|
||||||
- [@joellabes](https://github.com/joellabes) ([#3669](https://github.com/dbt-labs/dbt/pull/3669))
|
- [@joellabes](https://github.com/joellabes) ([#3669](https://github.com/dbt-labs/dbt/pull/3669), [#3833](https://github.com/dbt-labs/dbt/pull/3833))
|
||||||
- [@juma-adoreme](https://github.com/juma-adoreme) ([#3838](https://github.com/dbt-labs/dbt/pull/3838))
|
- [@juma-adoreme](https://github.com/juma-adoreme) ([#3838](https://github.com/dbt-labs/dbt/pull/3838))
|
||||||
- [@annafil](https://github.com/annafil) ([#3825](https://github.com/dbt-labs/dbt/pull/3825))
|
- [@annafil](https://github.com/annafil) ([#3825](https://github.com/dbt-labs/dbt/pull/3825))
|
||||||
- [@AndreasTA-AW](https://github.com/AndreasTA-AW) ([#3691](https://github.com/dbt-labs/dbt/pull/3691))
|
- [@AndreasTA-AW](https://github.com/AndreasTA-AW) ([#3691](https://github.com/dbt-labs/dbt/pull/3691))
|
||||||
@@ -52,6 +64,7 @@ Contributors:
|
|||||||
- [@TeddyCr](https://github.com/TeddyCr) ([#3448](https://github.com/dbt-labs/dbt/pull/3865))
|
- [@TeddyCr](https://github.com/TeddyCr) ([#3448](https://github.com/dbt-labs/dbt/pull/3865))
|
||||||
- [@sdebruyn](https://github.com/sdebruyn) ([#3906](https://github.com/dbt-labs/dbt/pull/3906))
|
- [@sdebruyn](https://github.com/sdebruyn) ([#3906](https://github.com/dbt-labs/dbt/pull/3906))
|
||||||
|
|
||||||
|
|
||||||
## dbt 0.21.0b2 (August 19, 2021)
|
## dbt 0.21.0b2 (August 19, 2021)
|
||||||
|
|
||||||
### Features
|
### Features
|
||||||
@@ -67,7 +80,6 @@ Contributors:
|
|||||||
### Under the hood
|
### Under the hood
|
||||||
|
|
||||||
- Add `build` RPC method, and a subset of flags for `build` task ([#3595](https://github.com/dbt-labs/dbt/issues/3595), [#3674](https://github.com/dbt-labs/dbt/pull/3674))
|
- Add `build` RPC method, and a subset of flags for `build` task ([#3595](https://github.com/dbt-labs/dbt/issues/3595), [#3674](https://github.com/dbt-labs/dbt/pull/3674))
|
||||||
- Get more information on partial parsing version mismatches ([#3757](https://github.com/dbt-labs/dbt/issues/3757), [#3758](https://github.com/dbt-labs/dbt/pull/3758))
|
|
||||||
|
|
||||||
## dbt 0.21.0b1 (August 03, 2021)
|
## dbt 0.21.0b1 (August 03, 2021)
|
||||||
|
|
||||||
@@ -118,6 +130,7 @@ Contributors:
|
|||||||
- Better error handling for BigQuery job labels that are too long. ([#3612](https://github.com/dbt-labs/dbt/pull/3612), [#3703](https://github.com/dbt-labs/dbt/pull/3703))
|
- Better error handling for BigQuery job labels that are too long. ([#3612](https://github.com/dbt-labs/dbt/pull/3612), [#3703](https://github.com/dbt-labs/dbt/pull/3703))
|
||||||
- Get more information on partial parsing version mismatches ([#3757](https://github.com/dbt-labs/dbt/issues/3757), [#3758](https://github.com/dbt-labs/dbt/pull/3758))
|
- Get more information on partial parsing version mismatches ([#3757](https://github.com/dbt-labs/dbt/issues/3757), [#3758](https://github.com/dbt-labs/dbt/pull/3758))
|
||||||
- Switch to full reparse on partial parsing exceptions. Log and report exception information. ([#3725](https://github.com/dbt-labs/dbt/issues/3725), [#3733](https://github.com/dbt-labs/dbt/pull/3733))
|
- Switch to full reparse on partial parsing exceptions. Log and report exception information. ([#3725](https://github.com/dbt-labs/dbt/issues/3725), [#3733](https://github.com/dbt-labs/dbt/pull/3733))
|
||||||
|
- Use GitHub Actions for CI ([#3688](https://github.com/dbt-labs/dbt/issues/3688), [#3669](https://github.com/dbt-labs/dbt/pull/3669))
|
||||||
|
|
||||||
### Fixes
|
### Fixes
|
||||||
|
|
||||||
|
|||||||
@@ -1071,7 +1071,7 @@ AnyManifest = Union[Manifest, MacroManifest]
|
|||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
@schema_version('manifest', 2)
|
@schema_version('manifest', 3)
|
||||||
class WritableManifest(ArtifactMixin):
|
class WritableManifest(ArtifactMixin):
|
||||||
nodes: Mapping[UniqueID, ManifestNode] = field(
|
nodes: Mapping[UniqueID, ManifestNode] = field(
|
||||||
metadata=dict(description=(
|
metadata=dict(description=(
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
|
import json
|
||||||
|
import io
|
||||||
from dbt.contracts.graph.manifest import CompileResultNode
|
from dbt.contracts.graph.manifest import CompileResultNode
|
||||||
from dbt.contracts.graph.unparsed import (
|
from dbt.contracts.graph.unparsed import (
|
||||||
FreshnessThreshold
|
FreshnessThreshold
|
||||||
@@ -185,7 +187,7 @@ class RunExecutionResult(
|
|||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
@schema_version('run-results', 2)
|
@schema_version('run-results', 3)
|
||||||
class RunResultsArtifact(ExecutionResult, ArtifactMixin):
|
class RunResultsArtifact(ExecutionResult, ArtifactMixin):
|
||||||
results: Sequence[RunResultOutput]
|
results: Sequence[RunResultOutput]
|
||||||
args: Dict[str, Any] = field(default_factory=dict)
|
args: Dict[str, Any] = field(default_factory=dict)
|
||||||
@@ -369,7 +371,7 @@ class FreshnessResult(ExecutionResult):
|
|||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
@schema_version('sources', 1)
|
@schema_version('sources', 2)
|
||||||
class FreshnessExecutionResultArtifact(
|
class FreshnessExecutionResultArtifact(
|
||||||
ArtifactMixin,
|
ArtifactMixin,
|
||||||
VersionedSchema,
|
VersionedSchema,
|
||||||
@@ -489,3 +491,31 @@ class CatalogArtifact(CatalogResults, ArtifactMixin):
|
|||||||
errors=errors,
|
errors=errors,
|
||||||
_compile_results=compile_results,
|
_compile_results=compile_results,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class MetaMetadata(BaseArtifactMetadata): # lol
|
||||||
|
dbt_schema_version: str = field(
|
||||||
|
default_factory=lambda: str(MetaArtifact.dbt_schema_version)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
@schema_version('meta', 1)
|
||||||
|
class MetaArtifact(ArtifactMixin):
|
||||||
|
metadata: MetaMetadata
|
||||||
|
queries: Any
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_results(
|
||||||
|
cls,
|
||||||
|
generated_at: datetime,
|
||||||
|
table: agate.Table
|
||||||
|
) -> 'MetaArtifact':
|
||||||
|
meta = MetaMetadata(generated_at=generated_at)
|
||||||
|
f = io.StringIO()
|
||||||
|
table.to_json(f)
|
||||||
|
return cls(
|
||||||
|
metadata=meta,
|
||||||
|
queries=json.loads(f.getvalue())
|
||||||
|
)
|
||||||
|
|||||||
@@ -18,6 +18,7 @@ WRITE_JSON = None
|
|||||||
PARTIAL_PARSE = None
|
PARTIAL_PARSE = None
|
||||||
USE_COLORS = None
|
USE_COLORS = None
|
||||||
STORE_FAILURES = None
|
STORE_FAILURES = None
|
||||||
|
GREEDY = None
|
||||||
|
|
||||||
|
|
||||||
def env_set_truthy(key: str) -> Optional[str]:
|
def env_set_truthy(key: str) -> Optional[str]:
|
||||||
@@ -56,7 +57,7 @@ MP_CONTEXT = _get_context()
|
|||||||
def reset():
|
def reset():
|
||||||
global STRICT_MODE, FULL_REFRESH, USE_CACHE, WARN_ERROR, TEST_NEW_PARSER, \
|
global STRICT_MODE, FULL_REFRESH, USE_CACHE, WARN_ERROR, TEST_NEW_PARSER, \
|
||||||
USE_EXPERIMENTAL_PARSER, WRITE_JSON, PARTIAL_PARSE, MP_CONTEXT, USE_COLORS, \
|
USE_EXPERIMENTAL_PARSER, WRITE_JSON, PARTIAL_PARSE, MP_CONTEXT, USE_COLORS, \
|
||||||
STORE_FAILURES
|
STORE_FAILURES, GREEDY
|
||||||
|
|
||||||
STRICT_MODE = False
|
STRICT_MODE = False
|
||||||
FULL_REFRESH = False
|
FULL_REFRESH = False
|
||||||
@@ -69,12 +70,13 @@ def reset():
|
|||||||
MP_CONTEXT = _get_context()
|
MP_CONTEXT = _get_context()
|
||||||
USE_COLORS = True
|
USE_COLORS = True
|
||||||
STORE_FAILURES = False
|
STORE_FAILURES = False
|
||||||
|
GREEDY = False
|
||||||
|
|
||||||
|
|
||||||
def set_from_args(args):
|
def set_from_args(args):
|
||||||
global STRICT_MODE, FULL_REFRESH, USE_CACHE, WARN_ERROR, TEST_NEW_PARSER, \
|
global STRICT_MODE, FULL_REFRESH, USE_CACHE, WARN_ERROR, TEST_NEW_PARSER, \
|
||||||
USE_EXPERIMENTAL_PARSER, WRITE_JSON, PARTIAL_PARSE, MP_CONTEXT, USE_COLORS, \
|
USE_EXPERIMENTAL_PARSER, WRITE_JSON, PARTIAL_PARSE, MP_CONTEXT, USE_COLORS, \
|
||||||
STORE_FAILURES
|
STORE_FAILURES, GREEDY
|
||||||
|
|
||||||
USE_CACHE = getattr(args, 'use_cache', USE_CACHE)
|
USE_CACHE = getattr(args, 'use_cache', USE_CACHE)
|
||||||
|
|
||||||
@@ -99,6 +101,7 @@ def set_from_args(args):
|
|||||||
USE_COLORS = use_colors_override
|
USE_COLORS = use_colors_override
|
||||||
|
|
||||||
STORE_FAILURES = getattr(args, 'store_failures', STORE_FAILURES)
|
STORE_FAILURES = getattr(args, 'store_failures', STORE_FAILURES)
|
||||||
|
GREEDY = getattr(args, 'greedy', GREEDY)
|
||||||
|
|
||||||
|
|
||||||
# initialize everything to the defaults on module load
|
# initialize everything to the defaults on module load
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# special support for CLI argument parsing.
|
# special support for CLI argument parsing.
|
||||||
|
from dbt import flags
|
||||||
import itertools
|
import itertools
|
||||||
from dbt.clients.yaml_helper import yaml, Loader, Dumper # noqa: F401
|
from dbt.clients.yaml_helper import yaml, Loader, Dumper # noqa: F401
|
||||||
|
|
||||||
@@ -66,7 +67,7 @@ def parse_union_from_default(
|
|||||||
def parse_difference(
|
def parse_difference(
|
||||||
include: Optional[List[str]], exclude: Optional[List[str]]
|
include: Optional[List[str]], exclude: Optional[List[str]]
|
||||||
) -> SelectionDifference:
|
) -> SelectionDifference:
|
||||||
included = parse_union_from_default(include, DEFAULT_INCLUDES)
|
included = parse_union_from_default(include, DEFAULT_INCLUDES, greedy=bool(flags.GREEDY))
|
||||||
excluded = parse_union_from_default(exclude, DEFAULT_EXCLUDES, greedy=True)
|
excluded = parse_union_from_default(exclude, DEFAULT_EXCLUDES, greedy=True)
|
||||||
return SelectionDifference(components=[included, excluded])
|
return SelectionDifference(components=[included, excluded])
|
||||||
|
|
||||||
@@ -180,7 +181,7 @@ def parse_union_definition(definition: Dict[str, Any]) -> SelectionSpec:
|
|||||||
union_def_parts = _get_list_dicts(definition, 'union')
|
union_def_parts = _get_list_dicts(definition, 'union')
|
||||||
include, exclude = _parse_include_exclude_subdefs(union_def_parts)
|
include, exclude = _parse_include_exclude_subdefs(union_def_parts)
|
||||||
|
|
||||||
union = SelectionUnion(components=include)
|
union = SelectionUnion(components=include, greedy_warning=False)
|
||||||
|
|
||||||
if exclude is None:
|
if exclude is None:
|
||||||
union.raw = definition
|
union.raw = definition
|
||||||
@@ -188,7 +189,8 @@ def parse_union_definition(definition: Dict[str, Any]) -> SelectionSpec:
|
|||||||
else:
|
else:
|
||||||
return SelectionDifference(
|
return SelectionDifference(
|
||||||
components=[union, exclude],
|
components=[union, exclude],
|
||||||
raw=definition
|
raw=definition,
|
||||||
|
greedy_warning=False
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -197,7 +199,7 @@ def parse_intersection_definition(
|
|||||||
) -> SelectionSpec:
|
) -> SelectionSpec:
|
||||||
intersection_def_parts = _get_list_dicts(definition, 'intersection')
|
intersection_def_parts = _get_list_dicts(definition, 'intersection')
|
||||||
include, exclude = _parse_include_exclude_subdefs(intersection_def_parts)
|
include, exclude = _parse_include_exclude_subdefs(intersection_def_parts)
|
||||||
intersection = SelectionIntersection(components=include)
|
intersection = SelectionIntersection(components=include, greedy_warning=False)
|
||||||
|
|
||||||
if exclude is None:
|
if exclude is None:
|
||||||
intersection.raw = definition
|
intersection.raw = definition
|
||||||
@@ -205,7 +207,8 @@ def parse_intersection_definition(
|
|||||||
else:
|
else:
|
||||||
return SelectionDifference(
|
return SelectionDifference(
|
||||||
components=[intersection, exclude],
|
components=[intersection, exclude],
|
||||||
raw=definition
|
raw=definition,
|
||||||
|
greedy_warning=False
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -239,7 +242,7 @@ def parse_dict_definition(definition: Dict[str, Any]) -> SelectionSpec:
|
|||||||
if diff_arg is None:
|
if diff_arg is None:
|
||||||
return base
|
return base
|
||||||
else:
|
else:
|
||||||
return SelectionDifference(components=[base, diff_arg])
|
return SelectionDifference(components=[base, diff_arg], greedy_warning=False)
|
||||||
|
|
||||||
|
|
||||||
def parse_from_definition(
|
def parse_from_definition(
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
|
|
||||||
from typing import Set, List, Optional, Tuple
|
from typing import Set, List, Optional, Tuple
|
||||||
|
|
||||||
from .graph import Graph, UniqueId
|
from .graph import Graph, UniqueId
|
||||||
@@ -30,6 +29,24 @@ def alert_non_existence(raw_spec, nodes):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def alert_unused_nodes(raw_spec, node_names):
|
||||||
|
summary_nodes_str = ("\n - ").join(node_names[:3])
|
||||||
|
debug_nodes_str = ("\n - ").join(node_names)
|
||||||
|
and_more_str = f"\n - and {len(node_names) - 3} more" if len(node_names) > 4 else ""
|
||||||
|
summary_msg = (
|
||||||
|
f"\nSome tests were excluded because at least one parent is not selected. "
|
||||||
|
f"Use the --greedy flag to include them."
|
||||||
|
f"\n - {summary_nodes_str}{and_more_str}"
|
||||||
|
)
|
||||||
|
logger.info(summary_msg)
|
||||||
|
if len(node_names) > 4:
|
||||||
|
debug_msg = (
|
||||||
|
f"Full list of tests that were excluded:"
|
||||||
|
f"\n - {debug_nodes_str}"
|
||||||
|
)
|
||||||
|
logger.debug(debug_msg)
|
||||||
|
|
||||||
|
|
||||||
def can_select_indirectly(node):
|
def can_select_indirectly(node):
|
||||||
"""If a node is not selected itself, but its parent(s) are, it may qualify
|
"""If a node is not selected itself, but its parent(s) are, it may qualify
|
||||||
for indirect selection.
|
for indirect selection.
|
||||||
@@ -151,16 +168,16 @@ class NodeSelector(MethodManager):
|
|||||||
|
|
||||||
return direct_nodes, indirect_nodes
|
return direct_nodes, indirect_nodes
|
||||||
|
|
||||||
def select_nodes(self, spec: SelectionSpec) -> Set[UniqueId]:
|
def select_nodes(self, spec: SelectionSpec) -> Tuple[Set[UniqueId], Set[UniqueId]]:
|
||||||
"""Select the nodes in the graph according to the spec.
|
"""Select the nodes in the graph according to the spec.
|
||||||
|
|
||||||
This is the main point of entry for turning a spec into a set of nodes:
|
This is the main point of entry for turning a spec into a set of nodes:
|
||||||
- Recurse through spec, select by criteria, combine by set operation
|
- Recurse through spec, select by criteria, combine by set operation
|
||||||
- Return final (unfiltered) selection set
|
- Return final (unfiltered) selection set
|
||||||
"""
|
"""
|
||||||
|
|
||||||
direct_nodes, indirect_nodes = self.select_nodes_recursively(spec)
|
direct_nodes, indirect_nodes = self.select_nodes_recursively(spec)
|
||||||
return direct_nodes
|
indirect_only = indirect_nodes.difference(direct_nodes)
|
||||||
|
return direct_nodes, indirect_only
|
||||||
|
|
||||||
def _is_graph_member(self, unique_id: UniqueId) -> bool:
|
def _is_graph_member(self, unique_id: UniqueId) -> bool:
|
||||||
if unique_id in self.manifest.sources:
|
if unique_id in self.manifest.sources:
|
||||||
@@ -213,6 +230,8 @@ class NodeSelector(MethodManager):
|
|||||||
# - If ANY parent is missing, return it separately. We'll keep it around
|
# - If ANY parent is missing, return it separately. We'll keep it around
|
||||||
# for later and see if its other parents show up.
|
# for later and see if its other parents show up.
|
||||||
# We use this for INCLUSION.
|
# We use this for INCLUSION.
|
||||||
|
# Users can also opt in to inclusive GREEDY mode by passing --greedy flag,
|
||||||
|
# or by specifying `greedy: true` in a yaml selector
|
||||||
|
|
||||||
direct_nodes = set(selected)
|
direct_nodes = set(selected)
|
||||||
indirect_nodes = set()
|
indirect_nodes = set()
|
||||||
@@ -251,15 +270,24 @@ class NodeSelector(MethodManager):
|
|||||||
|
|
||||||
- node selection. Based on the include/exclude sets, the set
|
- node selection. Based on the include/exclude sets, the set
|
||||||
of matched unique IDs is returned
|
of matched unique IDs is returned
|
||||||
- expand the graph at each leaf node, before combination
|
- includes direct + indirect selection (for tests)
|
||||||
- selectors might override this. for example, this is where
|
|
||||||
tests are added
|
|
||||||
- filtering:
|
- filtering:
|
||||||
- selectors can filter the nodes after all of them have been
|
- selectors can filter the nodes after all of them have been
|
||||||
selected
|
selected
|
||||||
"""
|
"""
|
||||||
selected_nodes = self.select_nodes(spec)
|
selected_nodes, indirect_only = self.select_nodes(spec)
|
||||||
filtered_nodes = self.filter_selection(selected_nodes)
|
filtered_nodes = self.filter_selection(selected_nodes)
|
||||||
|
|
||||||
|
if indirect_only:
|
||||||
|
filtered_unused_nodes = self.filter_selection(indirect_only)
|
||||||
|
if filtered_unused_nodes and spec.greedy_warning:
|
||||||
|
# log anything that didn't make the cut
|
||||||
|
unused_node_names = []
|
||||||
|
for unique_id in filtered_unused_nodes:
|
||||||
|
name = self.manifest.nodes[unique_id].name
|
||||||
|
unused_node_names.append(name)
|
||||||
|
alert_unused_nodes(spec, unused_node_names)
|
||||||
|
|
||||||
return filtered_nodes
|
return filtered_nodes
|
||||||
|
|
||||||
def get_graph_queue(self, spec: SelectionSpec) -> GraphQueue:
|
def get_graph_queue(self, spec: SelectionSpec) -> GraphQueue:
|
||||||
|
|||||||
@@ -405,27 +405,38 @@ class StateSelectorMethod(SelectorMethod):
|
|||||||
|
|
||||||
return modified
|
return modified
|
||||||
|
|
||||||
def recursively_check_macros_modified(self, node):
|
def recursively_check_macros_modified(self, node, previous_macros):
|
||||||
# check if there are any changes in macros the first time
|
|
||||||
if self.modified_macros is None:
|
|
||||||
self.modified_macros = self._macros_modified()
|
|
||||||
|
|
||||||
# loop through all macros that this node depends on
|
# loop through all macros that this node depends on
|
||||||
for macro_uid in node.depends_on.macros:
|
for macro_uid in node.depends_on.macros:
|
||||||
|
# avoid infinite recursion if we've already seen this macro
|
||||||
|
if macro_uid in previous_macros:
|
||||||
|
continue
|
||||||
|
previous_macros.append(macro_uid)
|
||||||
# is this macro one of the modified macros?
|
# is this macro one of the modified macros?
|
||||||
if macro_uid in self.modified_macros:
|
if macro_uid in self.modified_macros:
|
||||||
return True
|
return True
|
||||||
# if not, and this macro depends on other macros, keep looping
|
# if not, and this macro depends on other macros, keep looping
|
||||||
macro = self.manifest.macros[macro_uid]
|
macro_node = self.manifest.macros[macro_uid]
|
||||||
if len(macro.depends_on.macros) > 0:
|
if len(macro_node.depends_on.macros) > 0:
|
||||||
return self.recursively_check_macros_modified(macro)
|
return self.recursively_check_macros_modified(macro_node, previous_macros)
|
||||||
else:
|
else:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
def check_macros_modified(self, node):
|
||||||
|
# check if there are any changes in macros the first time
|
||||||
|
if self.modified_macros is None:
|
||||||
|
self.modified_macros = self._macros_modified()
|
||||||
|
# no macros have been modified, skip looping entirely
|
||||||
|
if not self.modified_macros:
|
||||||
return False
|
return False
|
||||||
|
# recursively loop through upstream macros to see if any is modified
|
||||||
|
else:
|
||||||
|
previous_macros = []
|
||||||
|
return self.recursively_check_macros_modified(node, previous_macros)
|
||||||
|
|
||||||
def check_modified(self, old: Optional[SelectorTarget], new: SelectorTarget) -> bool:
|
def check_modified(self, old: Optional[SelectorTarget], new: SelectorTarget) -> bool:
|
||||||
different_contents = not new.same_contents(old) # type: ignore
|
different_contents = not new.same_contents(old) # type: ignore
|
||||||
upstream_macro_change = self.recursively_check_macros_modified(new)
|
upstream_macro_change = self.check_macros_modified(new)
|
||||||
return different_contents or upstream_macro_change
|
return different_contents or upstream_macro_change
|
||||||
|
|
||||||
def check_modified_body(self, old: Optional[SelectorTarget], new: SelectorTarget) -> bool:
|
def check_modified_body(self, old: Optional[SelectorTarget], new: SelectorTarget) -> bool:
|
||||||
@@ -457,7 +468,7 @@ class StateSelectorMethod(SelectorMethod):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
def check_modified_macros(self, _, new: SelectorTarget) -> bool:
|
def check_modified_macros(self, _, new: SelectorTarget) -> bool:
|
||||||
return self.recursively_check_macros_modified(new)
|
return self.check_macros_modified(new)
|
||||||
|
|
||||||
def check_new(self, old: Optional[SelectorTarget], new: SelectorTarget) -> bool:
|
def check_new(self, old: Optional[SelectorTarget], new: SelectorTarget) -> bool:
|
||||||
return old is None
|
return old is None
|
||||||
|
|||||||
@@ -67,6 +67,7 @@ class SelectionCriteria:
|
|||||||
children: bool
|
children: bool
|
||||||
children_depth: Optional[int]
|
children_depth: Optional[int]
|
||||||
greedy: bool = False
|
greedy: bool = False
|
||||||
|
greedy_warning: bool = False # do not raise warning for yaml selectors
|
||||||
|
|
||||||
def __post_init__(self):
|
def __post_init__(self):
|
||||||
if self.children and self.childrens_parents:
|
if self.children and self.childrens_parents:
|
||||||
@@ -124,11 +125,11 @@ class SelectionCriteria:
|
|||||||
parents_depth=parents_depth,
|
parents_depth=parents_depth,
|
||||||
children=bool(dct.get('children')),
|
children=bool(dct.get('children')),
|
||||||
children_depth=children_depth,
|
children_depth=children_depth,
|
||||||
greedy=greedy
|
greedy=(greedy or bool(dct.get('greedy'))),
|
||||||
)
|
)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def dict_from_single_spec(cls, raw: str, greedy: bool = False):
|
def dict_from_single_spec(cls, raw: str):
|
||||||
result = RAW_SELECTOR_PATTERN.match(raw)
|
result = RAW_SELECTOR_PATTERN.match(raw)
|
||||||
if result is None:
|
if result is None:
|
||||||
return {'error': 'Invalid selector spec'}
|
return {'error': 'Invalid selector spec'}
|
||||||
@@ -145,6 +146,8 @@ class SelectionCriteria:
|
|||||||
dct['parents'] = bool(dct.get('parents'))
|
dct['parents'] = bool(dct.get('parents'))
|
||||||
if 'children' in dct:
|
if 'children' in dct:
|
||||||
dct['children'] = bool(dct.get('children'))
|
dct['children'] = bool(dct.get('children'))
|
||||||
|
if 'greedy' in dct:
|
||||||
|
dct['greedy'] = bool(dct.get('greedy'))
|
||||||
return dct
|
return dct
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
@@ -162,10 +165,12 @@ class BaseSelectionGroup(Iterable[SelectionSpec], metaclass=ABCMeta):
|
|||||||
self,
|
self,
|
||||||
components: Iterable[SelectionSpec],
|
components: Iterable[SelectionSpec],
|
||||||
expect_exists: bool = False,
|
expect_exists: bool = False,
|
||||||
|
greedy_warning: bool = True,
|
||||||
raw: Any = None,
|
raw: Any = None,
|
||||||
):
|
):
|
||||||
self.components: List[SelectionSpec] = list(components)
|
self.components: List[SelectionSpec] = list(components)
|
||||||
self.expect_exists = expect_exists
|
self.expect_exists = expect_exists
|
||||||
|
self.greedy_warning = greedy_warning
|
||||||
self.raw = raw
|
self.raw = raw
|
||||||
|
|
||||||
def __iter__(self) -> Iterator[SelectionSpec]:
|
def __iter__(self) -> Iterator[SelectionSpec]:
|
||||||
|
|||||||
@@ -342,3 +342,17 @@
|
|||||||
{% do run_query(sql) %}
|
{% do run_query(sql) %}
|
||||||
|
|
||||||
{% endmacro %}
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro get_query_history(information_schema, last_queried_at) -%}
|
||||||
|
{{ return(adapter.dispatch('get_query_history', 'dbt')(information_schema, last_queried_at)) }}
|
||||||
|
{%- endmacro %}
|
||||||
|
|
||||||
|
{% macro default__get_query_history(information_schema, last_queried_at) -%}
|
||||||
|
|
||||||
|
{% set typename = adapter.type() %}
|
||||||
|
{% set msg -%}
|
||||||
|
get_query_history not implemented for {{ typename }}
|
||||||
|
{%- endset %}
|
||||||
|
|
||||||
|
{{ exceptions.raise_compiler_error(msg) }}
|
||||||
|
{% endmacro %}
|
||||||
|
|||||||
@@ -51,7 +51,7 @@
|
|||||||
{% endmacro %}
|
{% endmacro %}
|
||||||
|
|
||||||
{% macro get_batch_size() -%}
|
{% macro get_batch_size() -%}
|
||||||
{{ adapter.dispatch('get_batch_size', 'dbt')() }}
|
{{ return(adapter.dispatch('get_batch_size', 'dbt')()) }}
|
||||||
{%- endmacro %}
|
{%- endmacro %}
|
||||||
|
|
||||||
{% macro default__get_batch_size() %}
|
{% macro default__get_batch_size() %}
|
||||||
|
|||||||
@@ -345,7 +345,7 @@ class TimestampNamed(logbook.Processor):
|
|||||||
class ScrubSecrets(logbook.Processor):
|
class ScrubSecrets(logbook.Processor):
|
||||||
def process(self, record):
|
def process(self, record):
|
||||||
for secret in get_secret_env():
|
for secret in get_secret_env():
|
||||||
record.message = record.message.replace(secret, "*****")
|
record.message = str(record.message).replace(secret, "*****")
|
||||||
|
|
||||||
|
|
||||||
logger = logbook.Logger('dbt')
|
logger = logbook.Logger('dbt')
|
||||||
|
|||||||
@@ -26,6 +26,7 @@ import dbt.task.seed as seed_task
|
|||||||
import dbt.task.serve as serve_task
|
import dbt.task.serve as serve_task
|
||||||
import dbt.task.snapshot as snapshot_task
|
import dbt.task.snapshot as snapshot_task
|
||||||
import dbt.task.test as test_task
|
import dbt.task.test as test_task
|
||||||
|
import dbt.task.meta as meta_task
|
||||||
from dbt.profiler import profiler
|
from dbt.profiler import profiler
|
||||||
from dbt.task.rpc.server import RPCServerTask
|
from dbt.task.rpc.server import RPCServerTask
|
||||||
from dbt.adapters.factory import reset_adapters, cleanup_connections
|
from dbt.adapters.factory import reset_adapters, cleanup_connections
|
||||||
@@ -406,6 +407,14 @@ def _build_build_subparser(subparsers, base_subparser):
|
|||||||
Store test results (failing rows) in the database
|
Store test results (failing rows) in the database
|
||||||
'''
|
'''
|
||||||
)
|
)
|
||||||
|
sub.add_argument(
|
||||||
|
'--greedy',
|
||||||
|
action='store_true',
|
||||||
|
help='''
|
||||||
|
Select all tests that touch the selected resources,
|
||||||
|
even if they also depend on unselected resources
|
||||||
|
'''
|
||||||
|
)
|
||||||
resource_values: List[str] = [
|
resource_values: List[str] = [
|
||||||
str(s) for s in build_task.BuildTask.ALL_RESOURCE_VALUES
|
str(s) for s in build_task.BuildTask.ALL_RESOURCE_VALUES
|
||||||
] + ['all']
|
] + ['all']
|
||||||
@@ -637,7 +646,7 @@ def _add_table_mutability_arguments(*subparsers):
|
|||||||
'--full-refresh',
|
'--full-refresh',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
help='''
|
help='''
|
||||||
If specified, DBT will drop incremental models and
|
If specified, dbt will drop incremental models and
|
||||||
fully-recalculate the incremental table from the model definition.
|
fully-recalculate the incremental table from the model definition.
|
||||||
'''
|
'''
|
||||||
)
|
)
|
||||||
@@ -753,6 +762,14 @@ def _build_test_subparser(subparsers, base_subparser):
|
|||||||
Store test results (failing rows) in the database
|
Store test results (failing rows) in the database
|
||||||
'''
|
'''
|
||||||
)
|
)
|
||||||
|
sub.add_argument(
|
||||||
|
'--greedy',
|
||||||
|
action='store_true',
|
||||||
|
help='''
|
||||||
|
Select all tests that touch the selected resources,
|
||||||
|
even if they also depend on unselected resources
|
||||||
|
'''
|
||||||
|
)
|
||||||
|
|
||||||
sub.set_defaults(cls=test_task.TestTask, which='test', rpc_method='test')
|
sub.set_defaults(cls=test_task.TestTask, which='test', rpc_method='test')
|
||||||
return sub
|
return sub
|
||||||
@@ -878,6 +895,14 @@ def _build_list_subparser(subparsers, base_subparser):
|
|||||||
metavar='SELECTOR',
|
metavar='SELECTOR',
|
||||||
required=False,
|
required=False,
|
||||||
)
|
)
|
||||||
|
sub.add_argument(
|
||||||
|
'--greedy',
|
||||||
|
action='store_true',
|
||||||
|
help='''
|
||||||
|
Select all tests that touch the selected resources,
|
||||||
|
even if they also depend on unselected resources
|
||||||
|
'''
|
||||||
|
)
|
||||||
_add_common_selector_arguments(sub)
|
_add_common_selector_arguments(sub)
|
||||||
|
|
||||||
return sub
|
return sub
|
||||||
@@ -913,6 +938,24 @@ def _build_run_operation_subparser(subparsers, base_subparser):
|
|||||||
return sub
|
return sub
|
||||||
|
|
||||||
|
|
||||||
|
def _build_meta_subparser(subparsers, base_subparser):
|
||||||
|
sub = subparsers.add_parser(
|
||||||
|
'meta',
|
||||||
|
parents=[base_subparser],
|
||||||
|
help='''
|
||||||
|
Generate metadata information artifacts.
|
||||||
|
'''
|
||||||
|
)
|
||||||
|
sub.add_argument(
|
||||||
|
'type',
|
||||||
|
help='''
|
||||||
|
Type of metadata artifact to generate.
|
||||||
|
''',
|
||||||
|
)
|
||||||
|
sub.set_defaults(cls=meta_task.MetaTask,
|
||||||
|
which='meta')
|
||||||
|
return sub
|
||||||
|
|
||||||
def parse_args(args, cls=DBTArgumentParser):
|
def parse_args(args, cls=DBTArgumentParser):
|
||||||
p = cls(
|
p = cls(
|
||||||
prog='dbt',
|
prog='dbt',
|
||||||
@@ -1097,6 +1140,7 @@ def parse_args(args, cls=DBTArgumentParser):
|
|||||||
_build_docs_serve_subparser(docs_subs, base_subparser)
|
_build_docs_serve_subparser(docs_subs, base_subparser)
|
||||||
_build_source_freshness_subparser(source_subs, base_subparser)
|
_build_source_freshness_subparser(source_subs, base_subparser)
|
||||||
_build_run_operation_subparser(subs, base_subparser)
|
_build_run_operation_subparser(subs, base_subparser)
|
||||||
|
_build_meta_subparser(subs, base_subparser)
|
||||||
|
|
||||||
if len(args) == 0:
|
if len(args) == 0:
|
||||||
p.print_help()
|
p.print_help()
|
||||||
|
|||||||
80
core/dbt/task/meta.py
Normal file
80
core/dbt/task/meta.py
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
import os
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
import agate
|
||||||
|
|
||||||
|
from .runnable import ManifestTask
|
||||||
|
|
||||||
|
import dbt.exceptions
|
||||||
|
from dbt.adapters.factory import get_adapter
|
||||||
|
# from dbt.config.utils import parse_cli_vars
|
||||||
|
from dbt.contracts.results import MetaArtifact
|
||||||
|
from dbt.exceptions import InternalException
|
||||||
|
from dbt.logger import GLOBAL_LOGGER as logger
|
||||||
|
|
||||||
|
META_FILENAME = 'meta.json'
|
||||||
|
|
||||||
|
|
||||||
|
class MetaTask(ManifestTask):
|
||||||
|
def _get_macro_parts(self):
|
||||||
|
macro_name = self.args.type
|
||||||
|
if '.' in macro_name:
|
||||||
|
package_name, macro_name = macro_name.split(".", 1)
|
||||||
|
else:
|
||||||
|
package_name = None
|
||||||
|
|
||||||
|
return package_name, f"get_{macro_name}" # hax
|
||||||
|
|
||||||
|
# def _get_kwargs(self) -> Dict[str, Any]:
|
||||||
|
# return parse_cli_vars(self.args.args)
|
||||||
|
|
||||||
|
def compile_manifest(self) -> None:
|
||||||
|
if self.manifest is None:
|
||||||
|
raise InternalException('manifest was None in compile_manifest')
|
||||||
|
|
||||||
|
def _run_unsafe(self) -> agate.Table:
|
||||||
|
adapter = get_adapter(self.config)
|
||||||
|
|
||||||
|
package_name, macro_name = self._get_macro_parts()
|
||||||
|
# macro_kwargs = self._get_kwargs()
|
||||||
|
|
||||||
|
with adapter.connection_named(f'macro_{macro_name}'):
|
||||||
|
adapter.clear_transaction()
|
||||||
|
res = adapter.execute_macro(
|
||||||
|
macro_name,
|
||||||
|
project=package_name,
|
||||||
|
# kwargs=macro_kwargs,
|
||||||
|
manifest=self.manifest
|
||||||
|
)
|
||||||
|
|
||||||
|
return res
|
||||||
|
|
||||||
|
def run(self) -> MetaArtifact:
|
||||||
|
self._runtime_initialize()
|
||||||
|
try:
|
||||||
|
res = self._run_unsafe()
|
||||||
|
except dbt.exceptions.Exception as exc:
|
||||||
|
logger.error(
|
||||||
|
'Encountered an error while running operation: {}'
|
||||||
|
.format(exc)
|
||||||
|
)
|
||||||
|
logger.debug('', exc_info=True)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.error(
|
||||||
|
'Encountered an uncaught exception while running operation: {}'
|
||||||
|
.format(exc)
|
||||||
|
)
|
||||||
|
logger.debug('', exc_info=True)
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
end = datetime.utcnow()
|
||||||
|
artifact = MetaArtifact.from_results(
|
||||||
|
generated_at=end,
|
||||||
|
table=res
|
||||||
|
)
|
||||||
|
path = os.path.join(self.config.target_path, META_FILENAME)
|
||||||
|
artifact.write(path)
|
||||||
|
return artifact
|
||||||
|
|
||||||
|
def interpret_results(self, results):
|
||||||
|
return True
|
||||||
@@ -438,7 +438,7 @@ class GraphRunnableTask(ManifestTask):
|
|||||||
)
|
)
|
||||||
|
|
||||||
if len(self._flattened_nodes) == 0:
|
if len(self._flattened_nodes) == 0:
|
||||||
logger.warning("WARNING: Nothing to do. Try checking your model "
|
logger.warning("\nWARNING: Nothing to do. Try checking your model "
|
||||||
"configs and model specification args")
|
"configs and model specification args")
|
||||||
result = self.get_result(
|
result = self.get_result(
|
||||||
results=[],
|
results=[],
|
||||||
|
|||||||
@@ -96,5 +96,5 @@ def _get_dbt_plugins_info():
|
|||||||
yield plugin_name, mod.version
|
yield plugin_name, mod.version
|
||||||
|
|
||||||
|
|
||||||
__version__ = '0.21.0b2'
|
__version__ = '0.21.0'
|
||||||
installed = get_installed_version()
|
installed = get_installed_version()
|
||||||
|
|||||||
@@ -284,12 +284,12 @@ def parse_args(argv=None):
|
|||||||
parser.add_argument('adapter')
|
parser.add_argument('adapter')
|
||||||
parser.add_argument('--title-case', '-t', default=None)
|
parser.add_argument('--title-case', '-t', default=None)
|
||||||
parser.add_argument('--dependency', action='append')
|
parser.add_argument('--dependency', action='append')
|
||||||
parser.add_argument('--dbt-core-version', default='0.21.0b2')
|
parser.add_argument('--dbt-core-version', default='0.21.0')
|
||||||
parser.add_argument('--email')
|
parser.add_argument('--email')
|
||||||
parser.add_argument('--author')
|
parser.add_argument('--author')
|
||||||
parser.add_argument('--url')
|
parser.add_argument('--url')
|
||||||
parser.add_argument('--sql', action='store_true')
|
parser.add_argument('--sql', action='store_true')
|
||||||
parser.add_argument('--package-version', default='0.21.0b2')
|
parser.add_argument('--package-version', default='0.21.0')
|
||||||
parser.add_argument('--project-version', default='1.0')
|
parser.add_argument('--project-version', default='1.0')
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
'--no-dependency', action='store_false', dest='set_dependency'
|
'--no-dependency', action='store_false', dest='set_dependency'
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ def read(fname):
|
|||||||
|
|
||||||
|
|
||||||
package_name = "dbt-core"
|
package_name = "dbt-core"
|
||||||
package_version = "0.21.0b2"
|
package_version = "0.21.0"
|
||||||
description = """dbt (data build tool) is a command line tool that helps \
|
description = """dbt (data build tool) is a command line tool that helps \
|
||||||
analysts and engineers transform data in their warehouse more effectively"""
|
analysts and engineers transform data in their warehouse more effectively"""
|
||||||
|
|
||||||
|
|||||||
75
docker/requirements/requirements.0.21.0.txt
Normal file
75
docker/requirements/requirements.0.21.0.txt
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
agate==1.6.1
|
||||||
|
asn1crypto==1.4.0
|
||||||
|
attrs==21.2.0
|
||||||
|
azure-common==1.1.27
|
||||||
|
azure-core==1.19.0
|
||||||
|
azure-storage-blob==12.9.0
|
||||||
|
Babel==2.9.1
|
||||||
|
boto3==1.18.53
|
||||||
|
botocore==1.21.53
|
||||||
|
cachetools==4.2.4
|
||||||
|
certifi==2021.5.30
|
||||||
|
cffi==1.14.6
|
||||||
|
chardet==4.0.0
|
||||||
|
charset-normalizer==2.0.6
|
||||||
|
colorama==0.4.4
|
||||||
|
cryptography==3.4.8
|
||||||
|
google-api-core==1.31.3
|
||||||
|
google-auth==1.35.0
|
||||||
|
google-cloud-bigquery==2.28.0
|
||||||
|
google-cloud-core==1.7.2
|
||||||
|
google-crc32c==1.2.0
|
||||||
|
google-resumable-media==2.0.3
|
||||||
|
googleapis-common-protos==1.53.0
|
||||||
|
grpcio==1.41.0
|
||||||
|
hologram==0.0.14
|
||||||
|
idna==3.2
|
||||||
|
importlib-metadata==4.8.1
|
||||||
|
isodate==0.6.0
|
||||||
|
jeepney==0.7.1
|
||||||
|
Jinja2==2.11.3
|
||||||
|
jmespath==0.10.0
|
||||||
|
json-rpc==1.13.0
|
||||||
|
jsonschema==3.1.1
|
||||||
|
keyring==21.8.0
|
||||||
|
leather==0.3.3
|
||||||
|
Logbook==1.5.3
|
||||||
|
MarkupSafe==2.0.1
|
||||||
|
mashumaro==2.5
|
||||||
|
minimal-snowplow-tracker==0.0.2
|
||||||
|
msgpack==1.0.2
|
||||||
|
msrest==0.6.21
|
||||||
|
networkx==2.6.3
|
||||||
|
oauthlib==3.1.1
|
||||||
|
oscrypto==1.2.1
|
||||||
|
packaging==20.9
|
||||||
|
parsedatetime==2.6
|
||||||
|
proto-plus==1.19.2
|
||||||
|
protobuf==3.17.3
|
||||||
|
psycopg2-binary==2.9.1
|
||||||
|
pyasn1==0.4.8
|
||||||
|
pyasn1-modules==0.2.8
|
||||||
|
pycparser==2.20
|
||||||
|
pycryptodomex==3.10.4
|
||||||
|
PyJWT==2.1.0
|
||||||
|
pyOpenSSL==20.0.1
|
||||||
|
pyparsing==2.4.7
|
||||||
|
pyrsistent==0.18.0
|
||||||
|
python-dateutil==2.8.2
|
||||||
|
python-slugify==5.0.2
|
||||||
|
pytimeparse==1.1.8
|
||||||
|
pytz==2021.3
|
||||||
|
PyYAML==5.4.1
|
||||||
|
requests==2.26.0
|
||||||
|
requests-oauthlib==1.3.0
|
||||||
|
rsa==4.7.2
|
||||||
|
s3transfer==0.5.0
|
||||||
|
SecretStorage==3.3.1
|
||||||
|
six==1.16.0
|
||||||
|
snowflake-connector-python==2.5.1
|
||||||
|
sqlparse==0.4.2
|
||||||
|
text-unidecode==1.3
|
||||||
|
typing-extensions==3.10.0.2
|
||||||
|
urllib3==1.26.7
|
||||||
|
Werkzeug==2.0.1
|
||||||
|
zipp==3.6.0
|
||||||
75
docker/requirements/requirements.0.21.0rc1.txt
Normal file
75
docker/requirements/requirements.0.21.0rc1.txt
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
agate==1.6.1
|
||||||
|
asn1crypto==1.4.0
|
||||||
|
attrs==21.2.0
|
||||||
|
azure-common==1.1.27
|
||||||
|
azure-core==1.18.0
|
||||||
|
azure-storage-blob==12.8.1
|
||||||
|
Babel==2.9.1
|
||||||
|
boto3==1.18.44
|
||||||
|
botocore==1.21.44
|
||||||
|
cachetools==4.2.2
|
||||||
|
certifi==2021.5.30
|
||||||
|
cffi==1.14.6
|
||||||
|
chardet==4.0.0
|
||||||
|
charset-normalizer==2.0.6
|
||||||
|
colorama==0.4.4
|
||||||
|
cryptography==3.4.8
|
||||||
|
google-api-core==1.31.2
|
||||||
|
google-auth==1.35.0
|
||||||
|
google-cloud-bigquery==2.26.0
|
||||||
|
google-cloud-core==1.7.2
|
||||||
|
google-crc32c==1.1.2
|
||||||
|
google-resumable-media==2.0.2
|
||||||
|
googleapis-common-protos==1.53.0
|
||||||
|
grpcio==1.40.0
|
||||||
|
hologram==0.0.14
|
||||||
|
idna==3.2
|
||||||
|
importlib-metadata==4.8.1
|
||||||
|
isodate==0.6.0
|
||||||
|
jeepney==0.7.1
|
||||||
|
Jinja2==2.11.3
|
||||||
|
jmespath==0.10.0
|
||||||
|
json-rpc==1.13.0
|
||||||
|
jsonschema==3.1.1
|
||||||
|
keyring==21.8.0
|
||||||
|
leather==0.3.3
|
||||||
|
Logbook==1.5.3
|
||||||
|
MarkupSafe==2.0.1
|
||||||
|
mashumaro==2.5
|
||||||
|
minimal-snowplow-tracker==0.0.2
|
||||||
|
msgpack==1.0.2
|
||||||
|
msrest==0.6.21
|
||||||
|
networkx==2.6.3
|
||||||
|
oauthlib==3.1.1
|
||||||
|
oscrypto==1.2.1
|
||||||
|
packaging==20.9
|
||||||
|
parsedatetime==2.6
|
||||||
|
proto-plus==1.19.0
|
||||||
|
protobuf==3.18.0
|
||||||
|
psycopg2-binary==2.9.1
|
||||||
|
pyasn1==0.4.8
|
||||||
|
pyasn1-modules==0.2.8
|
||||||
|
pycparser==2.20
|
||||||
|
pycryptodomex==3.10.1
|
||||||
|
PyJWT==2.1.0
|
||||||
|
pyOpenSSL==20.0.1
|
||||||
|
pyparsing==2.4.7
|
||||||
|
pyrsistent==0.18.0
|
||||||
|
python-dateutil==2.8.2
|
||||||
|
python-slugify==5.0.2
|
||||||
|
pytimeparse==1.1.8
|
||||||
|
pytz==2021.1
|
||||||
|
PyYAML==5.4.1
|
||||||
|
requests==2.26.0
|
||||||
|
requests-oauthlib==1.3.0
|
||||||
|
rsa==4.7.2
|
||||||
|
s3transfer==0.5.0
|
||||||
|
SecretStorage==3.3.1
|
||||||
|
six==1.16.0
|
||||||
|
snowflake-connector-python==2.5.1
|
||||||
|
sqlparse==0.4.2
|
||||||
|
text-unidecode==1.3
|
||||||
|
typing-extensions==3.10.0.2
|
||||||
|
urllib3==1.26.6
|
||||||
|
Werkzeug==2.0.1
|
||||||
|
zipp==3.5.0
|
||||||
75
docker/requirements/requirements.0.21.0rc2.txt
Normal file
75
docker/requirements/requirements.0.21.0rc2.txt
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
agate==1.6.1
|
||||||
|
asn1crypto==1.4.0
|
||||||
|
attrs==21.2.0
|
||||||
|
azure-common==1.1.27
|
||||||
|
azure-core==1.18.0
|
||||||
|
azure-storage-blob==12.9.0
|
||||||
|
Babel==2.9.1
|
||||||
|
boto3==1.18.48
|
||||||
|
botocore==1.21.48
|
||||||
|
cachetools==4.2.2
|
||||||
|
certifi==2021.5.30
|
||||||
|
cffi==1.14.6
|
||||||
|
chardet==4.0.0
|
||||||
|
charset-normalizer==2.0.6
|
||||||
|
colorama==0.4.4
|
||||||
|
cryptography==3.4.8
|
||||||
|
google-api-core==1.31.3
|
||||||
|
google-auth==1.35.0
|
||||||
|
google-cloud-bigquery==2.27.0
|
||||||
|
google-cloud-core==1.7.2
|
||||||
|
google-crc32c==1.2.0
|
||||||
|
google-resumable-media==2.0.3
|
||||||
|
googleapis-common-protos==1.53.0
|
||||||
|
grpcio==1.40.0
|
||||||
|
hologram==0.0.14
|
||||||
|
idna==3.2
|
||||||
|
importlib-metadata==4.8.1
|
||||||
|
isodate==0.6.0
|
||||||
|
jeepney==0.7.1
|
||||||
|
Jinja2==2.11.3
|
||||||
|
jmespath==0.10.0
|
||||||
|
json-rpc==1.13.0
|
||||||
|
jsonschema==3.1.1
|
||||||
|
keyring==21.8.0
|
||||||
|
leather==0.3.3
|
||||||
|
Logbook==1.5.3
|
||||||
|
MarkupSafe==2.0.1
|
||||||
|
mashumaro==2.5
|
||||||
|
minimal-snowplow-tracker==0.0.2
|
||||||
|
msgpack==1.0.2
|
||||||
|
msrest==0.6.21
|
||||||
|
networkx==2.6.3
|
||||||
|
oauthlib==3.1.1
|
||||||
|
oscrypto==1.2.1
|
||||||
|
packaging==20.9
|
||||||
|
parsedatetime==2.6
|
||||||
|
proto-plus==1.19.0
|
||||||
|
protobuf==3.17.3
|
||||||
|
psycopg2-binary==2.9.1
|
||||||
|
pyasn1==0.4.8
|
||||||
|
pyasn1-modules==0.2.8
|
||||||
|
pycparser==2.20
|
||||||
|
pycryptodomex==3.10.4
|
||||||
|
PyJWT==2.1.0
|
||||||
|
pyOpenSSL==20.0.1
|
||||||
|
pyparsing==2.4.7
|
||||||
|
pyrsistent==0.18.0
|
||||||
|
python-dateutil==2.8.2
|
||||||
|
python-slugify==5.0.2
|
||||||
|
pytimeparse==1.1.8
|
||||||
|
pytz==2021.1
|
||||||
|
PyYAML==5.4.1
|
||||||
|
requests==2.26.0
|
||||||
|
requests-oauthlib==1.3.0
|
||||||
|
rsa==4.7.2
|
||||||
|
s3transfer==0.5.0
|
||||||
|
SecretStorage==3.3.1
|
||||||
|
six==1.16.0
|
||||||
|
snowflake-connector-python==2.5.1
|
||||||
|
sqlparse==0.4.2
|
||||||
|
text-unidecode==1.3
|
||||||
|
typing-extensions==3.10.0.2
|
||||||
|
urllib3==1.26.7
|
||||||
|
Werkzeug==2.0.1
|
||||||
|
zipp==3.5.0
|
||||||
@@ -1 +1 @@
|
|||||||
version = '0.21.0b2'
|
version = '0.21.0'
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ except ImportError:
|
|||||||
|
|
||||||
|
|
||||||
package_name = "dbt-bigquery"
|
package_name = "dbt-bigquery"
|
||||||
package_version = "0.21.0b2"
|
package_version = "0.21.0"
|
||||||
description = """The bigquery adapter plugin for dbt (data build tool)"""
|
description = """The bigquery adapter plugin for dbt (data build tool)"""
|
||||||
|
|
||||||
this_directory = os.path.abspath(os.path.dirname(__file__))
|
this_directory = os.path.abspath(os.path.dirname(__file__))
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
version = '0.21.0b2'
|
version = '0.21.0'
|
||||||
|
|||||||
@@ -41,7 +41,7 @@ def _dbt_psycopg2_name():
|
|||||||
|
|
||||||
|
|
||||||
package_name = "dbt-postgres"
|
package_name = "dbt-postgres"
|
||||||
package_version = "0.21.0b2"
|
package_version = "0.21.0"
|
||||||
description = """The postgres adpter plugin for dbt (data build tool)"""
|
description = """The postgres adpter plugin for dbt (data build tool)"""
|
||||||
|
|
||||||
this_directory = os.path.abspath(os.path.dirname(__file__))
|
this_directory = os.path.abspath(os.path.dirname(__file__))
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
version = '0.21.0b2'
|
version = '0.21.0'
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ except ImportError:
|
|||||||
|
|
||||||
|
|
||||||
package_name = "dbt-redshift"
|
package_name = "dbt-redshift"
|
||||||
package_version = "0.21.0b2"
|
package_version = "0.21.0"
|
||||||
description = """The redshift adapter plugin for dbt (data build tool)"""
|
description = """The redshift adapter plugin for dbt (data build tool)"""
|
||||||
|
|
||||||
this_directory = os.path.abspath(os.path.dirname(__file__))
|
this_directory = os.path.abspath(os.path.dirname(__file__))
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
version = '0.21.0b2'
|
version = '0.21.0'
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
{% macro snowflake__load_csv_rows(model, agate_table) %}
|
{% macro snowflake__load_csv_rows(model, agate_table) %}
|
||||||
|
{% set batch_size = get_batch_size() %}
|
||||||
{% set cols_sql = get_seed_column_quoted_csv(model, agate_table.column_names) %}
|
{% set cols_sql = get_seed_column_quoted_csv(model, agate_table.column_names) %}
|
||||||
{% set bindings = [] %}
|
{% set bindings = [] %}
|
||||||
|
|
||||||
|
|||||||
@@ -0,0 +1,30 @@
|
|||||||
|
{% macro snowflake__get_query_history(information_schema, last_queried_at) -%}
|
||||||
|
|
||||||
|
use warehouse analytics;
|
||||||
|
{% set query %}
|
||||||
|
select
|
||||||
|
|
||||||
|
query_id,
|
||||||
|
query_text,
|
||||||
|
database_name,
|
||||||
|
schema_name,
|
||||||
|
session_id,
|
||||||
|
user_name,
|
||||||
|
execution_status,
|
||||||
|
start_time,
|
||||||
|
end_time
|
||||||
|
|
||||||
|
from table(
|
||||||
|
information_schema.query_history(
|
||||||
|
dateadd('hours',-1,current_timestamp()), --start_timestamp, use last_queried_at here maybe
|
||||||
|
current_timestamp(), --end_timestamp
|
||||||
|
1000) --row count
|
||||||
|
)
|
||||||
|
|
||||||
|
order by start_time
|
||||||
|
|
||||||
|
{%- endset -%}
|
||||||
|
|
||||||
|
{{ return(run_query(query)) }}
|
||||||
|
|
||||||
|
{%- endmacro %}
|
||||||
@@ -20,7 +20,7 @@ except ImportError:
|
|||||||
|
|
||||||
|
|
||||||
package_name = "dbt-snowflake"
|
package_name = "dbt-snowflake"
|
||||||
package_version = "0.21.0b2"
|
package_version = "0.21.0"
|
||||||
description = """The snowflake adapter plugin for dbt (data build tool)"""
|
description = """The snowflake adapter plugin for dbt (data build tool)"""
|
||||||
|
|
||||||
this_directory = os.path.abspath(os.path.dirname(__file__))
|
this_directory = os.path.abspath(os.path.dirname(__file__))
|
||||||
|
|||||||
2
setup.py
2
setup.py
@@ -24,7 +24,7 @@ with open(os.path.join(this_directory, 'README.md')) as f:
|
|||||||
|
|
||||||
|
|
||||||
package_name = "dbt"
|
package_name = "dbt"
|
||||||
package_version = "0.21.0b2"
|
package_version = "0.21.0"
|
||||||
description = """With dbt, data analysts and engineers can build analytics \
|
description = """With dbt, data analysts and engineers can build analytics \
|
||||||
the way engineers build applications."""
|
the way engineers build applications."""
|
||||||
|
|
||||||
|
|||||||
1
test/integration/005_simple_seed_test/data-big/.gitignore
vendored
Normal file
1
test/integration/005_simple_seed_test/data-big/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
*.csv
|
||||||
20001
test/integration/005_simple_seed_test/seeds-big/my_seed.csv
Normal file
20001
test/integration/005_simple_seed_test/seeds-big/my_seed.csv
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,5 @@
|
|||||||
import os
|
import os
|
||||||
|
import csv
|
||||||
from test.integration.base import DBTIntegrationTest, use_profile
|
from test.integration.base import DBTIntegrationTest, use_profile
|
||||||
|
|
||||||
|
|
||||||
@@ -312,3 +312,42 @@ class TestSimpleSeedWithDots(DBTIntegrationTest):
|
|||||||
def test_postgres_simple_seed(self):
|
def test_postgres_simple_seed(self):
|
||||||
results = self.run_dbt(["seed"])
|
results = self.run_dbt(["seed"])
|
||||||
self.assertEqual(len(results), 1)
|
self.assertEqual(len(results), 1)
|
||||||
|
|
||||||
|
class TestSimpleBigSeedBatched(DBTIntegrationTest):
|
||||||
|
@property
|
||||||
|
def schema(self):
|
||||||
|
return "simple_seed_005"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def models(self):
|
||||||
|
return "models"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def project_config(self):
|
||||||
|
return {
|
||||||
|
'config-version': 2,
|
||||||
|
"data-paths": ['data-big'],
|
||||||
|
'seeds': {
|
||||||
|
'quote_columns': False,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
def test_big_batched_seed(self):
|
||||||
|
with open('data-big/my_seed.csv', 'w') as f:
|
||||||
|
writer = csv.writer(f)
|
||||||
|
writer.writerow(['id'])
|
||||||
|
for i in range(0, 20000):
|
||||||
|
writer.writerow([i])
|
||||||
|
|
||||||
|
results = self.run_dbt(["seed"])
|
||||||
|
self.assertEqual(len(results), 1)
|
||||||
|
|
||||||
|
|
||||||
|
@use_profile('postgres')
|
||||||
|
def test_postgres_big_batched_seed(self):
|
||||||
|
self.test_big_batched_seed()
|
||||||
|
|
||||||
|
@use_profile('snowflake')
|
||||||
|
def test_snowflake_big_batched_seed(self):
|
||||||
|
self.test_big_batched_seed()
|
||||||
|
|
||||||
@@ -1093,7 +1093,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
|||||||
)
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'nodes': {
|
'nodes': {
|
||||||
'model.test.model': {
|
'model.test.model': {
|
||||||
@@ -1680,7 +1680,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
|||||||
snapshot_path = self.dir('snapshot/snapshot_seed.sql')
|
snapshot_path = self.dir('snapshot/snapshot_seed.sql')
|
||||||
|
|
||||||
return {
|
return {
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'nodes': {
|
'nodes': {
|
||||||
'model.test.ephemeral_copy': {
|
'model.test.ephemeral_copy': {
|
||||||
@@ -2203,7 +2203,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
|||||||
my_schema_name = self.unique_schema()
|
my_schema_name = self.unique_schema()
|
||||||
|
|
||||||
return {
|
return {
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'nodes': {
|
'nodes': {
|
||||||
'model.test.clustered': {
|
'model.test.clustered': {
|
||||||
@@ -2695,7 +2695,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
|||||||
snapshot_path = self.dir('snapshot/snapshot_seed.sql')
|
snapshot_path = self.dir('snapshot/snapshot_seed.sql')
|
||||||
|
|
||||||
return {
|
return {
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'nodes': {
|
'nodes': {
|
||||||
'model.test.model': {
|
'model.test.model': {
|
||||||
@@ -2959,7 +2959,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
|||||||
elif key == 'metadata':
|
elif key == 'metadata':
|
||||||
metadata = manifest['metadata']
|
metadata = manifest['metadata']
|
||||||
self.verify_metadata(
|
self.verify_metadata(
|
||||||
metadata, 'https://schemas.getdbt.com/dbt/manifest/v2.json')
|
metadata, 'https://schemas.getdbt.com/dbt/manifest/v3.json')
|
||||||
assert 'project_id' in metadata and metadata[
|
assert 'project_id' in metadata and metadata[
|
||||||
'project_id'] == '098f6bcd4621d373cade4e832627b4f6'
|
'project_id'] == '098f6bcd4621d373cade4e832627b4f6'
|
||||||
assert 'send_anonymous_usage_stats' in metadata and metadata[
|
assert 'send_anonymous_usage_stats' in metadata and metadata[
|
||||||
@@ -3100,7 +3100,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
|||||||
run_results = _read_json('./target/run_results.json')
|
run_results = _read_json('./target/run_results.json')
|
||||||
assert 'metadata' in run_results
|
assert 'metadata' in run_results
|
||||||
self.verify_metadata(
|
self.verify_metadata(
|
||||||
run_results['metadata'], 'https://schemas.getdbt.com/dbt/run-results/v2.json')
|
run_results['metadata'], 'https://schemas.getdbt.com/dbt/run-results/v3.json')
|
||||||
self.assertIn('elapsed_time', run_results)
|
self.assertIn('elapsed_time', run_results)
|
||||||
self.assertGreater(run_results['elapsed_time'], 0)
|
self.assertGreater(run_results['elapsed_time'], 0)
|
||||||
self.assertTrue(
|
self.assertTrue(
|
||||||
|
|||||||
@@ -248,7 +248,7 @@ class TestSourceFreshness(SuccessfulSourcesTest):
|
|||||||
assert isinstance(data['elapsed_time'], float)
|
assert isinstance(data['elapsed_time'], float)
|
||||||
self.assertBetween(data['metadata']['generated_at'],
|
self.assertBetween(data['metadata']['generated_at'],
|
||||||
self.freshness_start_time)
|
self.freshness_start_time)
|
||||||
assert data['metadata']['dbt_schema_version'] == 'https://schemas.getdbt.com/dbt/sources/v1.json'
|
assert data['metadata']['dbt_schema_version'] == 'https://schemas.getdbt.com/dbt/sources/v2.json'
|
||||||
assert data['metadata']['dbt_version'] == dbt.version.__version__
|
assert data['metadata']['dbt_version'] == dbt.version.__version__
|
||||||
assert data['metadata']['invocation_id'] == dbt.tracking.active_user.invocation_id
|
assert data['metadata']['invocation_id'] == dbt.tracking.active_user.invocation_id
|
||||||
key = 'key'
|
key = 'key'
|
||||||
|
|||||||
@@ -0,0 +1,13 @@
|
|||||||
|
{# trigger infinite recursion if not handled #}
|
||||||
|
|
||||||
|
{% macro my_infinitely_recursive_macro() %}
|
||||||
|
{{ return(adapter.dispatch('my_infinitely_recursive_macro')()) }}
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro default__my_infinitely_recursive_macro() %}
|
||||||
|
{% if unmet_condition %}
|
||||||
|
{{ my_infinitely_recursive_macro() }}
|
||||||
|
{% else %}
|
||||||
|
{{ return('') }}
|
||||||
|
{% endif %}
|
||||||
|
{% endmacro %}
|
||||||
@@ -1 +1,4 @@
|
|||||||
select * from {{ ref('seed') }}
|
select * from {{ ref('seed') }}
|
||||||
|
|
||||||
|
-- establish a macro dependency that trips infinite recursion if not handled
|
||||||
|
-- depends on: {{ my_infinitely_recursive_macro() }}
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
from test.integration.base import DBTIntegrationTest, FakeArgs, use_profile
|
from test.integration.base import DBTIntegrationTest, FakeArgs, use_profile
|
||||||
|
import yaml
|
||||||
|
|
||||||
from dbt.task.test import TestTask
|
from dbt.task.test import TestTask
|
||||||
from dbt.task.list import ListTask
|
from dbt.task.list import ListTask
|
||||||
@@ -20,12 +21,18 @@ class TestSelectionExpansion(DBTIntegrationTest):
|
|||||||
"test-paths": ["tests"]
|
"test-paths": ["tests"]
|
||||||
}
|
}
|
||||||
|
|
||||||
def list_tests_and_assert(self, include, exclude, expected_tests):
|
def list_tests_and_assert(self, include, exclude, expected_tests, greedy=False, selector_name=None):
|
||||||
list_args = [ 'ls', '--resource-type', 'test']
|
list_args = [ 'ls', '--resource-type', 'test']
|
||||||
if include:
|
if include:
|
||||||
list_args.extend(('--select', include))
|
list_args.extend(('--select', include))
|
||||||
if exclude:
|
if exclude:
|
||||||
list_args.extend(('--exclude', exclude))
|
list_args.extend(('--exclude', exclude))
|
||||||
|
if exclude:
|
||||||
|
list_args.extend(('--exclude', exclude))
|
||||||
|
if greedy:
|
||||||
|
list_args.append('--greedy')
|
||||||
|
if selector_name:
|
||||||
|
list_args.extend(('--selector', selector_name))
|
||||||
|
|
||||||
listed = self.run_dbt(list_args)
|
listed = self.run_dbt(list_args)
|
||||||
assert len(listed) == len(expected_tests)
|
assert len(listed) == len(expected_tests)
|
||||||
@@ -34,7 +41,7 @@ class TestSelectionExpansion(DBTIntegrationTest):
|
|||||||
assert sorted(test_names) == sorted(expected_tests)
|
assert sorted(test_names) == sorted(expected_tests)
|
||||||
|
|
||||||
def run_tests_and_assert(
|
def run_tests_and_assert(
|
||||||
self, include, exclude, expected_tests, schema = False, data = False
|
self, include, exclude, expected_tests, schema=False, data=False, greedy=False, selector_name=None
|
||||||
):
|
):
|
||||||
results = self.run_dbt(['run'])
|
results = self.run_dbt(['run'])
|
||||||
self.assertEqual(len(results), 2)
|
self.assertEqual(len(results), 2)
|
||||||
@@ -48,6 +55,10 @@ class TestSelectionExpansion(DBTIntegrationTest):
|
|||||||
test_args.append('--schema')
|
test_args.append('--schema')
|
||||||
if data:
|
if data:
|
||||||
test_args.append('--data')
|
test_args.append('--data')
|
||||||
|
if greedy:
|
||||||
|
test_args.append('--greedy')
|
||||||
|
if selector_name:
|
||||||
|
test_args.extend(('--selector', selector_name))
|
||||||
|
|
||||||
results = self.run_dbt(test_args)
|
results = self.run_dbt(test_args)
|
||||||
tests_run = [r.node.name for r in results]
|
tests_run = [r.node.name for r in results]
|
||||||
@@ -228,3 +239,80 @@ class TestSelectionExpansion(DBTIntegrationTest):
|
|||||||
|
|
||||||
self.list_tests_and_assert(select, exclude, expected)
|
self.list_tests_and_assert(select, exclude, expected)
|
||||||
self.run_tests_and_assert(select, exclude, expected)
|
self.run_tests_and_assert(select, exclude, expected)
|
||||||
|
|
||||||
|
@use_profile('postgres')
|
||||||
|
def test__postgres__model_a_greedy(self):
|
||||||
|
select = 'model_a'
|
||||||
|
exclude = None
|
||||||
|
greedy = True
|
||||||
|
expected = [
|
||||||
|
'cf_a_b', 'cf_a_src', 'just_a',
|
||||||
|
'relationships_model_a_fun__fun__ref_model_b_',
|
||||||
|
'relationships_model_a_fun__fun__source_my_src_my_tbl_',
|
||||||
|
'unique_model_a_fun'
|
||||||
|
]
|
||||||
|
|
||||||
|
self.list_tests_and_assert(select, exclude, expected, greedy)
|
||||||
|
self.run_tests_and_assert(select, exclude, expected, greedy=greedy)
|
||||||
|
|
||||||
|
@use_profile('postgres')
|
||||||
|
def test__postgres__model_a_greedy_exclude_unique_tests(self):
|
||||||
|
select = 'model_a'
|
||||||
|
exclude = 'test_name:unique'
|
||||||
|
greedy = True
|
||||||
|
expected = [
|
||||||
|
'cf_a_b', 'cf_a_src', 'just_a',
|
||||||
|
'relationships_model_a_fun__fun__ref_model_b_',
|
||||||
|
'relationships_model_a_fun__fun__source_my_src_my_tbl_',
|
||||||
|
]
|
||||||
|
|
||||||
|
self.list_tests_and_assert(select, exclude, expected, greedy)
|
||||||
|
self.run_tests_and_assert(select, exclude, expected, greedy=greedy)
|
||||||
|
|
||||||
|
class TestExpansionWithSelectors(TestSelectionExpansion):
|
||||||
|
|
||||||
|
@property
|
||||||
|
def selectors_config(self):
|
||||||
|
return yaml.safe_load('''
|
||||||
|
selectors:
|
||||||
|
- name: model_a_greedy_none
|
||||||
|
definition:
|
||||||
|
method: fqn
|
||||||
|
value: model_a
|
||||||
|
- name: model_a_greedy_false
|
||||||
|
definition:
|
||||||
|
method: fqn
|
||||||
|
value: model_a
|
||||||
|
greedy: false
|
||||||
|
- name: model_a_greedy_true
|
||||||
|
definition:
|
||||||
|
method: fqn
|
||||||
|
value: model_a
|
||||||
|
greedy: true
|
||||||
|
''')
|
||||||
|
|
||||||
|
@use_profile('postgres')
|
||||||
|
def test__postgres__selector_model_a_not_greedy(self):
|
||||||
|
expected = ['just_a','unique_model_a_fun']
|
||||||
|
|
||||||
|
# when greedy is not specified, so implicitly False
|
||||||
|
self.list_tests_and_assert(include=None, exclude=None, expected_tests=expected, selector_name='model_a_greedy_none')
|
||||||
|
self.run_tests_and_assert(include=None, exclude=None, expected_tests=expected, selector_name='model_a_greedy_none')
|
||||||
|
|
||||||
|
# when greedy is explicitly False
|
||||||
|
self.list_tests_and_assert(include=None, exclude=None, expected_tests=expected, selector_name='model_a_greedy_false')
|
||||||
|
self.run_tests_and_assert(include=None, exclude=None, expected_tests=expected, selector_name='model_a_greedy_false')
|
||||||
|
|
||||||
|
|
||||||
|
@use_profile('postgres')
|
||||||
|
def test__postgres__selector_model_a_yes_greedy(self):
|
||||||
|
expected = [
|
||||||
|
'cf_a_b', 'cf_a_src', 'just_a',
|
||||||
|
'relationships_model_a_fun__fun__ref_model_b_',
|
||||||
|
'relationships_model_a_fun__fun__source_my_src_my_tbl_',
|
||||||
|
'unique_model_a_fun'
|
||||||
|
]
|
||||||
|
|
||||||
|
# when greedy is explicitly False
|
||||||
|
self.list_tests_and_assert(include=None, exclude=None, expected_tests=expected, selector_name='model_a_greedy_true')
|
||||||
|
self.run_tests_and_assert(include=None, exclude=None, expected_tests=expected, selector_name='model_a_greedy_true')
|
||||||
|
|||||||
@@ -120,7 +120,7 @@ def test_run_specs(include, exclude, expected):
|
|||||||
manifest = _get_manifest(graph)
|
manifest = _get_manifest(graph)
|
||||||
selector = graph_selector.NodeSelector(graph, manifest)
|
selector = graph_selector.NodeSelector(graph, manifest)
|
||||||
spec = graph_cli.parse_difference(include, exclude)
|
spec = graph_cli.parse_difference(include, exclude)
|
||||||
selected = selector.select_nodes(spec)
|
selected, _ = selector.select_nodes(spec)
|
||||||
|
|
||||||
assert selected == expected
|
assert selected == expected
|
||||||
|
|
||||||
|
|||||||
@@ -273,7 +273,7 @@ class ManifestTest(unittest.TestCase):
|
|||||||
'child_map': {},
|
'child_map': {},
|
||||||
'metadata': {
|
'metadata': {
|
||||||
'generated_at': '2018-02-14T09:15:13Z',
|
'generated_at': '2018-02-14T09:15:13Z',
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'env': {ENV_KEY_NAME: 'value'},
|
'env': {ENV_KEY_NAME: 'value'},
|
||||||
# invocation_id is None, so it will not be present
|
# invocation_id is None, so it will not be present
|
||||||
@@ -419,7 +419,7 @@ class ManifestTest(unittest.TestCase):
|
|||||||
'docs': {},
|
'docs': {},
|
||||||
'metadata': {
|
'metadata': {
|
||||||
'generated_at': '2018-02-14T09:15:13Z',
|
'generated_at': '2018-02-14T09:15:13Z',
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'project_id': '098f6bcd4621d373cade4e832627b4f6',
|
'project_id': '098f6bcd4621d373cade4e832627b4f6',
|
||||||
'user_id': 'cfc9500f-dc7f-4c83-9ea7-2c581c1b38cf',
|
'user_id': 'cfc9500f-dc7f-4c83-9ea7-2c581c1b38cf',
|
||||||
@@ -662,7 +662,7 @@ class MixedManifestTest(unittest.TestCase):
|
|||||||
'child_map': {},
|
'child_map': {},
|
||||||
'metadata': {
|
'metadata': {
|
||||||
'generated_at': '2018-02-14T09:15:13Z',
|
'generated_at': '2018-02-14T09:15:13Z',
|
||||||
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v2.json',
|
'dbt_schema_version': 'https://schemas.getdbt.com/dbt/manifest/v3.json',
|
||||||
'dbt_version': dbt.version.__version__,
|
'dbt_version': dbt.version.__version__,
|
||||||
'invocation_id': '01234567-0123-0123-0123-0123456789ab',
|
'invocation_id': '01234567-0123-0123-0123-0123456789ab',
|
||||||
'env': {ENV_KEY_NAME: 'value'},
|
'env': {ENV_KEY_NAME: 'value'},
|
||||||
|
|||||||
Reference in New Issue
Block a user