mirror of
https://github.com/dbt-labs/dbt-core
synced 2025-12-18 23:41:28 +00:00
Compare commits
12 Commits
leahwicz/p
...
db-setup-w
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
03dfb11d2b | ||
|
|
3effade266 | ||
|
|
44e7390526 | ||
|
|
c141798abc | ||
|
|
df7ec3fb37 | ||
|
|
90e5507d03 | ||
|
|
332d3494b3 | ||
|
|
6393f5a5d7 | ||
|
|
ce97a9ca7a | ||
|
|
9af071bfe4 | ||
|
|
45a41202f3 | ||
|
|
9768999ca1 |
21
CHANGELOG.md
21
CHANGELOG.md
@@ -1,6 +1,7 @@
|
||||
## dbt 0.21.0 (Release TBD)
|
||||
|
||||
### Features
|
||||
|
||||
- Add connect_timeout profile configuration for Postgres and Redshift adapters. ([#3581](https://github.com/dbt-labs/dbt/issues/3581), [#3582](https://github.com/dbt-labs/dbt/pull/3582))
|
||||
- Enhance BigQuery copy materialization ([#3570](https://github.com/dbt-labs/dbt/issues/3570), [#3606](https://github.com/dbt-labs/dbt/pull/3606)):
|
||||
- to simplify config (default usage of `copy_materialization='table'` if is is not found in global or local config)
|
||||
@@ -8,23 +9,37 @@
|
||||
- Customize ls task JSON output by adding new flag `--output-keys` ([#3778](https://github.com/dbt-labs/dbt/issues/3778), [#3395](https://github.com/dbt-labs/dbt/issues/3395))
|
||||
|
||||
### Fixes
|
||||
|
||||
- Support BigQuery-specific aliases `target_dataset` and `target_project` in snapshot configs ([#3694](https://github.com/dbt-labs/dbt/issues/3694), [#3834](https://github.com/dbt-labs/dbt/pull/3834))
|
||||
- `dbt debug` shows a summary of whether all checks passed or not ([3831](https://github.com/dbt-labs/dbt/issues/3831), [3832](https://github.com/dbt-labs/dbt/issues/3831))
|
||||
- `dbt debug` shows a summary of whether all checks passed or not ([#3831](https://github.com/dbt-labs/dbt/issues/3831), [#3832](https://github.com/dbt-labs/dbt/issues/3831))
|
||||
- Fix issue when running the `deps` task after the `list` task in the RPC server ([#3846](https://github.com/dbt-labs/dbt/issues/3846), [#3848](https://github.com/dbt-labs/dbt/pull/3848), [#3850](https://github.com/dbt-labs/dbt/pull/3850))
|
||||
- Fix bug with initializing a dataclass that inherits from `typing.Protocol`, specifically for `dbt.config.profile.Profile` ([#3843](https://github.com/dbt-labs/dbt/issues/3843), [#3855](https://github.com/dbt-labs/dbt/pull/3855))
|
||||
- Introduce a macro, `get_where_subquery`, for tests that use `where` config. Alias filtering subquery as `dbt_subquery` instead of resource identifier ([#3857](https://github.com/dbt-labs/dbt/issues/3857), [#3859](https://github.com/dbt-labs/dbt/issues/3859))
|
||||
|
||||
### Fixes
|
||||
|
||||
- Separated table vs view configuration for BigQuery since some configuration is not possible to set for tables vs views. ([#3682](https://github.com/dbt-labs/dbt/issues/3682), [#3691](https://github.com/dbt-labs/dbt/issues/3682))
|
||||
|
||||
### Under the hood
|
||||
|
||||
- Use GitHub Actions for CI ([#3688](https://github.com/dbt-labs/dbt/issues/3688), [#3669](https://github.com/dbt-labs/dbt/pull/3669))
|
||||
- Better dbt hub registry packages version logging that prompts the user for upgrades to relevant packages ([#3560](https://github.com/dbt-labs/dbt/issues/3560), [#3763](https://github.com/dbt-labs/dbt/issues/3763), [#3759](https://github.com/dbt-labs/dbt/pull/3759))
|
||||
- Allow the default seed macro's SQL parameter, `%s`, to be replaced by dispatching a new macro, `get_binding_char()`. This enables adapters with parameter marker characters such as `?` to not have to override `basic_load_csv_rows`. ([#3622](https://github.com/fishtown-analytics/dbt/issues/3622), [#3623](https://github.com/fishtown-analytics/dbt/pull/3623))
|
||||
- Allow the default seed macro's SQL parameter, `%s`, to be replaced by dispatching a new macro, `get_binding_char()`. This enables adapters with parameter marker characters such as `?` to not have to override `basic_load_csv_rows`. ([#3622](https://github.com/fishtown-analytics/dbt/issues/3622), [#3623](https://github.com/fishtown-analytics/dbt/pull/3623))
|
||||
- Alert users on package rename ([hub.getdbt.com#180](https://github.com/dbt-labs/hub.getdbt.com/issues/810), [#3825](https://github.com/dbt-labs/dbt/pull/3825))
|
||||
- Add `adapter_unique_id` to invocation context in anonymous usage tracking, to better understand dbt adoption ([#3713](https://github.com/dbt-labs/dbt/issues/3713), [#3796](https://github.com/dbt-labs/dbt/issues/3796))
|
||||
- Specify `macro_namespace = 'dbt'` for all dispatched macros in the global project, making it possible to dispatch to macro implementations defined in packages. Dispatch `generate_schema_name` and `generate_alias_name` ([#3456](https://github.com/dbt-labs/dbt/issues/3456), [#3851](https://github.com/dbt-labs/dbt/issues/3851))
|
||||
|
||||
Contributors:
|
||||
|
||||
- [@xemuliam](https://github.com/xemuliam) ([#3606](https://github.com/dbt-labs/dbt/pull/3606))
|
||||
- [@sungchun12](https://github.com/sungchun12) ([#3759](https://github.com/dbt-labs/dbt/pull/3759))
|
||||
- [@dbrtly](https://github.com/dbrtly) ([#3834](https://github.com/dbt-labs/dbt/pull/3834))
|
||||
- [@swanderz](https://github.com/swanderz) [#3623](https://github.com/fishtown-analytics/dbt/pull/3623)
|
||||
- [@swanderz](https://github.com/swanderz) [#3623](https://github.com/dbt-labs/dbt/pull/3623)
|
||||
- [@JasonGluck](https://github.com/JasonGluck) ([#3582](https://github.com/dbt-labs/dbt/pull/3582))
|
||||
- [@joellabes](https://github.com/joellabes) ([#3669](https://github.com/dbt-labs/dbt/pull/3669))
|
||||
- [@juma-adoreme](https://github.com/juma-adoreme) ([#3838](https://github.com/dbt-labs/dbt/pull/3838))
|
||||
- [@annafil](https://github.com/annafil) ([#3825](https://github.com/dbt-labs/dbt/pull/3825))
|
||||
- [@AndreasTA-AW](https://github.com/AndreasTA-AW) ([#3691](https://github.com/dbt-labs/dbt/pull/3691))
|
||||
|
||||
## dbt 0.21.0b2 (August 19, 2021)
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ import functools
|
||||
import requests
|
||||
from dbt.utils import memoized, _connection_exception_retry as connection_exception_retry
|
||||
from dbt.logger import GLOBAL_LOGGER as logger
|
||||
from dbt import deprecations
|
||||
import os
|
||||
|
||||
if os.getenv('DBT_PACKAGE_HUB_URL'):
|
||||
@@ -44,7 +45,29 @@ def packages(registry_base_url=None):
|
||||
|
||||
|
||||
def package(name, registry_base_url=None):
|
||||
return _get_with_retries('api/v1/{}.json'.format(name), registry_base_url)
|
||||
response = _get_with_retries('api/v1/{}.json'.format(name), registry_base_url)
|
||||
|
||||
# Either redirectnamespace or redirectname in the JSON response indicate a redirect
|
||||
# redirectnamespace redirects based on package ownership
|
||||
# redirectname redirects based on package name
|
||||
# Both can be present at the same time, or neither. Fails gracefully to old name
|
||||
|
||||
if ('redirectnamespace' in response) or ('redirectname' in response):
|
||||
|
||||
if ('redirectnamespace' in response) and response['redirectnamespace'] is not None:
|
||||
use_namespace = response['redirectnamespace']
|
||||
else:
|
||||
use_namespace = response['namespace']
|
||||
|
||||
if ('redirectname' in response) and response['redirectname'] is not None:
|
||||
use_name = response['redirectname']
|
||||
else:
|
||||
use_name = response['name']
|
||||
|
||||
new_nwo = use_namespace + "/" + use_name
|
||||
deprecations.warn('package-redirect', old_name=name, new_name=new_nwo)
|
||||
|
||||
return response
|
||||
|
||||
|
||||
def package_version(name, version, registry_base_url=None):
|
||||
|
||||
@@ -84,7 +84,8 @@ def read_user_config(directory: str) -> UserConfig:
|
||||
|
||||
# The Profile class is included in RuntimeConfig, so any attribute
|
||||
# additions must also be set where the RuntimeConfig class is created
|
||||
@dataclass
|
||||
# `init=False` is a workaround for https://bugs.python.org/issue45081
|
||||
@dataclass(init=False)
|
||||
class Profile(HasCredentials):
|
||||
profile_name: str
|
||||
target_name: str
|
||||
@@ -92,6 +93,23 @@ class Profile(HasCredentials):
|
||||
threads: int
|
||||
credentials: Credentials
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
profile_name: str,
|
||||
target_name: str,
|
||||
config: UserConfig,
|
||||
threads: int,
|
||||
credentials: Credentials
|
||||
):
|
||||
"""Explicitly defining `__init__` to work around bug in Python 3.9.7
|
||||
https://bugs.python.org/issue45081
|
||||
"""
|
||||
self.profile_name = profile_name
|
||||
self.target_name = target_name
|
||||
self.config = config
|
||||
self.threads = threads
|
||||
self.credentials = credentials
|
||||
|
||||
def to_profile_info(
|
||||
self, serialize_credentials: bool = False
|
||||
) -> Dict[str, Any]:
|
||||
|
||||
@@ -391,6 +391,10 @@ class UnsetCredentials(Credentials):
|
||||
def type(self):
|
||||
return None
|
||||
|
||||
@property
|
||||
def unique_field(self):
|
||||
return None
|
||||
|
||||
def connection_info(self, *args, **kwargs):
|
||||
return {}
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import abc
|
||||
import itertools
|
||||
import hashlib
|
||||
from dataclasses import dataclass, field
|
||||
from typing import (
|
||||
Any, ClassVar, Dict, Tuple, Iterable, Optional, List, Callable,
|
||||
@@ -127,6 +128,15 @@ class Credentials(
|
||||
'type not implemented for base credentials class'
|
||||
)
|
||||
|
||||
@abc.abstractproperty
|
||||
def unique_field(self) -> str:
|
||||
raise NotImplementedError(
|
||||
'type not implemented for base credentials class'
|
||||
)
|
||||
|
||||
def hashed_unique_field(self) -> str:
|
||||
return hashlib.md5(self.unique_field.encode('utf-8')).hexdigest()
|
||||
|
||||
def connection_info(
|
||||
self, *, with_aliases: bool = False
|
||||
) -> Iterable[Tuple[str, Any]]:
|
||||
|
||||
@@ -131,6 +131,14 @@ class AdapterMacroDeprecation(DBTDeprecation):
|
||||
'''
|
||||
|
||||
|
||||
class PackageRedirectDeprecation(DBTDeprecation):
|
||||
_name = 'package-redirect'
|
||||
_description = '''\
|
||||
The `{old_name}` package is deprecated in favor of `{new_name}`. Please update
|
||||
your `packages.yml` configuration to use `{new_name}` instead.
|
||||
'''
|
||||
|
||||
|
||||
_adapter_renamed_description = """\
|
||||
The adapter function `adapter.{old_name}` is deprecated and will be removed in
|
||||
a future release of dbt. Please use `adapter.{new_name}` instead.
|
||||
@@ -176,6 +184,7 @@ deprecations_list: List[DBTDeprecation] = [
|
||||
ModelsKeyNonModelDeprecation(),
|
||||
ExecuteMacrosReleaseDeprecation(),
|
||||
AdapterMacroDeprecation(),
|
||||
PackageRedirectDeprecation()
|
||||
]
|
||||
|
||||
deprecations: Dict[str, DBTDeprecation] = {
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{% macro get_columns_in_query(select_sql) -%}
|
||||
{{ return(adapter.dispatch('get_columns_in_query')(select_sql)) }}
|
||||
{{ return(adapter.dispatch('get_columns_in_query', 'dbt')(select_sql)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__get_columns_in_query(select_sql) %}
|
||||
@@ -15,7 +15,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro create_schema(relation) -%}
|
||||
{{ adapter.dispatch('create_schema')(relation) }}
|
||||
{{ adapter.dispatch('create_schema', 'dbt')(relation) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__create_schema(relation) -%}
|
||||
@@ -25,7 +25,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro drop_schema(relation) -%}
|
||||
{{ adapter.dispatch('drop_schema')(relation) }}
|
||||
{{ adapter.dispatch('drop_schema', 'dbt')(relation) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__drop_schema(relation) -%}
|
||||
@@ -35,7 +35,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro create_table_as(temporary, relation, sql) -%}
|
||||
{{ adapter.dispatch('create_table_as')(temporary, relation, sql) }}
|
||||
{{ adapter.dispatch('create_table_as', 'dbt')(temporary, relation, sql) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__create_table_as(temporary, relation, sql) -%}
|
||||
@@ -52,7 +52,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro get_create_index_sql(relation, index_dict) -%}
|
||||
{{ return(adapter.dispatch('get_create_index_sql')(relation, index_dict)) }}
|
||||
{{ return(adapter.dispatch('get_create_index_sql', 'dbt')(relation, index_dict)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__get_create_index_sql(relation, index_dict) -%}
|
||||
@@ -60,7 +60,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro create_indexes(relation) -%}
|
||||
{{ adapter.dispatch('create_indexes')(relation) }}
|
||||
{{ adapter.dispatch('create_indexes', 'dbt')(relation) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__create_indexes(relation) -%}
|
||||
@@ -75,7 +75,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro create_view_as(relation, sql) -%}
|
||||
{{ adapter.dispatch('create_view_as')(relation, sql) }}
|
||||
{{ adapter.dispatch('create_view_as', 'dbt')(relation, sql) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__create_view_as(relation, sql) -%}
|
||||
@@ -89,7 +89,7 @@
|
||||
|
||||
|
||||
{% macro get_catalog(information_schema, schemas) -%}
|
||||
{{ return(adapter.dispatch('get_catalog')(information_schema, schemas)) }}
|
||||
{{ return(adapter.dispatch('get_catalog', 'dbt')(information_schema, schemas)) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__get_catalog(information_schema, schemas) -%}
|
||||
@@ -104,7 +104,7 @@
|
||||
|
||||
|
||||
{% macro get_columns_in_relation(relation) -%}
|
||||
{{ return(adapter.dispatch('get_columns_in_relation')(relation)) }}
|
||||
{{ return(adapter.dispatch('get_columns_in_relation', 'dbt')(relation)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro sql_convert_columns_in_relation(table) -%}
|
||||
@@ -121,13 +121,13 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro alter_column_type(relation, column_name, new_column_type) -%}
|
||||
{{ return(adapter.dispatch('alter_column_type')(relation, column_name, new_column_type)) }}
|
||||
{{ return(adapter.dispatch('alter_column_type', 'dbt')(relation, column_name, new_column_type)) }}
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
|
||||
{% macro alter_column_comment(relation, column_dict) -%}
|
||||
{{ return(adapter.dispatch('alter_column_comment')(relation, column_dict)) }}
|
||||
{{ return(adapter.dispatch('alter_column_comment', 'dbt')(relation, column_dict)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__alter_column_comment(relation, column_dict) -%}
|
||||
@@ -136,7 +136,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro alter_relation_comment(relation, relation_comment) -%}
|
||||
{{ return(adapter.dispatch('alter_relation_comment')(relation, relation_comment)) }}
|
||||
{{ return(adapter.dispatch('alter_relation_comment', 'dbt')(relation, relation_comment)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__alter_relation_comment(relation, relation_comment) -%}
|
||||
@@ -145,7 +145,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro persist_docs(relation, model, for_relation=true, for_columns=true) -%}
|
||||
{{ return(adapter.dispatch('persist_docs')(relation, model, for_relation, for_columns)) }}
|
||||
{{ return(adapter.dispatch('persist_docs', 'dbt')(relation, model, for_relation, for_columns)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__persist_docs(relation, model, for_relation, for_columns) -%}
|
||||
@@ -180,7 +180,7 @@
|
||||
|
||||
|
||||
{% macro drop_relation(relation) -%}
|
||||
{{ return(adapter.dispatch('drop_relation')(relation)) }}
|
||||
{{ return(adapter.dispatch('drop_relation', 'dbt')(relation)) }}
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
@@ -191,7 +191,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro truncate_relation(relation) -%}
|
||||
{{ return(adapter.dispatch('truncate_relation')(relation)) }}
|
||||
{{ return(adapter.dispatch('truncate_relation', 'dbt')(relation)) }}
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
@@ -202,7 +202,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro rename_relation(from_relation, to_relation) -%}
|
||||
{{ return(adapter.dispatch('rename_relation')(from_relation, to_relation)) }}
|
||||
{{ return(adapter.dispatch('rename_relation', 'dbt')(from_relation, to_relation)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__rename_relation(from_relation, to_relation) -%}
|
||||
@@ -214,7 +214,7 @@
|
||||
|
||||
|
||||
{% macro information_schema_name(database) %}
|
||||
{{ return(adapter.dispatch('information_schema_name')(database)) }}
|
||||
{{ return(adapter.dispatch('information_schema_name', 'dbt')(database)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__information_schema_name(database) -%}
|
||||
@@ -227,7 +227,7 @@
|
||||
|
||||
|
||||
{% macro list_schemas(database) -%}
|
||||
{{ return(adapter.dispatch('list_schemas')(database)) }}
|
||||
{{ return(adapter.dispatch('list_schemas', 'dbt')(database)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__list_schemas(database) -%}
|
||||
@@ -241,7 +241,7 @@
|
||||
|
||||
|
||||
{% macro check_schema_exists(information_schema, schema) -%}
|
||||
{{ return(adapter.dispatch('check_schema_exists')(information_schema, schema)) }}
|
||||
{{ return(adapter.dispatch('check_schema_exists', 'dbt')(information_schema, schema)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__check_schema_exists(information_schema, schema) -%}
|
||||
@@ -256,7 +256,7 @@
|
||||
|
||||
|
||||
{% macro list_relations_without_caching(schema_relation) %}
|
||||
{{ return(adapter.dispatch('list_relations_without_caching')(schema_relation)) }}
|
||||
{{ return(adapter.dispatch('list_relations_without_caching', 'dbt')(schema_relation)) }}
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
@@ -267,7 +267,7 @@
|
||||
|
||||
|
||||
{% macro current_timestamp() -%}
|
||||
{{ adapter.dispatch('current_timestamp')() }}
|
||||
{{ adapter.dispatch('current_timestamp', 'dbt')() }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
@@ -278,7 +278,7 @@
|
||||
|
||||
|
||||
{% macro collect_freshness(source, loaded_at_field, filter) %}
|
||||
{{ return(adapter.dispatch('collect_freshness')(source, loaded_at_field, filter))}}
|
||||
{{ return(adapter.dispatch('collect_freshness', 'dbt')(source, loaded_at_field, filter))}}
|
||||
{% endmacro %}
|
||||
|
||||
|
||||
@@ -296,7 +296,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro make_temp_relation(base_relation, suffix='__dbt_tmp') %}
|
||||
{{ return(adapter.dispatch('make_temp_relation')(base_relation, suffix))}}
|
||||
{{ return(adapter.dispatch('make_temp_relation', 'dbt')(base_relation, suffix))}}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__make_temp_relation(base_relation, suffix) %}
|
||||
@@ -313,7 +313,7 @@
|
||||
|
||||
|
||||
{% macro alter_relation_add_remove_columns(relation, add_columns = none, remove_columns = none) -%}
|
||||
{{ return(adapter.dispatch('alter_relation_add_remove_columns')(relation, add_columns, remove_columns)) }}
|
||||
{{ return(adapter.dispatch('alter_relation_add_remove_columns', 'dbt')(relation, add_columns, remove_columns)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__alter_relation_add_remove_columns(relation, add_columns, remove_columns) %}
|
||||
|
||||
@@ -13,6 +13,10 @@
|
||||
|
||||
#}
|
||||
{% macro generate_alias_name(custom_alias_name=none, node=none) -%}
|
||||
{% do return(adapter.dispatch('generate_alias_name', 'dbt')(custom_alias_name, node)) %}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__generate_alias_name(custom_alias_name=none, node=none) -%}
|
||||
|
||||
{%- if custom_alias_name is none -%}
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
|
||||
#}
|
||||
{% macro generate_database_name(custom_database_name=none, node=none) -%}
|
||||
{% do return(adapter.dispatch('generate_database_name')(custom_database_name, node)) %}
|
||||
{% do return(adapter.dispatch('generate_database_name', 'dbt')(custom_database_name, node)) %}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__generate_database_name(custom_database_name=none, node=none) -%}
|
||||
|
||||
@@ -15,6 +15,10 @@
|
||||
|
||||
#}
|
||||
{% macro generate_schema_name(custom_schema_name, node) -%}
|
||||
{{ return(adapter.dispatch('generate_schema_name', 'dbt')(custom_schema_name, node)) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__generate_schema_name(custom_schema_name, node) -%}
|
||||
|
||||
{%- set default_schema = target.schema -%}
|
||||
{%- if custom_schema_name is none -%}
|
||||
|
||||
@@ -0,0 +1,15 @@
|
||||
{% macro get_where_subquery(relation) -%}
|
||||
{% do return(adapter.dispatch('get_where_subquery')(relation)) %}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__get_where_subquery(relation) -%}
|
||||
{% set where = config.get('where', '') %}
|
||||
{% if where %}
|
||||
{%- set filtered -%}
|
||||
(select * from {{ relation }} where {{ where }}) dbt_subquery
|
||||
{%- endset -%}
|
||||
{% do return(filtered) %}
|
||||
{%- else -%}
|
||||
{% do return(relation) %}
|
||||
{%- endif -%}
|
||||
{%- endmacro %}
|
||||
@@ -1,17 +1,17 @@
|
||||
|
||||
|
||||
{% macro get_merge_sql(target, source, unique_key, dest_columns, predicates=none) -%}
|
||||
{{ adapter.dispatch('get_merge_sql')(target, source, unique_key, dest_columns, predicates) }}
|
||||
{{ adapter.dispatch('get_merge_sql', 'dbt')(target, source, unique_key, dest_columns, predicates) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{% macro get_delete_insert_merge_sql(target, source, unique_key, dest_columns) -%}
|
||||
{{ adapter.dispatch('get_delete_insert_merge_sql')(target, source, unique_key, dest_columns) }}
|
||||
{{ adapter.dispatch('get_delete_insert_merge_sql', 'dbt')(target, source, unique_key, dest_columns) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
{% macro get_insert_overwrite_merge_sql(target, source, dest_columns, predicates, include_sql_header=false) -%}
|
||||
{{ adapter.dispatch('get_insert_overwrite_merge_sql')(target, source, dest_columns, predicates, include_sql_header) }}
|
||||
{{ adapter.dispatch('get_insert_overwrite_merge_sql', 'dbt')(target, source, dest_columns, predicates, include_sql_header) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
|
||||
{% macro create_csv_table(model, agate_table) -%}
|
||||
{{ adapter.dispatch('create_csv_table')(model, agate_table) }}
|
||||
{{ adapter.dispatch('create_csv_table', 'dbt')(model, agate_table) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__create_csv_table(model, agate_table) %}
|
||||
@@ -26,7 +26,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro reset_csv_table(model, full_refresh, old_relation, agate_table) -%}
|
||||
{{ adapter.dispatch('reset_csv_table')(model, full_refresh, old_relation, agate_table) }}
|
||||
{{ adapter.dispatch('reset_csv_table', 'dbt')(model, full_refresh, old_relation, agate_table) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__reset_csv_table(model, full_refresh, old_relation, agate_table) %}
|
||||
@@ -43,7 +43,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro get_binding_char() -%}
|
||||
{{ adapter.dispatch('get_binding_char')() }}
|
||||
{{ adapter.dispatch('get_binding_char', 'dbt')() }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__get_binding_char() %}
|
||||
@@ -51,7 +51,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro get_batch_size() -%}
|
||||
{{ adapter.dispatch('get_batch_size')() }}
|
||||
{{ adapter.dispatch('get_batch_size', 'dbt')() }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__get_batch_size() %}
|
||||
@@ -70,7 +70,7 @@
|
||||
{% endmacro %}
|
||||
|
||||
{% macro load_csv_rows(model, agate_table) -%}
|
||||
{{ adapter.dispatch('load_csv_rows')(model, agate_table) }}
|
||||
{{ adapter.dispatch('load_csv_rows', 'dbt')(model, agate_table) }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__load_csv_rows(model, agate_table) %}
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
Add new columns to the table if applicable
|
||||
#}
|
||||
{% macro create_columns(relation, columns) %}
|
||||
{{ adapter.dispatch('create_columns')(relation, columns) }}
|
||||
{{ adapter.dispatch('create_columns', 'dbt')(relation, columns) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__create_columns(relation, columns) %}
|
||||
@@ -15,7 +15,7 @@
|
||||
|
||||
|
||||
{% macro post_snapshot(staging_relation) %}
|
||||
{{ adapter.dispatch('post_snapshot')(staging_relation) }}
|
||||
{{ adapter.dispatch('post_snapshot', 'dbt')(staging_relation) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__post_snapshot(staging_relation) %}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
|
||||
{% macro snapshot_merge_sql(target, source, insert_cols) -%}
|
||||
{{ adapter.dispatch('snapshot_merge_sql')(target, source, insert_cols) }}
|
||||
{{ adapter.dispatch('snapshot_merge_sql', 'dbt')(target, source, insert_cols) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
|
||||
@@ -36,7 +36,7 @@
|
||||
Create SCD Hash SQL fields cross-db
|
||||
#}
|
||||
{% macro snapshot_hash_arguments(args) -%}
|
||||
{{ adapter.dispatch('snapshot_hash_arguments')(args) }}
|
||||
{{ adapter.dispatch('snapshot_hash_arguments', 'dbt')(args) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
@@ -52,7 +52,7 @@
|
||||
Get the current time cross-db
|
||||
#}
|
||||
{% macro snapshot_get_time() -%}
|
||||
{{ adapter.dispatch('snapshot_get_time')() }}
|
||||
{{ adapter.dispatch('snapshot_get_time', 'dbt')() }}
|
||||
{%- endmacro %}
|
||||
|
||||
{% macro default__snapshot_get_time() -%}
|
||||
@@ -94,7 +94,7 @@
|
||||
|
||||
|
||||
{% macro snapshot_string_as_time(timestamp) -%}
|
||||
{{ adapter.dispatch('snapshot_string_as_time')(timestamp) }}
|
||||
{{ adapter.dispatch('snapshot_string_as_time', 'dbt')(timestamp) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{% macro get_test_sql(main_sql, fail_calc, warn_if, error_if, limit) -%}
|
||||
{{ adapter.dispatch('get_test_sql')(main_sql, fail_calc, warn_if, error_if, limit) }}
|
||||
{{ adapter.dispatch('get_test_sql', 'dbt')(main_sql, fail_calc, warn_if, error_if, limit) }}
|
||||
{%- endmacro %}
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
|
||||
{% macro handle_existing_table(full_refresh, old_relation) %}
|
||||
{{ adapter.dispatch('handle_existing_table', macro_namespace = 'dbt')(full_refresh, old_relation) }}
|
||||
{{ adapter.dispatch('handle_existing_table', 'dbt')(full_refresh, old_relation) }}
|
||||
{% endmacro %}
|
||||
|
||||
{% macro default__handle_existing_table(full_refresh, old_relation) %}
|
||||
|
||||
@@ -28,6 +28,6 @@ where value_field not in (
|
||||
|
||||
|
||||
{% test accepted_values(model, column_name, values, quote=True) %}
|
||||
{% set macro = adapter.dispatch('test_accepted_values') %}
|
||||
{% set macro = adapter.dispatch('test_accepted_values', 'dbt') %}
|
||||
{{ macro(model, column_name, values, quote) }}
|
||||
{% endtest %}
|
||||
|
||||
@@ -8,6 +8,6 @@ where {{ column_name }} is null
|
||||
|
||||
|
||||
{% test not_null(model, column_name) %}
|
||||
{% set macro = adapter.dispatch('test_not_null') %}
|
||||
{% set macro = adapter.dispatch('test_not_null', 'dbt') %}
|
||||
{{ macro(model, column_name) }}
|
||||
{% endtest %}
|
||||
|
||||
@@ -25,6 +25,6 @@ where parent.to_field is null
|
||||
|
||||
|
||||
{% test relationships(model, column_name, to, field) %}
|
||||
{% set macro = adapter.dispatch('test_relationships') %}
|
||||
{% set macro = adapter.dispatch('test_relationships', 'dbt') %}
|
||||
{{ macro(model, column_name, to, field) }}
|
||||
{% endtest %}
|
||||
|
||||
@@ -13,6 +13,6 @@ having count(*) > 1
|
||||
|
||||
|
||||
{% test unique(model, column_name) %}
|
||||
{% set macro = adapter.dispatch('test_unique') %}
|
||||
{% set macro = adapter.dispatch('test_unique', 'dbt') %}
|
||||
{{ macro(model, column_name) }}
|
||||
{% endtest %}
|
||||
|
||||
@@ -433,12 +433,8 @@ class TestBuilder(Generic[Testable]):
|
||||
|
||||
def build_model_str(self):
|
||||
targ = self.target
|
||||
cfg_where = "config.get('where')"
|
||||
if isinstance(self.target, UnparsedNodeUpdate):
|
||||
identifier = self.target.name
|
||||
target_str = f"{{{{ ref('{targ.name}') }}}}"
|
||||
target_str = f"ref('{targ.name}')"
|
||||
elif isinstance(self.target, UnpatchedSourceDefinition):
|
||||
identifier = self.target.table.name
|
||||
target_str = f"{{{{ source('{targ.source.name}', '{targ.table.name}') }}}}"
|
||||
filtered = f"(select * from {target_str} where {{{{{cfg_where}}}}}) {identifier}"
|
||||
return f"{{% if {cfg_where} %}}{filtered}{{% else %}}{target_str}{{% endif %}}"
|
||||
target_str = f"source('{targ.source.name}', '{targ.table.name}')"
|
||||
return f"{{{{ get_where_subquery({target_str}) }}}}"
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import inspect
|
||||
from abc import abstractmethod
|
||||
from copy import deepcopy
|
||||
from typing import List, Optional, Type, TypeVar, Generic, Dict, Any
|
||||
|
||||
from dbt.dataclass_schema import dbtClassMixin, ValidationError
|
||||
@@ -21,7 +20,7 @@ class RemoteMethod(Generic[Parameters, Result]):
|
||||
METHOD_NAME: Optional[str] = None
|
||||
|
||||
def __init__(self, args, config):
|
||||
self.args = deepcopy(args)
|
||||
self.args = args
|
||||
self.config = config
|
||||
|
||||
@classmethod
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from copy import deepcopy
|
||||
import threading
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
@@ -155,7 +156,7 @@ class TaskManager:
|
||||
f'Manifest should not be None if the last parse state is '
|
||||
f'{state}'
|
||||
)
|
||||
return task(self.args, self.config, self.manifest)
|
||||
return task(deepcopy(self.args), self.config, self.manifest)
|
||||
|
||||
def rpc_task(
|
||||
self, method_name: str
|
||||
@@ -167,7 +168,7 @@ class TaskManager:
|
||||
elif issubclass(task, RemoteManifestMethod):
|
||||
return self._get_manifest_callable(task)
|
||||
elif issubclass(task, RemoteMethod):
|
||||
return task(self.args, self.config)
|
||||
return task(deepcopy(self.args), self.config)
|
||||
else:
|
||||
raise dbt.exceptions.InternalException(
|
||||
f'Got a task with an invalid type! {task} with method '
|
||||
|
||||
@@ -3,7 +3,6 @@ import re
|
||||
from typing import List
|
||||
|
||||
from packaging import version as packaging_version
|
||||
from distutils.version import StrictVersion
|
||||
|
||||
from dbt.exceptions import VersionsNotCompatibleException
|
||||
import dbt.utils
|
||||
@@ -439,9 +438,14 @@ def filter_installable(
|
||||
install_prerelease: bool
|
||||
) -> List[str]:
|
||||
installable = []
|
||||
installable_dict = {}
|
||||
for version_string in versions:
|
||||
version = VersionSpecifier.from_version_string(version_string)
|
||||
if install_prerelease or not version.prerelease:
|
||||
installable.append(version_string)
|
||||
installable.sort(key=StrictVersion)
|
||||
return installable
|
||||
installable.append(version)
|
||||
installable_dict[str(version)] = version_string
|
||||
sorted_installable = sorted(installable)
|
||||
sorted_installable_original_versions = [
|
||||
str(installable_dict.get(str(version))) for version in sorted_installable
|
||||
]
|
||||
return sorted_installable_original_versions
|
||||
|
||||
@@ -44,7 +44,6 @@ class ListTask(GraphRunnableTask):
|
||||
|
||||
def __init__(self, args, config):
|
||||
super().__init__(args, config)
|
||||
self.args.single_threaded = True
|
||||
if self.args.models:
|
||||
if self.args.select:
|
||||
raise RuntimeException(
|
||||
|
||||
@@ -291,7 +291,6 @@ class RemoteListTask(
|
||||
METHOD_NAME = 'list'
|
||||
|
||||
def set_args(self, params: RPCListParameters) -> None:
|
||||
|
||||
self.args.output = params.output
|
||||
self.args.output_keys = params.output_keys
|
||||
self.args.resource_types = self._listify(params.resource_types)
|
||||
@@ -299,6 +298,7 @@ class RemoteListTask(
|
||||
self.args.exclude = self._listify(params.exclude)
|
||||
self.args.selector_name = params.selector
|
||||
self.args.select = self._listify(params.select)
|
||||
self.args.single_threaded = True
|
||||
|
||||
if self.args.models:
|
||||
if self.args.select:
|
||||
|
||||
@@ -21,7 +21,7 @@ sp_logger.setLevel(100)
|
||||
COLLECTOR_URL = "fishtownanalytics.sinter-collect.com"
|
||||
COLLECTOR_PROTOCOL = "https"
|
||||
|
||||
INVOCATION_SPEC = 'iglu:com.dbt/invocation/jsonschema/1-0-1'
|
||||
INVOCATION_SPEC = 'iglu:com.dbt/invocation/jsonschema/1-0-2'
|
||||
PLATFORM_SPEC = 'iglu:com.dbt/platform/jsonschema/1-0-0'
|
||||
RUN_MODEL_SPEC = 'iglu:com.dbt/run_model/jsonschema/1-0-1'
|
||||
INVOCATION_ENV_SPEC = 'iglu:com.dbt/invocation_env/jsonschema/1-0-0'
|
||||
@@ -166,10 +166,15 @@ def get_run_type(args):
|
||||
|
||||
|
||||
def get_invocation_context(user, config, args):
|
||||
# this adapter might not have implemented the type or unique_field properties
|
||||
try:
|
||||
adapter_type = config.credentials.type
|
||||
except Exception:
|
||||
adapter_type = None
|
||||
try:
|
||||
adapter_unique_id = config.credentials.hashed_unique_field()
|
||||
except Exception:
|
||||
adapter_unique_id = None
|
||||
|
||||
return {
|
||||
"project_id": None if config is None else config.hashed_name(),
|
||||
@@ -182,6 +187,7 @@ def get_invocation_context(user, config, args):
|
||||
|
||||
"run_type": get_run_type(args),
|
||||
"adapter_type": adapter_type,
|
||||
"adapter_unique_id": adapter_unique_id,
|
||||
}
|
||||
|
||||
|
||||
|
||||
11
docs/arch/README.md
Normal file
11
docs/arch/README.md
Normal file
@@ -0,0 +1,11 @@
|
||||
## ADRs
|
||||
|
||||
For any architectural/engineering decisions we make, we will create an ADR (Architectural Design Record) to keep track of what decision we made and why. This allows us to refer back to decisions in the future and see if the reasons we made a choice still holds true. This also allows for others to more easily understand the code. ADRs will follow this process:
|
||||
|
||||
- They will live in the repo, under a directory `docs/arch`
|
||||
- They will be written in markdown
|
||||
- They will follow the naming convention [`adr-NNN-<decision-title>.md`](http://adr-nnn.md/)
|
||||
- `NNN` will just be a counter starting at `001` and will allow us easily keep the records in chronological order.
|
||||
- The common sections that each ADR should have are:
|
||||
- Title, Context, Decision, Status, Consequences
|
||||
- Use this article as a reference: [https://cognitect.com/blog/2011/11/15/documenting-architecture-decisions](https://cognitect.com/blog/2011/11/15/documenting-architecture-decisions)
|
||||
35
docs/arch/adr-001-perf-testing.md
Normal file
35
docs/arch/adr-001-perf-testing.md
Normal file
@@ -0,0 +1,35 @@
|
||||
# Performance Regression Framework
|
||||
|
||||
## Context
|
||||
We want the ability to benchmark our perfomance overtime with new changes going forward.
|
||||
|
||||
### Options
|
||||
- Static Window: Compare the develop branch to fastest version and ensure it doesn't exceed a static window (i.e. time parse on develop and time parse on 0.20.latest and make sure it's not more than 5% slower)
|
||||
- Pro: quick to run
|
||||
- Pro: simple to implement
|
||||
- Con: rerunning a failing test could get it to pass in a large number of changes.
|
||||
- Con: several small regressions could press us up against the threshold requiring us to do unexpected additional performance work, or lower the threshold to get a release out.
|
||||
- Variance-aware Testing: Run both the develop branch and our fastest version *many times* to collect a set of timing data. We can fail on a static window based on medians, confidence interval midpoints, and even variance magnitude.
|
||||
- Pro: would catch more small performance regressions
|
||||
- Con: would take much longer to run
|
||||
- Con: Need to be very careful about making sure caching doesn't wreck the curve (or if it does, it wrecks the curve equally for all tests)
|
||||
- Stateful Tracking: For example, the rust compiler team does some [bananas performance tracking](https://perf.rust-lang.org/). This option could be done in tandem with the above options, however it would require results be stored somewhere.
|
||||
- Pro: we can graph our performance history and look really cool.
|
||||
- Pro: Variance-aware testing would run in half the time since you can just reference old runs for comparison
|
||||
- Con: state in tests sucks
|
||||
- Con: longer to build
|
||||
- Performance Profiling: Running a sampling-based profiler through a series of standardized test runs (test designed to hit as many/all of the code paths in the codebase) to determine if any particular function/class/other code has regressed in performance.
|
||||
- Pro: easy to find the cause of the perf. regression
|
||||
- Pro: should be able to run on a fairly small project size without losing much test resolution (a 5% change in a function should be evident with even a single case that runs that code path)
|
||||
- Con: complex to build
|
||||
- Con: compute intensive
|
||||
- Con: requires stored results to compare against
|
||||
|
||||
## Decision
|
||||
We decided to start with variance-aware testing with the ability to add stateful tracking by leveraging `hyperfine` which does all the variance work for us, and outputs clear json artifacts. Since we're running perfornace testing on a schedule it doesn't matter that as we add more tests it may take hours to run. The artifacts are all stored in the github action runs today, but could easily be changed to be sent somewhere in the action to track over time.
|
||||
|
||||
## Status
|
||||
Completed
|
||||
|
||||
## Consequences
|
||||
We now have the ability to more rigorously detect performance regressions, but we do not have a solid way to identify where that regression is coming from. Adding Performance Profiling cababilities will help with this, but for now just running it nightly should help us narrow it down to specific commits. As we add more performance tests, the testing matrix may take hours to run which consumes resources on GitHub Actions. Because performance testing is asynchronous, failures are easier to miss or ignore, and because it is non-deterministic it adds a non-trivial amount of complexity to our development process.
|
||||
@@ -112,6 +112,10 @@ class BigQueryCredentials(Credentials):
|
||||
def type(self):
|
||||
return 'bigquery'
|
||||
|
||||
@property
|
||||
def unique_field(self):
|
||||
return self.database
|
||||
|
||||
def _connection_keys(self):
|
||||
return ('method', 'database', 'schema', 'location', 'priority',
|
||||
'timeout_seconds', 'maximum_bytes_billed')
|
||||
|
||||
@@ -769,13 +769,10 @@ class BigQueryAdapter(BaseAdapter):
|
||||
return result
|
||||
|
||||
@available.parse(lambda *a, **k: {})
|
||||
def get_table_options(
|
||||
self, config: Dict[str, Any], node: Dict[str, Any], temporary: bool
|
||||
def get_common_options(
|
||||
self, config: Dict[str, Any], node: Dict[str, Any], temporary: bool = False
|
||||
) -> Dict[str, Any]:
|
||||
opts = {}
|
||||
if temporary:
|
||||
expiration = 'TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour)'
|
||||
opts['expiration_timestamp'] = expiration
|
||||
|
||||
if (config.get('hours_to_expiration') is not None) and (not temporary):
|
||||
expiration = (
|
||||
@@ -787,13 +784,25 @@ class BigQueryAdapter(BaseAdapter):
|
||||
description = sql_escape(node['description'])
|
||||
opts['description'] = '"""{}"""'.format(description)
|
||||
|
||||
if config.get('kms_key_name') is not None:
|
||||
opts['kms_key_name'] = "'{}'".format(config.get('kms_key_name'))
|
||||
|
||||
if config.get('labels'):
|
||||
labels = config.get('labels', {})
|
||||
opts['labels'] = list(labels.items())
|
||||
|
||||
return opts
|
||||
|
||||
@available.parse(lambda *a, **k: {})
|
||||
def get_table_options(
|
||||
self, config: Dict[str, Any], node: Dict[str, Any], temporary: bool
|
||||
) -> Dict[str, Any]:
|
||||
opts = self.get_common_options(config, node, temporary)
|
||||
|
||||
if temporary:
|
||||
expiration = 'TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour)'
|
||||
opts['expiration_timestamp'] = expiration
|
||||
|
||||
if config.get('kms_key_name') is not None:
|
||||
opts['kms_key_name'] = "'{}'".format(config.get('kms_key_name'))
|
||||
|
||||
if config.get('require_partition_filter'):
|
||||
opts['require_partition_filter'] = config.get(
|
||||
'require_partition_filter')
|
||||
@@ -804,6 +813,13 @@ class BigQueryAdapter(BaseAdapter):
|
||||
|
||||
return opts
|
||||
|
||||
@available.parse(lambda *a, **k: {})
|
||||
def get_view_options(
|
||||
self, config: Dict[str, Any], node: Dict[str, Any]
|
||||
) -> Dict[str, Any]:
|
||||
opts = self.get_common_options(config, node)
|
||||
return opts
|
||||
|
||||
@available.parse_none
|
||||
def grant_access_to(self, entity, entity_type, role, grant_target_dict):
|
||||
"""
|
||||
|
||||
@@ -27,10 +27,7 @@
|
||||
|
||||
{%- endmacro -%}
|
||||
|
||||
|
||||
{% macro bigquery_table_options(config, node, temporary) %}
|
||||
{% set opts = adapter.get_table_options(config, node, temporary) %}
|
||||
|
||||
{% macro bigquery_options(opts) %}
|
||||
{% set options -%}
|
||||
OPTIONS({% for opt_key, opt_val in opts.items() %}
|
||||
{{ opt_key }}={{ opt_val }}{{ "," if not loop.last }}
|
||||
@@ -39,6 +36,11 @@
|
||||
{%- do return(options) -%}
|
||||
{%- endmacro -%}
|
||||
|
||||
{% macro bigquery_table_options(config, node, temporary) %}
|
||||
{% set opts = adapter.get_table_options(config, node, temporary) %}
|
||||
{%- do return(bigquery_options(opts)) -%}
|
||||
{%- endmacro -%}
|
||||
|
||||
{% macro bigquery__create_table_as(temporary, relation, sql) -%}
|
||||
{%- set raw_partition_by = config.get('partition_by', none) -%}
|
||||
{%- set raw_cluster_by = config.get('cluster_by', none) -%}
|
||||
@@ -58,13 +60,18 @@
|
||||
|
||||
{%- endmacro -%}
|
||||
|
||||
{% macro bigquery_view_options(config, node) %}
|
||||
{% set opts = adapter.get_view_options(config, node) %}
|
||||
{%- do return(bigquery_options(opts)) -%}
|
||||
{%- endmacro -%}
|
||||
|
||||
{% macro bigquery__create_view_as(relation, sql) -%}
|
||||
{%- set sql_header = config.get('sql_header', none) -%}
|
||||
|
||||
{{ sql_header if sql_header is not none }}
|
||||
|
||||
create or replace view {{ relation }}
|
||||
{{ bigquery_table_options(config, model, temporary=false) }}
|
||||
{{ bigquery_view_options(config, model) }}
|
||||
as {{ sql }};
|
||||
|
||||
{% endmacro %}
|
||||
|
||||
@@ -38,6 +38,10 @@ class PostgresCredentials(Credentials):
|
||||
def type(self):
|
||||
return 'postgres'
|
||||
|
||||
@property
|
||||
def unique_field(self):
|
||||
return self.host
|
||||
|
||||
def _connection_keys(self):
|
||||
return ('host', 'port', 'user', 'database', 'schema', 'search_path',
|
||||
'keepalives_idle', 'sslmode')
|
||||
|
||||
@@ -58,6 +58,10 @@ class SnowflakeCredentials(Credentials):
|
||||
def type(self):
|
||||
return 'snowflake'
|
||||
|
||||
@property
|
||||
def unique_field(self):
|
||||
return self.account
|
||||
|
||||
def _connection_keys(self):
|
||||
return (
|
||||
'account', 'user', 'database', 'schema', 'warehouse', 'role',
|
||||
|
||||
@@ -20,3 +20,8 @@ models:
|
||||
to: ref('table_copy') # itself
|
||||
field: id
|
||||
where: 1=1
|
||||
- name: "table.copy.with.dots"
|
||||
description: "A copy of the table with a gross name"
|
||||
# passes, see https://github.com/dbt-labs/dbt/issues/3857
|
||||
tests:
|
||||
- where
|
||||
|
||||
@@ -0,0 +1,9 @@
|
||||
|
||||
{{
|
||||
config(
|
||||
materialized='table',
|
||||
alias='table_copy_with_dots'
|
||||
)
|
||||
}}
|
||||
|
||||
select * from {{ this.schema }}.seed
|
||||
@@ -152,7 +152,7 @@ class TestCustomConfigSchemaTests(DBTIntegrationTest):
|
||||
results = self.run_dbt()
|
||||
results = self.run_dbt(['test'], strict=False)
|
||||
|
||||
self.assertEqual(len(results), 6)
|
||||
self.assertEqual(len(results), 7)
|
||||
for result in results:
|
||||
self.assertFalse(result.skipped)
|
||||
self.assertEqual(
|
||||
@@ -544,8 +544,8 @@ class TestSchemaTestNameCollision(DBTIntegrationTest):
|
||||
|
||||
# both tests have the same unique id except for the hash
|
||||
expected_unique_ids = [
|
||||
'test.test.not_null_base_extension_id.4a9d96018d',
|
||||
'test.test.not_null_base_extension_id.60bbea9027'
|
||||
'test.test.not_null_base_extension_id.922d83a56c',
|
||||
'test.test.not_null_base_extension_id.c8d18fe069'
|
||||
]
|
||||
self.assertIn(test_results[0].node.unique_id, expected_unique_ids)
|
||||
self.assertIn(test_results[1].node.unique_id, expected_unique_ids)
|
||||
|
||||
@@ -187,3 +187,36 @@ class TestDispatchPackagesDeprecation(BaseTestDeprecations):
|
||||
self.run_dbt(strict=True)
|
||||
exc_str = ' '.join(str(exc.exception).split()) # flatten all whitespace
|
||||
assert 'Raised during dispatch for: string_literal' in exc_str
|
||||
|
||||
|
||||
class TestPackageRedirectDeprecation(BaseTestDeprecations):
|
||||
@property
|
||||
def models(self):
|
||||
return self.dir('where-were-going-we-dont-need-models')
|
||||
|
||||
@property
|
||||
def packages_config(self):
|
||||
return {
|
||||
"packages": [
|
||||
{
|
||||
'package': 'fishtown-analytics/dbt_utils',
|
||||
'version': '0.7.0'
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@use_profile('postgres')
|
||||
def test_postgres_package_redirect(self):
|
||||
self.assertEqual(deprecations.active_deprecations, set())
|
||||
self.run_dbt(['deps'], strict=False)
|
||||
expected = {'package-redirect'}
|
||||
self.assertEqual(expected, deprecations.active_deprecations)
|
||||
|
||||
@use_profile('postgres')
|
||||
def test_postgres_package_redirect_fail(self):
|
||||
self.assertEqual(deprecations.active_deprecations, set())
|
||||
with self.assertRaises(dbt.exceptions.CompilationException) as exc:
|
||||
self.run_dbt(['deps'], strict=True)
|
||||
exc_str = ' '.join(str(exc.exception).split()) # flatten all whitespace
|
||||
expected = "The `fishtown-analytics/dbt_utils` package is deprecated in favor of `dbt-labs/dbt_utils`"
|
||||
assert expected in exc_str
|
||||
@@ -0,0 +1,7 @@
|
||||
name: 'package_macro_overrides'
|
||||
version: '1.0'
|
||||
config-version: 2
|
||||
|
||||
profile: 'default'
|
||||
|
||||
macro-paths: ["macros"]
|
||||
@@ -0,0 +1,3 @@
|
||||
{% macro get_columns_in_relation(relation) %}
|
||||
{{ return('a string') }}
|
||||
{% endmacro %}
|
||||
@@ -131,3 +131,36 @@ class TestMacroOverrideBuiltin(DBTIntegrationTest):
|
||||
# the first time, the model doesn't exist
|
||||
self.run_dbt()
|
||||
self.run_dbt()
|
||||
|
||||
|
||||
class TestDispatchMacroOverrideBuiltin(TestMacroOverrideBuiltin):
|
||||
# test the same functionality as above, but this time,
|
||||
# dbt.get_columns_in_relation will dispatch to a default__ macro
|
||||
# from an installed package, per dispatch config search_order
|
||||
|
||||
@property
|
||||
def project_config(self):
|
||||
return {
|
||||
"config-version": 2,
|
||||
"dispatch": [
|
||||
{
|
||||
"macro_namespace": "dbt",
|
||||
"search_order": ["test", "package_macro_overrides", "dbt"],
|
||||
}
|
||||
],
|
||||
}
|
||||
|
||||
@property
|
||||
def packages_config(self):
|
||||
return {
|
||||
'packages': [
|
||||
{
|
||||
"local": "./package_macro_overrides",
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
@use_profile('postgres')
|
||||
def test_postgres_overrides(self):
|
||||
self.run_dbt(["deps"])
|
||||
super().test_postgres_overrides()
|
||||
|
||||
@@ -0,0 +1,7 @@
|
||||
name: 'package_macro_overrides'
|
||||
version: '1.0'
|
||||
config-version: 2
|
||||
|
||||
profile: 'default'
|
||||
|
||||
macro-paths: ["macros"]
|
||||
@@ -0,0 +1,6 @@
|
||||
|
||||
{% macro generate_schema_name(schema_name, node) %}
|
||||
|
||||
{{ schema_name }}_{{ target.schema }}_macro
|
||||
|
||||
{% endmacro %}
|
||||
@@ -259,6 +259,43 @@ class TestCustomSchemaWithCustomMacro(DBTIntegrationTest):
|
||||
self.assertTablesEqual("agg", "view_3", schema, self.xf_schema())
|
||||
|
||||
|
||||
class TestCustomSchemaDispatchedPackageMacro(TestCustomSchemaWithCustomMacro):
|
||||
# test the same functionality as above, but this time,
|
||||
# dbt.generate_schema_name will dispatch to a default__ macro
|
||||
# from an installed package, per dispatch config search_order
|
||||
|
||||
@property
|
||||
def project_config(self):
|
||||
return {
|
||||
"config-version": 2,
|
||||
"dispatch": [
|
||||
{
|
||||
"macro_namespace": "dbt",
|
||||
"search_order": ["test", "package_macro_overrides", "dbt"],
|
||||
}
|
||||
],
|
||||
"models": {
|
||||
"schema": "dbt_test"
|
||||
},
|
||||
|
||||
}
|
||||
|
||||
@property
|
||||
def packages_config(self):
|
||||
return {
|
||||
'packages': [
|
||||
{
|
||||
"local": "./package_macro_overrides",
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
@use_profile('postgres')
|
||||
def test__postgres__custom_schema_from_macro(self):
|
||||
self.run_dbt(["deps"])
|
||||
super().test__postgres__custom_schema_from_macro
|
||||
|
||||
|
||||
class TestCustomSchemaWithCustomMacroConfigs(TestCustomSchemaWithCustomMacro):
|
||||
|
||||
@property
|
||||
|
||||
@@ -1333,7 +1333,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'checksum': self._checksum_file(seed_path),
|
||||
'unrendered_config': unrendered_seed_config,
|
||||
},
|
||||
'test.test.not_null_model_id.3cc43f158a': {
|
||||
'test.test.not_null_model_id.d01cc630e6': {
|
||||
'alias': 'not_null_model_id',
|
||||
'compiled_path': Normalized('target/compiled/test/models/schema.yml/schema_test/not_null_model_id.sql'),
|
||||
'build_path': None,
|
||||
@@ -1343,7 +1343,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'config': test_config,
|
||||
'sources': [],
|
||||
'depends_on': {
|
||||
'macros': ['macro.dbt.test_not_null'],
|
||||
'macros': ['macro.dbt.test_not_null', 'macro.dbt.get_where_subquery'],
|
||||
'nodes': ['model.test.model'],
|
||||
},
|
||||
'deferred': False,
|
||||
@@ -1363,7 +1363,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'database': self.default_database,
|
||||
'tags': ['schema'],
|
||||
'meta': {},
|
||||
'unique_id': 'test.test.not_null_model_id.3cc43f158a',
|
||||
'unique_id': 'test.test.not_null_model_id.d01cc630e6',
|
||||
'docs': {'show': True},
|
||||
'compiled': True,
|
||||
'compiled_sql': AnyStringWith('where id is null'),
|
||||
@@ -1374,7 +1374,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'name': 'not_null',
|
||||
'kwargs': {
|
||||
'column_name': 'id',
|
||||
'model': "{% if config.get('where') %}(select * from {{ ref('model') }} where {{config.get('where')}}) model{% else %}{{ ref('model') }}{% endif %}",
|
||||
'model': "{{ get_where_subquery(ref('model')) }}",
|
||||
},
|
||||
},
|
||||
'checksum': {'name': 'none', 'checksum': ''},
|
||||
@@ -1424,7 +1424,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'unique_id': 'snapshot.test.snapshot_seed',
|
||||
'unrendered_config': unrendered_snapshot_config,
|
||||
},
|
||||
'test.test.test_nothing_model_.f26a2164bd': {
|
||||
'test.test.test_nothing_model_.5d38568946': {
|
||||
'alias': 'test_nothing_model_',
|
||||
'compiled_path': Normalized('target/compiled/test/models/schema.yml/schema_test/test_nothing_model_.sql'),
|
||||
'build_path': None,
|
||||
@@ -1434,7 +1434,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'config': test_config,
|
||||
'sources': [],
|
||||
'depends_on': {
|
||||
'macros': ['macro.test.test_nothing'],
|
||||
'macros': ['macro.test.test_nothing', 'macro.dbt.get_where_subquery'],
|
||||
'nodes': ['model.test.model'],
|
||||
},
|
||||
'deferred': False,
|
||||
@@ -1454,7 +1454,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'database': self.default_database,
|
||||
'tags': ['schema'],
|
||||
'meta': {},
|
||||
'unique_id': 'test.test.test_nothing_model_.f26a2164bd',
|
||||
'unique_id': 'test.test.test_nothing_model_.5d38568946',
|
||||
'docs': {'show': True},
|
||||
'compiled': True,
|
||||
'compiled_sql': AnyStringWith('select 0'),
|
||||
@@ -1464,13 +1464,13 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'namespace': 'test',
|
||||
'name': 'nothing',
|
||||
'kwargs': {
|
||||
'model': "{% if config.get('where') %}(select * from {{ ref('model') }} where {{config.get('where')}}) model{% else %}{{ ref('model') }}{% endif %}",
|
||||
'model': "{{ get_where_subquery(ref('model')) }}",
|
||||
},
|
||||
},
|
||||
'checksum': {'name': 'none', 'checksum': ''},
|
||||
'unrendered_config': unrendered_test_config,
|
||||
},
|
||||
'test.test.unique_model_id.d8c29ba0fc': {
|
||||
'test.test.unique_model_id.67b76558ff': {
|
||||
'alias': 'unique_model_id',
|
||||
'compiled_path': Normalized('target/compiled/test/models/schema.yml/schema_test/unique_model_id.sql'),
|
||||
'build_path': None,
|
||||
@@ -1480,7 +1480,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'config': test_config,
|
||||
'sources': [],
|
||||
'depends_on': {
|
||||
'macros': ['macro.dbt.test_unique'],
|
||||
'macros': ['macro.dbt.test_unique', 'macro.dbt.get_where_subquery'],
|
||||
'nodes': ['model.test.model'],
|
||||
},
|
||||
'deferred': False,
|
||||
@@ -1500,7 +1500,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'database': self.default_database,
|
||||
'tags': ['schema'],
|
||||
'meta': {},
|
||||
'unique_id': 'test.test.unique_model_id.d8c29ba0fc',
|
||||
'unique_id': 'test.test.unique_model_id.67b76558ff',
|
||||
'docs': {'show': True},
|
||||
'compiled': True,
|
||||
'compiled_sql': AnyStringWith('count(*)'),
|
||||
@@ -1511,7 +1511,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'name': 'unique',
|
||||
'kwargs': {
|
||||
'column_name': 'id',
|
||||
'model': "{% if config.get('where') %}(select * from {{ ref('model') }} where {{config.get('where')}}) model{% else %}{{ ref('model') }}{% endif %}",
|
||||
'model': "{{ get_where_subquery(ref('model')) }}",
|
||||
},
|
||||
},
|
||||
'checksum': {'name': 'none', 'checksum': ''},
|
||||
@@ -1636,17 +1636,17 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'seed.test.seed': [],
|
||||
'snapshot.test.snapshot_seed': ['seed.test.seed'],
|
||||
'source.test.my_source.my_table': [],
|
||||
'test.test.not_null_model_id.3cc43f158a': ['model.test.model'],
|
||||
'test.test.test_nothing_model_.f26a2164bd': ['model.test.model'],
|
||||
'test.test.unique_model_id.d8c29ba0fc': ['model.test.model'],
|
||||
'test.test.not_null_model_id.d01cc630e6': ['model.test.model'],
|
||||
'test.test.test_nothing_model_.5d38568946': ['model.test.model'],
|
||||
'test.test.unique_model_id.67b76558ff': ['model.test.model'],
|
||||
},
|
||||
'child_map': {
|
||||
'model.test.model': [
|
||||
'exposure.test.notebook_exposure',
|
||||
'exposure.test.simple_exposure',
|
||||
'test.test.not_null_model_id.3cc43f158a',
|
||||
'test.test.test_nothing_model_.f26a2164bd',
|
||||
'test.test.unique_model_id.d8c29ba0fc',
|
||||
'test.test.not_null_model_id.d01cc630e6',
|
||||
'test.test.test_nothing_model_.5d38568946',
|
||||
'test.test.unique_model_id.67b76558ff',
|
||||
],
|
||||
'model.test.second_model': ['exposure.test.notebook_exposure'],
|
||||
'exposure.test.notebook_exposure': [],
|
||||
@@ -1656,9 +1656,9 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'snapshot.test.snapshot_seed'],
|
||||
'snapshot.test.snapshot_seed': [],
|
||||
'source.test.my_source.my_table': ['exposure.test.simple_exposure'],
|
||||
'test.test.not_null_model_id.3cc43f158a': [],
|
||||
'test.test.test_nothing_model_.f26a2164bd': [],
|
||||
'test.test.unique_model_id.d8c29ba0fc': [],
|
||||
'test.test.not_null_model_id.d01cc630e6': [],
|
||||
'test.test.test_nothing_model_.5d38568946': [],
|
||||
'test.test.unique_model_id.67b76558ff': [],
|
||||
},
|
||||
'docs': {
|
||||
'dbt.__overview__': ANY,
|
||||
@@ -3024,7 +3024,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'status': 'success',
|
||||
'message': None,
|
||||
'execution_time': AnyFloat(),
|
||||
'unique_id': 'test.test.not_null_model_id.3cc43f158a',
|
||||
'unique_id': 'test.test.not_null_model_id.d01cc630e6',
|
||||
'adapter_response': ANY,
|
||||
'thread_id': ANY,
|
||||
'timing': [ANY, ANY],
|
||||
@@ -3034,7 +3034,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'status': 'success',
|
||||
'message': None,
|
||||
'execution_time': AnyFloat(),
|
||||
'unique_id': 'test.test.test_nothing_model_.f26a2164bd',
|
||||
'unique_id': 'test.test.test_nothing_model_.5d38568946',
|
||||
'adapter_response': ANY,
|
||||
'thread_id': ANY,
|
||||
'timing': [ANY, ANY],
|
||||
@@ -3044,7 +3044,7 @@ class TestDocsGenerate(DBTIntegrationTest):
|
||||
'status': 'success',
|
||||
'message': None,
|
||||
'execution_time': AnyFloat(),
|
||||
'unique_id': 'test.test.unique_model_id.d8c29ba0fc',
|
||||
'unique_id': 'test.test.unique_model_id.67b76558ff',
|
||||
'adapter_response': ANY,
|
||||
'thread_id': ANY,
|
||||
'timing': [ANY, ANY],
|
||||
|
||||
@@ -374,7 +374,7 @@ class TestStrictUndefined(DBTIntegrationTest):
|
||||
'alias': None,
|
||||
'meta': {},
|
||||
},
|
||||
'unique_id': 'test.test.not_null_outer_id.e5db1d4aad',
|
||||
'unique_id': 'test.test.not_null_outer_id.a226f4fb36',
|
||||
'original_file_path': normalize('models/schema.yml'),
|
||||
'alias': 'not_null_outer_id',
|
||||
'resource_type': 'test',
|
||||
@@ -426,7 +426,7 @@ class TestStrictUndefined(DBTIntegrationTest):
|
||||
'alias': None,
|
||||
'meta': {},
|
||||
},
|
||||
'unique_id': 'test.test.unique_outer_id.615b011076',
|
||||
'unique_id': 'test.test.unique_outer_id.2195e332d3',
|
||||
'original_file_path': normalize('models/schema.yml'),
|
||||
'alias': 'unique_outer_id',
|
||||
'resource_type': 'test',
|
||||
|
||||
@@ -74,7 +74,7 @@ class TestModels(DBTIntegrationTest):
|
||||
self.assertEqual(type(schema_file).__name__, 'SchemaSourceFile')
|
||||
self.assertEqual(len(schema_file.tests), 1)
|
||||
tests = schema_file.get_all_test_ids()
|
||||
self.assertEqual(tests, ['test.test.unique_model_three_id.1358521a1c'])
|
||||
self.assertEqual(tests, ['test.test.unique_model_three_id.6776ac8160'])
|
||||
unique_test_id = tests[0]
|
||||
self.assertIn(unique_test_id, manifest.nodes)
|
||||
|
||||
@@ -85,7 +85,7 @@ class TestModels(DBTIntegrationTest):
|
||||
schema_file_id = 'test://' + normalize('models-a/schema.yml')
|
||||
schema_file = manifest.files[schema_file_id]
|
||||
tests = schema_file.get_all_test_ids()
|
||||
self.assertEqual(tests, ['test.test.not_null_model_three_id.8f3f13afd0'])
|
||||
self.assertEqual(tests, ['test.test.not_null_model_three_id.3162ce0a6f'])
|
||||
not_null_test_id = tests[0]
|
||||
self.assertIn(not_null_test_id, manifest.nodes.keys())
|
||||
self.assertNotIn(unique_test_id, manifest.nodes.keys())
|
||||
@@ -406,7 +406,7 @@ class TestPartialParsingDependency(DBTIntegrationTest):
|
||||
self.assertIn(source_id, manifest.sources)
|
||||
# We have 1 root model, 1 local_dep model, 1 local_dep seed, 1 local_dep source test, 2 root source tests
|
||||
self.assertEqual(len(manifest.nodes), 5)
|
||||
test_id = 'test.local_dep.source_unique_seed_source_seed_id.c37cdbabae'
|
||||
test_id = 'test.local_dep.source_unique_seed_source_seed_id.afa94935ed'
|
||||
test_node = manifest.nodes[test_id]
|
||||
|
||||
|
||||
@@ -460,7 +460,7 @@ class TestMacros(DBTIntegrationTest):
|
||||
shutil.copyfile('extra-files/custom_schema_tests2.sql', 'macros-macros/custom_schema_tests.sql')
|
||||
results = self.run_dbt(["--partial-parse", "test"], expect_pass=False)
|
||||
manifest = get_manifest()
|
||||
test_node_id = 'test.test.type_two_model_a_.05477328b9'
|
||||
test_node_id = 'test.test.type_two_model_a_.842bc6c2a7'
|
||||
self.assertIn(test_node_id, manifest.nodes)
|
||||
results = sorted(results, key=lambda r: r.node.name)
|
||||
self.assertEqual(len(results), 2)
|
||||
|
||||
@@ -6,73 +6,6 @@ from .util import (
|
||||
)
|
||||
|
||||
|
||||
def deps_with_packages(packages, bad_packages, project_dir, profiles_dir, schema):
|
||||
project = ProjectDefinition(
|
||||
models={
|
||||
'my_model.sql': 'select 1 as id',
|
||||
},
|
||||
packages={'packages': packages},
|
||||
)
|
||||
querier_ctx = get_querier(
|
||||
project_def=project,
|
||||
project_dir=project_dir,
|
||||
profiles_dir=profiles_dir,
|
||||
schema=schema,
|
||||
test_kwargs={},
|
||||
criteria='error',
|
||||
)
|
||||
|
||||
with querier_ctx as querier:
|
||||
# we should be able to run sql queries at startup
|
||||
querier.is_error(querier.run_sql('select 1 as id'))
|
||||
|
||||
# the status should be an error as deps wil not be defined
|
||||
querier.is_result(querier.status())
|
||||
|
||||
# deps should pass
|
||||
querier.async_wait_for_result(querier.deps())
|
||||
|
||||
# queries should work after deps
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
# now break the project
|
||||
project.packages['packages'] = bad_packages
|
||||
project.write_packages(project_dir, remove=True)
|
||||
|
||||
# queries should still work because we haven't reloaded
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
# now run deps again, it should be sad
|
||||
querier.async_wait_for_error(querier.deps())
|
||||
# it should also not be running.
|
||||
result = querier.is_result(querier.ps(active=True, completed=False))
|
||||
assert result['rows'] == []
|
||||
|
||||
# fix packages again
|
||||
project.packages['packages'] = packages
|
||||
project.write_packages(project_dir, remove=True)
|
||||
# keep queries broken, we haven't run deps yet
|
||||
querier.is_error(querier.run())
|
||||
|
||||
# deps should pass now
|
||||
querier.async_wait_for_result(querier.deps())
|
||||
querier.is_result(querier.status())
|
||||
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"packages, bad_packages",
|
||||
# from dbt hub
|
||||
@@ -147,4 +80,110 @@ def deps_with_packages(packages, bad_packages, project_dir, profiles_dir, schema
|
||||
)
|
||||
@pytest.mark.supported('postgres')
|
||||
def test_rpc_deps_packages(project_root, profiles_root, dbt_profile, unique_schema, packages, bad_packages):
|
||||
deps_with_packages(packages, bad_packages, project_root, profiles_root, unique_schema)
|
||||
project = ProjectDefinition(
|
||||
models={
|
||||
'my_model.sql': 'select 1 as id',
|
||||
},
|
||||
packages={'packages': packages},
|
||||
)
|
||||
querier_ctx = get_querier(
|
||||
project_def=project,
|
||||
project_dir=project_root,
|
||||
profiles_dir=profiles_root,
|
||||
schema=unique_schema,
|
||||
test_kwargs={},
|
||||
criteria='error',
|
||||
)
|
||||
with querier_ctx as querier:
|
||||
# we should be able to run sql queries at startup
|
||||
querier.is_error(querier.run_sql('select 1 as id'))
|
||||
|
||||
# the status should be an error as deps wil not be defined
|
||||
querier.is_result(querier.status())
|
||||
|
||||
# deps should pass
|
||||
querier.async_wait_for_result(querier.deps())
|
||||
|
||||
# queries should work after deps
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
# now break the project
|
||||
project.packages['packages'] = bad_packages
|
||||
project.write_packages(project_root, remove=True)
|
||||
|
||||
# queries should still work because we haven't reloaded
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
# now run deps again, it should be sad
|
||||
querier.async_wait_for_error(querier.deps())
|
||||
# it should also not be running.
|
||||
result = querier.is_result(querier.ps(active=True, completed=False))
|
||||
assert result['rows'] == []
|
||||
|
||||
# fix packages again
|
||||
project.packages['packages'] = packages
|
||||
project.write_packages(project_root, remove=True)
|
||||
# keep queries broken, we haven't run deps yet
|
||||
querier.is_error(querier.run())
|
||||
|
||||
# deps should pass now
|
||||
querier.async_wait_for_result(querier.deps())
|
||||
querier.is_result(querier.status())
|
||||
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
|
||||
@pytest.mark.supported('postgres')
|
||||
def test_rpc_deps_after_list(project_root, profiles_root, dbt_profile, unique_schema):
|
||||
packages = [{
|
||||
'package': 'dbt-labs/dbt_utils',
|
||||
'version': '0.5.0',
|
||||
}]
|
||||
project = ProjectDefinition(
|
||||
models={
|
||||
'my_model.sql': 'select 1 as id',
|
||||
},
|
||||
packages={'packages': packages},
|
||||
)
|
||||
querier_ctx = get_querier(
|
||||
project_def=project,
|
||||
project_dir=project_root,
|
||||
profiles_dir=profiles_root,
|
||||
schema=unique_schema,
|
||||
test_kwargs={},
|
||||
criteria='error',
|
||||
)
|
||||
with querier_ctx as querier:
|
||||
# we should be able to run sql queries at startup
|
||||
querier.is_error(querier.run_sql('select 1 as id'))
|
||||
|
||||
# the status should be an error as deps wil not be defined
|
||||
querier.is_result(querier.status())
|
||||
|
||||
# deps should pass
|
||||
querier.async_wait_for_result(querier.deps())
|
||||
|
||||
# queries should work after deps
|
||||
tok1 = querier.is_async_result(querier.run())
|
||||
tok2 = querier.is_async_result(querier.run_sql('select 1 as id'))
|
||||
|
||||
querier.is_result(querier.async_wait(tok2))
|
||||
querier.is_result(querier.async_wait(tok1))
|
||||
|
||||
# list should pass
|
||||
querier.list()
|
||||
|
||||
# deps should pass
|
||||
querier.async_wait_for_result(querier.deps())
|
||||
|
||||
@@ -209,6 +209,9 @@ class Querier:
|
||||
def deps(self, request_id: int = 1):
|
||||
return self.request(method='deps', request_id=request_id)
|
||||
|
||||
def list(self, request_id: int = 1):
|
||||
return self.request(method='list', request_id=request_id)
|
||||
|
||||
def compile(
|
||||
self,
|
||||
models: Optional[Union[str, List[str]]] = None,
|
||||
|
||||
@@ -12,23 +12,10 @@ PGPORT="${PGPORT:-5432}"
|
||||
export PGPORT
|
||||
PGHOST="${PGHOST:-localhost}"
|
||||
|
||||
function connect_circle() {
|
||||
# try to handle circleci/docker oddness
|
||||
let rc=1
|
||||
while [[ $rc -eq 1 ]]; do
|
||||
nc -z ${PGHOST} ${PGPORT}
|
||||
let rc=$?
|
||||
done
|
||||
if [[ $rc -ne 0 ]]; then
|
||||
echo "Fatal: Could not connect to $PGHOST"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
# appveyor doesn't have 'nc', but it also doesn't have these issues
|
||||
if [[ -n $CIRCLECI ]]; then
|
||||
connect_circle
|
||||
fi
|
||||
until psql -h "$PGHOST" -c '\q'; do
|
||||
>&2 echo "Postgres is unavailable - sleeping"
|
||||
sleep 1
|
||||
done
|
||||
|
||||
createdb dbt
|
||||
psql -c "CREATE ROLE root WITH PASSWORD 'password';"
|
||||
|
||||
@@ -868,6 +868,31 @@ class TestBigQueryAdapter(BaseTestBigQueryAdapter):
|
||||
actual = adapter.get_table_options(mock_config, node={}, temporary=True)
|
||||
self.assertEqual(expected, actual)
|
||||
|
||||
def test_table_kms_key_name(self):
|
||||
adapter = self.get_adapter('oauth')
|
||||
mock_config = create_autospec(
|
||||
RuntimeConfigObject)
|
||||
config={'kms_key_name': 'some_key'}
|
||||
mock_config.get.side_effect = lambda name: config.get(name)
|
||||
|
||||
expected = {
|
||||
'kms_key_name': "'some_key'"
|
||||
}
|
||||
actual = adapter.get_table_options(mock_config, node={}, temporary=False)
|
||||
self.assertEqual(expected, actual)
|
||||
|
||||
|
||||
def test_view_kms_key_name(self):
|
||||
adapter = self.get_adapter('oauth')
|
||||
mock_config = create_autospec(
|
||||
RuntimeConfigObject)
|
||||
config={'kms_key_name': 'some_key'}
|
||||
mock_config.get.side_effect = lambda name: config.get(name)
|
||||
|
||||
expected = {}
|
||||
actual = adapter.get_view_options(mock_config, node={})
|
||||
self.assertEqual(expected, actual)
|
||||
|
||||
|
||||
|
||||
class TestBigQueryFilterCatalog(unittest.TestCase):
|
||||
|
||||
@@ -415,14 +415,14 @@ class SchemaParserModelsTest(SchemaParserTest):
|
||||
self.assertEqual(tests[0].package_name, 'snowplow')
|
||||
self.assertTrue(tests[0].name.startswith('accepted_values_'))
|
||||
self.assertEqual(tests[0].fqn, ['snowplow', 'schema_test', tests[0].name])
|
||||
self.assertEqual(tests[0].unique_id.split('.'), ['test', 'snowplow', tests[0].name, '0ecffad2de'])
|
||||
self.assertEqual(tests[0].unique_id.split('.'), ['test', 'snowplow', tests[0].name, '9d4814efde'])
|
||||
self.assertEqual(tests[0].test_metadata.name, 'accepted_values')
|
||||
self.assertIsNone(tests[0].test_metadata.namespace)
|
||||
self.assertEqual(
|
||||
tests[0].test_metadata.kwargs,
|
||||
{
|
||||
'column_name': 'color',
|
||||
'model': "{% if config.get('where') %}(select * from {{ ref('my_model') }} where {{config.get('where')}}) my_model{% else %}{{ ref('my_model') }}{% endif %}",
|
||||
'model': "{{ get_where_subquery(ref('my_model')) }}",
|
||||
'values': ['red', 'blue', 'green'],
|
||||
}
|
||||
)
|
||||
@@ -437,14 +437,14 @@ class SchemaParserModelsTest(SchemaParserTest):
|
||||
self.assertEqual(tests[1].fqn, ['snowplow', 'schema_test', tests[1].name])
|
||||
self.assertTrue(tests[1].name.startswith('foreign_package_test_case_'))
|
||||
self.assertEqual(tests[1].package_name, 'snowplow')
|
||||
self.assertEqual(tests[1].unique_id.split('.'), ['test', 'snowplow', tests[1].name, '0cc2317899'])
|
||||
self.assertEqual(tests[1].unique_id.split('.'), ['test', 'snowplow', tests[1].name, '13958f62f7'])
|
||||
self.assertEqual(tests[1].test_metadata.name, 'test_case')
|
||||
self.assertEqual(tests[1].test_metadata.namespace, 'foreign_package')
|
||||
self.assertEqual(
|
||||
tests[1].test_metadata.kwargs,
|
||||
{
|
||||
'column_name': 'color',
|
||||
'model': "{% if config.get('where') %}(select * from {{ ref('my_model') }} where {{config.get('where')}}) my_model{% else %}{{ ref('my_model') }}{% endif %}",
|
||||
'model': "{{ get_where_subquery(ref('my_model')) }}",
|
||||
'arg': 100,
|
||||
},
|
||||
)
|
||||
@@ -456,14 +456,14 @@ class SchemaParserModelsTest(SchemaParserTest):
|
||||
self.assertEqual(tests[2].package_name, 'snowplow')
|
||||
self.assertTrue(tests[2].name.startswith('not_null_'))
|
||||
self.assertEqual(tests[2].fqn, ['snowplow', 'schema_test', tests[2].name])
|
||||
self.assertEqual(tests[2].unique_id.split('.'), ['test', 'snowplow', tests[2].name, '1bac81dcfe'])
|
||||
self.assertEqual(tests[2].unique_id.split('.'), ['test', 'snowplow', tests[2].name, '2f61818750'])
|
||||
self.assertEqual(tests[2].test_metadata.name, 'not_null')
|
||||
self.assertIsNone(tests[2].test_metadata.namespace)
|
||||
self.assertEqual(
|
||||
tests[2].test_metadata.kwargs,
|
||||
{
|
||||
'column_name': 'color',
|
||||
'model': "{% if config.get('where') %}(select * from {{ ref('my_model') }} where {{config.get('where')}}) my_model{% else %}{{ ref('my_model') }}{% endif %}",
|
||||
'model': "{{ get_where_subquery(ref('my_model')) }}",
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@@ -175,11 +175,11 @@ class TestSemver(unittest.TestCase):
|
||||
|
||||
def test__filter_installable(self):
|
||||
assert filter_installable(
|
||||
['1.1.0', '1.2.0a1', '1.0.0'],
|
||||
['1.1.0', '1.2.0a1', '1.0.0','2.1.0-alpha','2.2.0asdf','2.1.0','2.2.0','2.2.0-fishtown-beta','2.2.0-2'],
|
||||
install_prerelease=True
|
||||
) == ['1.0.0', '1.1.0', '1.2.0a1']
|
||||
) == ['1.0.0', '1.1.0', '1.2.0a1','2.1.0-alpha','2.1.0','2.2.0asdf','2.2.0-fishtown-beta','2.2.0-2','2.2.0']
|
||||
|
||||
assert filter_installable(
|
||||
['1.1.0', '1.2.0a1', '1.0.0'],
|
||||
['1.1.0', '1.2.0a1', '1.0.0','2.1.0-alpha','2.2.0asdf','2.1.0','2.2.0','2.2.0-fishtown-beta'],
|
||||
install_prerelease=False
|
||||
) == ['1.0.0', '1.1.0']
|
||||
) == ['1.0.0', '1.1.0','2.1.0','2.2.0']
|
||||
|
||||
Reference in New Issue
Block a user