Compare commits

...

14 Commits

Author SHA1 Message Date
Michelle Ark
b29709b4d7 add python-dev-tools to dev-requirements 2023-07-27 13:38:42 -04:00
Michelle Ark
23b16ad6d2 Split integration tests into parallel groups / jobs (#6346) 2023-07-27 11:27:34 -04:00
Gerda Shank
fdeccfaf24 Initialize sqlparse lexer and tweak order of setting compilation fields (#8215) 2023-07-26 17:29:51 -04:00
dependabot[bot]
fecde23da5 Bump mypy from 1.3.0 to 1.4.0 (#7912)
* Bump mypy from 1.3.0 to 1.4.0

Bumps [mypy](https://github.com/python/mypy) from 1.3.0 to 1.4.0.
- [Commits](https://github.com/python/mypy/compare/v1.3.0...v1.4.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add automated changelog yaml from template for bot PR

* add to pre-commit config

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com>
2023-07-26 15:15:40 -05:00
Emily Rockman
b1d931337e up timeout (#8218) 2023-07-26 13:50:03 -05:00
Michelle Ark
39542336b8 Update implementation-ticket.yml (#8208) 2023-07-25 18:50:13 -04:00
Peter Webb
799588cada Update Core for Latest dbt-extractor with Version Parsing (#8206)
* Update dbt-extractor requirement, and adjust ref handling accordingly

* Add changelog entry.
2023-07-25 15:47:29 -04:00
Chenyu Li
f392add4b8 add param to control maxBytes for single dbt.log file (#8200)
* add param to control maxBytes for single dbt.log file

* nits

* nits

* Update core/dbt/cli/params.py

Co-authored-by: Peter Webb <peter.webb@dbtlabs.com>

---------

Co-authored-by: Peter Webb <peter.webb@dbtlabs.com>
2023-07-25 12:30:55 -07:00
Gerda Shank
49560bf2a2 Check for existing_relation immediately prior to renaming (#8193) 2023-07-25 12:59:18 -04:00
Quigley Malcolm
44b3ed5ae9 [CT-2594] Fix serialization of warn_error_options on Contexts (#8180)
* Add test ensuring `warn_error_options` is dictified in `invocation_args_dict` of contexts

* Add dictification specific to `warn_error_options` in `args_to_dict`

* Changie doc for serialization changes of warn_error_options
2023-07-24 13:35:27 -07:00
Quigley Malcolm
6235145641 [CT-1483] Let macro names include word materialization (#8181)
* Add test asserting that a macro with the work materializtion doesn't cause issues

* Let macro names include the word `materialization`

Previously we were checking if a macro included a materialization
based on if the macro name included the word `materialization`. However,
a macro name included the word `materialization` isn't guarnteed to
actually have a materialization, and a macro that doesn't have
`materialization` in the name isn't guaranteed to not have a materialization.
This change is to detect macros with materializations based on the
detected block type of the macro.

* Add changie doc materialization in macro detection
2023-07-24 13:10:42 -07:00
lllong33
ff5cb7ba51 Fixed double underline (#7944)
* Fixed double-underline
* backward compatibility postgres_get_relations
* Remove invalid comments
* compatibility adapter and get_relation
* fix generic for call
* fix adapter dispatch grammar issue
2023-07-21 17:28:27 -04:00
Quigley Malcolm
1e2b9ae962 [CT-1849] _connection_exception_retry handles EOFError exceptions (#8182)
* Add test for checking that `_connection_exception_retry` handles `EOFError`s

* Update `_connection_exception_retry` to handle `EOFError` exceptions

* Add changie docs for `_connection_exception_retry` handling `EOFError` exceptions
2023-07-21 11:34:06 -07:00
Michelle Ark
8cab58d248 Add implementation issue template (#8176) 2023-07-21 13:50:53 -04:00
35 changed files with 326 additions and 34 deletions

View File

@@ -0,0 +1,6 @@
kind: "Dependencies"
body: "Bump mypy from 1.3.0 to 1.4.0"
time: 2023-06-21T00:57:52.00000Z
custom:
Author: dependabot[bot]
PR: 7912

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Fixed double-underline
time: 2023-06-25T14:27:31.231253719+08:00
custom:
Author: lllong33
Issue: "5301"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Ensure `warn_error_options` get serialized in `invocation_args_dict`
time: 2023-07-20T16:15:13.761813-07:00
custom:
Author: QMalcolm
Issue: "7694"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Stop detecting materialization macros based on macro name
time: 2023-07-20T17:01:12.496238-07:00
custom:
Author: QMalcolm
Issue: "6231"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Update `dbt deps` download retry logic to handle `EOFError` exceptions
time: 2023-07-20T17:24:22.969951-07:00
custom:
Author: QMalcolm
Issue: "6653"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Improve handling of CTE injection with ephemeral models
time: 2023-07-26T10:44:48.888451-04:00
custom:
Author: gshank
Issue: "8213"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: A way to control maxBytes for a single dbt.log file
time: 2023-07-24T15:06:54.263822-07:00
custom:
Author: ChenyuLInx
Issue: "8199"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Ref expressions with version can now be processed by the latest version of the
high-performance dbt-extractor library.
time: 2023-07-25T10:26:09.902878-04:00
custom:
Author: peterallenwebb
Issue: "7688"

View File

@@ -0,0 +1,40 @@
name: 🛠️ Implementation
description: This is an implementation ticket intended for use by the maintainers of dbt-core
title: "[<project>] <title>"
labels: ["user_docs"]
body:
- type: markdown
attributes:
value: This is an implementation ticket intended for use by the maintainers of dbt-core
- type: checkboxes
attributes:
label: Housekeeping
description: >
A couple friendly reminders:
1. Remove the `user_docs` label if the scope of this work does not require changes to https://docs.getdbt.com/docs: no end-user interface (e.g. yml spec, CLI, error messages, etc) or functional changes
2. Link any blocking issues in the "Blocked on" field under the "Core devs & maintainers" project.
options:
- label: I am a maintainer of dbt-core
required: true
- type: textarea
attributes:
label: Short description
description: |
Describe the scope of the ticket, a high-level implementation approach and any tradeoffs to consider
validations:
required: true
- type: textarea
attributes:
label: Acceptance critera
description: |
What is the definition of done for this ticket? Include any relevant edge cases and/or test cases
validations:
required: true
- type: textarea
attributes:
label: Context
description: |
Provide the "why", motivation, and alternative approaches considered -- linking to previous refinement issues, spikes, Notion docs as appropriate
validations:
validations:
required: false

View File

@@ -33,6 +33,11 @@ defaults:
run: run:
shell: bash shell: bash
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python integration testing
PYTHON_INTEGRATION_TEST_WORKERS: ${{ vars.PYTHON_INTEGRATION_TEST_WORKERS }}
jobs: jobs:
code-quality: code-quality:
name: code-quality name: code-quality
@@ -106,23 +111,55 @@ jobs:
env: env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }} CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
include: ${{ steps.generate-include.outputs.include }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
- name: generate include
id: generate-include
run: |
INCLUDE=('"python-version":"3.8","os":"windows-latest"' '"python-version":"3.8","os":"macos-latest"' )
INCLUDE_GROUPS="["
for include in ${INCLUDE[@]}; do
for group in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
INCLUDE_GROUPS+=$(sed 's/$/, /' <<< "{\"split-group\":\"${group}\",${include}}")
done
done
INCLUDE_GROUPS=$(echo $INCLUDE_GROUPS | sed 's/,*$//g')
INCLUDE_GROUPS+="]"
echo "include=${INCLUDE_GROUPS}"
echo "include=${INCLUDE_GROUPS}" >> $GITHUB_OUTPUT
integration: integration:
name: integration test / python ${{ matrix.python-version }} / ${{ matrix.os }} name: (${{ matrix.split-group }}) integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
timeout-minutes: 60 timeout-minutes: 30
needs:
- integration-metadata
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"] python-version: ["3.8", "3.9", "3.10", "3.11"]
os: [ubuntu-20.04] os: [ubuntu-20.04]
include: split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
- python-version: 3.8 include: ${{ fromJson(needs.integration-metadata.outputs.include) }}
os: windows-latest
- python-version: 3.8
os: macos-latest
env: env:
TOXENV: integration TOXENV: integration
DBT_INVOCATION_ENV: github-actions DBT_INVOCATION_ENV: github-actions
@@ -165,6 +202,8 @@ jobs:
- name: Run tests - name: Run tests
run: tox -- --ddtrace run: tox -- --ddtrace
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
- name: Get current date - name: Get current date
if: always() if: always()
@@ -185,6 +224,15 @@ jobs:
env: env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }} CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
integration-report:
name: integration test suite
runs-on: ubuntu-latest
needs: integration
steps:
- name: "[Notification] Integration test suite passes"
run: |
echo "::notice title="Integration test suite passes""
build: build:
name: build packages name: build packages

View File

@@ -18,11 +18,41 @@ on:
permissions: read-all permissions: read-all
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python testing
PYTHON_INTEGRATION_TEST_WORKERS: ${{ vars.PYTHON_INTEGRATION_TEST_WORKERS }}
jobs: jobs:
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
# run the performance measurements on the current or default branch # run the performance measurements on the current or default branch
test-schema: test-schema:
name: Test Log Schema name: Test Log Schema
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
timeout-minutes: 30
needs:
- integration-metadata
strategy:
fail-fast: false
matrix:
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
env: env:
# turns warnings into errors # turns warnings into errors
RUSTFLAGS: "-D warnings" RUSTFLAGS: "-D warnings"
@@ -65,3 +95,14 @@ jobs:
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs # we actually care if these pass, because the normal test run doesn't usually include many json log outputs
- name: Run integration tests - name: Run integration tests
run: tox -e integration -- -nauto run: tox -e integration -- -nauto
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
test-schema-report:
name: Log Schema Test Suite
runs-on: ubuntu-latest
needs: test-schema
steps:
- name: "[Notification] Log test suite passes"
run: |
echo "::notice title="Log test suite passes""

View File

@@ -37,7 +37,7 @@ repos:
alias: flake8-check alias: flake8-check
stages: [manual] stages: [manual]
- repo: https://github.com/pre-commit/mirrors-mypy - repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.3.0 rev: v1.4.0
hooks: hooks:
- id: mypy - id: mypy
# N.B.: Mypy is... a bit fragile. # N.B.: Mypy is... a bit fragile.

View File

@@ -132,6 +132,7 @@ class dbtRunner:
@p.enable_legacy_logger @p.enable_legacy_logger
@p.fail_fast @p.fail_fast
@p.log_cache_events @p.log_cache_events
@p.log_file_max_bytes
@p.log_format @p.log_format
@p.log_format_file @p.log_format_file
@p.log_level @p.log_level

View File

@@ -171,6 +171,15 @@ use_colors_file = click.option(
default=True, default=True,
) )
log_file_max_bytes = click.option(
"--log-file-max-bytes",
envvar="DBT_LOG_FILE_MAX_BYTES",
help="Configure the max file size in bytes for a single dbt.log file, before rolling over. 0 means no limit.",
default=10 * 1024 * 1024, # 10mb
type=click.INT,
hidden=True,
)
log_path = click.option( log_path = click.option(
"--log-path", "--log-path",
envvar="DBT_LOG_PATH", envvar="DBT_LOG_PATH",

View File

@@ -4,7 +4,6 @@ import json
import networkx as nx # type: ignore import networkx as nx # type: ignore
import os import os
import pickle import pickle
import sqlparse
from collections import defaultdict from collections import defaultdict
from typing import List, Dict, Any, Tuple, Optional from typing import List, Dict, Any, Tuple, Optional
@@ -36,6 +35,7 @@ from dbt.node_types import NodeType, ModelLanguage
from dbt.events.format import pluralize from dbt.events.format import pluralize
import dbt.tracking import dbt.tracking
import dbt.task.list as list_task import dbt.task.list as list_task
import sqlparse
graph_file_name = "graph.gpickle" graph_file_name = "graph.gpickle"
@@ -378,16 +378,16 @@ class Compiler:
_add_prepended_cte(prepended_ctes, InjectedCTE(id=cte.id, sql=sql)) _add_prepended_cte(prepended_ctes, InjectedCTE(id=cte.id, sql=sql))
injected_sql = inject_ctes_into_sql(
model.compiled_code,
prepended_ctes,
)
# Check again before updating for multi-threading # Check again before updating for multi-threading
if not model.extra_ctes_injected: if not model.extra_ctes_injected:
injected_sql = inject_ctes_into_sql(
model.compiled_code,
prepended_ctes,
)
model.extra_ctes_injected = True
model._pre_injected_sql = model.compiled_code model._pre_injected_sql = model.compiled_code
model.compiled_code = injected_sql model.compiled_code = injected_sql
model.extra_ctes = prepended_ctes model.extra_ctes = prepended_ctes
model.extra_ctes_injected = True
# if model.extra_ctes is not set to prepended ctes, something went wrong # if model.extra_ctes is not set to prepended ctes, something went wrong
return model, model.extra_ctes return model, model.extra_ctes
@@ -523,6 +523,12 @@ class Compiler:
the node's raw_code into compiled_code, and then calls the the node's raw_code into compiled_code, and then calls the
recursive method to "prepend" the ctes. recursive method to "prepend" the ctes.
""" """
# Make sure Lexer for sqlparse 0.4.4 is initialized
from sqlparse.lexer import Lexer # type: ignore
if hasattr(Lexer, "get_default_instance"):
Lexer.get_default_instance()
node = self._compile_code(node, manifest, extra_context) node = self._compile_code(node, manifest, extra_context)
node, _ = self._recursively_prepend_ctes(node, manifest, extra_context) node, _ = self._recursively_prepend_ctes(node, manifest, extra_context)

View File

@@ -80,6 +80,7 @@ class LoggerConfig:
use_colors: bool = False use_colors: bool = False
output_stream: Optional[TextIO] = None output_stream: Optional[TextIO] = None
output_file_name: Optional[str] = None output_file_name: Optional[str] = None
output_file_max_bytes: Optional[int] = 10 * 1024 * 1024 # 10 mb
logger: Optional[Any] = None logger: Optional[Any] = None
@@ -100,7 +101,7 @@ class _Logger:
file_handler = RotatingFileHandler( file_handler = RotatingFileHandler(
filename=str(config.output_file_name), filename=str(config.output_file_name),
encoding="utf8", encoding="utf8",
maxBytes=10 * 1024 * 1024, # 10 mb maxBytes=config.output_file_max_bytes, # type: ignore
backupCount=5, backupCount=5,
) )
self._python_logger = self._get_python_log_for_handler(file_handler) self._python_logger = self._get_python_log_for_handler(file_handler)

View File

@@ -68,7 +68,11 @@ def setup_event_logger(flags, callbacks: List[Callable[[EventMsg], None]] = [])
log_level_file = EventLevel.DEBUG if flags.DEBUG else EventLevel(flags.LOG_LEVEL_FILE) log_level_file = EventLevel.DEBUG if flags.DEBUG else EventLevel(flags.LOG_LEVEL_FILE)
EVENT_MANAGER.add_logger( EVENT_MANAGER.add_logger(
_get_logfile_config( _get_logfile_config(
log_file, flags.USE_COLORS_FILE, log_file_format, log_level_file log_file,
flags.USE_COLORS_FILE,
log_file_format,
log_level_file,
flags.LOG_FILE_MAX_BYTES,
) )
) )
@@ -117,7 +121,11 @@ def _stdout_filter(
def _get_logfile_config( def _get_logfile_config(
log_path: str, use_colors: bool, line_format: LineFormat, level: EventLevel log_path: str,
use_colors: bool,
line_format: LineFormat,
level: EventLevel,
log_file_max_bytes: int,
) -> LoggerConfig: ) -> LoggerConfig:
return LoggerConfig( return LoggerConfig(
name="file_log", name="file_log",
@@ -127,6 +135,7 @@ def _get_logfile_config(
scrubber=env_scrubber, scrubber=env_scrubber,
filter=partial(_logfile_filter, bool(get_flags().LOG_CACHE_EVENTS), line_format), filter=partial(_logfile_filter, bool(get_flags().LOG_CACHE_EVENTS), line_format),
output_file_name=log_path, output_file_name=log_path,
output_file_max_bytes=log_file_max_bytes,
) )

View File

@@ -63,3 +63,12 @@
{{ exceptions.raise_not_implemented( {{ exceptions.raise_not_implemented(
'list_relations_without_caching macro not implemented for adapter '+adapter.type()) }} 'list_relations_without_caching macro not implemented for adapter '+adapter.type()) }}
{% endmacro %} {% endmacro %}
{% macro get_relations() %}
{{ return(adapter.dispatch('get_relations', 'dbt')()) }}
{% endmacro %}
{% macro default__get_relations() %}
{{ exceptions.raise_not_implemented(
'get_relations macro not implemented for adapter '+adapter.type()) }}
{% endmacro %}

View File

@@ -33,7 +33,12 @@
-- cleanup -- cleanup
{% if existing_relation is not none %} {% if existing_relation is not none %}
{{ adapter.rename_relation(existing_relation, backup_relation) }} /* Do the equivalent of rename_if_exists. 'existing_relation' could have been dropped
since the variable was first set. */
{% set existing_relation = load_cached_relation(existing_relation) %}
{% if existing_relation is not none %}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{% endif %}
{% endif %} {% endif %}
{{ adapter.rename_relation(intermediate_relation, target_relation) }} {{ adapter.rename_relation(intermediate_relation, target_relation) }}

View File

@@ -45,7 +45,12 @@
-- cleanup -- cleanup
-- move the existing view out of the way -- move the existing view out of the way
{% if existing_relation is not none %} {% if existing_relation is not none %}
{{ adapter.rename_relation(existing_relation, backup_relation) }} /* Do the equivalent of rename_if_exists. 'existing_relation' could have been dropped
since the variable was first set. */
{% set existing_relation = load_cached_relation(existing_relation) %}
{% if existing_relation is not none %}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{% endif %}
{% endif %} {% endif %}
{{ adapter.rename_relation(intermediate_relation, target_relation) }} {{ adapter.rename_relation(intermediate_relation, target_relation) }}

View File

@@ -81,7 +81,7 @@ class MacroParser(BaseParser[Macro]):
name: str = macro.name.replace(MACRO_PREFIX, "") name: str = macro.name.replace(MACRO_PREFIX, "")
node = self.parse_macro(block, base_node, name) node = self.parse_macro(block, base_node, name)
# get supported_languages for materialization macro # get supported_languages for materialization macro
if "materialization" in name: if block.block_type_name == "materialization":
node.supported_languages = jinja.get_supported_languages(macro) node.supported_languages = jinja.get_supported_languages(macro)
yield node yield node

View File

@@ -497,12 +497,10 @@ class ModelParser(SimpleSQLParser[ModelNode]):
# set refs and sources on the node object # set refs and sources on the node object
refs: List[RefArgs] = [] refs: List[RefArgs] = []
for ref in statically_parsed["refs"]: for ref in statically_parsed["refs"]:
if len(ref) == 1: name = ref.get("name")
package, name = None, ref[0] package = ref.get("package")
else: version = ref.get("version")
package, name = ref refs.append(RefArgs(name, package, version))
refs.append(RefArgs(package=package, name=name))
node.refs += refs node.refs += refs
node.sources += statically_parsed["sources"] node.sources += statically_parsed["sources"]

View File

@@ -502,6 +502,7 @@ def project(
DEBUG=False, DEBUG=False,
LOG_CACHE_EVENTS=False, LOG_CACHE_EVENTS=False,
QUIET=False, QUIET=False,
LOG_FILE_MAX_BYTES=1000000,
) )
setup_event_logger(log_flags) setup_event_logger(log_flags)
orig_cwd = os.getcwd() orig_cwd = os.getcwd()

View File

@@ -17,6 +17,7 @@ from pathlib import PosixPath, WindowsPath
from contextlib import contextmanager from contextlib import contextmanager
from dbt.events.types import RetryExternalCall, RecordRetryException from dbt.events.types import RetryExternalCall, RecordRetryException
from dbt.helper_types import WarnErrorOptions
from dbt import flags from dbt import flags
from enum import Enum from enum import Enum
from typing_extensions import Protocol from typing_extensions import Protocol
@@ -601,6 +602,7 @@ def _connection_exception_retry(fn, max_attempts: int, attempt: int = 0):
except ( except (
requests.exceptions.RequestException, requests.exceptions.RequestException,
ReadError, ReadError,
EOFError,
) as exc: ) as exc:
if attempt <= max_attempts - 1: if attempt <= max_attempts - 1:
dbt.events.functions.fire_event(RecordRetryException(exc=str(exc))) dbt.events.functions.fire_event(RecordRetryException(exc=str(exc)))
@@ -654,6 +656,9 @@ def args_to_dict(args):
# this was required for a test case # this was required for a test case
if isinstance(var_args[key], PosixPath) or isinstance(var_args[key], WindowsPath): if isinstance(var_args[key], PosixPath) or isinstance(var_args[key], WindowsPath):
var_args[key] = str(var_args[key]) var_args[key] = str(var_args[key])
if isinstance(var_args[key], WarnErrorOptions):
var_args[key] = var_args[key].to_dict()
dict_args[key] = var_args[key] dict_args[key] = var_args[key]
return dict_args return dict_args

View File

@@ -73,7 +73,7 @@ setup(
"sqlparse>=0.2.3", "sqlparse>=0.2.3",
# ---- # ----
# These are major-version-0 packages also maintained by dbt-labs. Accept patches. # These are major-version-0 packages also maintained by dbt-labs. Accept patches.
"dbt-extractor~=0.4.1", "dbt-extractor~=0.5.0",
"hologram~=0.0.16", # includes transitive dependencies on python-dateutil and jsonschema "hologram~=0.0.16", # includes transitive dependencies on python-dateutil and jsonschema
"minimal-snowplow-tracker~=0.0.2", "minimal-snowplow-tracker~=0.0.2",
# DSI is under active development, so we're pinning to specific dev versions for now. # DSI is under active development, so we're pinning to specific dev versions for now.

View File

@@ -6,7 +6,7 @@ flake8
flaky flaky
freezegun==0.3.12 freezegun==0.3.12
ipdb ipdb
mypy==1.3.0 mypy==1.4.0
pip-tools pip-tools
pre-commit pre-commit
protobuf>=4.0.0 protobuf>=4.0.0
@@ -16,7 +16,9 @@ pytest-csv
pytest-dotenv pytest-dotenv
pytest-logbook pytest-logbook
pytest-mock pytest-mock
pytest-split
pytest-xdist pytest-xdist
python-dev-tools
sphinx sphinx
tox>=3.13 tox>=3.13
twine twine

View File

@@ -20,8 +20,7 @@ from dbt.exceptions import (
import dbt.utils import dbt.utils
# note that this isn't an adapter macro, so just a single underscore GET_RELATIONS_MACRO_NAME = "postgres__get_relations"
GET_RELATIONS_MACRO_NAME = "postgres_get_relations"
@dataclass @dataclass

View File

@@ -1,4 +1,4 @@
{% macro postgres_get_relations () -%} {% macro postgres__get_relations() -%}
{# {#
-- in pg_depend, objid is the dependent, refobjid is the referenced object -- in pg_depend, objid is the dependent, refobjid is the referenced object
@@ -74,3 +74,7 @@
{{ return(load_result('relations').table) }} {{ return(load_result('relations').table) }}
{% endmacro %} {% endmacro %}
{% macro postgres_get_relations() %}
{{ return(postgres__get_relations()) }}
{% endmacro %}

View File

@@ -4,6 +4,12 @@ models__dep_macro = """
}} }}
""" """
models__materialization_macro = """
{{
materialization_macro()
}}
"""
models__with_undefined_macro = """ models__with_undefined_macro = """
{{ dispatch_to_nowhere() }} {{ dispatch_to_nowhere() }}
select 1 as id select 1 as id
@@ -75,6 +81,12 @@ macros__my_macros = """
{% endmacro %} {% endmacro %}
""" """
macros__named_materialization = """
{% macro materialization_macro() %}
select 1 as foo
{% endmacro %}
"""
macros__no_default_macros = """ macros__no_default_macros = """
{% macro do_something2(foo2, bar2) %} {% macro do_something2(foo2, bar2) %}

View File

@@ -20,12 +20,14 @@ from tests.functional.macros.fixtures import (
models__override_get_columns_macros, models__override_get_columns_macros,
models__deprecated_adapter_macro_model, models__deprecated_adapter_macro_model,
models__incorrect_dispatch, models__incorrect_dispatch,
models__materialization_macro,
macros__my_macros, macros__my_macros,
macros__no_default_macros, macros__no_default_macros,
macros__override_get_columns_macros, macros__override_get_columns_macros,
macros__package_override_get_columns_macros, macros__package_override_get_columns_macros,
macros__deprecated_adapter_macro, macros__deprecated_adapter_macro,
macros__incorrect_dispatch, macros__incorrect_dispatch,
macros__named_materialization,
) )
@@ -78,6 +80,21 @@ class TestMacros:
check_relations_equal(project.adapter, ["expected_local_macro", "local_macro"]) check_relations_equal(project.adapter, ["expected_local_macro", "local_macro"])
class TestMacrosNamedMaterialization:
@pytest.fixture(scope="class")
def models(self):
return {
"models_materialization_macro.sql": models__materialization_macro,
}
@pytest.fixture(scope="class")
def macros(self):
return {"macros_named_materialization.sql": macros__named_materialization}
def test_macro_with_materialization_in_name_works(self, project):
run_dbt(expect_pass=True)
class TestInvalidMacros: class TestInvalidMacros:
@pytest.fixture(scope="class") @pytest.fixture(scope="class")
def models(self): def models(self):

View File

@@ -57,6 +57,11 @@ class TestFlags:
assert hasattr(flags, "LOG_PATH") assert hasattr(flags, "LOG_PATH")
assert getattr(flags, "LOG_PATH") == Path("logs") assert getattr(flags, "LOG_PATH") == Path("logs")
def test_log_file_max_size_default(self, run_context):
flags = Flags(run_context)
assert hasattr(flags, "LOG_FILE_MAX_BYTES")
assert getattr(flags, "LOG_FILE_MAX_BYTES") == 10 * 1024 * 1024
@pytest.mark.parametrize( @pytest.mark.parametrize(
"set_stats_param,do_not_track,expected_anonymous_usage_stats", "set_stats_param,do_not_track,expected_anonymous_usage_stats",
[ [

View File

@@ -424,6 +424,9 @@ def test_invocation_args_to_dict_in_macro_runtime_context(
# Comes from unit/utils.py config_from_parts_or_dicts method # Comes from unit/utils.py config_from_parts_or_dicts method
assert ctx["invocation_args_dict"]["profile_dir"] == "/dev/null" assert ctx["invocation_args_dict"]["profile_dir"] == "/dev/null"
assert isinstance(ctx["invocation_args_dict"]["warn_error_options"], Dict)
assert ctx["invocation_args_dict"]["warn_error_options"] == {"include": [], "exclude": []}
def test_model_parse_context(config_postgres, manifest_fx, get_adapter, get_include_paths): def test_model_parse_context(config_postgres, manifest_fx, get_adapter, get_include_paths):
ctx = providers.generate_parser_model_context( ctx = providers.generate_parser_model_context(

View File

@@ -28,6 +28,11 @@ class TestCoreDbtUtils(unittest.TestCase):
connection_exception_retry(lambda: Counter._add_with_untar_exception(), 5) connection_exception_retry(lambda: Counter._add_with_untar_exception(), 5)
self.assertEqual(2, counter) # 2 = original attempt returned ReadError, plus 1 retry self.assertEqual(2, counter) # 2 = original attempt returned ReadError, plus 1 retry
def test_connection_exception_retry_success_failed_eofexception(self):
Counter._reset()
connection_exception_retry(lambda: Counter._add_with_eof_exception(), 5)
self.assertEqual(2, counter) # 2 = original attempt returned EOFError, plus 1 retry
counter: int = 0 counter: int = 0
@@ -57,6 +62,12 @@ class Counter:
if counter < 2: if counter < 2:
raise tarfile.ReadError raise tarfile.ReadError
def _add_with_eof_exception():
global counter
counter += 1
if counter < 2:
raise EOFError
def _reset(): def _reset():
global counter global counter
counter = 0 counter = 0

View File

@@ -2,7 +2,7 @@ from argparse import Namespace
import pytest import pytest
import dbt.flags as flags import dbt.flags as flags
from dbt.events.functions import msg_to_dict, warn_or_error from dbt.events.functions import msg_to_dict, warn_or_error, setup_event_logger
from dbt.events.types import InfoLevel, NoNodesForSelectionCriteria from dbt.events.types import InfoLevel, NoNodesForSelectionCriteria
from dbt.exceptions import EventCompilationError from dbt.exceptions import EventCompilationError
@@ -59,3 +59,13 @@ def test_msg_to_dict_handles_exceptions_gracefully():
assert ( assert (
False False
), f"We expect `msg_to_dict` to gracefully handle exceptions, but it raised {exc}" ), f"We expect `msg_to_dict` to gracefully handle exceptions, but it raised {exc}"
def test_setup_event_logger_specify_max_bytes(mocker):
patched_file_handler = mocker.patch("dbt.events.eventmgr.RotatingFileHandler")
args = Namespace(log_file_max_bytes=1234567)
flags.set_from_args(args, {})
setup_event_logger(flags.get_flags())
patched_file_handler.assert_called_once_with(
filename="logs/dbt.log", encoding="utf8", maxBytes=1234567, backupCount=5
)

View File

@@ -18,6 +18,7 @@ from dbt import tracking
from dbt.contracts.files import SourceFile, FileHash, FilePath from dbt.contracts.files import SourceFile, FileHash, FilePath
from dbt.contracts.graph.manifest import MacroManifest, ManifestStateCheck from dbt.contracts.graph.manifest import MacroManifest, ManifestStateCheck
from dbt.graph import NodeSelector, parse_difference from dbt.graph import NodeSelector, parse_difference
from dbt.events.functions import setup_event_logger
try: try:
from queue import Empty from queue import Empty
@@ -140,6 +141,7 @@ class GraphTest(unittest.TestCase):
config = config_from_parts_or_dicts(project=cfg, profile=self.profile) config = config_from_parts_or_dicts(project=cfg, profile=self.profile)
dbt.flags.set_from_args(Namespace(), config) dbt.flags.set_from_args(Namespace(), config)
setup_event_logger(dbt.flags.get_flags())
object.__setattr__(dbt.flags.get_flags(), "PARTIAL_PARSE", False) object.__setattr__(dbt.flags.get_flags(), "PARTIAL_PARSE", False)
return config return config