Compare commits

..

8 Commits

Author SHA1 Message Date
Jeremy Cohen
2c8856da3b Add changelog entry 2023-07-19 12:37:50 +02:00
Jeremy Cohen
029045e556 Pin sqlparse<0.5 2023-07-19 12:35:31 +02:00
Jeremy Cohen
433e5c670e Pin click<9 2023-07-19 12:35:17 +02:00
FishtownBuildBot
cbfc6a8baf Add new index.html and changelog yaml files from dbt-docs (#8131) 2023-07-19 11:23:35 +02:00
Michelle Ark
9765596247 wrap deprecation warnings in warn_or_error calls (#8129) 2023-07-18 18:08:52 -04:00
Kshitij Aranke
1b1a291fae Publish coverage results to codecov.io (#8127) 2023-07-18 16:47:44 -05:00
FishtownBuildBot
867534c1f4 Cleanup main after cutting new 1.6.latest branch (#8102) 2023-07-17 19:49:55 -04:00
Michelle Ark
6d8b6459eb bumpversion 1.7.0a1 (#8111) 2023-07-17 18:10:29 -04:00
105 changed files with 765 additions and 2581 deletions

View File

@@ -1,5 +1,5 @@
[bumpversion]
current_version = 1.6.2
current_version = 1.7.0a1
parse = (?P<major>[\d]+) # major version number
\.(?P<minor>[\d]+) # minor version number
\.(?P<patch>[\d]+) # patch version number

View File

@@ -3,6 +3,7 @@
For information on prior major and minor releases, see their changelogs:
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)

View File

@@ -1,204 +0,0 @@
## dbt-core 1.6.0 - July 31, 2023
### Breaking Changes
- Dropped support for Python 3.7 ([#7082](https://github.com/dbt-labs/dbt-core/issues/7082))
- Switch from dbt-metrics to dbt-semantic-interfaces for MetricNode definitions ([#7500](https://github.com/dbt-labs/dbt-core/issues/7500), [#7404](https://github.com/dbt-labs/dbt-core/issues/7404))
### Features
- Add merge as valid incremental strategy for postgres ([#1880](https://github.com/dbt-labs/dbt-core/issues/1880))
- Skip catalog generation ([#6980](https://github.com/dbt-labs/dbt-core/issues/6980))
- Add support for materialized views ([#6911](https://github.com/dbt-labs/dbt-core/issues/6911))
- Publication artifacts and cross-project ref ([#7227](https://github.com/dbt-labs/dbt-core/issues/7227))
- Optimize template rendering for common parse scenarios ([#7449](https://github.com/dbt-labs/dbt-core/issues/7449))
- Add graph structure summaries to target path output ([#7357](https://github.com/dbt-labs/dbt-core/issues/7357))
- Allow duplicate manifest node (models, seeds, analyses, snapshots) names across packages ([#7446](https://github.com/dbt-labs/dbt-core/issues/7446))
- Detect breaking changes to enforced constraints ([#7065](https://github.com/dbt-labs/dbt-core/issues/7065))
- Check for project dependency cycles ([#7468](https://github.com/dbt-labs/dbt-core/issues/7468))
- nodes in packages respect custom generate_alias_name, generate_schema_name, generate_database_name macro overrides defined in packages ([#7444](https://github.com/dbt-labs/dbt-core/issues/7444))
- Added warnings for model and ref deprecations ([#7433](https://github.com/dbt-labs/dbt-core/issues/7433))
- Update drop_relation macro to allow for configuration of drop statement separately from object name ([#7625](https://github.com/dbt-labs/dbt-core/issues/7625))
- accept publications in dbt.invoke ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372))
- Enable state for deferral to be separate from state for selectors ([#7300](https://github.com/dbt-labs/dbt-core/issues/7300))
- add access selection syntax ([#7738](https://github.com/dbt-labs/dbt-core/issues/7738))
- add project_name to manifest metadata ([#7752](https://github.com/dbt-labs/dbt-core/issues/7752))
- dbt retry ([#7299](https://github.com/dbt-labs/dbt-core/issues/7299))
- This change adds new selector methods to the state selector. Namely, state:unmodified and state:old. ([#7564](https://github.com/dbt-labs/dbt-core/issues/7564))
- Revamp debug, add --connection flag. Prepare for future refactors/interface changes. ([#7104](https://github.com/dbt-labs/dbt-core/issues/7104))
- Validate public models are not materialized as ephemeral ([#7226](https://github.com/dbt-labs/dbt-core/issues/7226))
- Added support for parsing and serializaing semantic models ([#7499](https://github.com/dbt-labs/dbt-core/issues/7499), [#7503](https://github.com/dbt-labs/dbt-core/issues/7503))
- Enable setting packages in dependencies.yml ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372), [#7736](https://github.com/dbt-labs/dbt-core/issues/7736))
- Add AdapterRegistered event log message ([#7038](https://github.com/dbt-labs/dbt-core/issues/7038))
- dbt clone ([#7258](https://github.com/dbt-labs/dbt-core/issues/7258))
- Further integrate semantic models into the DAG and partial parsing module ([#7800](https://github.com/dbt-labs/dbt-core/issues/7800))
- Handle external model nodes in state:modified ([#7563](https://github.com/dbt-labs/dbt-core/issues/7563))
- Add invocation_command to flags ([#6051](https://github.com/dbt-labs/dbt-core/issues/6051))
- Add thread_id context var ([#7941](https://github.com/dbt-labs/dbt-core/issues/7941))
- Add partial parsing support for semantic models ([#7897](https://github.com/dbt-labs/dbt-core/issues/7897))
- Add restrict-access to dbt_project.yml ([#7713](https://github.com/dbt-labs/dbt-core/issues/7713))
- allow setting enabled and depends_on_nodes from ModelNodeArgs ([#7506](https://github.com/dbt-labs/dbt-core/issues/7506))
- Support '_'-delimited fqn matching for versioned models and matching on Path.stem for path selection ([#7639](https://github.com/dbt-labs/dbt-core/issues/7639))
- Store time_spline table configuration in semantic manifest ([#7938](https://github.com/dbt-labs/dbt-core/issues/7938))
- Add validate_sql method to BaseAdapter with implementation for SQLAdapter ([#7839](https://github.com/dbt-labs/dbt-core/issues/7839))
- Support validation of metrics and semantic models. ([#7969](https://github.com/dbt-labs/dbt-core/issues/7969))
- Begin populating `depends_on` of metric nodes ([#7854](https://github.com/dbt-labs/dbt-core/issues/7854))
- Enumerate supported materialized view features for dbt-postgres ([#6911](https://github.com/dbt-labs/dbt-core/issues/6911))
### Fixes
- Raise better error message when dispatching a package that is not installed ([#5801](https://github.com/dbt-labs/dbt-core/issues/5801))
- add negative part_number arg for split part macro ([#7915](https://github.com/dbt-labs/dbt-core/issues/7915))
- Persist timing info in run results for failed nodes ([#5476](https://github.com/dbt-labs/dbt-core/issues/5476))
- fix typo in unpacking statically parsed ref ([#7364](https://github.com/dbt-labs/dbt-core/issues/7364))
- safe version attribute access in _check_resource_uniqueness ([#7375](https://github.com/dbt-labs/dbt-core/issues/7375))
- Fix dbt command missing target-path param ([# 7411](https://github.com/dbt-labs/dbt-core/issues/ 7411))
- Fix CTE insertion position when the model uses WITH RECURSIVE ([#7350](https://github.com/dbt-labs/dbt-core/issues/7350))
- Fix v0 ref resolution ([#7408](https://github.com/dbt-labs/dbt-core/issues/7408))
- Add --target-path to dbt snapshot command. ([#7418](https://github.com/dbt-labs/dbt-core/issues/7418))
- dbt build selection of tests' descendants ([#7289](https://github.com/dbt-labs/dbt-core/issues/7289))
- fix groupable node partial parsing, raise DbtReferenceError at runtime for safety ([#7437](https://github.com/dbt-labs/dbt-core/issues/7437))
- Fix partial parsing of latest_version changes for downstream references ([#7369](https://github.com/dbt-labs/dbt-core/issues/7369))
- Use "add_node" to update depends_on.nodes ([#7453](https://github.com/dbt-labs/dbt-core/issues/7453))
- Fix var precedence in configs: root vars override package vars ([#6705](https://github.com/dbt-labs/dbt-core/issues/6705))
- Constraint rendering fixes: wrap check expression in parentheses, foreign key 'references', support expression in all constraint types ([#7417](https://github.com/dbt-labs/dbt-core/issues/7417), [#7480](https://github.com/dbt-labs/dbt-core/issues/7480), [#7416](https://github.com/dbt-labs/dbt-core/issues/7416))
- Fix inverted `--print/--no-print` flag ([#7517](https://github.com/dbt-labs/dbt-core/issues/7517))
- Back-compat for previous return type of 'collect_freshness' macro ([#7489](https://github.com/dbt-labs/dbt-core/issues/7489))
- Fix warning messages for deprecated dbt_project.yml configs ([#7424](https://github.com/dbt-labs/dbt-core/issues/7424))
- Respect column 'quote' config in model contracts ([#7370](https://github.com/dbt-labs/dbt-core/issues/7370))
- print model version in dbt show if specified ([#7407](https://github.com/dbt-labs/dbt-core/issues/7407))
- enable dbt show for seeds ([#7273](https://github.com/dbt-labs/dbt-core/issues/7273))
- push down limit filtering to adapter ([#7390](https://github.com/dbt-labs/dbt-core/issues/7390))
- Allow missing `profiles.yml` for `dbt deps` and `dbt init` ([#7511](https://github.com/dbt-labs/dbt-core/issues/7511))
- `run_results.json` is now written after every node completes. ([#7302](https://github.com/dbt-labs/dbt-core/issues/7302))
- Do not rewrite manifest.json during 'docs serve' command ([#7553](https://github.com/dbt-labs/dbt-core/issues/7553))
- Pin protobuf to greater than 4.0.0 ([#7565](https://github.com/dbt-labs/dbt-core/issues/7565))
- inject sql header in query for show ([#7413](https://github.com/dbt-labs/dbt-core/issues/7413))
- Pin urllib3 to ~=1.0 ([#7573](https://github.com/dbt-labs/dbt-core/issues/7573))
- Throw error for duplicated versioned and unversioned models ([#7487](https://github.com/dbt-labs/dbt-core/issues/7487))
- Honor `--skip-profile-setup` parameter when inside an existing project ([#7594](https://github.com/dbt-labs/dbt-core/issues/7594))
- Fix: Relative project paths weren't working with deps ([#7491](https://github.com/dbt-labs/dbt-core/issues/7491))
- Exclude password fields from Jinja rendering. ([#7629](https://github.com/dbt-labs/dbt-core/issues/7629))
- Add --target-path to more CLI subcommands ([#7646](https://github.com/dbt-labs/dbt-core/issues/7646))
- Stringify flag paths for Jinja context ([#7495](https://github.com/dbt-labs/dbt-core/issues/7495))
- write run_results.json for run operation ([#7502](https://github.com/dbt-labs/dbt-core/issues/7502))
- Add `%` to adapter suite test cases for `persist_docs` ([#7698](https://github.com/dbt-labs/dbt-core/issues/7698))
- Unified to UTC ([#7664](https://github.com/dbt-labs/dbt-core/issues/7664))
- Improve warnings for constraints and materialization types ([#7335](https://github.com/dbt-labs/dbt-core/issues/7335))
- Incorrect paths used for "target" and "state" directories ([#7465](https://github.com/dbt-labs/dbt-core/issues/7465))
- fix StopIteration error when publication for project not found ([#7711](https://github.com/dbt-labs/dbt-core/issues/7711))
- Using version 0 works when resolving single model ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372))
- Fix empty --warn-error-options error message ([#7730](https://github.com/dbt-labs/dbt-core/issues/7730))
- send sql header on contract enforcement ([#7714](https://github.com/dbt-labs/dbt-core/issues/7714))
- Fixed doc link in selector.py ([#7533](https://github.com/dbt-labs/dbt-core/issues/7533))
- Fix null-safe equals comparison via `equals` ([#7778](https://github.com/dbt-labs/dbt-core/issues/7778))
- Log PublicationArtifactAvailable even when partially parsing unchanged public models ([#7782](https://github.com/dbt-labs/dbt-core/issues/7782))
- fix RuntimeError when removing project dependency from dependencies.yml ([#7743](https://github.com/dbt-labs/dbt-core/issues/7743))
- Fix regression in `run-operation` to not require the name of the package to run ([#7753](https://github.com/dbt-labs/dbt-core/issues/7753))
- Fix path selector when using project-dir ([#7819](https://github.com/dbt-labs/dbt-core/issues/7819))
- Allow project dependencies to use miscellaneous keys ([#7497](https://github.com/dbt-labs/dbt-core/issues/7497))
- Allow dbt show --inline preview of private models ([#7837](https://github.com/dbt-labs/dbt-core/issues/7837))
- Updating this error message to point to the correct URL ([#7789](https://github.com/dbt-labs/dbt-core/issues/7789))
- Update SemanticModel node to properly impelment the DSI 0.1.0dev3 SemanticModel protocol spec ([#7833](https://github.com/dbt-labs/dbt-core/issues/7833), [#7827](https://github.com/dbt-labs/dbt-core/issues/7827))
- Allow semantic model measure exprs to be defined with ints and bools in yaml ([#7865](https://github.com/dbt-labs/dbt-core/issues/7865))
- Update `use_discrete_percentile` and `use_approximate_percentile` to be non optional and default to `False` ([#7866](https://github.com/dbt-labs/dbt-core/issues/7866))
- Fix accidental propagation of log messages to root logger. ([#7872](https://github.com/dbt-labs/dbt-core/issues/7872))
- Fixed an issue which blocked debug logging to stdout with --log-level debug, unless --debug was also used. ([#7872](https://github.com/dbt-labs/dbt-core/issues/7872))
- Skip jinja parsing of metric filters ([#7864](https://github.com/dbt-labs/dbt-core/issues/7864))
- Fix a bad implicit string conversion regression in debug --config-dir code. ([#7774](https://github.com/dbt-labs/dbt-core/issues/7774))
- Remove limitation on use of sqlparse 0.4.4 ([#7515](https://github.com/dbt-labs/dbt-core/issues/7515))
- Fix UninstalledPackagesFoundError error message to use correct packages specified path ([#7921](https://github.com/dbt-labs/dbt-core/issues/7921))
- Fix: safe remove of external nodes from nodes.depends_on ([#7924](https://github.com/dbt-labs/dbt-core/issues/7924))
- Fix query comment tests ([#7845](https://github.com/dbt-labs/dbt-core/issues/7845))
- Move project_root contextvar into events.contextvars ([#7937](https://github.com/dbt-labs/dbt-core/issues/7937))
- add access to ModelNodeArgs for external node building ([#7890](https://github.com/dbt-labs/dbt-core/issues/7890))
- Inline query emit proper error message ([#7940](https://github.com/dbt-labs/dbt-core/issues/7940))
- Fix typo in ModelNodeArgs ([#7991](https://github.com/dbt-labs/dbt-core/issues/7991))
- Allow on_schema_change = fail for contracted incremental models ([#7975](https://github.com/dbt-labs/dbt-core/issues/7975))
- Nicer error message if model with enforced contract is missing 'columns' specification ([#7943](https://github.com/dbt-labs/dbt-core/issues/7943))
- include 'v' in ModelNodeArgs.unique_id ([#8039](https://github.com/dbt-labs/dbt-core/issues/8039))
- Fix fail-fast behavior (including retry) ([#7785](https://github.com/dbt-labs/dbt-core/issues/7785))
- Remove `create_metric` as a `SemanticModel.Measure` property because it currently doesn't do anything ([#8064](https://github.com/dbt-labs/dbt-core/issues/8064))
- Remove `VOLUME` declaration within Dockerfile ([#4784](https://github.com/dbt-labs/dbt-core/issues/4784))
- Fix Dockerfile.test ([#7352](https://github.com/dbt-labs/dbt-core/issues/7352))
- Detect breaking contract changes to versioned models ([#8030](https://github.com/dbt-labs/dbt-core/issues/8030))
- Update DryRunMethod test classes ValidateSqlMethod naming ([#7839](https://github.com/dbt-labs/dbt-core/issues/7839))
- Fix typo in `NonAdditiveDimension` implementation ([#8088](https://github.com/dbt-labs/dbt-core/issues/8088))
- Copy target_schema from config into snapshot node ([#6745](https://github.com/dbt-labs/dbt-core/issues/6745))
- Enable converting deprecation warnings to errors ([#8130](https://github.com/dbt-labs/dbt-core/issues/8130))
- Ensure `warn_error_options` get serialized in `invocation_args_dict` ([#7694](https://github.com/dbt-labs/dbt-core/issues/7694))
- Stop detecting materialization macros based on macro name ([#6231](https://github.com/dbt-labs/dbt-core/issues/6231))
- Improve handling of CTE injection with ephemeral models ([#8213](https://github.com/dbt-labs/dbt-core/issues/8213))
- Fix unbound local variable error in `checked_agg_time_dimension_for_measure` ([#8230](https://github.com/dbt-labs/dbt-core/issues/8230))
- Ensure runtime errors are raised for graph runnable tasks (compile, show, run, etc) ([#8166](https://github.com/dbt-labs/dbt-core/issues/8166))
### Docs
- add note before running integration tests ([dbt-docs/#nothing](https://github.com/dbt-labs/dbt-docs/issues/nothing))
- Fix for column tests not rendering on quoted columns ([dbt-docs/#201](https://github.com/dbt-labs/dbt-docs/issues/201))
- Fix broken links in `CONTRIBUTING.md`. ([dbt-docs/#8018](https://github.com/dbt-labs/dbt-docs/issues/8018))
- Remove static SQL codeblock for metrics ([dbt-docs/#436](https://github.com/dbt-labs/dbt-docs/issues/436))
### Under the Hood
- Update docs link in ContractBreakingChangeError message ([#7366](https://github.com/dbt-labs/dbt-core/issues/7366))
- Reduce memory footprint of cached statement results. ([#7281](https://github.com/dbt-labs/dbt-core/issues/7281))
- Remove noisy parsing events: GenericTestFileParse, MacroFileParse, Note events for static model parsing ([#6671](https://github.com/dbt-labs/dbt-core/issues/6671))
- Update --help text for cache-related parameters ([#7381](https://github.com/dbt-labs/dbt-core/issues/7381))
- Small UX improvements to model versions: Support defining latest_version in unsuffixed file by default. Notify on unpinned ref when a prerelease version is available. ([#7443](https://github.com/dbt-labs/dbt-core/issues/7443))
- Add ability to instantiate Flags class from dict ([#7607](https://github.com/dbt-labs/dbt-core/issues/7607))
- Add other relation to reffable nodes ([#7550](https://github.com/dbt-labs/dbt-core/issues/7550))
- Move node patch method to schema parser patch_node_properties and refactor schema parsing ([#7430](https://github.com/dbt-labs/dbt-core/issues/7430))
- Remove legacy file logger code ([#NA](https://github.com/dbt-labs/dbt-core/issues/NA))
- Break up integration tests as a short term fix for Windows CI runs ([#7668](https://github.com/dbt-labs/dbt-core/issues/7668))
- Include null checks in utils test base ([#7670](https://github.com/dbt-labs/dbt-core/issues/7670))
- Write pub artifact to log ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372))
- Fix flaky test for --fail-fast ([#7744](https://github.com/dbt-labs/dbt-core/issues/7744))
- Create `add_from_artifact` to populate `state_relation` field of nodes ([#7551](https://github.com/dbt-labs/dbt-core/issues/7551))
- Replace space with underscore in NodeType strings ([#7841](https://github.com/dbt-labs/dbt-core/issues/7841))
- Upgrade to dbt-semantic-interfaces v0.1.0dev5 ([#7853](https://github.com/dbt-labs/dbt-core/issues/7853))
- Refactoring: consolidating public_nodes and nodes ([#7890](https://github.com/dbt-labs/dbt-core/issues/7890))
- Resolve SemanticModel ref is the same way as other refs ([#7822](https://github.com/dbt-labs/dbt-core/issues/7822))
- Move from dbt-semantic-intefaces 0.1.0dev5 to 0.1.0dev7 ([#7898](https://github.com/dbt-labs/dbt-core/issues/7898))
- Don't jinja render packages from dependencies.yml ([#7905](https://github.com/dbt-labs/dbt-core/issues/7905))
- Update mashumaro to 3.8.1 ([#7950](https://github.com/dbt-labs/dbt-core/issues/7950))
- Refactor: entry point for cross-project ref ([#7954](https://github.com/dbt-labs/dbt-core/issues/7954))
- Populate metric input measures ([#7884](https://github.com/dbt-labs/dbt-core/issues/7884))
- Add option to specify partial parse file ([#7911](https://github.com/dbt-labs/dbt-core/issues/7911))
- Add semantic_models to resource counts ([#8077](https://github.com/dbt-labs/dbt-core/issues/8077))
- A way to control maxBytes for a single dbt.log file ([#8199](https://github.com/dbt-labs/dbt-core/issues/8199))
### Dependencies
- Bump mypy from 0.981 to 1.0.1 ([#7027](https://github.com/dbt-labs/dbt-core/pull/7027))
- Bump ubuntu from 23.04 to 23.10 ([#7675](https://github.com/dbt-labs/dbt-core/pull/7675))
- ([#7681](https://github.com/dbt-labs/dbt-core/pull/7681))
- Pin click>=8.1.1,<8.1.4 ([#8050](https://github.com/dbt-labs/dbt-core/pull/8050))
- Bump `dbt-semantic-interfaces` to `~=0.1.0rc1` ([#8082](https://github.com/dbt-labs/dbt-core/pull/8082))
- Update pin for click<9 ([#8232](https://github.com/dbt-labs/dbt-core/pull/8232))
- Add upper bound to sqlparse pin of <0.5 ([#8236](https://github.com/dbt-labs/dbt-core/pull/8236))
- Support dbt-semantic-interfaces 0.2.0 ([#8250](https://github.com/dbt-labs/dbt-core/pull/8250))
### Contributors
- [@AndyBys](https://github.com/AndyBys) ([#6980](https://github.com/dbt-labs/dbt-core/issues/6980))
- [@NiallRees](https://github.com/NiallRees) ([#6051](https://github.com/dbt-labs/dbt-core/issues/6051), [#7941](https://github.com/dbt-labs/dbt-core/issues/7941))
- [@alexrosenfeld10](https://github.com/alexrosenfeld10) ([#4784](https://github.com/dbt-labs/dbt-core/issues/4784))
- [@b-luu](https://github.com/b-luu) ([#7289](https://github.com/dbt-labs/dbt-core/issues/7289))
- [@d-kaneshiro](https://github.com/d-kaneshiro) ([#7664](https://github.com/dbt-labs/dbt-core/issues/7664), [#nothing](https://github.com/dbt-labs/dbt-core/issues/nothing))
- [@damian3031](https://github.com/damian3031) ([#7845](https://github.com/dbt-labs/dbt-core/issues/7845))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#7738](https://github.com/dbt-labs/dbt-core/issues/7738), [#7915](https://github.com/dbt-labs/dbt-core/issues/7915))
- [@dradnan89@hotmail.com](https://github.com/dradnan89@hotmail.com) ([#7681](https://github.com/dbt-labs/dbt-core/pull/7681))
- [@drewbanin](https://github.com/drewbanin) ([#201](https://github.com/dbt-labs/dbt-core/issues/201))
- [@dwreeves](https://github.com/dwreeves) ([#7418](https://github.com/dbt-labs/dbt-core/issues/7418), [#7646](https://github.com/dbt-labs/dbt-core/issues/7646))
- [@gem7318](https://github.com/gem7318) ([#8018](https://github.com/dbt-labs/dbt-core/issues/8018))
- [@iknox-fa](https://github.com/iknox-fa) ([#7302](https://github.com/dbt-labs/dbt-core/issues/7302), [#7491](https://github.com/dbt-labs/dbt-core/issues/7491), [#7281](https://github.com/dbt-labs/dbt-core/issues/7281), [#NA](https://github.com/dbt-labs/dbt-core/issues/NA))
- [@marcodamore](https://github.com/marcodamore) ([#436](https://github.com/dbt-labs/dbt-core/issues/436))
- [@mirnawong1](https://github.com/mirnawong1) ([#7789](https://github.com/dbt-labs/dbt-core/issues/7789))
- [@quazi-irfan](https://github.com/quazi-irfan) ([#7533](https://github.com/dbt-labs/dbt-core/issues/7533))
- [@rainermensing](https://github.com/rainermensing) ([#1880](https://github.com/dbt-labs/dbt-core/issues/1880))
- [@sdebruyn](https://github.com/sdebruyn) ([#7082](https://github.com/dbt-labs/dbt-core/issues/7082), [#7670](https://github.com/dbt-labs/dbt-core/issues/7670))
- [@stu-k](https://github.com/stu-k) ([#7299](https://github.com/dbt-labs/dbt-core/issues/7299), [#5476](https://github.com/dbt-labs/dbt-core/issues/5476), [#7607](https://github.com/dbt-labs/dbt-core/issues/7607), [#7550](https://github.com/dbt-labs/dbt-core/issues/7550), [#7551](https://github.com/dbt-labs/dbt-core/issues/7551))
- [@thomasgjerdekog](https://github.com/thomasgjerdekog) ([#7517](https://github.com/dbt-labs/dbt-core/issues/7517))
- [@tlento](https://github.com/tlento) ([#7839](https://github.com/dbt-labs/dbt-core/issues/7839), [#7839](https://github.com/dbt-labs/dbt-core/issues/7839))
- [@trouze](https://github.com/trouze) ([#7564](https://github.com/dbt-labs/dbt-core/issues/7564))
- [@willbryant](https://github.com/willbryant) ([#7350](https://github.com/dbt-labs/dbt-core/issues/7350))

View File

@@ -1,27 +0,0 @@
## dbt-core 1.6.1 - August 23, 2023
### Fixes
- Add status to Parse Inline Error ([#8173](https://github.com/dbt-labs/dbt-core/issues/8173))
- Fix retry not working with log-file-max-bytes ([#8297](https://github.com/dbt-labs/dbt-core/issues/8297))
- Detect changes to model access, version, or latest_version in state:modified ([#8189](https://github.com/dbt-labs/dbt-core/issues/8189))
- fix fqn-selection for external versioned models ([#8374](https://github.com/dbt-labs/dbt-core/issues/8374))
- Fix: DbtInternalError after model that previously ref'd external model is deleted ([#8375](https://github.com/dbt-labs/dbt-core/issues/8375))
- Fix using list command with path selector and project-dir ([#8385](https://github.com/dbt-labs/dbt-core/issues/8385))
- Remedy performance regression by only writing run_results.json once. ([#8360](https://github.com/dbt-labs/dbt-core/issues/8360))
- Ensure parsing does not break when `window_groupings` is not specified for `non_additive_dimension` ([#8453](https://github.com/dbt-labs/dbt-core/issues/8453))
### Docs
- Display contract and column constraints on the model page ([dbt-docs/#433](https://github.com/dbt-labs/dbt-docs/issues/433))
- Display semantic model details in docs ([dbt-docs/#431](https://github.com/dbt-labs/dbt-docs/issues/431))
### Under the Hood
- Refactor flaky test pp_versioned_models ([#7781](https://github.com/dbt-labs/dbt-core/issues/7781))
- format exception from dbtPlugin.initialize ([#8152](https://github.com/dbt-labs/dbt-core/issues/8152))
- Update manifest v10 ([#8333](https://github.com/dbt-labs/dbt-core/issues/8333))
- add tracking for plugin.get_nodes calls ([#8344](https://github.com/dbt-labs/dbt-core/issues/8344))
- add internal flag: --no-partial-parse-file-diff to inform whether to compute a file diff during partial parsing ([#8363](https://github.com/dbt-labs/dbt-core/issues/8363))
- Use python version 3.10.7 in Docker image. ([#8444](https://github.com/dbt-labs/dbt-core/issues/8444))
- Check for existing_relation immediately prior to renaming ([#7781](https://github.com/dbt-labs/dbt-core/issues/7781))

View File

@@ -1,20 +0,0 @@
## dbt-core 1.6.2 - September 07, 2023
### Breaking Changes
- Removed the FirstRunResultError and AfterFirstRunResultError event types, using the existing RunResultError in their place. ([#7963](https://github.com/dbt-labs/dbt-core/issues/7963))
### Features
- Accept a `dbt-cloud` config in dbt_project.yml ([#8438](https://github.com/dbt-labs/dbt-core/issues/8438))
### Fixes
- Copy dir during `dbt deps` if symlink fails ([#7428](https://github.com/dbt-labs/dbt-core/issues/7428), [#8223](https://github.com/dbt-labs/dbt-core/issues/8223))
- fix ambiguous reference error for tests and versions when model name is duplicated across packages ([#8327](https://github.com/dbt-labs/dbt-core/issues/8327), [#8493](https://github.com/dbt-labs/dbt-core/issues/8493))
- Fix "Internal Error: Expected node <unique-id> not found in manifest" when depends_on set on ModelNodeArgs ([#8506](https://github.com/dbt-labs/dbt-core/issues/8506))
- Fix snapshot success message ([#7583](https://github.com/dbt-labs/dbt-core/issues/7583))
- Parse the correct schema version from manifest ([#8544](https://github.com/dbt-labs/dbt-core/issues/8544))
### Contributors
- [@anjutiwari](https://github.com/anjutiwari) ([#7428](https://github.com/dbt-labs/dbt-core/issues/7428), [#8223](https://github.com/dbt-labs/dbt-core/issues/8223))

View File

@@ -0,0 +1,6 @@
kind: Dependencies
body: Pin click<9 + sqlparse<0.5
time: 2023-07-19T12:37:43.716495+02:00
custom:
Author: jtcohen6
PR: "8146"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Fix for column tests not rendering on quoted columns
time: 2023-05-31T11:54:19.687363-04:00
custom:
Author: drewbanin
Issue: "201"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Remove static SQL codeblock for metrics
time: 2023-07-18T19:24:22.155323+02:00
custom:
Author: marcodamore
Issue: "436"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Add --no-inject-ephemeral-ctes flag for `compile` command, for usage by linting.
time: 2023-08-23T14:04:07.617476-04:00
custom:
Author: benmosher
Issue: "8480"

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Enable converting deprecation warnings to errors
time: 2023-07-18T12:55:18.03914-04:00
custom:
Author: michelleark
Issue: "8130"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Add explicit support for integers for the show command
time: 2023-08-03T09:35:02.163968-05:00
custom:
Author: dave-connors-3
Issue: "8153"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: make version comparison insensitive to order
time: 2023-09-06T14:22:13.114549-04:00
custom:
Author: michelleark
Issue: "8571"

View File

@@ -1,6 +0,0 @@
kind: Under the Hood
body: Fix test_numeric_values to look for more specific strings
time: 2023-09-13T14:16:51.453247-04:00
custom:
Author: gshank
Issue: "8470"

View File

@@ -33,11 +33,6 @@ defaults:
run:
shell: bash
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python integration testing
PYTHON_INTEGRATION_TEST_WORKERS: 5
jobs:
code-quality:
name: code-quality
@@ -78,7 +73,6 @@ jobs:
env:
TOXENV: "unit"
PYTEST_ADDOPTS: "-v --color=yes --csv unit_results.csv"
steps:
- name: Check out the repository
@@ -106,64 +100,31 @@ jobs:
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v3
if: always()
with:
name: unit_results_${{ matrix.python-version }}-${{ steps.date.outputs.date }}.csv
path: unit_results.csv
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
include: ${{ steps.generate-include.outputs.include }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
- name: generate include
id: generate-include
run: |
INCLUDE=('"python-version":"3.8","os":"windows-latest"' '"python-version":"3.8","os":"macos-latest"' )
INCLUDE_GROUPS="["
for include in ${INCLUDE[@]}; do
for group in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
INCLUDE_GROUPS+=$(sed 's/$/, /' <<< "{\"split-group\":\"${group}\",${include}}")
done
done
INCLUDE_GROUPS=$(echo $INCLUDE_GROUPS | sed 's/,*$//g')
INCLUDE_GROUPS+="]"
echo "include=${INCLUDE_GROUPS}"
echo "include=${INCLUDE_GROUPS}" >> $GITHUB_OUTPUT
- name: Upload Unit Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
integration:
name: (${{ matrix.split-group }}) integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
name: integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
runs-on: ${{ matrix.os }}
timeout-minutes: 30
needs:
- integration-metadata
timeout-minutes: 60
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
os: [ubuntu-20.04]
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
include: ${{ fromJson(needs.integration-metadata.outputs.include) }}
include:
- python-version: 3.8
os: windows-latest
- python-version: 3.8
os: macos-latest
env:
TOXENV: integration
PYTEST_ADDOPTS: "-v --color=yes -n4 --csv integration_results.csv"
DBT_INVOCATION_ENV: github-actions
DBT_TEST_USER_1: dbt_test_user_1
DBT_TEST_USER_2: dbt_test_user_2
@@ -204,8 +165,6 @@ jobs:
- name: Run tests
run: tox -- --ddtrace
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
- name: Get current date
if: always()
@@ -220,28 +179,11 @@ jobs:
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}
path: ./logs
- uses: actions/upload-artifact@v3
if: always()
with:
name: integration_results_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}.csv
path: integration_results.csv
integration-report:
if: ${{ always() }}
name: Integration Test Suite
runs-on: ubuntu-latest
needs: integration
steps:
- name: "Integration Tests Failed"
if: ${{ contains(needs.integration.result, 'failure') || contains(needs.integration.result, 'cancelled') }}
# when this is true the next step won't execute
run: |
echo "::notice title='Integration test suite failed'"
exit 1
- name: "Integration Tests Passed"
run: |
echo "::notice title='Integration test suite passed'"
- name: Upload Integration Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
build:
name: build packages

View File

@@ -18,41 +18,11 @@ on:
permissions: read-all
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python testing
PYTHON_INTEGRATION_TEST_WORKERS: 5
jobs:
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
# run the performance measurements on the current or default branch
test-schema:
name: Test Log Schema
runs-on: ubuntu-20.04
timeout-minutes: 30
needs:
- integration-metadata
strategy:
fail-fast: false
matrix:
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
env:
# turns warnings into errors
RUSTFLAGS: "-D warnings"
@@ -95,14 +65,3 @@ jobs:
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs
- name: Run integration tests
run: tox -e integration -- -nauto
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
test-schema-report:
name: Log Schema Test Suite
runs-on: ubuntu-latest
needs: test-schema
steps:
- name: "[Notification] Log test suite passes"
run: |
echo "::notice title="Log test suite passes""

View File

@@ -5,266 +5,12 @@
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-core 1.6.2 - September 07, 2023
### Breaking Changes
- Removed the FirstRunResultError and AfterFirstRunResultError event types, using the existing RunResultError in their place. ([#7963](https://github.com/dbt-labs/dbt-core/issues/7963))
### Features
- Accept a `dbt-cloud` config in dbt_project.yml ([#8438](https://github.com/dbt-labs/dbt-core/issues/8438))
### Fixes
- Copy dir during `dbt deps` if symlink fails ([#7428](https://github.com/dbt-labs/dbt-core/issues/7428), [#8223](https://github.com/dbt-labs/dbt-core/issues/8223))
- fix ambiguous reference error for tests and versions when model name is duplicated across packages ([#8327](https://github.com/dbt-labs/dbt-core/issues/8327), [#8493](https://github.com/dbt-labs/dbt-core/issues/8493))
- Fix "Internal Error: Expected node <unique-id> not found in manifest" when depends_on set on ModelNodeArgs ([#8506](https://github.com/dbt-labs/dbt-core/issues/8506))
- Fix snapshot success message ([#7583](https://github.com/dbt-labs/dbt-core/issues/7583))
- Parse the correct schema version from manifest ([#8544](https://github.com/dbt-labs/dbt-core/issues/8544))
### Contributors
- [@anjutiwari](https://github.com/anjutiwari) ([#7428](https://github.com/dbt-labs/dbt-core/issues/7428), [#8223](https://github.com/dbt-labs/dbt-core/issues/8223))
## dbt-core 1.6.1 - August 23, 2023
### Fixes
- Add status to Parse Inline Error ([#8173](https://github.com/dbt-labs/dbt-core/issues/8173))
- Fix retry not working with log-file-max-bytes ([#8297](https://github.com/dbt-labs/dbt-core/issues/8297))
- Detect changes to model access, version, or latest_version in state:modified ([#8189](https://github.com/dbt-labs/dbt-core/issues/8189))
- fix fqn-selection for external versioned models ([#8374](https://github.com/dbt-labs/dbt-core/issues/8374))
- Fix: DbtInternalError after model that previously ref'd external model is deleted ([#8375](https://github.com/dbt-labs/dbt-core/issues/8375))
- Fix using list command with path selector and project-dir ([#8385](https://github.com/dbt-labs/dbt-core/issues/8385))
- Remedy performance regression by only writing run_results.json once. ([#8360](https://github.com/dbt-labs/dbt-core/issues/8360))
- Ensure parsing does not break when `window_groupings` is not specified for `non_additive_dimension` ([#8453](https://github.com/dbt-labs/dbt-core/issues/8453))
### Docs
- Display contract and column constraints on the model page ([dbt-docs/#433](https://github.com/dbt-labs/dbt-docs/issues/433))
- Display semantic model details in docs ([dbt-docs/#431](https://github.com/dbt-labs/dbt-docs/issues/431))
### Under the Hood
- Refactor flaky test pp_versioned_models ([#7781](https://github.com/dbt-labs/dbt-core/issues/7781))
- format exception from dbtPlugin.initialize ([#8152](https://github.com/dbt-labs/dbt-core/issues/8152))
- Update manifest v10 ([#8333](https://github.com/dbt-labs/dbt-core/issues/8333))
- add tracking for plugin.get_nodes calls ([#8344](https://github.com/dbt-labs/dbt-core/issues/8344))
- add internal flag: --no-partial-parse-file-diff to inform whether to compute a file diff during partial parsing ([#8363](https://github.com/dbt-labs/dbt-core/issues/8363))
- Use python version 3.10.7 in Docker image. ([#8444](https://github.com/dbt-labs/dbt-core/issues/8444))
- Check for existing_relation immediately prior to renaming ([#7781](https://github.com/dbt-labs/dbt-core/issues/7781))
## dbt-core 1.6.0 - July 31, 2023
### Breaking Changes
- Dropped support for Python 3.7 ([#7082](https://github.com/dbt-labs/dbt-core/issues/7082))
- Switch from dbt-metrics to dbt-semantic-interfaces for MetricNode definitions ([#7500](https://github.com/dbt-labs/dbt-core/issues/7500), [#7404](https://github.com/dbt-labs/dbt-core/issues/7404))
### Features
- Add merge as valid incremental strategy for postgres ([#1880](https://github.com/dbt-labs/dbt-core/issues/1880))
- Skip catalog generation ([#6980](https://github.com/dbt-labs/dbt-core/issues/6980))
- Add support for materialized views ([#6911](https://github.com/dbt-labs/dbt-core/issues/6911))
- Publication artifacts and cross-project ref ([#7227](https://github.com/dbt-labs/dbt-core/issues/7227))
- Optimize template rendering for common parse scenarios ([#7449](https://github.com/dbt-labs/dbt-core/issues/7449))
- Add graph structure summaries to target path output ([#7357](https://github.com/dbt-labs/dbt-core/issues/7357))
- Allow duplicate manifest node (models, seeds, analyses, snapshots) names across packages ([#7446](https://github.com/dbt-labs/dbt-core/issues/7446))
- Detect breaking changes to enforced constraints ([#7065](https://github.com/dbt-labs/dbt-core/issues/7065))
- Check for project dependency cycles ([#7468](https://github.com/dbt-labs/dbt-core/issues/7468))
- nodes in packages respect custom generate_alias_name, generate_schema_name, generate_database_name macro overrides defined in packages ([#7444](https://github.com/dbt-labs/dbt-core/issues/7444))
- Added warnings for model and ref deprecations ([#7433](https://github.com/dbt-labs/dbt-core/issues/7433))
- Update drop_relation macro to allow for configuration of drop statement separately from object name ([#7625](https://github.com/dbt-labs/dbt-core/issues/7625))
- accept publications in dbt.invoke ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372))
- Enable state for deferral to be separate from state for selectors ([#7300](https://github.com/dbt-labs/dbt-core/issues/7300))
- add access selection syntax ([#7738](https://github.com/dbt-labs/dbt-core/issues/7738))
- add project_name to manifest metadata ([#7752](https://github.com/dbt-labs/dbt-core/issues/7752))
- dbt retry ([#7299](https://github.com/dbt-labs/dbt-core/issues/7299))
- This change adds new selector methods to the state selector. Namely, state:unmodified and state:old. ([#7564](https://github.com/dbt-labs/dbt-core/issues/7564))
- Revamp debug, add --connection flag. Prepare for future refactors/interface changes. ([#7104](https://github.com/dbt-labs/dbt-core/issues/7104))
- Validate public models are not materialized as ephemeral ([#7226](https://github.com/dbt-labs/dbt-core/issues/7226))
- Added support for parsing and serializaing semantic models ([#7499](https://github.com/dbt-labs/dbt-core/issues/7499), [#7503](https://github.com/dbt-labs/dbt-core/issues/7503))
- Enable setting packages in dependencies.yml ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372), [#7736](https://github.com/dbt-labs/dbt-core/issues/7736))
- Add AdapterRegistered event log message ([#7038](https://github.com/dbt-labs/dbt-core/issues/7038))
- dbt clone ([#7258](https://github.com/dbt-labs/dbt-core/issues/7258))
- Further integrate semantic models into the DAG and partial parsing module ([#7800](https://github.com/dbt-labs/dbt-core/issues/7800))
- Handle external model nodes in state:modified ([#7563](https://github.com/dbt-labs/dbt-core/issues/7563))
- Add invocation_command to flags ([#6051](https://github.com/dbt-labs/dbt-core/issues/6051))
- Add thread_id context var ([#7941](https://github.com/dbt-labs/dbt-core/issues/7941))
- Add partial parsing support for semantic models ([#7897](https://github.com/dbt-labs/dbt-core/issues/7897))
- Add restrict-access to dbt_project.yml ([#7713](https://github.com/dbt-labs/dbt-core/issues/7713))
- allow setting enabled and depends_on_nodes from ModelNodeArgs ([#7506](https://github.com/dbt-labs/dbt-core/issues/7506))
- Support '_'-delimited fqn matching for versioned models and matching on Path.stem for path selection ([#7639](https://github.com/dbt-labs/dbt-core/issues/7639))
- Store time_spline table configuration in semantic manifest ([#7938](https://github.com/dbt-labs/dbt-core/issues/7938))
- Add validate_sql method to BaseAdapter with implementation for SQLAdapter ([#7839](https://github.com/dbt-labs/dbt-core/issues/7839))
- Support validation of metrics and semantic models. ([#7969](https://github.com/dbt-labs/dbt-core/issues/7969))
- Begin populating `depends_on` of metric nodes ([#7854](https://github.com/dbt-labs/dbt-core/issues/7854))
- Enumerate supported materialized view features for dbt-postgres ([#6911](https://github.com/dbt-labs/dbt-core/issues/6911))
### Fixes
- Raise better error message when dispatching a package that is not installed ([#5801](https://github.com/dbt-labs/dbt-core/issues/5801))
- add negative part_number arg for split part macro ([#7915](https://github.com/dbt-labs/dbt-core/issues/7915))
- Persist timing info in run results for failed nodes ([#5476](https://github.com/dbt-labs/dbt-core/issues/5476))
- fix typo in unpacking statically parsed ref ([#7364](https://github.com/dbt-labs/dbt-core/issues/7364))
- safe version attribute access in _check_resource_uniqueness ([#7375](https://github.com/dbt-labs/dbt-core/issues/7375))
- Fix dbt command missing target-path param ([# 7411](https://github.com/dbt-labs/dbt-core/issues/ 7411))
- Fix CTE insertion position when the model uses WITH RECURSIVE ([#7350](https://github.com/dbt-labs/dbt-core/issues/7350))
- Fix v0 ref resolution ([#7408](https://github.com/dbt-labs/dbt-core/issues/7408))
- Add --target-path to dbt snapshot command. ([#7418](https://github.com/dbt-labs/dbt-core/issues/7418))
- dbt build selection of tests' descendants ([#7289](https://github.com/dbt-labs/dbt-core/issues/7289))
- fix groupable node partial parsing, raise DbtReferenceError at runtime for safety ([#7437](https://github.com/dbt-labs/dbt-core/issues/7437))
- Fix partial parsing of latest_version changes for downstream references ([#7369](https://github.com/dbt-labs/dbt-core/issues/7369))
- Use "add_node" to update depends_on.nodes ([#7453](https://github.com/dbt-labs/dbt-core/issues/7453))
- Fix var precedence in configs: root vars override package vars ([#6705](https://github.com/dbt-labs/dbt-core/issues/6705))
- Constraint rendering fixes: wrap check expression in parentheses, foreign key 'references', support expression in all constraint types ([#7417](https://github.com/dbt-labs/dbt-core/issues/7417), [#7480](https://github.com/dbt-labs/dbt-core/issues/7480), [#7416](https://github.com/dbt-labs/dbt-core/issues/7416))
- Fix inverted `--print/--no-print` flag ([#7517](https://github.com/dbt-labs/dbt-core/issues/7517))
- Back-compat for previous return type of 'collect_freshness' macro ([#7489](https://github.com/dbt-labs/dbt-core/issues/7489))
- Fix warning messages for deprecated dbt_project.yml configs ([#7424](https://github.com/dbt-labs/dbt-core/issues/7424))
- Respect column 'quote' config in model contracts ([#7370](https://github.com/dbt-labs/dbt-core/issues/7370))
- print model version in dbt show if specified ([#7407](https://github.com/dbt-labs/dbt-core/issues/7407))
- enable dbt show for seeds ([#7273](https://github.com/dbt-labs/dbt-core/issues/7273))
- push down limit filtering to adapter ([#7390](https://github.com/dbt-labs/dbt-core/issues/7390))
- Allow missing `profiles.yml` for `dbt deps` and `dbt init` ([#7511](https://github.com/dbt-labs/dbt-core/issues/7511))
- `run_results.json` is now written after every node completes. ([#7302](https://github.com/dbt-labs/dbt-core/issues/7302))
- Do not rewrite manifest.json during 'docs serve' command ([#7553](https://github.com/dbt-labs/dbt-core/issues/7553))
- Pin protobuf to greater than 4.0.0 ([#7565](https://github.com/dbt-labs/dbt-core/issues/7565))
- inject sql header in query for show ([#7413](https://github.com/dbt-labs/dbt-core/issues/7413))
- Pin urllib3 to ~=1.0 ([#7573](https://github.com/dbt-labs/dbt-core/issues/7573))
- Throw error for duplicated versioned and unversioned models ([#7487](https://github.com/dbt-labs/dbt-core/issues/7487))
- Honor `--skip-profile-setup` parameter when inside an existing project ([#7594](https://github.com/dbt-labs/dbt-core/issues/7594))
- Fix: Relative project paths weren't working with deps ([#7491](https://github.com/dbt-labs/dbt-core/issues/7491))
- Exclude password fields from Jinja rendering. ([#7629](https://github.com/dbt-labs/dbt-core/issues/7629))
- Add --target-path to more CLI subcommands ([#7646](https://github.com/dbt-labs/dbt-core/issues/7646))
- Stringify flag paths for Jinja context ([#7495](https://github.com/dbt-labs/dbt-core/issues/7495))
- write run_results.json for run operation ([#7502](https://github.com/dbt-labs/dbt-core/issues/7502))
- Add `%` to adapter suite test cases for `persist_docs` ([#7698](https://github.com/dbt-labs/dbt-core/issues/7698))
- Unified to UTC ([#7664](https://github.com/dbt-labs/dbt-core/issues/7664))
- Improve warnings for constraints and materialization types ([#7335](https://github.com/dbt-labs/dbt-core/issues/7335))
- Incorrect paths used for "target" and "state" directories ([#7465](https://github.com/dbt-labs/dbt-core/issues/7465))
- fix StopIteration error when publication for project not found ([#7711](https://github.com/dbt-labs/dbt-core/issues/7711))
- Using version 0 works when resolving single model ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372))
- Fix empty --warn-error-options error message ([#7730](https://github.com/dbt-labs/dbt-core/issues/7730))
- send sql header on contract enforcement ([#7714](https://github.com/dbt-labs/dbt-core/issues/7714))
- Fixed doc link in selector.py ([#7533](https://github.com/dbt-labs/dbt-core/issues/7533))
- Fix null-safe equals comparison via `equals` ([#7778](https://github.com/dbt-labs/dbt-core/issues/7778))
- Log PublicationArtifactAvailable even when partially parsing unchanged public models ([#7782](https://github.com/dbt-labs/dbt-core/issues/7782))
- fix RuntimeError when removing project dependency from dependencies.yml ([#7743](https://github.com/dbt-labs/dbt-core/issues/7743))
- Fix regression in `run-operation` to not require the name of the package to run ([#7753](https://github.com/dbt-labs/dbt-core/issues/7753))
- Fix path selector when using project-dir ([#7819](https://github.com/dbt-labs/dbt-core/issues/7819))
- Allow project dependencies to use miscellaneous keys ([#7497](https://github.com/dbt-labs/dbt-core/issues/7497))
- Allow dbt show --inline preview of private models ([#7837](https://github.com/dbt-labs/dbt-core/issues/7837))
- Updating this error message to point to the correct URL ([#7789](https://github.com/dbt-labs/dbt-core/issues/7789))
- Update SemanticModel node to properly impelment the DSI 0.1.0dev3 SemanticModel protocol spec ([#7833](https://github.com/dbt-labs/dbt-core/issues/7833), [#7827](https://github.com/dbt-labs/dbt-core/issues/7827))
- Allow semantic model measure exprs to be defined with ints and bools in yaml ([#7865](https://github.com/dbt-labs/dbt-core/issues/7865))
- Update `use_discrete_percentile` and `use_approximate_percentile` to be non optional and default to `False` ([#7866](https://github.com/dbt-labs/dbt-core/issues/7866))
- Fix accidental propagation of log messages to root logger. ([#7872](https://github.com/dbt-labs/dbt-core/issues/7872))
- Fixed an issue which blocked debug logging to stdout with --log-level debug, unless --debug was also used. ([#7872](https://github.com/dbt-labs/dbt-core/issues/7872))
- Skip jinja parsing of metric filters ([#7864](https://github.com/dbt-labs/dbt-core/issues/7864))
- Fix a bad implicit string conversion regression in debug --config-dir code. ([#7774](https://github.com/dbt-labs/dbt-core/issues/7774))
- Remove limitation on use of sqlparse 0.4.4 ([#7515](https://github.com/dbt-labs/dbt-core/issues/7515))
- Fix UninstalledPackagesFoundError error message to use correct packages specified path ([#7921](https://github.com/dbt-labs/dbt-core/issues/7921))
- Fix: safe remove of external nodes from nodes.depends_on ([#7924](https://github.com/dbt-labs/dbt-core/issues/7924))
- Fix query comment tests ([#7845](https://github.com/dbt-labs/dbt-core/issues/7845))
- Move project_root contextvar into events.contextvars ([#7937](https://github.com/dbt-labs/dbt-core/issues/7937))
- add access to ModelNodeArgs for external node building ([#7890](https://github.com/dbt-labs/dbt-core/issues/7890))
- Inline query emit proper error message ([#7940](https://github.com/dbt-labs/dbt-core/issues/7940))
- Fix typo in ModelNodeArgs ([#7991](https://github.com/dbt-labs/dbt-core/issues/7991))
- Allow on_schema_change = fail for contracted incremental models ([#7975](https://github.com/dbt-labs/dbt-core/issues/7975))
- Nicer error message if model with enforced contract is missing 'columns' specification ([#7943](https://github.com/dbt-labs/dbt-core/issues/7943))
- include 'v' in ModelNodeArgs.unique_id ([#8039](https://github.com/dbt-labs/dbt-core/issues/8039))
- Fix fail-fast behavior (including retry) ([#7785](https://github.com/dbt-labs/dbt-core/issues/7785))
- Remove `create_metric` as a `SemanticModel.Measure` property because it currently doesn't do anything ([#8064](https://github.com/dbt-labs/dbt-core/issues/8064))
- Remove `VOLUME` declaration within Dockerfile ([#4784](https://github.com/dbt-labs/dbt-core/issues/4784))
- Fix Dockerfile.test ([#7352](https://github.com/dbt-labs/dbt-core/issues/7352))
- Detect breaking contract changes to versioned models ([#8030](https://github.com/dbt-labs/dbt-core/issues/8030))
- Update DryRunMethod test classes ValidateSqlMethod naming ([#7839](https://github.com/dbt-labs/dbt-core/issues/7839))
- Fix typo in `NonAdditiveDimension` implementation ([#8088](https://github.com/dbt-labs/dbt-core/issues/8088))
- Copy target_schema from config into snapshot node ([#6745](https://github.com/dbt-labs/dbt-core/issues/6745))
- Enable converting deprecation warnings to errors ([#8130](https://github.com/dbt-labs/dbt-core/issues/8130))
- Ensure `warn_error_options` get serialized in `invocation_args_dict` ([#7694](https://github.com/dbt-labs/dbt-core/issues/7694))
- Stop detecting materialization macros based on macro name ([#6231](https://github.com/dbt-labs/dbt-core/issues/6231))
- Improve handling of CTE injection with ephemeral models ([#8213](https://github.com/dbt-labs/dbt-core/issues/8213))
- Fix unbound local variable error in `checked_agg_time_dimension_for_measure` ([#8230](https://github.com/dbt-labs/dbt-core/issues/8230))
- Ensure runtime errors are raised for graph runnable tasks (compile, show, run, etc) ([#8166](https://github.com/dbt-labs/dbt-core/issues/8166))
### Docs
- add note before running integration tests ([dbt-docs/#nothing](https://github.com/dbt-labs/dbt-docs/issues/nothing))
- Fix for column tests not rendering on quoted columns ([dbt-docs/#201](https://github.com/dbt-labs/dbt-docs/issues/201))
- Fix broken links in `CONTRIBUTING.md`. ([dbt-docs/#8018](https://github.com/dbt-labs/dbt-docs/issues/8018))
- Remove static SQL codeblock for metrics ([dbt-docs/#436](https://github.com/dbt-labs/dbt-docs/issues/436))
### Under the Hood
- Update docs link in ContractBreakingChangeError message ([#7366](https://github.com/dbt-labs/dbt-core/issues/7366))
- Reduce memory footprint of cached statement results. ([#7281](https://github.com/dbt-labs/dbt-core/issues/7281))
- Remove noisy parsing events: GenericTestFileParse, MacroFileParse, Note events for static model parsing ([#6671](https://github.com/dbt-labs/dbt-core/issues/6671))
- Update --help text for cache-related parameters ([#7381](https://github.com/dbt-labs/dbt-core/issues/7381))
- Small UX improvements to model versions: Support defining latest_version in unsuffixed file by default. Notify on unpinned ref when a prerelease version is available. ([#7443](https://github.com/dbt-labs/dbt-core/issues/7443))
- Add ability to instantiate Flags class from dict ([#7607](https://github.com/dbt-labs/dbt-core/issues/7607))
- Add other relation to reffable nodes ([#7550](https://github.com/dbt-labs/dbt-core/issues/7550))
- Move node patch method to schema parser patch_node_properties and refactor schema parsing ([#7430](https://github.com/dbt-labs/dbt-core/issues/7430))
- Remove legacy file logger code ([#NA](https://github.com/dbt-labs/dbt-core/issues/NA))
- Break up integration tests as a short term fix for Windows CI runs ([#7668](https://github.com/dbt-labs/dbt-core/issues/7668))
- Include null checks in utils test base ([#7670](https://github.com/dbt-labs/dbt-core/issues/7670))
- Write pub artifact to log ([#7372](https://github.com/dbt-labs/dbt-core/issues/7372))
- Fix flaky test for --fail-fast ([#7744](https://github.com/dbt-labs/dbt-core/issues/7744))
- Create `add_from_artifact` to populate `state_relation` field of nodes ([#7551](https://github.com/dbt-labs/dbt-core/issues/7551))
- Replace space with underscore in NodeType strings ([#7841](https://github.com/dbt-labs/dbt-core/issues/7841))
- Upgrade to dbt-semantic-interfaces v0.1.0dev5 ([#7853](https://github.com/dbt-labs/dbt-core/issues/7853))
- Refactoring: consolidating public_nodes and nodes ([#7890](https://github.com/dbt-labs/dbt-core/issues/7890))
- Resolve SemanticModel ref is the same way as other refs ([#7822](https://github.com/dbt-labs/dbt-core/issues/7822))
- Move from dbt-semantic-intefaces 0.1.0dev5 to 0.1.0dev7 ([#7898](https://github.com/dbt-labs/dbt-core/issues/7898))
- Don't jinja render packages from dependencies.yml ([#7905](https://github.com/dbt-labs/dbt-core/issues/7905))
- Update mashumaro to 3.8.1 ([#7950](https://github.com/dbt-labs/dbt-core/issues/7950))
- Refactor: entry point for cross-project ref ([#7954](https://github.com/dbt-labs/dbt-core/issues/7954))
- Populate metric input measures ([#7884](https://github.com/dbt-labs/dbt-core/issues/7884))
- Add option to specify partial parse file ([#7911](https://github.com/dbt-labs/dbt-core/issues/7911))
- Add semantic_models to resource counts ([#8077](https://github.com/dbt-labs/dbt-core/issues/8077))
- A way to control maxBytes for a single dbt.log file ([#8199](https://github.com/dbt-labs/dbt-core/issues/8199))
### Dependencies
- Bump mypy from 0.981 to 1.0.1 ([#7027](https://github.com/dbt-labs/dbt-core/pull/7027))
- Bump ubuntu from 23.04 to 23.10 ([#7675](https://github.com/dbt-labs/dbt-core/pull/7675))
- ([#7681](https://github.com/dbt-labs/dbt-core/pull/7681))
- Pin click>=8.1.1,<8.1.4 ([#8050](https://github.com/dbt-labs/dbt-core/pull/8050))
- Bump `dbt-semantic-interfaces` to `~=0.1.0rc1` ([#8082](https://github.com/dbt-labs/dbt-core/pull/8082))
- Update pin for click<9 ([#8232](https://github.com/dbt-labs/dbt-core/pull/8232))
- Add upper bound to sqlparse pin of <0.5 ([#8236](https://github.com/dbt-labs/dbt-core/pull/8236))
- Support dbt-semantic-interfaces 0.2.0 ([#8250](https://github.com/dbt-labs/dbt-core/pull/8250))
### Contributors
- [@AndyBys](https://github.com/AndyBys) ([#6980](https://github.com/dbt-labs/dbt-core/issues/6980))
- [@NiallRees](https://github.com/NiallRees) ([#6051](https://github.com/dbt-labs/dbt-core/issues/6051), [#7941](https://github.com/dbt-labs/dbt-core/issues/7941))
- [@alexrosenfeld10](https://github.com/alexrosenfeld10) ([#4784](https://github.com/dbt-labs/dbt-core/issues/4784))
- [@b-luu](https://github.com/b-luu) ([#7289](https://github.com/dbt-labs/dbt-core/issues/7289))
- [@d-kaneshiro](https://github.com/d-kaneshiro) ([#7664](https://github.com/dbt-labs/dbt-core/issues/7664), [#nothing](https://github.com/dbt-labs/dbt-core/issues/nothing))
- [@damian3031](https://github.com/damian3031) ([#7845](https://github.com/dbt-labs/dbt-core/issues/7845))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#7738](https://github.com/dbt-labs/dbt-core/issues/7738), [#7915](https://github.com/dbt-labs/dbt-core/issues/7915))
- [@dradnan89@hotmail.com](https://github.com/dradnan89@hotmail.com) ([#7681](https://github.com/dbt-labs/dbt-core/pull/7681))
- [@drewbanin](https://github.com/drewbanin) ([#201](https://github.com/dbt-labs/dbt-core/issues/201))
- [@dwreeves](https://github.com/dwreeves) ([#7418](https://github.com/dbt-labs/dbt-core/issues/7418), [#7646](https://github.com/dbt-labs/dbt-core/issues/7646))
- [@gem7318](https://github.com/gem7318) ([#8018](https://github.com/dbt-labs/dbt-core/issues/8018))
- [@iknox-fa](https://github.com/iknox-fa) ([#7302](https://github.com/dbt-labs/dbt-core/issues/7302), [#7491](https://github.com/dbt-labs/dbt-core/issues/7491), [#7281](https://github.com/dbt-labs/dbt-core/issues/7281), [#NA](https://github.com/dbt-labs/dbt-core/issues/NA))
- [@marcodamore](https://github.com/marcodamore) ([#436](https://github.com/dbt-labs/dbt-core/issues/436))
- [@mirnawong1](https://github.com/mirnawong1) ([#7789](https://github.com/dbt-labs/dbt-core/issues/7789))
- [@quazi-irfan](https://github.com/quazi-irfan) ([#7533](https://github.com/dbt-labs/dbt-core/issues/7533))
- [@rainermensing](https://github.com/rainermensing) ([#1880](https://github.com/dbt-labs/dbt-core/issues/1880))
- [@sdebruyn](https://github.com/sdebruyn) ([#7082](https://github.com/dbt-labs/dbt-core/issues/7082), [#7670](https://github.com/dbt-labs/dbt-core/issues/7670))
- [@stu-k](https://github.com/stu-k) ([#7299](https://github.com/dbt-labs/dbt-core/issues/7299), [#5476](https://github.com/dbt-labs/dbt-core/issues/5476), [#7607](https://github.com/dbt-labs/dbt-core/issues/7607), [#7550](https://github.com/dbt-labs/dbt-core/issues/7550), [#7551](https://github.com/dbt-labs/dbt-core/issues/7551))
- [@thomasgjerdekog](https://github.com/thomasgjerdekog) ([#7517](https://github.com/dbt-labs/dbt-core/issues/7517))
- [@tlento](https://github.com/tlento) ([#7839](https://github.com/dbt-labs/dbt-core/issues/7839), [#7839](https://github.com/dbt-labs/dbt-core/issues/7839))
- [@trouze](https://github.com/trouze) ([#7564](https://github.com/dbt-labs/dbt-core/issues/7564))
- [@willbryant](https://github.com/willbryant) ([#7350](https://github.com/dbt-labs/dbt-core/issues/7350))
## Previous Releases
For information on prior major and minor releases, see their changelogs:
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)

0
codecov.yml Normal file
View File

View File

@@ -61,6 +61,7 @@ def args_to_context(args: List[str]) -> Context:
if len(args) == 1 and "," in args[0]:
args = args[0].split(",")
sub_command_name, sub_command, args = cli.resolve_command(cli_ctx, args)
# Handle source and docs group.
if isinstance(sub_command, Group):
sub_command_name, sub_command, args = sub_command.resolve_command(cli_ctx, args)
@@ -318,6 +319,7 @@ def command_params(command: CliCommand, args_dict: Dict[str, Any]) -> CommandPar
for k, v in args_dict.items():
k = k.lower()
# if a "which" value exists in the args dict, it should match the command provided
if k == WHICH_KEY:
if v != command.value:
@@ -342,8 +344,7 @@ def command_params(command: CliCommand, args_dict: Dict[str, Any]) -> CommandPar
if k == "macro" and command == CliCommand.RUN_OPERATION:
add_fn(v)
# None is a Singleton, False is a Flyweight, only one instance of each.
elif v is None or v is False:
elif v in (None, False):
add_fn(f"--no-{spinal_cased}")
elif v is True:
add_fn(f"--{spinal_cased}")

View File

@@ -132,7 +132,6 @@ class dbtRunner:
@p.enable_legacy_logger
@p.fail_fast
@p.log_cache_events
@p.log_file_max_bytes
@p.log_format
@p.log_format_file
@p.log_level
@@ -141,7 +140,6 @@ class dbtRunner:
@p.macro_debugging
@p.partial_parse
@p.partial_parse_file_path
@p.partial_parse_file_diff
@p.populate_cache
@p.print
@p.printer_width
@@ -330,7 +328,6 @@ def docs_serve(ctx, **kwargs):
@p.state
@p.defer_state
@p.deprecated_state
@p.compile_inject_ephemeral_ctes
@p.target
@p.target_path
@p.threads

View File

@@ -40,14 +40,6 @@ compile_docs = click.option(
default=True,
)
compile_inject_ephemeral_ctes = click.option(
"--inject-ephemeral-ctes/--no-inject-ephemeral-ctes",
envvar=None,
help="Internal flag controlling injection of referenced ephemeral models' CTEs during `compile`.",
hidden=True,
default=True,
)
config_dir = click.option(
"--config-dir",
envvar=None,
@@ -179,15 +171,6 @@ use_colors_file = click.option(
default=True,
)
log_file_max_bytes = click.option(
"--log-file-max-bytes",
envvar="DBT_LOG_FILE_MAX_BYTES",
help="Configure the max file size in bytes for a single dbt.log file, before rolling over. 0 means no limit.",
default=10 * 1024 * 1024, # 10mb
type=click.INT,
hidden=True,
)
log_path = click.option(
"--log-path",
envvar="DBT_LOG_PATH",
@@ -265,14 +248,6 @@ partial_parse_file_path = click.option(
type=click.Path(exists=True, dir_okay=False, resolve_path=True),
)
partial_parse_file_diff = click.option(
"--partial-parse-file-diff/--no-partial-parse-file-diff",
envvar="DBT_PARTIAL_PARSE_FILE_DIFF",
help="Internal flag for whether to compute a file diff during partial parsing.",
hidden=True,
default=True,
)
populate_cache = click.option(
"--populate-cache/--no-populate-cache",
envvar="DBT_POPULATE_CACHE",
@@ -405,9 +380,9 @@ inline = click.option(
# Most CLI arguments should use the combined `select` option that aliases `--models` to `--select`.
# However, if you need to split out these separators (like `dbt ls`), use the `models` and `raw_select` options instead.
# See https://github.com/dbt-labs/dbt-core/pull/6774#issuecomment-1408476095 for more info.
models = click.option(*model_decls, **select_attrs) # type: ignore[arg-type]
raw_select = click.option(*select_decls, **select_attrs) # type: ignore[arg-type]
select = click.option(*select_decls, *model_decls, **select_attrs) # type: ignore[arg-type]
models = click.option(*model_decls, **select_attrs)
raw_select = click.option(*select_decls, **select_attrs)
select = click.option(*select_decls, *model_decls, **select_attrs)
selector = click.option(
"--selector",

View File

@@ -9,23 +9,10 @@ from typing import Iterable, List, Dict, Union, Optional, Any
from dbt.exceptions import DbtRuntimeError
BOM = BOM_UTF8.decode("utf-8") # '\ufeff'
class Integer(agate.data_types.DataType):
def cast(self, d):
# by default agate will cast none as a Number
# but we need to cast it as an Integer to preserve
# the type when merging and unioning tables
if type(d) == int or d is None:
return d
else:
raise agate.exceptions.CastError('Can not parse value "%s" as Integer.' % d)
def jsonify(self, d):
return d
class Number(agate.data_types.Number):
# undo the change in https://github.com/wireservice/agate/pull/733
# i.e. do not cast True and False to numeric 1 and 0
@@ -61,7 +48,6 @@ def build_type_tester(
) -> agate.TypeTester:
types = [
Integer(null_values=("null", "")),
Number(null_values=("null", "")),
agate.data_types.Date(null_values=("null", ""), date_format="%Y-%m-%d"),
agate.data_types.DateTime(null_values=("null", ""), datetime_format="%Y-%m-%d %H:%M:%S"),
@@ -180,13 +166,6 @@ class ColumnTypeBuilder(Dict[str, NullableAgateType]):
elif isinstance(value, _NullMarker):
# use the existing value
return
# when one table column is Number while another is Integer, force the column to Number on merge
elif isinstance(value, Integer) and isinstance(existing_type, agate.data_types.Number):
# use the existing value
return
elif isinstance(existing_type, Integer) and isinstance(value, agate.data_types.Number):
# overwrite
super().__setitem__(key, value)
elif not isinstance(value, type(existing_type)):
# actual type mismatch!
raise DbtRuntimeError(
@@ -198,9 +177,8 @@ class ColumnTypeBuilder(Dict[str, NullableAgateType]):
result: Dict[str, agate.data_types.DataType] = {}
for key, value in self.items():
if isinstance(value, _NullMarker):
# agate would make it a Number but we'll make it Integer so that if this column
# gets merged with another Integer column, it won't get forced to a Number
result[key] = Integer()
# this is what agate would do.
result[key] = agate.data_types.Number()
else:
result[key] = value
return result

View File

@@ -4,6 +4,7 @@ import json
import networkx as nx # type: ignore
import os
import pickle
import sqlparse
from collections import defaultdict
from typing import List, Dict, Any, Tuple, Optional
@@ -35,7 +36,6 @@ from dbt.node_types import NodeType, ModelLanguage
from dbt.events.format import pluralize
import dbt.tracking
import dbt.task.list as list_task
import sqlparse
graph_file_name = "graph.gpickle"
@@ -320,10 +320,6 @@ class Compiler:
if model.compiled_code is None:
raise DbtRuntimeError("Cannot inject ctes into an uncompiled node", model)
# tech debt: safe flag/arg access (#6259)
if not getattr(self.config.args, "inject_ephemeral_ctes", True):
return (model, [])
# extra_ctes_injected flag says that we've already recursively injected the ctes
if model.extra_ctes_injected:
return (model, model.extra_ctes)
@@ -382,16 +378,16 @@ class Compiler:
_add_prepended_cte(prepended_ctes, InjectedCTE(id=cte.id, sql=sql))
injected_sql = inject_ctes_into_sql(
model.compiled_code,
prepended_ctes,
)
# Check again before updating for multi-threading
if not model.extra_ctes_injected:
injected_sql = inject_ctes_into_sql(
model.compiled_code,
prepended_ctes,
)
model.extra_ctes_injected = True
model._pre_injected_sql = model.compiled_code
model.compiled_code = injected_sql
model.extra_ctes = prepended_ctes
model.extra_ctes_injected = True
# if model.extra_ctes is not set to prepended ctes, something went wrong
return model, model.extra_ctes
@@ -527,12 +523,6 @@ class Compiler:
the node's raw_code into compiled_code, and then calls the
recursive method to "prepend" the ctes.
"""
# Make sure Lexer for sqlparse 0.4.4 is initialized
from sqlparse.lexer import Lexer # type: ignore
if hasattr(Lexer, "get_default_instance"):
Lexer.get_default_instance()
node = self._compile_code(node, manifest, extra_context)
node, _ = self._recursively_prepend_ctes(node, manifest, extra_context)

View File

@@ -428,7 +428,6 @@ class PartialProject(RenderComponents):
metrics: Dict[str, Any]
exposures: Dict[str, Any]
vars_value: VarProvider
dbt_cloud: Dict[str, Any]
dispatch = cfg.dispatch
models = cfg.models
@@ -460,8 +459,6 @@ class PartialProject(RenderComponents):
manifest_selectors = SelectorDict.parse_from_selectors_list(
rendered.selectors_dict["selectors"]
)
dbt_cloud = cfg.dbt_cloud
project = Project(
project_name=name,
version=version,
@@ -501,7 +498,6 @@ class PartialProject(RenderComponents):
unrendered=unrendered,
project_env_vars=project_env_vars,
restrict_access=cfg.restrict_access,
dbt_cloud=dbt_cloud,
)
# sanity check - this means an internal issue
project.validate()
@@ -613,7 +609,6 @@ class Project:
unrendered: RenderComponents
project_env_vars: Dict[str, Any]
restrict_access: bool
dbt_cloud: Dict[str, Any]
@property
def all_source_paths(self) -> List[str]:
@@ -683,7 +678,6 @@ class Project:
"require-dbt-version": [v.to_version_string() for v in self.dbt_version],
"config-version": self.config_version,
"restrict-access": self.restrict_access,
"dbt-cloud": self.dbt_cloud,
}
)
if self.query_comment:

View File

@@ -182,7 +182,6 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
args=args,
cli_vars=cli_vars,
dependencies=dependencies,
dbt_cloud=project.dbt_cloud,
)
# Called by 'load_projects' in this class

View File

@@ -1510,15 +1510,7 @@ def get_manifest_schema_version(dct: dict) -> int:
schema_version = dct.get("metadata", {}).get("dbt_schema_version", None)
if not schema_version:
raise ValueError("Manifest doesn't have schema version")
# schema_version is in this format: https://schemas.getdbt.com/dbt/manifest/v10.json
# What the code below is doing:
# 1. Split on "/" v10.json
# 2. Split on "." v10
# 3. Skip first character 10
# 4. Convert to int
# TODO: If this gets more complicated, turn into a regex
return int(schema_version.split("/")[-1].split(".")[0][1:])
return int(schema_version.split(".")[-2][-1])
def _check_duplicates(value: BaseNode, src: Mapping[str, BaseNode]):

View File

@@ -619,8 +619,6 @@ class SnapshotConfig(EmptySnapshotConfig):
@classmethod
def validate(cls, data):
super().validate(data)
# Note: currently you can't just set these keys in schema.yml because this validation
# will fail when parsing the snapshot node.
if not data.get("strategy") or not data.get("unique_key") or not data.get("target_schema"):
raise ValidationError(
"Snapshots must be configured with a 'strategy', 'unique_key', "
@@ -651,7 +649,6 @@ class SnapshotConfig(EmptySnapshotConfig):
if data.get("materialized") and data.get("materialized") != "snapshot":
raise ValidationError("A snapshot must have a materialized value of 'snapshot'")
# Called by "calculate_node_config_dict" in ContextConfigGenerator
def finalize_and_validate(self):
data = self.to_dict(omit_none=True)
self.validate(data)

View File

@@ -29,11 +29,3 @@ class ModelNodeArgs:
unique_id = f"{unique_id}.v{self.version}"
return unique_id
@property
def fqn(self) -> List[str]:
fqn = [self.package_name, self.name]
if self.version:
fqn.append(f"v{self.version}")
return fqn

View File

@@ -50,7 +50,6 @@ from dbt.flags import get_flags
from dbt.node_types import ModelLanguage, NodeType, AccessType
from dbt_semantic_interfaces.call_parameter_sets import FilterCallParameterSets
from dbt_semantic_interfaces.references import (
EntityReference,
MeasureReference,
LinkableElementReference,
SemanticModelReference,
@@ -590,7 +589,7 @@ class ModelNode(CompiledNode):
name=args.name,
package_name=args.package_name,
unique_id=unique_id,
fqn=args.fqn,
fqn=[args.package_name, args.name],
version=args.version,
latest_version=args.latest_version,
relation_name=args.relation_name,
@@ -626,18 +625,6 @@ class ModelNode(CompiledNode):
def materialization_enforces_constraints(self) -> bool:
return self.config.materialized in ["table", "incremental"]
def same_contents(self, old, adapter_type) -> bool:
return super().same_contents(old, adapter_type) and self.same_ref_representation(old)
def same_ref_representation(self, old) -> bool:
return (
# Changing the latest_version may break downstream unpinned refs
self.latest_version == old.latest_version
# Changes to access or deprecation_date may lead to ref-related parsing errors
and self.access == old.access
and self.deprecation_date == old.deprecation_date
)
def build_contract_checksum(self):
# We don't need to construct the checksum if the model does not
# have contract enforced, because it won't be used.
@@ -1511,7 +1498,6 @@ class SemanticModel(GraphNode):
refs: List[RefArgs] = field(default_factory=list)
created_at: float = field(default_factory=lambda: time.time())
config: SemanticModelConfig = field(default_factory=SemanticModelConfig)
primary_entity: Optional[str] = None
@property
def entity_references(self) -> List[LinkableElementReference]:
@@ -1582,26 +1568,17 @@ class SemanticModel(GraphNode):
measure is not None
), f"No measure with name ({measure_reference.element_name}) in semantic_model with name ({self.name})"
default_agg_time_dimension = (
self.defaults.agg_time_dimension if self.defaults is not None else None
)
if self.defaults is not None:
default_agg_time_dimesion = self.defaults.agg_time_dimension
agg_time_dimension_name = measure.agg_time_dimension or default_agg_time_dimension
agg_time_dimension_name = measure.agg_time_dimension or default_agg_time_dimesion
assert agg_time_dimension_name is not None, (
f"Aggregation time dimension for measure {measure.name} on semantic model {self.name} is not set! "
"To fix this either specify a default `agg_time_dimension` for the semantic model or define an "
"`agg_time_dimension` on the measure directly."
f"Aggregation time dimension for measure {measure.name} is not set! This should either be set directly on "
f"the measure specification in the model, or else defaulted to the primary time dimension in the data "
f"source containing the measure."
)
return TimeDimensionReference(element_name=agg_time_dimension_name)
@property
def primary_entity_reference(self) -> Optional[EntityReference]:
return (
EntityReference(element_name=self.primary_entity)
if self.primary_entity is not None
else None
)
# ====================================
# Patches

View File

@@ -163,9 +163,14 @@ class UnparsedVersion(dbtClassMixin):
def __lt__(self, other):
try:
return float(self.v) < float(other.v)
v = type(other.v)(self.v)
return v < other.v
except ValueError:
return str(self.v) < str(other.v)
try:
other_v = type(self.v)(other.v)
return self.v < other_v
except ValueError:
return str(self.v) < str(other.v)
@property
def include_exclude(self) -> dbt.helper_types.IncludeExclude:
@@ -684,7 +689,7 @@ class UnparsedEntity(dbtClassMixin):
class UnparsedNonAdditiveDimension(dbtClassMixin):
name: str
window_choice: str # AggregationType enum
window_groupings: List[str] = field(default_factory=list)
window_groupings: List[str]
@dataclass
@@ -723,7 +728,6 @@ class UnparsedSemanticModel(dbtClassMixin):
entities: List[UnparsedEntity] = field(default_factory=list)
measures: List[UnparsedMeasure] = field(default_factory=list)
dimensions: List[UnparsedDimension] = field(default_factory=list)
primary_entity: Optional[str] = None
def normalize_date(d: Optional[datetime.date]) -> Optional[datetime.datetime]:

View File

@@ -224,7 +224,6 @@ class Project(HyphenatedDbtClassMixin, Replaceable):
packages: List[PackageSpec] = field(default_factory=list)
query_comment: Optional[Union[QueryComment, NoValue, str]] = field(default_factory=NoValue)
restrict_access: bool = False
dbt_cloud: Optional[Dict[str, Any]] = None
@classmethod
def validate(cls, data):
@@ -241,10 +240,6 @@ class Project(HyphenatedDbtClassMixin, Replaceable):
or not isinstance(entry["search_order"], list)
):
raise ValidationError(f"Invalid project dispatch config: {entry}")
if "dbt_cloud" in data and not isinstance(data["dbt_cloud"], dict):
raise ValidationError(
f"Invalid dbt_cloud config. Expected a 'dict' but got '{type(data['dbt_cloud'])}'"
)
@dataclass

View File

@@ -51,15 +51,19 @@ class LocalPinnedPackage(LocalPackageMixin, PinnedPackage):
src_path = self.resolve_path(project)
dest_path = self.get_installation_path(project, renderer)
can_create_symlink = system.supports_symlinks()
if system.path_exists(dest_path):
if not system.path_is_symlink(dest_path):
system.rmdir(dest_path)
else:
system.remove_file(dest_path)
try:
if can_create_symlink:
fire_event(DepsCreatingLocalSymlink())
system.make_symlink(src_path, dest_path)
except OSError:
else:
fire_event(DepsSymlinkNotAvailable())
shutil.copytree(src_path, dest_path)

View File

@@ -8,12 +8,12 @@ import logging
from logging.handlers import RotatingFileHandler
import threading
import traceback
from typing import Any, Callable, List, Optional, TextIO, Protocol
from typing import Any, Callable, List, Optional, TextIO
from uuid import uuid4
from dbt.events.format import timestamp_to_datetime_string
from dbt.events.base_types import BaseEvent, EventLevel, msg_from_base_event, EventMsg
import dbt.utils
# A Filter is a function which takes a BaseEvent and returns True if the event
# should be logged, False otherwise.
@@ -80,7 +80,6 @@ class LoggerConfig:
use_colors: bool = False
output_stream: Optional[TextIO] = None
output_file_name: Optional[str] = None
output_file_max_bytes: Optional[int] = 10 * 1024 * 1024 # 10 mb
logger: Optional[Any] = None
@@ -101,7 +100,7 @@ class _Logger:
file_handler = RotatingFileHandler(
filename=str(config.output_file_name),
encoding="utf8",
maxBytes=config.output_file_max_bytes, # type: ignore
maxBytes=10 * 1024 * 1024, # 10 mb
backupCount=5,
)
self._python_logger = self._get_python_log_for_handler(file_handler)
@@ -176,7 +175,7 @@ class _JsonLogger(_Logger):
from dbt.events.functions import msg_to_dict
msg_dict = msg_to_dict(msg)
raw_log_line = json.dumps(msg_dict, sort_keys=True, cls=dbt.utils.ForgivingJSONEncoder)
raw_log_line = json.dumps(msg_dict, sort_keys=True)
line = self.scrubber(raw_log_line) # type: ignore
return line
@@ -206,7 +205,7 @@ class EventManager:
for callback in self.callbacks:
callback(msg)
def add_logger(self, config: LoggerConfig) -> None:
def add_logger(self, config: LoggerConfig):
logger = (
_JsonLogger(self, config)
if config.line_format == LineFormat.Json
@@ -218,25 +217,3 @@ class EventManager:
def flush(self):
for logger in self.loggers:
logger.flush()
class IEventManager(Protocol):
callbacks: List[Callable[[EventMsg], None]]
invocation_id: str
def fire_event(self, e: BaseEvent, level: Optional[EventLevel] = None) -> None:
...
def add_logger(self, config: LoggerConfig) -> None:
...
class TestEventManager(IEventManager):
def __init__(self):
self.event_history = []
def fire_event(self, e: BaseEvent, level: Optional[EventLevel] = None) -> None:
self.event_history.append((e, level))
def add_logger(self, config: LoggerConfig) -> None:
raise NotImplementedError()

View File

@@ -1,6 +1,6 @@
from dbt.constants import METADATA_ENV_PREFIX
from dbt.events.base_types import BaseEvent, EventLevel, EventMsg
from dbt.events.eventmgr import EventManager, LoggerConfig, LineFormat, NoFilter, IEventManager
from dbt.events.eventmgr import EventManager, LoggerConfig, LineFormat, NoFilter
from dbt.events.helpers import env_secrets, scrub_secrets
from dbt.events.types import Formatting, Note
from dbt.flags import get_flags, ENABLE_LEGACY_LOGGER
@@ -13,7 +13,6 @@ from typing import Callable, Dict, List, Optional, TextIO
import uuid
from google.protobuf.json_format import MessageToDict
import dbt.utils
LOG_VERSION = 3
metadata_vars: Optional[Dict[str, str]] = None
@@ -68,11 +67,7 @@ def setup_event_logger(flags, callbacks: List[Callable[[EventMsg], None]] = [])
log_level_file = EventLevel.DEBUG if flags.DEBUG else EventLevel(flags.LOG_LEVEL_FILE)
EVENT_MANAGER.add_logger(
_get_logfile_config(
log_file,
flags.USE_COLORS_FILE,
log_file_format,
log_level_file,
flags.LOG_FILE_MAX_BYTES,
log_file, flags.USE_COLORS_FILE, log_file_format, log_level_file
)
)
@@ -121,11 +116,7 @@ def _stdout_filter(
def _get_logfile_config(
log_path: str,
use_colors: bool,
line_format: LineFormat,
level: EventLevel,
log_file_max_bytes: int,
log_path: str, use_colors: bool, line_format: LineFormat, level: EventLevel
) -> LoggerConfig:
return LoggerConfig(
name="file_log",
@@ -135,7 +126,6 @@ def _get_logfile_config(
scrubber=env_scrubber,
filter=partial(_logfile_filter, bool(get_flags().LOG_CACHE_EVENTS), line_format),
output_file_name=log_path,
output_file_max_bytes=log_file_max_bytes,
)
@@ -182,7 +172,7 @@ def cleanup_event_logger():
# Since dbt-rpc does not do its own log setup, and since some events can
# currently fire before logs can be configured by setup_event_logger(), we
# create a default configuration with default settings and no file output.
EVENT_MANAGER: IEventManager = EventManager()
EVENT_MANAGER: EventManager = EventManager()
EVENT_MANAGER.add_logger(
_get_logbook_log_config(False, True, False, False) # type: ignore
if ENABLE_LEGACY_LOGGER
@@ -210,7 +200,7 @@ def stop_capture_stdout_logs():
# the message may contain secrets which must be scrubbed at the usage site.
def msg_to_json(msg: EventMsg) -> str:
msg_dict = msg_to_dict(msg)
raw_log_line = json.dumps(msg_dict, sort_keys=True, cls=dbt.utils.ForgivingJSONEncoder)
raw_log_line = json.dumps(msg_dict, sort_keys=True)
return raw_log_line
@@ -295,8 +285,3 @@ def set_invocation_id() -> None:
# This is primarily for setting the invocation_id for separate
# commands in the dbt servers. It shouldn't be necessary for the CLI.
EVENT_MANAGER.invocation_id = str(uuid.uuid4())
def ctx_set_event_manager(event_manager: IEventManager):
global EVENT_MANAGER
EVENT_MANAGER = event_manager

View File

@@ -1650,7 +1650,6 @@ message LogSnapshotResult {
int32 total = 5;
float execution_time = 6;
map<string, string> cfg = 7;
string result_message = 8;
}
message LogSnapshotResultMsg {
@@ -2246,7 +2245,25 @@ message CheckNodeTestFailureMsg {
CheckNodeTestFailure data = 2;
}
// Skipped Z028, Z029
// Z028
message FirstRunResultError {
string msg = 1;
}
message FirstRunResultErrorMsg {
EventInfo info = 1;
FirstRunResultError data = 2;
}
// Z029
message AfterFirstRunResultError {
string msg = 1;
}
message AfterFirstRunResultErrorMsg {
EventInfo info = 1;
AfterFirstRunResultError data = 2;
}
// Z030
message EndOfRunSummary {

View File

@@ -1614,7 +1614,7 @@ class LogSnapshotResult(DynamicLevel):
status = red(self.status.upper())
else:
info = "OK snapshotted"
status = green(self.result_message)
status = green(self.status)
msg = "{info} {description}".format(info=info, description=self.description, **self.cfg)
return format_fancy_output_line(
@@ -2171,7 +2171,25 @@ class CheckNodeTestFailure(InfoLevel):
return f" See test failures:\n {border}\n {msg}\n {border}"
# Skipped Z028, Z029
# FirstRunResultError and AfterFirstRunResultError are just splitting the message from the result
# object into multiple log lines
# TODO: is this reallly needed? See printer.py
class FirstRunResultError(ErrorLevel):
def code(self):
return "Z028"
def message(self) -> str:
return yellow(self.msg)
class AfterFirstRunResultError(ErrorLevel):
def code(self):
return "Z029"
def message(self) -> str:
return self.msg
class EndOfRunSummary(InfoLevel):

File diff suppressed because one or more lines are too long

View File

@@ -33,12 +33,7 @@
-- cleanup
{% if existing_relation is not none %}
/* Do the equivalent of rename_if_exists. 'existing_relation' could have been dropped
since the variable was first set. */
{% set existing_relation = load_cached_relation(existing_relation) %}
{% if existing_relation is not none %}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{% endif %}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{% endif %}
{{ adapter.rename_relation(intermediate_relation, target_relation) }}

View File

@@ -45,12 +45,7 @@
-- cleanup
-- move the existing view out of the way
{% if existing_relation is not none %}
/* Do the equivalent of rename_if_exists. 'existing_relation' could have been dropped
since the variable was first set. */
{% set existing_relation = load_cached_relation(existing_relation) %}
{% if existing_relation is not none %}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{% endif %}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{% endif %}
{{ adapter.rename_relation(intermediate_relation, target_relation) }}

File diff suppressed because one or more lines are too long

View File

@@ -4,8 +4,8 @@ from dbt.dataclass_schema import StrEnum
class AccessType(StrEnum):
Private = "private"
Protected = "protected"
Private = "private"
Public = "public"
@classmethod

View File

@@ -102,7 +102,8 @@ class RelationUpdate:
self.package_updaters = package_updaters
self.component = component
def __call__(self, parsed_node: Any, override: Optional[str]) -> None:
def __call__(self, parsed_node: Any, config_dict: Dict[str, Any]) -> None:
override = config_dict.get(self.component)
if parsed_node.package_name in self.package_updaters:
new_value = self.package_updaters[parsed_node.package_name](override, parsed_node)
else:
@@ -279,19 +280,9 @@ class ConfiguredParser(
def update_parsed_node_relation_names(
self, parsed_node: IntermediateNode, config_dict: Dict[str, Any]
) -> None:
# These call the RelationUpdate callable to go through generate_name macros
self._update_node_database(parsed_node, config_dict.get("database"))
self._update_node_schema(parsed_node, config_dict.get("schema"))
self._update_node_alias(parsed_node, config_dict.get("alias"))
# Snapshot nodes use special "target_database" and "target_schema" fields for some reason
if parsed_node.resource_type == NodeType.Snapshot:
if "target_database" in config_dict and config_dict["target_database"]:
parsed_node.database = config_dict["target_database"]
if "target_schema" in config_dict and config_dict["target_schema"]:
parsed_node.schema = config_dict["target_schema"]
self._update_node_database(parsed_node, config_dict)
self._update_node_schema(parsed_node, config_dict)
self._update_node_alias(parsed_node, config_dict)
self._update_node_relation_name(parsed_node)
def update_parsed_node_config(
@@ -358,7 +349,7 @@ class ConfiguredParser(
# do this once before we parse the node database/schema/alias, so
# parsed_node.config is what it would be if they did nothing
self.update_parsed_node_config_dict(parsed_node, config_dict)
# This updates the node database/schema/alias/relation_name
# This updates the node database/schema/alias
self.update_parsed_node_relation_names(parsed_node, config_dict)
# tests don't have hooks

View File

@@ -81,7 +81,7 @@ class MacroParser(BaseParser[Macro]):
name: str = macro.name.replace(MACRO_PREFIX, "")
node = self.parse_macro(block, base_node, name)
# get supported_languages for materialization macro
if block.block_type_name == "materialization":
if "materialization" in name:
node.supported_languages = jinja.get_supported_languages(macro)
yield node

View File

@@ -122,7 +122,7 @@ from dbt.parser.sources import SourcePatcher
from dbt.version import __version__
from dbt.dataclass_schema import StrEnum, dbtClassMixin
from dbt import plugins
from dbt.plugins import get_plugin_manager
from dbt_semantic_interfaces.enum_extension import assert_values_exhausted
from dbt_semantic_interfaces.type_enums import MetricType
@@ -284,17 +284,8 @@ class ManifestLoader:
adapter.clear_macro_manifest()
macro_hook = adapter.connections.set_query_header
flags = get_flags()
if not flags.PARTIAL_PARSE_FILE_DIFF:
file_diff = FileDiff.from_dict(
{
"deleted": [],
"changed": [],
"added": [],
}
)
# Hack to test file_diffs
elif os.environ.get("DBT_PP_FILE_DIFF_TEST"):
if os.environ.get("DBT_PP_FILE_DIFF_TEST"):
file_diff_path = "file_diff.json"
if path_exists(file_diff_path):
file_diff_dct = read_json(file_diff_path)
@@ -512,7 +503,6 @@ class ManifestLoader:
self.manifest.selectors = self.root_project.manifest_selectors
# inject any available external nodes
self.manifest.build_parent_and_child_maps()
external_nodes_modified = self.inject_external_nodes()
if external_nodes_modified:
self.manifest.rebuild_ref_lookup()
@@ -557,7 +547,7 @@ class ManifestLoader:
)
# parent and child maps will be rebuilt by write_manifest
if not skip_parsing or external_nodes_modified:
if not skip_parsing:
# write out the fully parsed manifest
self.write_manifest_for_partial_parse()
@@ -755,16 +745,13 @@ class ManifestLoader:
def inject_external_nodes(self) -> bool:
# Remove previously existing external nodes since we are regenerating them
manifest_nodes_modified = False
# Remove all dependent nodes before removing referencing nodes
for unique_id in self.manifest.external_node_unique_ids:
self.manifest.nodes.pop(unique_id)
remove_dependent_project_references(self.manifest, unique_id)
manifest_nodes_modified = True
for unique_id in self.manifest.external_node_unique_ids:
# remove external nodes from manifest only after dependent project references safely removed
self.manifest.nodes.pop(unique_id)
# Inject any newly-available external nodes
pm = plugins.get_plugin_manager(self.root_project.project_name)
pm = get_plugin_manager(self.root_project.project_name)
plugin_model_nodes = pm.get_nodes().models
for node_arg in plugin_model_nodes.values():
node = ModelNode.from_args(node_arg)

View File

@@ -233,7 +233,7 @@ class SchemaGenericTestParser(SimpleParser):
attached_node = None # type: Optional[Union[ManifestNode, GraphMemberNode]]
if not isinstance(target, UnpatchedSourceDefinition):
attached_node_unique_id = self.manifest.ref_lookup.get_unique_id(
target.name, target.package_name, version
target.name, None, version
)
if attached_node_unique_id:
attached_node = self.manifest.nodes[attached_node_unique_id]

View File

@@ -532,7 +532,6 @@ class SemanticModelParser(YamlReader):
measures=self._get_measures(unparsed.measures),
dimensions=self._get_dimensions(unparsed.dimensions),
defaults=unparsed.defaults,
primary_entity=unparsed.primary_entity,
)
ctx = generate_parse_semantic_models(

View File

@@ -693,7 +693,7 @@ class ModelPatchParser(NodePatchParser[UnparsedModelUpdate]):
)
# ref lookup without version - version is not set yet
versioned_model_unique_id = self.manifest.ref_lookup.get_unique_id(
versioned_model_name, target.package_name, None
versioned_model_name, None, None
)
versioned_model_node = None
@@ -702,7 +702,7 @@ class ModelPatchParser(NodePatchParser[UnparsedModelUpdate]):
# If this is the latest version, it's allowed to define itself in a model file name that doesn't have a suffix
if versioned_model_unique_id is None and unparsed_version.v == latest_version:
versioned_model_unique_id = self.manifest.ref_lookup.get_unique_id(
block.name, target.package_name, None
block.name, None, None
)
if versioned_model_unique_id is None:

View File

@@ -6,7 +6,6 @@ from dbt.contracts.graph.manifest import Manifest
from dbt.exceptions import DbtRuntimeError
from dbt.plugins.contracts import PluginArtifacts
from dbt.plugins.manifest import PluginNodes
import dbt.tracking
def dbt_hook(func):
@@ -30,11 +29,8 @@ class dbtPlugin:
self.project_name = project_name
try:
self.initialize()
except DbtRuntimeError as e:
# Remove the first line of DbtRuntimeError to avoid redundant "Runtime Error" line
raise DbtRuntimeError("\n".join(str(e).split("\n")[1:]))
except Exception as e:
raise DbtRuntimeError(str(e))
raise DbtRuntimeError(f"initialize: {e}")
@property
def name(self) -> str:
@@ -120,14 +116,5 @@ class PluginManager:
all_plugin_nodes = PluginNodes()
for hook_method in self.hooks.get("get_nodes", []):
plugin_nodes = hook_method()
dbt.tracking.track_plugin_get_nodes(
{
"plugin_name": hook_method.__self__.name, # type: ignore
"num_model_nodes": len(plugin_nodes.models),
"num_model_packages": len(
{model.package_name for model in plugin_nodes.models.values()}
),
}
)
all_plugin_nodes.update(plugin_nodes)
return all_plugin_nodes

View File

@@ -139,7 +139,6 @@ class CompileTask(GraphRunnableTask):
"node_path": "sql/inline_query",
"node_name": "inline_query",
"unique_id": "sqloperation.test.inline_query",
"node_status": "failed",
},
)
)

View File

@@ -15,7 +15,6 @@ from dbt.events.types import (
ListCmdOut,
)
from dbt.exceptions import DbtRuntimeError, DbtInternalError
from dbt.events.contextvars import task_contextvars
class ListTask(GraphRunnableTask):
@@ -124,23 +123,20 @@ class ListTask(GraphRunnableTask):
yield node.original_file_path
def run(self):
# We set up a context manager here with "task_contextvars" because we
# we need the project_root in compile_manifest.
with task_contextvars(project_root=self.config.project_root):
self.compile_manifest()
output = self.args.output
if output == "selector":
generator = self.generate_selectors
elif output == "name":
generator = self.generate_names
elif output == "json":
generator = self.generate_json
elif output == "path":
generator = self.generate_paths
else:
raise DbtInternalError("Invalid output {}".format(output))
self.compile_manifest()
output = self.args.output
if output == "selector":
generator = self.generate_selectors
elif output == "name":
generator = self.generate_names
elif output == "json":
generator = self.generate_json
elif output == "path":
generator = self.generate_paths
else:
raise DbtInternalError("Invalid output {}".format(output))
return self.output_results(generator())
return self.output_results(generator())
def output_results(self, results):
"""Log, or output a plain, newline-delimited, and ready-to-pipe list of nodes found."""

View File

@@ -14,6 +14,8 @@ from dbt.events.types import (
RunResultErrorNoMessage,
SQLCompiledPath,
CheckNodeTestFailure,
FirstRunResultError,
AfterFirstRunResultError,
EndOfRunSummary,
)
@@ -116,7 +118,15 @@ def print_run_result_error(result, newline: bool = True, is_warning: bool = Fals
fire_event(CheckNodeTestFailure(relation_name=result.node.relation_name))
elif result.message is not None:
fire_event(RunResultError(msg=result.message))
first = True
for line in result.message.split("\n"):
# TODO: why do we format like this? Is there a reason this needs to
# be split instead of sending it as a single log line?
if first:
fire_event(FirstRunResultError(msg=line))
first = False
else:
fire_event(AfterFirstRunResultError(msg=line))
def print_run_end_messages(results, keyboard_interrupt: bool = False) -> None:

View File

@@ -313,6 +313,15 @@ class GraphRunnableTask(ConfiguredTask):
cause = None
self._mark_dependent_errors(node.unique_id, result, cause)
interim_run_result = self.get_result(
results=self.node_results,
elapsed_time=time.time() - self.started_at,
generated_at=datetime.utcnow(),
)
if self.args.write_json and hasattr(interim_run_result, "write"):
interim_run_result.write(self.result_path())
def _cancel_connections(self, pool):
"""Given a pool, cancel all adapter connections and wait until all
runners gentle terminates.
@@ -366,27 +375,15 @@ class GraphRunnableTask(ConfiguredTask):
)
print_run_result_error(failure.result)
# ensure information about all nodes is propagated to run results when failing fast
return self.node_results
raise
except KeyboardInterrupt:
run_result = self.get_result(
results=self.node_results,
elapsed_time=time.time() - self.started_at,
generated_at=datetime.utcnow(),
)
if self.args.write_json and hasattr(run_result, "write"):
run_result.write(self.result_path())
self._cancel_connections(pool)
print_run_end_messages(self.node_results, keyboard_interrupt=True)
raise
pool.close()
pool.join()
return self.node_results
finally:
pool.close()
pool.join()
return self.node_results
def _mark_dependent_errors(self, node_id, result, cause):
if self.graph is None:
@@ -444,7 +441,7 @@ class GraphRunnableTask(ConfiguredTask):
Run dbt for the query, based on the graph.
"""
# We set up a context manager here with "task_contextvars" because we
# need the project_root in runtime_initialize.
# we need the project_root in runtime_initialize.
with task_contextvars(project_root=self.config.project_root):
self._runtime_initialize()
@@ -585,7 +582,7 @@ class GraphRunnableTask(ConfiguredTask):
create_futures.append(fut)
for create_future in as_completed(create_futures):
# trigger/re-raise any exceptions while creating schemas
# trigger/re-raise any excceptions while creating schemas
create_future.result()
def get_result(self, results, elapsed_time, generated_at):

View File

@@ -27,7 +27,6 @@ class SnapshotRunner(ModelRunner):
total=self.num_nodes,
execution_time=result.execution_time,
node_info=model.node_info,
result_message=result.message,
),
level=level,
)

View File

@@ -502,7 +502,6 @@ def project(
DEBUG=False,
LOG_CACHE_EVENTS=False,
QUIET=False,
LOG_FILE_MAX_BYTES=1000000,
)
setup_event_logger(log_flags)
orig_cwd = os.getcwd()

View File

@@ -46,7 +46,6 @@ RESOURCE_COUNTS = "iglu:com.dbt/resource_counts/jsonschema/1-0-1"
RPC_REQUEST_SPEC = "iglu:com.dbt/rpc_request/jsonschema/1-0-1"
RUNNABLE_TIMING = "iglu:com.dbt/runnable/jsonschema/1-0-0"
RUN_MODEL_SPEC = "iglu:com.dbt/run_model/jsonschema/1-0-3"
PLUGIN_GET_NODES = "iglu:com.dbt/plugin_get_nodes/jsonschema/1-0-0"
class TimeoutEmitter(Emitter):
@@ -410,19 +409,6 @@ def track_partial_parser(options):
)
def track_plugin_get_nodes(options):
context = [SelfDescribingJson(PLUGIN_GET_NODES, options)]
assert active_user is not None, "Cannot track plugin node info when active user is None"
track(
active_user,
category="dbt",
action="plugin_get_nodes",
label=get_invocation_id(),
context=context,
)
def track_runnable_timing(options):
context = [SelfDescribingJson(RUNNABLE_TIMING, options)]
assert active_user is not None, "Cannot track runnable info when active user is None"

View File

@@ -16,8 +16,9 @@ import time
from pathlib import PosixPath, WindowsPath
from contextlib import contextmanager
from dbt.exceptions import ConnectionError, DuplicateAliasError
from dbt.events.functions import fire_event
from dbt.events.types import RetryExternalCall, RecordRetryException
from dbt.helper_types import WarnErrorOptions
from dbt import flags
from enum import Enum
from typing_extensions import Protocol
@@ -39,7 +40,6 @@ from typing import (
Sequence,
)
import dbt.events.functions
import dbt.exceptions
DECIMALS: Tuple[Type[Any], ...]
@@ -337,18 +337,15 @@ class JSONEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, DECIMALS):
return float(obj)
elif isinstance(obj, (datetime.datetime, datetime.date, datetime.time)):
if isinstance(obj, (datetime.datetime, datetime.date, datetime.time)):
return obj.isoformat()
elif isinstance(obj, jinja2.Undefined):
if isinstance(obj, jinja2.Undefined):
return ""
elif isinstance(obj, Exception):
return repr(obj)
elif hasattr(obj, "to_dict"):
if hasattr(obj, "to_dict"):
# if we have a to_dict we should try to serialize the result of
# that!
return obj.to_dict(omit_none=True)
else:
return super().default(obj)
return super().default(obj)
class ForgivingJSONEncoder(JSONEncoder):
@@ -372,7 +369,7 @@ class Translator:
for key, value in kwargs.items():
canonical_key = self.aliases.get(key, key)
if canonical_key in result:
raise dbt.exceptions.DuplicateAliasError(kwargs, self.aliases, canonical_key)
raise DuplicateAliasError(kwargs, self.aliases, canonical_key)
result[canonical_key] = self.translate_value(value)
return result
@@ -392,7 +389,9 @@ class Translator:
return self.translate_mapping(value)
except RuntimeError as exc:
if "maximum recursion depth exceeded" in str(exc):
raise RecursionError("Cycle detected in a value passed to translate!")
raise dbt.exceptions.RecursionError(
"Cycle detected in a value passed to translate!"
)
raise
@@ -604,14 +603,12 @@ def _connection_exception_retry(fn, max_attempts: int, attempt: int = 0):
ReadError,
) as exc:
if attempt <= max_attempts - 1:
dbt.events.functions.fire_event(RecordRetryException(exc=str(exc)))
dbt.events.functions.fire_event(RetryExternalCall(attempt=attempt, max=max_attempts))
fire_event(RecordRetryException(exc=str(exc)))
fire_event(RetryExternalCall(attempt=attempt, max=max_attempts))
time.sleep(1)
return _connection_exception_retry(fn, max_attempts, attempt + 1)
else:
raise dbt.exceptions.ConnectionError(
"External connection exception occurred: " + str(exc)
)
raise ConnectionError("External connection exception occurred: " + str(exc))
# This is used to serialize the args in the run_results and in the logs.
@@ -655,9 +652,6 @@ def args_to_dict(args):
# this was required for a test case
if isinstance(var_args[key], PosixPath) or isinstance(var_args[key], WindowsPath):
var_args[key] = str(var_args[key])
if isinstance(var_args[key], WarnErrorOptions):
var_args[key] = var_args[key].to_dict()
dict_args[key] = var_args[key]
return dict_args

View File

@@ -232,5 +232,5 @@ def _get_adapter_plugin_names() -> Iterator[str]:
yield plugin_name
__version__ = "1.6.2"
__version__ = "1.7.0a1"
installed = get_installed_version()

View File

@@ -25,7 +25,7 @@ with open(os.path.join(this_directory, "README.md")) as f:
package_name = "dbt-core"
package_version = "1.6.2"
package_version = "1.7.0a1"
description = """With dbt, data analysts and engineers can build analytics \
the way engineers build applications."""
@@ -59,7 +59,7 @@ setup(
# ----
# dbt-core uses these packages in standard ways. Pin to the major version, and check compatibility
# with major versions in each new minor version of dbt-core.
"click<9",
"click>=8.1.1,<9",
"networkx>=2.3,<4",
# ----
# These packages are major-version-0. Keep upper bounds on upcoming minor versions (which could have breaking changes)
@@ -68,6 +68,7 @@ setup(
"pathspec>=0.9,<0.12",
"isodate>=0.6,<0.7",
# ----
# There was a pin to below 0.4.4 for a while due to a bug in Ubuntu/sqlparse 0.4.4
"sqlparse>=0.2.3,<0.5",
# ----
# These are major-version-0 packages also maintained by dbt-labs. Accept patches.
@@ -75,7 +76,8 @@ setup(
"hologram~=0.0.16", # includes transitive dependencies on python-dateutil and jsonschema
"minimal-snowplow-tracker~=0.0.2",
# DSI is under active development, so we're pinning to specific dev versions for now.
"dbt-semantic-interfaces~=0.2.0",
# TODO: Before RC/final release, update to use ~= pinning.
"dbt-semantic-interfaces~=0.1.0rc1",
# ----
# Expect compatibility with all new versions of these packages, so lower bounds only.
"packaging>20.9",

View File

@@ -16,7 +16,6 @@ pytest-csv
pytest-dotenv
pytest-logbook
pytest-mock
pytest-split
pytest-xdist
sphinx
tox>=3.13

View File

@@ -9,19 +9,17 @@ ARG build_for=linux/amd64
##
# base image (abstract)
##
# Please do not upgrade beyond python3.10.7 currently as dbt-spark does not support
# 3.11py and images do not get made properly
FROM --platform=$build_for python:3.10.7-slim-bullseye as base
FROM --platform=$build_for python:3.11.2-slim-bullseye as base
# N.B. The refs updated automagically every release via bumpversion
# N.B. dbt-postgres is currently found in the core codebase so a value of dbt-core@<some_version> is correct
ARG dbt_core_ref=dbt-core@v1.6.2
ARG dbt_postgres_ref=dbt-core@v1.6.2
ARG dbt_redshift_ref=dbt-redshift@v1.6.2
ARG dbt_bigquery_ref=dbt-bigquery@v1.6.2
ARG dbt_snowflake_ref=dbt-snowflake@v1.6.2
ARG dbt_spark_ref=dbt-spark@v1.6.2
ARG dbt_core_ref=dbt-core@v1.7.0a1
ARG dbt_postgres_ref=dbt-core@v1.7.0a1
ARG dbt_redshift_ref=dbt-redshift@v1.7.0a1
ARG dbt_bigquery_ref=dbt-bigquery@v1.7.0a1
ARG dbt_snowflake_ref=dbt-snowflake@v1.7.0a1
ARG dbt_spark_ref=dbt-spark@v1.7.0a1
# special case args
ARG dbt_spark_version=all
ARG dbt_third_party

View File

@@ -1 +1 @@
version = "1.6.2"
version = "1.7.0a1"

View File

@@ -32,10 +32,7 @@ class PostgresCredentials(Credentials):
sslkey: Optional[str] = None
sslrootcert: Optional[str] = None
application_name: Optional[str] = "dbt"
endpoint: Optional[str] = None
retries: int = 1
options: Optional[str] = None
# options: Dict[str, Any] = field(default_factory=dict)
_ALIASES = {"dbname": "database", "pass": "password"}
@@ -133,12 +130,6 @@ class PostgresConnectionManager(SQLConnectionManager):
if credentials.application_name:
kwargs["application_name"] = credentials.application_name
if credentials.options:
kwargs["options"] = credentials.options
if credentials.endpoint:
kwargs["endpoint"] = credentials.endpoint
def connect():
handle = psycopg2.connect(
dbname=credentials.database,

View File

@@ -41,7 +41,7 @@ def _dbt_psycopg2_name():
package_name = "dbt-postgres"
package_version = "1.6.2"
package_version = "1.7.0a1"
description = """The postgres adapter plugin for dbt (data build tool)"""
this_directory = os.path.abspath(os.path.dirname(__file__))

View File

@@ -141,9 +141,6 @@
},
{
"$ref": "#/definitions/Metric"
},
{
"$ref": "#/definitions/SemanticModel"
}
]
}
@@ -215,7 +212,7 @@
}
},
"additionalProperties": false,
"description": "WritableManifest(metadata: dbt.contracts.graph.manifest.ManifestMetadata, nodes: Mapping[str, Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode]], sources: Mapping[str, dbt.contracts.graph.nodes.SourceDefinition], macros: Mapping[str, dbt.contracts.graph.nodes.Macro], docs: Mapping[str, dbt.contracts.graph.nodes.Documentation], exposures: Mapping[str, dbt.contracts.graph.nodes.Exposure], metrics: Mapping[str, dbt.contracts.graph.nodes.Metric], groups: Mapping[str, dbt.contracts.graph.nodes.Group], selectors: Mapping[str, Any], disabled: Union[Mapping[str, List[Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode, dbt.contracts.graph.nodes.SourceDefinition, dbt.contracts.graph.nodes.Exposure, dbt.contracts.graph.nodes.Metric, dbt.contracts.graph.nodes.SemanticModel]]], NoneType], parent_map: Union[Dict[str, List[str]], NoneType], child_map: Union[Dict[str, List[str]], NoneType], group_map: Union[Dict[str, List[str]], NoneType], semantic_models: Mapping[str, dbt.contracts.graph.nodes.SemanticModel])",
"description": "WritableManifest(metadata: dbt.contracts.graph.manifest.ManifestMetadata, nodes: Mapping[str, Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode]], sources: Mapping[str, dbt.contracts.graph.nodes.SourceDefinition], macros: Mapping[str, dbt.contracts.graph.nodes.Macro], docs: Mapping[str, dbt.contracts.graph.nodes.Documentation], exposures: Mapping[str, dbt.contracts.graph.nodes.Exposure], metrics: Mapping[str, dbt.contracts.graph.nodes.Metric], groups: Mapping[str, dbt.contracts.graph.nodes.Group], selectors: Mapping[str, Any], disabled: Union[Mapping[str, List[Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode, dbt.contracts.graph.nodes.SourceDefinition, dbt.contracts.graph.nodes.Exposure, dbt.contracts.graph.nodes.Metric]]], NoneType], parent_map: Union[Dict[str, List[str]], NoneType], child_map: Union[Dict[str, List[str]], NoneType], group_map: Union[Dict[str, List[str]], NoneType], semantic_models: Mapping[str, dbt.contracts.graph.nodes.SemanticModel])",
"definitions": {
"ManifestMetadata": {
"type": "object",
@@ -227,12 +224,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.6.0"
"default": "1.6.0b4"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2023-08-07T20:10:03.381822Z"
"default": "2023-06-15T20:32:38.802488Z"
},
"invocation_id": {
"oneOf": [
@@ -243,7 +240,7 @@
"type": "null"
}
],
"default": "03dee192-ff77-43cc-bc3f-5eeaf6d36344"
"default": "fe95e4d0-61ff-487d-8293-092f543fcab2"
},
"env": {
"type": "object",
@@ -474,7 +471,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.386713
"default": 1686861158.804467
},
"config_call_dict": {
"type": "object",
@@ -1187,7 +1184,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.389955
"default": 1686861158.805745
},
"config_call_dict": {
"type": "object",
@@ -1575,7 +1572,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.3916101
"default": 1686861158.806452
},
"config_call_dict": {
"type": "object",
@@ -1851,7 +1848,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.393298
"default": 1686861158.807143
},
"config_call_dict": {
"type": "object",
@@ -2004,10 +2001,10 @@
}
]
},
"defer_relation": {
"state_relation": {
"oneOf": [
{
"$ref": "#/definitions/DeferRelation"
"$ref": "#/definitions/StateRelation"
},
{
"type": "null"
@@ -2016,7 +2013,7 @@
}
},
"additionalProperties": false,
"description": "ModelNode(database: Union[str, NoneType], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, group: Union[str, NoneType] = None, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Union[str, NoneType] = None, build_path: Union[str, NoneType] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Union[str, NoneType] = None, raw_code: str = '', language: str = 'sql', refs: List[dbt.contracts.graph.nodes.RefArgs] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Union[str, NoneType] = None, compiled: bool = False, compiled_code: Union[str, NoneType] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Union[str, NoneType] = None, contract: dbt.contracts.graph.nodes.Contract = <factory>, access: dbt.node_types.AccessType = <AccessType.Protected: 'protected'>, constraints: List[dbt.contracts.graph.nodes.ModelLevelConstraint] = <factory>, version: Union[str, float, NoneType] = None, latest_version: Union[str, float, NoneType] = None, deprecation_date: Union[datetime.datetime, NoneType] = None, defer_relation: Union[dbt.contracts.graph.nodes.DeferRelation, NoneType] = None)"
"description": "ModelNode(database: Union[str, NoneType], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, group: Union[str, NoneType] = None, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Union[str, NoneType] = None, build_path: Union[str, NoneType] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Union[str, NoneType] = None, raw_code: str = '', language: str = 'sql', refs: List[dbt.contracts.graph.nodes.RefArgs] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Union[str, NoneType] = None, compiled: bool = False, compiled_code: Union[str, NoneType] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Union[str, NoneType] = None, contract: dbt.contracts.graph.nodes.Contract = <factory>, access: dbt.node_types.AccessType = <AccessType.Protected: 'protected'>, constraints: List[dbt.contracts.graph.nodes.ModelLevelConstraint] = <factory>, version: Union[str, float, NoneType] = None, latest_version: Union[str, float, NoneType] = None, deprecation_date: Union[datetime.datetime, NoneType] = None, state_relation: Union[dbt.contracts.graph.nodes.StateRelation, NoneType] = None)"
},
"ModelLevelConstraint": {
"type": "object",
@@ -2074,13 +2071,16 @@
"additionalProperties": false,
"description": "ModelLevelConstraint(type: dbt.contracts.graph.nodes.ConstraintType, name: Union[str, NoneType] = None, expression: Union[str, NoneType] = None, warn_unenforced: bool = True, warn_unsupported: bool = True, columns: List[str] = <factory>)"
},
"DeferRelation": {
"StateRelation": {
"type": "object",
"required": [
"schema",
"alias"
"alias",
"schema"
],
"properties": {
"alias": {
"type": "string"
},
"database": {
"oneOf": [
{
@@ -2093,23 +2093,10 @@
},
"schema": {
"type": "string"
},
"alias": {
"type": "string"
},
"relation_name": {
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
}
},
"additionalProperties": false,
"description": "DeferRelation(database: Union[str, NoneType], schema: str, alias: str, relation_name: Union[str, NoneType])"
"description": "StateRelation(alias: str, database: Union[str, NoneType], schema: str)"
},
"RPCNode": {
"type": "object",
@@ -2273,7 +2260,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.39583
"default": 1686861158.808148
},
"config_call_dict": {
"type": "object",
@@ -2411,7 +2398,7 @@
"resource_type": {
"type": "string",
"enum": [
"sql_operation"
"sqloperation"
]
},
"package_name": {
@@ -2539,7 +2526,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.3974268
"default": 1686861158.8088078
},
"config_call_dict": {
"type": "object",
@@ -2797,7 +2784,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.399393
"default": 1686861158.8095539
},
"config_call_dict": {
"type": "object",
@@ -3092,7 +3079,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.4026701
"default": 1686861158.810841
},
"config_call_dict": {
"type": "object",
@@ -3192,10 +3179,10 @@
"checksum": null
}
},
"defer_relation": {
"state_relation": {
"oneOf": [
{
"$ref": "#/definitions/DeferRelation"
"$ref": "#/definitions/StateRelation"
},
{
"type": "null"
@@ -3204,7 +3191,7 @@
}
},
"additionalProperties": false,
"description": "SnapshotNode(database: Union[str, NoneType], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SnapshotConfig, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, group: Union[str, NoneType] = None, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Union[str, NoneType] = None, build_path: Union[str, NoneType] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Union[str, NoneType] = None, raw_code: str = '', language: str = 'sql', refs: List[dbt.contracts.graph.nodes.RefArgs] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Union[str, NoneType] = None, compiled: bool = False, compiled_code: Union[str, NoneType] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Union[str, NoneType] = None, contract: dbt.contracts.graph.nodes.Contract = <factory>, defer_relation: Union[dbt.contracts.graph.nodes.DeferRelation, NoneType] = None)"
"description": "SnapshotNode(database: Union[str, NoneType], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SnapshotConfig, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, group: Union[str, NoneType] = None, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Union[str, NoneType] = None, build_path: Union[str, NoneType] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Union[str, NoneType] = None, raw_code: str = '', language: str = 'sql', refs: List[dbt.contracts.graph.nodes.RefArgs] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Union[str, NoneType] = None, compiled: bool = False, compiled_code: Union[str, NoneType] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Union[str, NoneType] = None, contract: dbt.contracts.graph.nodes.Contract = <factory>, state_relation: Union[dbt.contracts.graph.nodes.StateRelation, NoneType] = None)"
},
"SnapshotConfig": {
"type": "object",
@@ -3599,7 +3586,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.4056058
"default": 1686861158.812035
},
"config_call_dict": {
"type": "object",
@@ -3635,10 +3622,10 @@
"macros": []
}
},
"defer_relation": {
"state_relation": {
"oneOf": [
{
"$ref": "#/definitions/DeferRelation"
"$ref": "#/definitions/StateRelation"
},
{
"type": "null"
@@ -3647,7 +3634,7 @@
}
},
"additionalProperties": false,
"description": "SeedNode(database: Union[str, NoneType], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SeedConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, group: Union[str, NoneType] = None, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Union[str, NoneType] = None, build_path: Union[str, NoneType] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Union[str, NoneType] = None, raw_code: str = '', root_path: Union[str, NoneType] = None, depends_on: dbt.contracts.graph.nodes.MacroDependsOn = <factory>, defer_relation: Union[dbt.contracts.graph.nodes.DeferRelation, NoneType] = None)"
"description": "SeedNode(database: Union[str, NoneType], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SeedConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, group: Union[str, NoneType] = None, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Union[str, NoneType] = None, build_path: Union[str, NoneType] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Union[str, NoneType] = None, raw_code: str = '', root_path: Union[str, NoneType] = None, depends_on: dbt.contracts.graph.nodes.MacroDependsOn = <factory>, state_relation: Union[dbt.contracts.graph.nodes.StateRelation, NoneType] = None)"
},
"SeedConfig": {
"type": "object",
@@ -4020,7 +4007,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.408927
"default": 1686861158.8133152
}
},
"additionalProperties": false,
@@ -4332,7 +4319,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.409885
"default": 1686861158.8135822
},
"supported_languages": {
"oneOf": [
@@ -4572,7 +4559,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.411563
"default": 1686861158.814228
}
},
"additionalProperties": false,
@@ -4672,6 +4659,7 @@
"enum": [
"simple",
"ratio",
"expr",
"cumulative",
"derived"
]
@@ -4757,7 +4745,7 @@
},
"created_at": {
"type": "number",
"default": 1691439003.41419
"default": 1686861158.815338
},
"group": {
"oneOf": [
@@ -4787,17 +4775,23 @@
}
]
},
"input_measures": {
"type": "array",
"items": {
"$ref": "#/definitions/MetricInputMeasure"
},
"default": []
"measures": {
"oneOf": [
{
"type": "array",
"items": {
"$ref": "#/definitions/MetricInputMeasure"
}
},
{
"type": "null"
}
]
},
"numerator": {
"oneOf": [
{
"$ref": "#/definitions/MetricInput"
"$ref": "#/definitions/MetricInputMeasure"
},
{
"type": "null"
@@ -4807,7 +4801,7 @@
"denominator": {
"oneOf": [
{
"$ref": "#/definitions/MetricInput"
"$ref": "#/definitions/MetricInputMeasure"
},
{
"type": "null"
@@ -4866,7 +4860,7 @@
}
},
"additionalProperties": false,
"description": "MetricTypeParams(measure: Union[dbt.contracts.graph.nodes.MetricInputMeasure, NoneType] = None, input_measures: List[dbt.contracts.graph.nodes.MetricInputMeasure] = <factory>, numerator: Union[dbt.contracts.graph.nodes.MetricInput, NoneType] = None, denominator: Union[dbt.contracts.graph.nodes.MetricInput, NoneType] = None, expr: Union[str, NoneType] = None, window: Union[dbt.contracts.graph.nodes.MetricTimeWindow, NoneType] = None, grain_to_date: Union[dbt_semantic_interfaces.type_enums.time_granularity.TimeGranularity, NoneType] = None, metrics: Union[List[dbt.contracts.graph.nodes.MetricInput], NoneType] = None)"
"description": "MetricTypeParams(measure: Union[dbt.contracts.graph.nodes.MetricInputMeasure, NoneType] = None, measures: Union[List[dbt.contracts.graph.nodes.MetricInputMeasure], NoneType] = None, numerator: Union[dbt.contracts.graph.nodes.MetricInputMeasure, NoneType] = None, denominator: Union[dbt.contracts.graph.nodes.MetricInputMeasure, NoneType] = None, expr: Union[str, NoneType] = None, window: Union[dbt.contracts.graph.nodes.MetricTimeWindow, NoneType] = None, grain_to_date: Union[dbt_semantic_interfaces.type_enums.time_granularity.TimeGranularity, NoneType] = None, metrics: Union[List[dbt.contracts.graph.nodes.MetricInput], NoneType] = None)"
},
"MetricInputMeasure": {
"type": "object",
@@ -4914,6 +4908,30 @@
"additionalProperties": false,
"description": "WhereFilter(where_sql_template: str)"
},
"MetricTimeWindow": {
"type": "object",
"required": [
"count",
"granularity"
],
"properties": {
"count": {
"type": "integer"
},
"granularity": {
"type": "string",
"enum": [
"day",
"week",
"month",
"quarter",
"year"
]
}
},
"additionalProperties": false,
"description": "MetricTimeWindow(count: int, granularity: dbt_semantic_interfaces.type_enums.time_granularity.TimeGranularity)"
},
"MetricInput": {
"type": "object",
"required": [
@@ -4974,30 +4992,6 @@
"additionalProperties": false,
"description": "MetricInput(name: str, filter: Union[dbt.contracts.graph.nodes.WhereFilter, NoneType] = None, alias: Union[str, NoneType] = None, offset_window: Union[dbt.contracts.graph.nodes.MetricTimeWindow, NoneType] = None, offset_to_grain: Union[dbt_semantic_interfaces.type_enums.time_granularity.TimeGranularity, NoneType] = None)"
},
"MetricTimeWindow": {
"type": "object",
"required": [
"count",
"granularity"
],
"properties": {
"count": {
"type": "integer"
},
"granularity": {
"type": "string",
"enum": [
"day",
"week",
"month",
"quarter",
"year"
]
}
},
"additionalProperties": false,
"description": "MetricTimeWindow(count: int, granularity: dbt_semantic_interfaces.type_enums.time_granularity.TimeGranularity)"
},
"SourceFileMetadata": {
"type": "object",
"required": [
@@ -5128,14 +5122,14 @@
"operation",
"seed",
"rpc",
"sql_operation",
"sqloperation",
"doc",
"source",
"macro",
"exposure",
"metric",
"group",
"semantic_model"
"semanticmodel"
]
},
"package_name": {
@@ -5219,44 +5213,10 @@
"type": "null"
}
]
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
"macros": [],
"nodes": []
}
},
"refs": {
"type": "array",
"items": {
"$ref": "#/definitions/RefArgs"
},
"default": []
},
"created_at": {
"type": "number",
"default": 1691439003.4182558
},
"config": {
"$ref": "#/definitions/SemanticModelConfig",
"default": {
"enabled": true
}
},
"primary_entity": {
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
}
},
"additionalProperties": false,
"description": "SemanticModel(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], model: str, node_relation: Union[dbt.contracts.graph.nodes.NodeRelation, NoneType], description: Union[str, NoneType] = None, defaults: Union[dbt.contracts.graph.semantic_models.Defaults, NoneType] = None, entities: Sequence[dbt.contracts.graph.semantic_models.Entity] = <factory>, measures: Sequence[dbt.contracts.graph.semantic_models.Measure] = <factory>, dimensions: Sequence[dbt.contracts.graph.semantic_models.Dimension] = <factory>, metadata: Union[dbt.contracts.graph.semantic_models.SourceFileMetadata, NoneType] = None, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, refs: List[dbt.contracts.graph.nodes.RefArgs] = <factory>, created_at: float = <factory>, config: dbt.contracts.graph.model_config.SemanticModelConfig = <factory>, primary_entity: Union[str, NoneType] = None)"
"description": "SemanticModel(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], model: str, node_relation: Union[dbt.contracts.graph.nodes.NodeRelation, NoneType], description: Union[str, NoneType] = None, defaults: Union[dbt.contracts.graph.semantic_models.Defaults, NoneType] = None, entities: Sequence[dbt.contracts.graph.semantic_models.Entity] = <factory>, measures: Sequence[dbt.contracts.graph.semantic_models.Measure] = <factory>, dimensions: Sequence[dbt.contracts.graph.semantic_models.Dimension] = <factory>, metadata: Union[dbt.contracts.graph.semantic_models.SourceFileMetadata, NoneType] = None)"
},
"NodeRelation": {
"type": "object",
@@ -5280,20 +5240,10 @@
"type": "null"
}
]
},
"relation_name": {
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
}
},
"additionalProperties": false,
"description": "NodeRelation(alias: str, schema_name: str, database: Union[str, NoneType] = None, relation_name: Union[str, NoneType] = None)"
"description": "NodeRelation(alias: str, schema_name: str, database: Union[str, NoneType] = None)"
},
"Defaults": {
"type": "object",
@@ -5463,23 +5413,35 @@
]
},
"use_discrete_percentile": {
"type": "boolean",
"default": false
"oneOf": [
{
"type": "boolean"
},
{
"type": "null"
}
]
},
"use_approximate_percentile": {
"type": "boolean",
"default": false
"oneOf": [
{
"type": "boolean"
},
{
"type": "null"
}
]
}
},
"additionalProperties": false,
"description": "MeasureAggregationParameters(percentile: Union[float, NoneType] = None, use_discrete_percentile: bool = False, use_approximate_percentile: bool = False)"
"description": "MeasureAggregationParameters(percentile: Union[float, NoneType] = None, use_discrete_percentile: Union[bool, NoneType] = None, use_approximate_percentile: Union[bool, NoneType] = None)"
},
"NonAdditiveDimension": {
"type": "object",
"required": [
"name",
"window_choice",
"window_groupings"
"window_grouples"
],
"properties": {
"name": {
@@ -5499,7 +5461,7 @@
"count"
]
},
"window_groupings": {
"window_grouples": {
"type": "array",
"items": {
"type": "string"
@@ -5507,7 +5469,7 @@
}
},
"additionalProperties": false,
"description": "NonAdditiveDimension(name: str, window_choice: dbt_semantic_interfaces.type_enums.aggregation_type.AggregationType, window_groupings: List[str])"
"description": "NonAdditiveDimension(name: str, window_choice: dbt_semantic_interfaces.type_enums.aggregation_type.AggregationType, window_grouples: List[str])"
},
"Dimension": {
"type": "object",
@@ -5619,18 +5581,6 @@
},
"additionalProperties": false,
"description": "DimensionValidityParams(is_start: bool = False, is_end: bool = False)"
},
"SemanticModelConfig": {
"type": "object",
"required": [],
"properties": {
"enabled": {
"type": "boolean",
"default": true
}
},
"additionalProperties": true,
"description": "SemanticModelConfig(_extra: Dict[str, Any] = <factory>, enabled: bool = True)"
}
},
"$schema": "http://json-schema.org/draft-07/schema#",

View File

@@ -1 +1 @@
version = "1.6.2"
version = "1.7.0a1"

View File

@@ -20,7 +20,7 @@ except ImportError:
package_name = "dbt-tests-adapter"
package_version = "1.6.2"
package_version = "1.7.0a1"
description = """The dbt adapter tests for adapter plugins"""
this_directory = os.path.abspath(os.path.dirname(__file__))

File diff suppressed because one or more lines are too long

View File

@@ -1,12 +1,9 @@
import json
import pytest
import os
import shutil
import pytest
from dbt.contracts.graph.manifest import WritableManifest, get_manifest_schema_version
from dbt.exceptions import IncompatibleSchemaError
from dbt.tests.util import run_dbt, get_manifest
from dbt.exceptions import IncompatibleSchemaError
from dbt.contracts.graph.manifest import WritableManifest
# This project must have one of each kind of node type, plus disabled versions, for
# test coverage to be complete.
@@ -354,13 +351,3 @@ class TestPreviousVersionState:
# schema versions 1, 2, 3 are all not forward compatible
for schema_version in range(1, 4):
self.compare_previous_state(project, schema_version, False)
def test_get_manifest_schema_version(self, project):
for schema_version in range(1, self.CURRENT_EXPECTED_MANIFEST_VERSION):
manifest_path = os.path.join(
project.test_data_dir, f"state/v{schema_version}/manifest.json"
)
manifest = json.load(open(manifest_path))
manifest_version = get_manifest_schema_version(manifest)
assert manifest_version == schema_version

View File

@@ -2,6 +2,7 @@ from multiprocessing import Process
from pathlib import Path
import json
import pytest
import platform
from dbt.tests.util import run_dbt
good_model_sql = """
@@ -40,7 +41,7 @@ class TestRunResultsTimingFailure:
assert len(results.results[0].timing) > 0
@pytest.mark.skip()
@pytest.mark.skipif(platform.system() != "Darwin", reason="Fails on linux in github actions")
class TestRunResultsWritesFileOnSignal:
@pytest.fixture(scope="class")
def models(self):

View File

@@ -1,8 +1,5 @@
import os
import pytest
import yaml
from pathlib import Path
from dbt.tests.util import run_dbt, update_config_file, write_config_file
from dbt.tests.util import run_dbt, update_config_file
from dbt.exceptions import ProjectContractError
@@ -65,50 +62,3 @@ class TestProjectYamlVersionInvalid:
assert "at path ['version']: 'invalid' is not valid under any of the given schemas" in str(
excinfo.value
)
class TestProjectDbtCloudConfig:
@pytest.fixture(scope="class")
def models(self):
return {"simple_model.sql": simple_model_sql, "simple_model.yml": simple_model_yml}
def test_dbt_cloud(self, project):
run_dbt(["parse"], expect_pass=True)
conf = yaml.safe_load(
Path(os.path.join(project.project_root, "dbt_project.yml")).read_text()
)
assert conf == {"name": "test", "profile": "test"}
config = {
"name": "test",
"profile": "test",
"dbt-cloud": {
"account_id": "123",
"application": "test",
"environment": "test",
"api_key": "test",
},
}
write_config_file(config, project.project_root, "dbt_project.yml")
run_dbt(["parse"], expect_pass=True)
conf = yaml.safe_load(
Path(os.path.join(project.project_root, "dbt_project.yml")).read_text()
)
assert conf == config
class TestProjectDbtCloudConfigString:
@pytest.fixture(scope="class")
def models(self):
return {"simple_model.sql": simple_model_sql, "simple_model.yml": simple_model_yml}
def test_dbt_cloud_invalid(self, project):
run_dbt()
config = {"name": "test", "profile": "test", "dbt-cloud": "Some string"}
update_config_file(config, "dbt_project.yml")
expected_err = (
"at path ['dbt-cloud']: 'Some string' is not valid under any of the given schemas"
)
with pytest.raises(ProjectContractError) as excinfo:
run_dbt()
assert expected_err in str(excinfo.value)

View File

@@ -164,10 +164,6 @@ class TestCompile:
with pytest.raises(DbtException, match="Error parsing inline query"):
run_dbt(["compile", "--inline", "select * from {{ ref('third_model') }}"])
def test_inline_fail_database_error(self, project):
with pytest.raises(DbtRuntimeError, match="Database Error"):
run_dbt(["show", "--inline", "slect asdlkjfsld;j"])
def test_multiline_jinja(self, project):
(results, log_output) = run_dbt_and_capture(["compile", "--inline", model_multiline_jinja])
assert len(results) == 1

View File

@@ -781,108 +781,3 @@ class TestModifiedBodyAndContract:
# The model's contract has changed, even if non-breaking, so it should be selected by 'state:modified.contract'
results = run_dbt(["list", "-s", "state:modified.contract", "--state", "./state"])
assert results == ["test.my_model"]
modified_table_model_access_yml = """
version: 2
models:
- name: table_model
access: public
"""
class TestModifiedAccess(BaseModifiedState):
def test_changed_access(self, project):
self.run_and_save_state()
# No access change
assert not run_dbt(["list", "-s", "state:modified", "--state", "./state"])
# Modify access (protected -> public)
write_file(modified_table_model_access_yml, "models", "schema.yml")
assert run_dbt(["list", "-s", "state:modified", "--state", "./state"])
results = run_dbt(["list", "-s", "state:modified", "--state", "./state"])
assert results == ["test.table_model"]
modified_table_model_access_yml = """
version: 2
models:
- name: table_model
deprecation_date: 2020-01-01
"""
class TestModifiedDeprecationDate(BaseModifiedState):
def test_changed_access(self, project):
self.run_and_save_state()
# No access change
assert not run_dbt(["list", "-s", "state:modified", "--state", "./state"])
# Modify deprecation_date (None -> 2020-01-01)
write_file(modified_table_model_access_yml, "models", "schema.yml")
assert run_dbt(["list", "-s", "state:modified", "--state", "./state"])
results = run_dbt(["list", "-s", "state:modified", "--state", "./state"])
assert results == ["test.table_model"]
modified_table_model_version_yml = """
version: 2
models:
- name: table_model
versions:
- v: 1
defined_in: table_model
"""
class TestModifiedVersion(BaseModifiedState):
def test_changed_access(self, project):
self.run_and_save_state()
# Change version (null -> v1)
write_file(modified_table_model_version_yml, "models", "schema.yml")
results = run_dbt(["list", "-s", "state:modified", "--state", "./state"])
assert results == ["test.table_model.v1"]
table_model_latest_version_yml = """
version: 2
models:
- name: table_model
latest_version: 1
versions:
- v: 1
defined_in: table_model
"""
modified_table_model_latest_version_yml = """
version: 2
models:
- name: table_model
latest_version: 2
versions:
- v: 1
defined_in: table_model
- v: 2
"""
class TestModifiedLatestVersion(BaseModifiedState):
def test_changed_access(self, project):
# Setup initial latest_version: 1
write_file(table_model_latest_version_yml, "models", "schema.yml")
self.run_and_save_state()
# Bump latest version
write_file(table_model_sql, "models", "table_model_v2.sql")
write_file(modified_table_model_latest_version_yml, "models", "schema.yml")
results = run_dbt(["list", "-s", "state:modified", "--state", "./state"])
assert results == ["test.table_model.v1", "test.table_model.v2"]

View File

@@ -13,6 +13,7 @@ from contextlib import contextmanager
import dbt.semver
import dbt.config
import dbt.exceptions
from dbt.contracts.results import RunStatus
from dbt.tests.util import check_relations_equal, run_dbt, run_dbt_and_capture
@@ -207,8 +208,9 @@ class TestMissingDependency(object):
def test_missing_dependency(self, project):
# dbt should raise a runtime exception
with pytest.raises(dbt.exceptions.DbtRuntimeError):
run_dbt(["compile"])
res = run_dbt(["compile"], expect_pass=False)
assert len(res) == 1
assert res[0].status == RunStatus.Error
class TestSimpleDependencyWithSchema(BaseDependencyTest):

View File

@@ -25,7 +25,7 @@ metrics:
type_params:
measure:
name: "years_tenure"
filter: "{{ Dimension('people_entity__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
"""

View File

@@ -43,26 +43,6 @@ seeds:
"""
local_dep_schema_yml = """
models:
- name: table_model
config:
alias: table_model_local_dep
columns:
- name: id
tests:
- unique
"""
local_dep_versions_schema_yml = """
models:
- name: table_model
config:
alias: table_model_local_dep
versions:
- v: 1
"""
class TestDuplicateModelEnabled:
@pytest.fixture(scope="class")
@@ -162,72 +142,6 @@ class TestDuplicateModelDisabledAcrossPackages:
assert model_id in manifest.disabled
class TestDuplicateModelNameWithTestAcrossPackages:
@pytest.fixture(scope="class", autouse=True)
def setUp(self, project_root):
local_dependency_files = {
"dbt_project.yml": dbt_project_yml,
"models": {"table_model.sql": enabled_model_sql, "schema.yml": local_dep_schema_yml},
}
write_project_files(project_root, "local_dependency", local_dependency_files)
@pytest.fixture(scope="class")
def models(self):
return {"table_model.sql": enabled_model_sql}
@pytest.fixture(scope="class")
def packages(self):
return {"packages": [{"local": "local_dependency"}]}
def test_duplicate_model_name_with_test_across_packages(self, project):
run_dbt(["deps"])
manifest = run_dbt(["parse"])
assert len(manifest.nodes) == 3
# model nodes with duplicate names exist
local_dep_model_node_id = "model.local_dep.table_model"
root_model_node_id = "model.test.table_model"
assert local_dep_model_node_id in manifest.nodes
assert root_model_node_id in manifest.nodes
# test node exists and is attached to correct node
test_node_id = "test.local_dep.unique_table_model_id.1da9e464d9"
assert test_node_id in manifest.nodes
assert manifest.nodes[test_node_id].attached_node == local_dep_model_node_id
class TestDuplicateModelNameWithVersionAcrossPackages:
@pytest.fixture(scope="class", autouse=True)
def setUp(self, project_root):
local_dependency_files = {
"dbt_project.yml": dbt_project_yml,
"models": {
"table_model.sql": enabled_model_sql,
"schema.yml": local_dep_versions_schema_yml,
},
}
write_project_files(project_root, "local_dependency", local_dependency_files)
@pytest.fixture(scope="class")
def models(self):
return {"table_model.sql": enabled_model_sql}
@pytest.fixture(scope="class")
def packages(self):
return {"packages": [{"local": "local_dependency"}]}
def test_duplicate_model_name_with_test_across_packages(self, project):
run_dbt(["deps"])
manifest = run_dbt(["parse"])
assert len(manifest.nodes) == 2
# model nodes with duplicate names exist
local_dep_model_node_id = "model.local_dep.table_model.v1"
root_model_node_id = "model.test.table_model"
assert local_dep_model_node_id in manifest.nodes
assert root_model_node_id in manifest.nodes
class TestModelTestOverlap:
@pytest.fixture(scope="class")
def models(self):

View File

@@ -142,24 +142,6 @@ class TestGraphSelection(SelectionFixtures):
check_result_nodes_by_name(results, ["subdir"])
assert_correct_schemas(project)
# Check that list command works
os.chdir(
project.profiles_dir
) # Change to random directory to test that Path selector works with project-dir
results = run_dbt(
[
"-q",
"ls",
"-s",
"path:models/test/subdir.sql",
"--project-dir",
str(project.project_root),
]
# ["list", "--project-dir", str(project.project_root), "--select", "models/test/subdir*"]
)
print(f"--- results: {results}")
assert len(results) == 1
def test_locally_qualified_name_model_with_dots(self, project):
results = run_dbt(["run", "--select", "alternative.users"], expect_pass=False)
check_result_nodes_by_name(results, ["alternative.users"])
@@ -286,22 +268,3 @@ class TestGraphSelection(SelectionFixtures):
"users",
],
)
class TestListPathGraphSelection(SelectionFixtures):
def test_list_select_with_project_dir(self, project):
# Check that list command works
os.chdir(
project.profiles_dir
) # Change to random directory to test that Path selector works with project-dir
results = run_dbt(
[
"-q",
"ls",
"-s",
"path:models/test/subdir.sql",
"--project-dir",
str(project.project_root),
]
)
assert results == ["test.test.subdir"]

View File

@@ -4,12 +4,6 @@ models__dep_macro = """
}}
"""
models__materialization_macro = """
{{
materialization_macro()
}}
"""
models__with_undefined_macro = """
{{ dispatch_to_nowhere() }}
select 1 as id
@@ -81,12 +75,6 @@ macros__my_macros = """
{% endmacro %}
"""
macros__named_materialization = """
{% macro materialization_macro() %}
select 1 as foo
{% endmacro %}
"""
macros__no_default_macros = """
{% macro do_something2(foo2, bar2) %}

View File

@@ -20,14 +20,12 @@ from tests.functional.macros.fixtures import (
models__override_get_columns_macros,
models__deprecated_adapter_macro_model,
models__incorrect_dispatch,
models__materialization_macro,
macros__my_macros,
macros__no_default_macros,
macros__override_get_columns_macros,
macros__package_override_get_columns_macros,
macros__deprecated_adapter_macro,
macros__incorrect_dispatch,
macros__named_materialization,
)
@@ -80,21 +78,6 @@ class TestMacros:
check_relations_equal(project.adapter, ["expected_local_macro", "local_macro"])
class TestMacrosNamedMaterialization:
@pytest.fixture(scope="class")
def models(self):
return {
"models_materialization_macro.sql": models__materialization_macro,
}
@pytest.fixture(scope="class")
def macros(self):
return {"macros_named_materialization.sql": macros__named_materialization}
def test_macro_with_materialization_in_name_works(self, project):
run_dbt(expect_pass=True)
class TestInvalidMacros:
@pytest.fixture(scope="class")
def models(self):

View File

@@ -1,5 +1,3 @@
from dbt.contracts.graph.nodes import ModelNode
from dbt.contracts.results import RunExecutionResult, RunResult
import pytest
from dbt.tests.util import run_dbt
@@ -55,16 +53,6 @@ models:
"""
SUPPRESSED_CTE_EXPECTED_OUTPUT = """-- fct_eph_first.sql
with int_eph_first as(
select * from __dbt__cte__int_eph_first
)
select * from int_eph_first"""
class TestEphemeralCompilation:
@pytest.fixture(scope="class")
def models(self):
@@ -79,13 +67,5 @@ class TestEphemeralCompilation:
results = run_dbt(["run"])
assert len(results) == 0
def test__suppress_injected_ctes(self, project):
compile_output = run_dbt(
["compile", "--no-inject-ephemeral-ctes", "--select", "fct_eph_first"]
)
assert isinstance(compile_output, RunExecutionResult)
node_result = compile_output.results[0]
assert isinstance(node_result, RunResult)
node = node_result.node
assert isinstance(node, ModelNode)
assert node.compiled_code == SUPPRESSED_CTE_EXPECTED_OUTPUT
results = run_dbt(["test"])
len(results) == 4

View File

@@ -70,7 +70,7 @@ metrics:
type_params:
measure:
name: "years_tenure"
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
- name: average_tenure
label: "Average tenure"
@@ -115,7 +115,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
- name: collective_window
label: "Collective window"
@@ -124,7 +124,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
window: 14 days
- name: average_tenure
@@ -452,7 +452,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
"""
@@ -479,7 +479,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
"""

View File

@@ -353,7 +353,7 @@ metrics:
type_params:
measure:
name: customers
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
+meta:
is_okr: True
tags:
@@ -472,7 +472,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
"""
@@ -619,7 +619,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
"""
@@ -1008,7 +1008,7 @@ metrics:
type_params:
measure:
name: years_tenure
filter: "{{ Dimension('id__loves_dbt') }} is true"
filter: "{{dimension('loves_dbt')}} is true"
"""

View File

@@ -1,8 +1,6 @@
import os
import pytest
from dbt.tests.util import run_dbt, write_artifact, write_file
from tests.functional.partial_parsing.fixtures import model_one_sql, model_two_sql
from dbt.tests.util import run_dbt, write_artifact
first_file_diff = {
@@ -19,7 +17,7 @@ second_file_diff = {
}
class TestFileDiffPaths:
class TestFileDiffs:
def test_file_diffs(self, project):
os.environ["DBT_PP_FILE_DIFF_TEST"] = "true"
@@ -37,27 +35,3 @@ class TestFileDiffPaths:
write_artifact(second_file_diff, "file_diff.json")
results = run_dbt()
assert len(results) == 2
class TestFileDiffs:
@pytest.fixture(scope="class")
def models(self):
return {
"model_one.sql": model_one_sql,
}
def test_no_file_diffs(self, project):
# We start with a project with one model
manifest = run_dbt(["parse"])
assert len(manifest.nodes) == 1
# add a model file
write_file(model_two_sql, project.project_root, "models", "model_two.sql")
# parse without computing a file diff
manifest = run_dbt(["--partial-parse", "--no-partial-parse-file-diff", "parse"])
assert len(manifest.nodes) == 1
# default behaviour - parse with computing a file diff
manifest = run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 2

View File

@@ -1,5 +1,4 @@
import pytest
from unittest import mock
from dbt.tests.util import run_dbt, get_manifest, write_file, rm_file, run_dbt_and_capture
from dbt.tests.fixtures.project import write_project_files
@@ -9,6 +8,9 @@ from tests.functional.partial_parsing.fixtures import (
models_schema1_yml,
models_schema2_yml,
models_schema2b_yml,
models_versions_schema_yml,
models_versions_defined_in_schema_yml,
models_versions_updated_schema_yml,
model_three_sql,
model_three_modified_sql,
model_four1_sql,
@@ -69,10 +71,9 @@ from tests.functional.partial_parsing.fixtures import (
groups_schema_yml_two_groups_private_orders_invalid_access,
)
from dbt.exceptions import CompilationError, ParsingError
from dbt.exceptions import CompilationError, ParsingError, DuplicateVersionedUnversionedError
from dbt.contracts.files import ParseFileType
from dbt.contracts.results import TestStatus
from dbt.plugins.manifest import PluginNodes, ModelNodeArgs
import re
import os
@@ -302,6 +303,72 @@ class TestModels:
assert model_id not in manifest.disabled
class TestVersionedModels:
@pytest.fixture(scope="class")
def models(self):
return {
"model_one_v1.sql": model_one_sql,
"model_one.sql": model_one_sql,
"model_one_downstream.sql": model_four2_sql,
"schema.yml": models_versions_schema_yml,
}
def test_pp_versioned_models(self, project):
results = run_dbt(["run"])
assert len(results) == 3
manifest = get_manifest(project.project_root)
model_one_node = manifest.nodes["model.test.model_one.v1"]
assert not model_one_node.is_latest_version
model_two_node = manifest.nodes["model.test.model_one.v2"]
assert model_two_node.is_latest_version
# assert unpinned ref points to latest version
model_one_downstream_node = manifest.nodes["model.test.model_one_downstream"]
assert model_one_downstream_node.depends_on.nodes == ["model.test.model_one.v2"]
# update schema.yml block - model_one is now 'defined_in: model_one_different'
rm_file(project.project_root, "models", "model_one.sql")
write_file(model_one_sql, project.project_root, "models", "model_one_different.sql")
write_file(
models_versions_defined_in_schema_yml, project.project_root, "models", "schema.yml"
)
results = run_dbt(["--partial-parse", "run"])
assert len(results) == 3
# update versions schema.yml block - latest_version from 2 to 1
write_file(
models_versions_updated_schema_yml, project.project_root, "models", "schema.yml"
)
results, log_output = run_dbt_and_capture(
["--partial-parse", "--log-format", "json", "run"]
)
assert len(results) == 3
manifest = get_manifest(project.project_root)
model_one_node = manifest.nodes["model.test.model_one.v1"]
assert model_one_node.is_latest_version
model_two_node = manifest.nodes["model.test.model_one.v2"]
assert not model_two_node.is_latest_version
# assert unpinned ref points to latest version
model_one_downstream_node = manifest.nodes["model.test.model_one_downstream"]
assert model_one_downstream_node.depends_on.nodes == ["model.test.model_one.v1"]
# assert unpinned ref to latest-not-max version yields an "FYI" info-level log
assert "UnpinnedRefNewVersionAvailable" in log_output
# update versioned model
write_file(model_two_sql, project.project_root, "models", "model_one_different.sql")
results = run_dbt(["--partial-parse", "run"])
assert len(results) == 3
manifest = get_manifest(project.project_root)
assert len(manifest.nodes) == 3
print(f"--- nodes: {manifest.nodes.keys()}")
# create a new model_one in model_one.sql and re-parse
write_file(model_one_sql, project.project_root, "models", "model_one.sql")
with pytest.raises(DuplicateVersionedUnversionedError):
run_dbt(["parse"])
class TestSources:
@pytest.fixture(scope="class")
def models(self):
@@ -738,111 +805,3 @@ class TestGroups:
)
with pytest.raises(ParsingError):
results = run_dbt(["--partial-parse", "run"])
class TestExternalModels:
@pytest.fixture(scope="class")
def external_model_node(self):
return ModelNodeArgs(
name="external_model",
package_name="external",
identifier="test_identifier",
schema="test_schema",
)
@pytest.fixture(scope="class")
def external_model_node_versioned(self):
return ModelNodeArgs(
name="external_model_versioned",
package_name="external",
identifier="test_identifier_v1",
schema="test_schema",
version=1,
)
@pytest.fixture(scope="class")
def external_model_node_depends_on(self):
return ModelNodeArgs(
name="external_model_depends_on",
package_name="external",
identifier="test_identifier_depends_on",
schema="test_schema",
depends_on_nodes=["model.external.external_model_depends_on_parent"],
)
@pytest.fixture(scope="class")
def external_model_node_depends_on_parent(self):
return ModelNodeArgs(
name="external_model_depends_on_parent",
package_name="external",
identifier="test_identifier_depends_on_parent",
schema="test_schema",
)
@pytest.fixture(scope="class")
def models(self):
return {"model_one.sql": model_one_sql}
@mock.patch("dbt.plugins.get_plugin_manager")
def test_pp_external_models(
self,
get_plugin_manager,
project,
external_model_node,
external_model_node_versioned,
external_model_node_depends_on,
external_model_node_depends_on_parent,
):
# initial plugin - one external model
external_nodes = PluginNodes()
external_nodes.add_model(external_model_node)
get_plugin_manager.return_value.get_nodes.return_value = external_nodes
# initial parse
manifest = run_dbt(["parse"])
assert len(manifest.nodes) == 2
assert set(manifest.nodes.keys()) == {
"model.external.external_model",
"model.test.model_one",
}
assert len(manifest.external_node_unique_ids) == 1
assert manifest.external_node_unique_ids == ["model.external.external_model"]
# add a model file
write_file(model_two_sql, project.project_root, "models", "model_two.sql")
manifest = run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 3
# add an external model
external_nodes.add_model(external_model_node_versioned)
manifest = run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 4
assert len(manifest.external_node_unique_ids) == 2
# add a model file that depends on external model
write_file(
"SELECT * FROM {{ref('external', 'external_model')}}",
project.project_root,
"models",
"model_depends_on_external.sql",
)
manifest = run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 5
assert len(manifest.external_node_unique_ids) == 2
# remove a model file that depends on external model
rm_file(project.project_root, "models", "model_depends_on_external.sql")
manifest = run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 4
# add an external node with depends on
external_nodes.add_model(external_model_node_depends_on)
external_nodes.add_model(external_model_node_depends_on_parent)
manifest = run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 6
assert len(manifest.external_node_unique_ids) == 4
# skip files parsing - ensure no issues
run_dbt(["--partial-parse", "parse"])
assert len(manifest.nodes) == 6
assert len(manifest.external_node_unique_ids) == 4

View File

@@ -1,126 +0,0 @@
import pytest
import pathlib
from dbt.tests.util import (
run_dbt,
get_manifest,
write_file,
rm_file,
read_file,
)
from dbt.exceptions import DuplicateVersionedUnversionedError
model_one_sql = """
select 1 as fun
"""
model_one_downstream_sql = """
select fun from {{ ref('model_one') }}
"""
models_versions_schema_yml = """
models:
- name: model_one
description: "The first model"
versions:
- v: 1
- v: 2
"""
models_versions_defined_in_schema_yml = """
models:
- name: model_one
description: "The first model"
versions:
- v: 1
- v: 2
defined_in: model_one_different
"""
models_versions_updated_schema_yml = """
models:
- name: model_one
latest_version: 1
description: "The first model"
versions:
- v: 1
- v: 2
defined_in: model_one_different
"""
model_two_sql = """
select 1 as notfun
"""
class TestVersionedModels:
@pytest.fixture(scope="class")
def models(self):
return {
"model_one_v1.sql": model_one_sql,
"model_one.sql": model_one_sql,
"model_one_downstream.sql": model_one_downstream_sql,
"schema.yml": models_versions_schema_yml,
}
def test_pp_versioned_models(self, project):
results = run_dbt(["run"])
assert len(results) == 3
manifest = get_manifest(project.project_root)
model_one_node = manifest.nodes["model.test.model_one.v1"]
assert not model_one_node.is_latest_version
model_two_node = manifest.nodes["model.test.model_one.v2"]
assert model_two_node.is_latest_version
# assert unpinned ref points to latest version
model_one_downstream_node = manifest.nodes["model.test.model_one_downstream"]
assert model_one_downstream_node.depends_on.nodes == ["model.test.model_one.v2"]
# update schema.yml block - model_one is now 'defined_in: model_one_different'
rm_file(project.project_root, "models", "model_one.sql")
write_file(model_one_sql, project.project_root, "models", "model_one_different.sql")
write_file(
models_versions_defined_in_schema_yml, project.project_root, "models", "schema.yml"
)
results = run_dbt(["--partial-parse", "run"])
assert len(results) == 3
# update versions schema.yml block - latest_version from 2 to 1
write_file(
models_versions_updated_schema_yml, project.project_root, "models", "schema.yml"
)
# This is where the test was failings in a CI run with:
# relation \"test..._test_partial_parsing.model_one_downstream\" does not exist
# because in core/dbt/include/global_project/macros/materializations/models/view/view.sql
# "existing_relation" didn't actually exist by the time it gets to the rename of the
# existing relation.
(pathlib.Path(project.project_root) / "log_output").mkdir(parents=True, exist_ok=True)
results = run_dbt(
["--partial-parse", "--log-format-file", "json", "--log-path", "log_output", "run"]
)
assert len(results) == 3
manifest = get_manifest(project.project_root)
model_one_node = manifest.nodes["model.test.model_one.v1"]
assert model_one_node.is_latest_version
model_two_node = manifest.nodes["model.test.model_one.v2"]
assert not model_two_node.is_latest_version
# assert unpinned ref points to latest version
model_one_downstream_node = manifest.nodes["model.test.model_one_downstream"]
assert model_one_downstream_node.depends_on.nodes == ["model.test.model_one.v1"]
# assert unpinned ref to latest-not-max version yields an "FYI" info-level log
log_output = read_file("log_output", "dbt.log").replace("\n", " ").replace("\\n", " ")
assert "UnpinnedRefNewVersionAvailable" in log_output
# update versioned model
write_file(model_two_sql, project.project_root, "models", "model_one_different.sql")
results = run_dbt(["--partial-parse", "run"])
assert len(results) == 3
manifest = get_manifest(project.project_root)
assert len(manifest.nodes) == 3
# create a new model_one in model_one.sql and re-parse
write_file(model_one_sql, project.project_root, "models", "model_one.sql")
with pytest.raises(DuplicateVersionedUnversionedError):
run_dbt(["parse"])

View File

@@ -49,12 +49,6 @@ semantic_models:
agg_time_dimension: ds
agg_params:
percentile: 0.99
- name: test_non_additive
expr: txn_revenue
agg: sum
non_additive_dimension:
name: ds
window_choice: max
dimensions:
- name: ds
@@ -67,8 +61,6 @@ semantic_models:
- name: user
type: foreign
expr: user_id
- name: id
type: primary
metrics:
- name: records_with_revenue
@@ -131,7 +123,7 @@ class TestSemanticModelParsing:
semantic_model.node_relation.relation_name
== f'"dbt"."{project.test_schema}"."fct_revenue"'
)
assert len(semantic_model.measures) == 6
assert len(semantic_model.measures) == 5
def test_semantic_model_error(self, project):
# Next, modify the default schema.yml to remove the semantic model.

View File

@@ -2,31 +2,6 @@ models__sample_model = """
select * from {{ ref('sample_seed') }}
"""
models__sample_number_model = """
select
cast(1.0 as int) as float_to_int_field,
3.0 as float_field,
4.3 as float_with_dec_field,
5 as int_field
"""
models__sample_number_model_with_nulls = """
select
cast(1.0 as int) as float_to_int_field,
3.0 as float_field,
4.3 as float_with_dec_field,
5 as int_field
union all
select
cast(null as int) as float_to_int_field,
cast(null as float) as float_field,
cast(null as float) as float_with_dec_field,
cast(null as int) as int_field
"""
models__second_model = """
select
sample_num as col_one,

View File

@@ -6,8 +6,6 @@ from tests.functional.show.fixtures import (
models__second_ephemeral_model,
seeds__sample_seed,
models__sample_model,
models__sample_number_model,
models__sample_number_model_with_nulls,
models__second_model,
models__ephemeral_model,
schema_yml,
@@ -16,13 +14,11 @@ from tests.functional.show.fixtures import (
)
class ShowBase:
class TestShow:
@pytest.fixture(scope="class")
def models(self):
return {
"sample_model.sql": models__sample_model,
"sample_number_model.sql": models__sample_number_model,
"sample_number_model_with_nulls.sql": models__sample_number_model_with_nulls,
"second_model.sql": models__second_model,
"ephemeral_model.sql": models__ephemeral_model,
"sql_header.sql": models__sql_header,
@@ -32,122 +28,66 @@ class ShowBase:
def seeds(self):
return {"sample_seed.csv": seeds__sample_seed}
@pytest.fixture(scope="class", autouse=True)
def setup(self, project):
run_dbt(["seed"])
class TestShowNone(ShowBase):
def test_none(self, project):
with pytest.raises(
DbtRuntimeError, match="Either --select or --inline must be passed to show"
):
run_dbt(["seed"])
run_dbt(["show"])
class TestShowSelectText(ShowBase):
def test_select_model_text(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(["show", "--select", "second_model"])
(results, log_output) = run_dbt_and_capture(["show", "--select", "second_model"])
assert "Previewing node 'sample_model'" not in log_output
assert "Previewing node 'second_model'" in log_output
assert "col_one" in log_output
assert "col_two" in log_output
assert "answer" in log_output
class TestShowMultiple(ShowBase):
def test_select_multiple_model_text(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(["show", "--select", "sample_model second_model"])
(results, log_output) = run_dbt_and_capture(
["show", "--select", "sample_model second_model"]
)
assert "Previewing node 'sample_model'" in log_output
assert "sample_num" in log_output
assert "sample_bool" in log_output
class TestShowSingle(ShowBase):
def test_select_single_model_json(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(
(results, log_output) = run_dbt_and_capture(
["show", "--select", "sample_model", "--output", "json"]
)
assert "Previewing node 'sample_model'" not in log_output
assert "sample_num" in log_output
assert "sample_bool" in log_output
class TestShowNumeric(ShowBase):
def test_numeric_values(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(
["show", "--select", "sample_number_model", "--output", "json"]
)
# json log output needs the escapes removed for string matching
log_output = log_output.replace("\\", "")
assert "Previewing node 'sample_number_model'" not in log_output
assert '"float_to_int_field": 1.0' not in log_output
assert '"float_to_int_field": 1' in log_output
assert '"float_field": 3.0' in log_output
assert '"float_with_dec_field": 4.3' in log_output
assert '"int_field": 5' in log_output
assert '"int_field": 5.0' not in log_output
class TestShowNumericNulls(ShowBase):
def test_numeric_values_with_nulls(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(
["show", "--select", "sample_number_model_with_nulls", "--output", "json"]
)
# json log output needs the escapes removed for string matching
log_output = log_output.replace("\\", "")
assert "Previewing node 'sample_number_model_with_nulls'" not in log_output
assert '"float_to_int_field": 1.0' not in log_output
assert '"float_to_int_field": 1' in log_output
assert '"float_field": 3.0' in log_output
assert '"float_with_dec_field": 4.3' in log_output
assert '"int_field": 5' in log_output
assert '"int_field": 5.0' not in log_output
class TestShowInline(ShowBase):
def test_inline_pass(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(
(results, log_output) = run_dbt_and_capture(
["show", "--inline", "select * from {{ ref('sample_model') }}"]
)
assert "Previewing inline node" in log_output
assert "sample_num" in log_output
assert "sample_bool" in log_output
class TestShowInlineFail(ShowBase):
def test_inline_fail(self, project):
run_dbt(["build"])
with pytest.raises(DbtException, match="Error parsing inline query"):
run_dbt(["show", "--inline", "select * from {{ ref('third_model') }}"])
class TestShowInlineFailDB(ShowBase):
def test_inline_fail_database_error(self, project):
with pytest.raises(DbtRuntimeError, match="Database Error"):
run_dbt(["show", "--inline", "slect asdlkjfsld;j"])
class TestShowEphemeral(ShowBase):
def test_ephemeral_model(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(["show", "--select", "ephemeral_model"])
(results, log_output) = run_dbt_and_capture(["show", "--select", "ephemeral_model"])
assert "col_deci" in log_output
class TestShowSecondEphemeral(ShowBase):
def test_second_ephemeral_model(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(["show", "--inline", models__second_ephemeral_model])
(results, log_output) = run_dbt_and_capture(
["show", "--inline", models__second_ephemeral_model]
)
assert "col_hundo" in log_output
class TestShowLimit(ShowBase):
@pytest.mark.parametrize(
"args,expected",
[
@@ -159,20 +99,16 @@ class TestShowLimit(ShowBase):
def test_limit(self, project, args, expected):
run_dbt(["build"])
dbt_args = ["show", "--inline", models__second_ephemeral_model, *args]
results = run_dbt(dbt_args)
results, log_output = run_dbt_and_capture(dbt_args)
assert len(results.results[0].agate_table) == expected
class TestShowSeed(ShowBase):
def test_seed(self, project):
(_, log_output) = run_dbt_and_capture(["show", "--select", "sample_seed"])
(results, log_output) = run_dbt_and_capture(["show", "--select", "sample_seed"])
assert "Previewing node 'sample_seed'" in log_output
class TestShowSqlHeader(ShowBase):
def test_sql_header(self, project):
run_dbt(["build"])
(_, log_output) = run_dbt_and_capture(["show", "--select", "sql_header"])
(results, log_output) = run_dbt_and_capture(["show", "--select", "sql_header"])
assert "Asia/Kolkata" in log_output

View File

@@ -96,18 +96,6 @@ snapshots:
owner: 'a_owner'
"""
models__schema_with_target_schema_yml = """
version: 2
snapshots:
- name: snapshot_actual
tests:
- mutually_exclusive_ranges
config:
meta:
owner: 'a_owner'
target_schema: schema_from_schema_yml
"""
models__ref_snapshot_sql = """
select * from {{ ref('snapshot_actual') }}
"""
@@ -293,26 +281,6 @@ snapshots_pg__snapshot_sql = """
{% endsnapshot %}
"""
snapshots_pg__snapshot_no_target_schema_sql = """
{% snapshot snapshot_actual %}
{{
config(
target_database=var('target_database', database),
unique_key='id || ' ~ "'-'" ~ ' || first_name',
strategy='timestamp',
updated_at='updated_at',
)
}}
{% if var('invalidate_hard_deletes', 'false') | as_bool %}
{{ config(invalidate_hard_deletes=True) }}
{% endif %}
select * from {{target.database}}.{{target.schema}}.seed
{% endsnapshot %}
"""
models_slow__gen_sql = """

View File

@@ -2,15 +2,13 @@ import os
from datetime import datetime
import pytz
import pytest
from dbt.tests.util import run_dbt, check_relations_equal, relation_from_name, write_file
from dbt.tests.util import run_dbt, check_relations_equal, relation_from_name
from tests.functional.simple_snapshot.fixtures import (
models__schema_yml,
models__schema_with_target_schema_yml,
models__ref_snapshot_sql,
seeds__seed_newcol_csv,
seeds__seed_csv,
snapshots_pg__snapshot_sql,
snapshots_pg__snapshot_no_target_schema_sql,
macros__test_no_overlaps_sql,
macros_custom_snapshot__custom_sql,
snapshots_pg_custom_namespaced__snapshot_sql,
@@ -125,41 +123,6 @@ class TestBasicRef(Basic):
ref_setup(project, num_snapshot_models=1)
class TestBasicTargetSchemaConfig(Basic):
@pytest.fixture(scope="class")
def snapshots(self):
return {"snapshot.sql": snapshots_pg__snapshot_no_target_schema_sql}
@pytest.fixture(scope="class")
def project_config_update(self, unique_schema):
return {
"snapshots": {
"test": {
"target_schema": unique_schema + "_alt",
}
}
}
def test_target_schema(self, project):
manifest = run_dbt(["parse"])
assert len(manifest.nodes) == 5
# ensure that the schema in the snapshot node is the same as target_schema
snapshot_id = "snapshot.test.snapshot_actual"
snapshot_node = manifest.nodes[snapshot_id]
assert snapshot_node.schema == f"{project.test_schema}_alt"
assert (
snapshot_node.relation_name
== f'"{project.database}"."{project.test_schema}_alt"."snapshot_actual"'
)
assert snapshot_node.meta == {"owner": "a_owner"}
# write out schema.yml file and check again
write_file(models__schema_with_target_schema_yml, "models", "schema.yml")
manifest = run_dbt(["parse"])
snapshot_node = manifest.nodes[snapshot_id]
assert snapshot_node.schema == "schema_from_schema_yml"
class CustomNamespace:
@pytest.fixture(scope="class")
def snapshots(self):

View File

@@ -121,64 +121,39 @@ class TestAgateHelper(unittest.TestCase):
self.assertEqual(tbl[0][0], expected)
def test_merge_allnull(self):
t1 = agate_helper.table_from_rows([(1, "a", None), (2, "b", None)], ("a", "b", "c"))
t2 = agate_helper.table_from_rows([(3, "c", None), (4, "d", None)], ("a", "b", "c"))
t1 = agate.Table([(1, "a", None), (2, "b", None)], ("a", "b", "c"))
t2 = agate.Table([(3, "c", None), (4, "d", None)], ("a", "b", "c"))
result = agate_helper.merge_tables([t1, t2])
self.assertEqual(result.column_names, ("a", "b", "c"))
assert isinstance(result.column_types[0], agate_helper.Integer)
assert isinstance(result.column_types[0], agate.data_types.Number)
assert isinstance(result.column_types[1], agate.data_types.Text)
assert isinstance(result.column_types[2], agate_helper.Integer)
assert isinstance(result.column_types[2], agate.data_types.Number)
self.assertEqual(len(result), 4)
def test_merge_mixed(self):
t1 = agate_helper.table_from_rows(
[(1, "a", None, None), (2, "b", None, None)], ("a", "b", "c", "d")
)
t2 = agate_helper.table_from_rows(
[(3, "c", "dog", 1), (4, "d", "cat", 5)], ("a", "b", "c", "d")
)
t3 = agate_helper.table_from_rows(
[(3, "c", None, 1.5), (4, "d", None, 3.5)], ("a", "b", "c", "d")
)
t1 = agate.Table([(1, "a", None), (2, "b", None)], ("a", "b", "c"))
t2 = agate.Table([(3, "c", "dog"), (4, "d", "cat")], ("a", "b", "c"))
t3 = agate.Table([(3, "c", None), (4, "d", None)], ("a", "b", "c"))
result = agate_helper.merge_tables([t1, t2])
self.assertEqual(result.column_names, ("a", "b", "c", "d"))
assert isinstance(result.column_types[0], agate_helper.Integer)
self.assertEqual(result.column_names, ("a", "b", "c"))
assert isinstance(result.column_types[0], agate.data_types.Number)
assert isinstance(result.column_types[1], agate.data_types.Text)
assert isinstance(result.column_types[2], agate.data_types.Text)
assert isinstance(result.column_types[3], agate_helper.Integer)
self.assertEqual(len(result), 4)
result = agate_helper.merge_tables([t1, t3])
self.assertEqual(result.column_names, ("a", "b", "c", "d"))
assert isinstance(result.column_types[0], agate_helper.Integer)
assert isinstance(result.column_types[1], agate.data_types.Text)
assert isinstance(result.column_types[2], agate_helper.Integer)
assert isinstance(result.column_types[3], agate.data_types.Number)
self.assertEqual(len(result), 4)
result = agate_helper.merge_tables([t2, t3])
self.assertEqual(result.column_names, ("a", "b", "c", "d"))
assert isinstance(result.column_types[0], agate_helper.Integer)
self.assertEqual(result.column_names, ("a", "b", "c"))
assert isinstance(result.column_types[0], agate.data_types.Number)
assert isinstance(result.column_types[1], agate.data_types.Text)
assert isinstance(result.column_types[2], agate.data_types.Text)
assert isinstance(result.column_types[3], agate.data_types.Number)
self.assertEqual(len(result), 4)
result = agate_helper.merge_tables([t3, t2])
self.assertEqual(result.column_names, ("a", "b", "c", "d"))
assert isinstance(result.column_types[0], agate_helper.Integer)
assert isinstance(result.column_types[1], agate.data_types.Text)
assert isinstance(result.column_types[2], agate.data_types.Text)
assert isinstance(result.column_types[3], agate.data_types.Number)
self.assertEqual(len(result), 4)
result = agate_helper.merge_tables([t1, t2, t3])
self.assertEqual(result.column_names, ("a", "b", "c", "d"))
assert isinstance(result.column_types[0], agate_helper.Integer)
self.assertEqual(result.column_names, ("a", "b", "c"))
assert isinstance(result.column_types[0], agate.data_types.Number)
assert isinstance(result.column_types[1], agate.data_types.Text)
assert isinstance(result.column_types[2], agate.data_types.Text)
assert isinstance(result.column_types[3], agate.data_types.Number)
self.assertEqual(len(result), 6)
def test_nocast_string_types(self):
@@ -216,7 +191,7 @@ class TestAgateHelper(unittest.TestCase):
self.assertEqual(len(tbl), len(result_set))
assert isinstance(tbl.column_types[0], agate.data_types.Boolean)
assert isinstance(tbl.column_types[1], agate_helper.Integer)
assert isinstance(tbl.column_types[1], agate.data_types.Number)
expected = [
[True, Decimal(1)],

View File

@@ -57,11 +57,6 @@ class TestFlags:
assert hasattr(flags, "LOG_PATH")
assert getattr(flags, "LOG_PATH") == Path("logs")
def test_log_file_max_size_default(self, run_context):
flags = Flags(run_context)
assert hasattr(flags, "LOG_FILE_MAX_BYTES")
assert getattr(flags, "LOG_FILE_MAX_BYTES") == 10 * 1024 * 1024
@pytest.mark.parametrize(
"set_stats_param,do_not_track,expected_anonymous_usage_stats",
[
@@ -391,8 +386,3 @@ class TestFlags:
args_dict = {"which": "some bad command"}
with pytest.raises(DbtInternalError, match=r"does not match value of which"):
self._create_flags_from_dict(Command.RUN, args_dict)
def test_from_dict_0_value(self):
args_dict = {"log_file_max_bytes": 0}
flags = Flags.from_dict(Command.RUN, args_dict)
assert flags.LOG_FILE_MAX_BYTES == 0

View File

@@ -424,9 +424,6 @@ def test_invocation_args_to_dict_in_macro_runtime_context(
# Comes from unit/utils.py config_from_parts_or_dicts method
assert ctx["invocation_args_dict"]["profile_dir"] == "/dev/null"
assert isinstance(ctx["invocation_args_dict"]["warn_error_options"], Dict)
assert ctx["invocation_args_dict"]["warn_error_options"] == {"include": [], "exclude": []}
def test_model_parse_context(config_postgres, manifest_fx, get_adapter, get_include_paths):
ctx = providers.generate_parser_model_context(

View File

@@ -17,22 +17,3 @@ class TestModelNodeArgs:
version="1",
)
assert model_node_args.unique_id == "model.package.name.v1"
def test_model_node_args_fqn(self) -> None:
model_node_args = ModelNodeArgs(
name="name",
package_name="package",
identifier="identifier",
schema="schema",
)
assert model_node_args.fqn == ["package", "name"]
def test_model_node_args_fqn_with_version(self) -> None:
model_node_args = ModelNodeArgs(
name="name",
package_name="package",
identifier="identifier",
schema="schema",
version="1",
)
assert model_node_args.fqn == ["package", "name", "v1"]

View File

@@ -452,8 +452,6 @@ unchanged_nodes = [
lambda u: (u, u.replace(alias="other")),
lambda u: (u, u.replace(schema="other")),
lambda u: (u, u.replace(database="other")),
# unchanged ref representations - protected is default
lambda u: (u, u.replace(access=AccessType.Protected)),
]
@@ -487,10 +485,6 @@ changed_nodes = [
lambda u: (u, replace_config(u, alias="other")),
lambda u: (u, replace_config(u, schema="other")),
lambda u: (u, replace_config(u, database="other")),
# changed ref representations
lambda u: (u, replace_config(u, access=AccessType.Public)),
lambda u: (u, replace_config(u, latest_version=2)),
lambda u: (u, replace_config(u, version=2)),
]

View File

@@ -1,6 +1,5 @@
from datetime import timedelta
import pickle
import pytest
from datetime import timedelta
from dbt.contracts.graph.unparsed import (
UnparsedNode,
@@ -941,25 +940,3 @@ class TestUnparsedVersion(ContractTestCase):
version = self.get_ok_dict()
del version["v"]
self.assert_fails_validation(version)
@pytest.mark.parametrize(
"left,right,expected_lt",
[
# same types
(2, 12, True),
(12, 2, False),
("a", "b", True),
("b", "a", False),
# mismatched types - numeric
(2, 12.0, True),
(12.0, 2, False),
(2, "12", True),
("12", 2, False),
# mismatched types
(1, "test", True),
("test", 1, False),
],
)
def test_unparsed_version_lt(left, right, expected_lt):
assert (UnparsedVersion(left) < UnparsedVersion(right)) == expected_lt

View File

@@ -6,7 +6,7 @@ from unittest import mock
import dbt.deps
import dbt.exceptions
from dbt.deps.git import GitUnpinnedPackage
from dbt.deps.local import LocalUnpinnedPackage, LocalPinnedPackage
from dbt.deps.local import LocalUnpinnedPackage
from dbt.deps.tarball import TarballUnpinnedPackage
from dbt.deps.registry import RegistryUnpinnedPackage
from dbt.clients.registry import is_compatible_version
@@ -92,21 +92,6 @@ class TestGitPackage(unittest.TestCase):
self.assertEqual(a_pinned.source_type(), "git")
self.assertIs(a_pinned.warn_unpinned, True)
@mock.patch("shutil.copytree")
@mock.patch("dbt.deps.local.system.make_symlink")
@mock.patch("dbt.deps.local.LocalPinnedPackage.get_installation_path")
@mock.patch("dbt.deps.local.LocalPinnedPackage.resolve_path")
def test_deps_install(
self, mock_resolve_path, mock_get_installation_path, mock_symlink, mock_shutil
):
mock_resolve_path.return_value = "/tmp/source"
mock_get_installation_path.return_value = "/tmp/dest"
mock_symlink.side_effect = OSError("Install deps symlink error")
LocalPinnedPackage("local").install("dummy", "dummy")
self.assertEqual(mock_shutil.call_count, 1)
mock_shutil.assert_called_once_with("/tmp/source", "/tmp/dest")
def test_invalid(self):
with self.assertRaises(ValidationError):
GitPackage.validate(

View File

@@ -2,7 +2,7 @@ import pytest
import re
from typing import TypeVar
from dbt.contracts.results import TimingInfo, RunResult, RunStatus
from dbt.contracts.results import TimingInfo
from dbt.events import AdapterLogger, types
from dbt.events.base_types import (
BaseEvent,
@@ -14,15 +14,11 @@ from dbt.events.base_types import (
WarnLevel,
msg_from_base_event,
)
from dbt.events.eventmgr import TestEventManager, EventManager
from dbt.events.functions import msg_to_dict, msg_to_json, ctx_set_event_manager
from dbt.events.functions import msg_to_dict, msg_to_json
from dbt.events.helpers import get_json_string_utcnow
from dbt.events.types import RunResultError
from dbt.flags import set_from_args
from argparse import Namespace
from dbt.task.printer import print_run_result_error
set_from_args(Namespace(WARN_ERROR=False), None)
@@ -392,6 +388,8 @@ sample_values = [
types.RunResultErrorNoMessage(status=""),
types.SQLCompiledPath(path=""),
types.CheckNodeTestFailure(relation_name=""),
types.FirstRunResultError(msg=""),
types.AfterFirstRunResultError(msg=""),
types.EndOfRunSummary(num_errors=0, num_warnings=0, keyboard_interrupt=False),
types.LogSkipBecauseError(schema="", relation="", index=0, total=0),
types.EnsureGitInstalled(),
@@ -487,34 +485,3 @@ def test_bad_serialization():
str(excinfo.value)
== "[Note]: Unable to parse dict {'param_event_doesnt_have': 'This should break'}"
)
def test_single_run_error():
try:
# Add a recording event manager to the context, so we can test events.
event_mgr = TestEventManager()
ctx_set_event_manager(event_mgr)
error_result = RunResult(
status=RunStatus.Error,
timing=[],
thread_id="",
execution_time=0.0,
node=None,
adapter_response=dict(),
message="oh no!",
failures=[],
)
print_run_result_error(error_result)
events = [e for e in event_mgr.event_history if isinstance(e[0], RunResultError)]
assert len(events) == 1
assert events[0][0].msg == "oh no!"
finally:
# Set an empty event manager unconditionally on exit. This is an early
# attempt at unit testing events, and we need to think about how it
# could be done in a thread safe way in the long run.
ctx_set_event_manager(EventManager())

View File

@@ -2,7 +2,7 @@ from argparse import Namespace
import pytest
import dbt.flags as flags
from dbt.events.functions import msg_to_dict, warn_or_error, setup_event_logger
from dbt.events.functions import msg_to_dict, warn_or_error
from dbt.events.types import InfoLevel, NoNodesForSelectionCriteria
from dbt.exceptions import EventCompilationError
@@ -59,13 +59,3 @@ def test_msg_to_dict_handles_exceptions_gracefully():
assert (
False
), f"We expect `msg_to_dict` to gracefully handle exceptions, but it raised {exc}"
def test_setup_event_logger_specify_max_bytes(mocker):
patched_file_handler = mocker.patch("dbt.events.eventmgr.RotatingFileHandler")
args = Namespace(log_file_max_bytes=1234567)
flags.set_from_args(args, {})
setup_event_logger(flags.get_flags())
patched_file_handler.assert_called_once_with(
filename="logs/dbt.log", encoding="utf8", maxBytes=1234567, backupCount=5
)

View File

@@ -18,7 +18,6 @@ from dbt import tracking
from dbt.contracts.files import SourceFile, FileHash, FilePath
from dbt.contracts.graph.manifest import MacroManifest, ManifestStateCheck
from dbt.graph import NodeSelector, parse_difference
from dbt.events.functions import setup_event_logger
try:
from queue import Empty
@@ -141,7 +140,6 @@ class GraphTest(unittest.TestCase):
config = config_from_parts_or_dicts(project=cfg, profile=self.profile)
dbt.flags.set_from_args(Namespace(), config)
setup_event_logger(dbt.flags.get_flags())
object.__setattr__(dbt.flags.get_flags(), "PARTIAL_PARSE", False)
return config

View File

@@ -24,7 +24,6 @@ from dbt.contracts.graph.nodes import (
TestConfig,
TestMetadata,
ColumnInfo,
AccessType,
)
from dbt.contracts.graph.manifest import Manifest, ManifestMetadata
from dbt.contracts.graph.unparsed import ExposureType, Owner
@@ -126,7 +125,7 @@ def make_model(
checksum=FileHash.from_contents(""),
version=version,
latest_version=latest_version,
access=access or AccessType.Protected,
access=access,
)
@@ -638,21 +637,6 @@ def versioned_model_v3(seed):
)
@pytest.fixture
def versioned_model_v12_string(seed):
return make_model(
"pkg",
"versioned_model",
'select * from {{ ref("seed") }}',
config_kwargs={"materialized": "table"},
refs=[seed],
sources=[],
path="subdirectory/versioned_model_v12.sql",
version="12",
latest_version=2,
)
@pytest.fixture
def versioned_model_v4_nested_dir(seed):
return make_model(
@@ -747,7 +731,6 @@ def manifest(
versioned_model_v2,
versioned_model_v3,
versioned_model_v4_nested_dir,
versioned_model_v12_string,
ext_source_2,
ext_source_other,
ext_source_other_2,
@@ -776,7 +759,6 @@ def manifest(
versioned_model_v2,
versioned_model_v3,
versioned_model_v4_nested_dir,
versioned_model_v12_string,
ext_model,
table_id_unique,
table_id_not_null,
@@ -840,7 +822,6 @@ def test_select_fqn(manifest):
"versioned_model.v2",
"versioned_model.v3",
"versioned_model.v4",
"versioned_model.v12",
"table_model",
"table_model_py",
"table_model_csv",
@@ -858,7 +839,6 @@ def test_select_fqn(manifest):
"versioned_model.v2",
"versioned_model.v3",
"versioned_model.v4",
"versioned_model.v12",
}
assert search_manifest_using_method(manifest, method, "versioned_model.v1") == {
"versioned_model.v1"
@@ -1070,7 +1050,6 @@ def test_select_package(manifest):
"versioned_model.v2",
"versioned_model.v3",
"versioned_model.v4",
"versioned_model.v12",
"table_model",
"table_model_py",
"table_model_csv",
@@ -1123,7 +1102,6 @@ def test_select_config_materialized(manifest):
"versioned_model.v2",
"versioned_model.v3",
"versioned_model.v4",
"versioned_model.v12",
"mynamespace.union_model",
}
@@ -1210,7 +1188,6 @@ def test_select_version(manifest):
assert search_manifest_using_method(manifest, method, "prerelease") == {
"versioned_model.v3",
"versioned_model.v4",
"versioned_model.v12",
}
assert search_manifest_using_method(manifest, method, "none") == {
"table_model_py",

View File

@@ -612,7 +612,7 @@ class SchemaParserVersionedModels(SchemaParserTest):
def setUp(self):
super().setUp()
my_model_v1_node = MockNode(
package="snowplow",
package="root",
name="arbitrary_file_name",
config=mock.MagicMock(enabled=True),
refs=[],
@@ -621,7 +621,7 @@ class SchemaParserVersionedModels(SchemaParserTest):
file_id="snowplow://models/arbitrary_file_name.sql",
)
my_model_v2_node = MockNode(
package="snowplow",
package="root",
name="my_model_v2",
config=mock.MagicMock(enabled=True),
refs=[],

Some files were not shown because too many files have changed in this diff Show More