Compare commits

..

3 Commits

Author SHA1 Message Date
Callum McCann
fb8b161351 adding entity inheritence 2022-12-06 16:20:12 -06:00
Callum McCann
7ecb431278 making dimensions possible! 2022-12-06 13:54:28 -06:00
Callum McCann
792150ff6a let there be entity 2022-12-05 16:02:14 -06:00
1407 changed files with 69524 additions and 141385 deletions

View File

@@ -1,19 +1,13 @@
[bumpversion]
current_version = 1.9.6
parse = (?P<major>[\d]+) # major version number
\.(?P<minor>[\d]+) # minor version number
\.(?P<patch>[\d]+) # patch version number
(?P<prerelease> # optional pre-release - ex: a1, b2, rc25
(?P<prekind>a|b|rc) # pre-release type
(?P<num>[\d]+) # pre-release version number
current_version = 1.4.0a1
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
((?P<prekind>a|b|rc)
(?P<pre>\d+) # pre-release version num
)?
( # optional nightly release indicator
\.(?P<nightly>dev[0-9]+) # ex: .dev02142023
)? # expected matches: `1.15.0`, `1.5.0a11`, `1.5.0a1.dev123`, `1.5.0.dev123457`, expected failures: `1`, `1.5`, `1.5.2-a1`, `text1.5.0`
serialize =
{major}.{minor}.{patch}{prekind}{num}.{nightly}
{major}.{minor}.{patch}.{nightly}
{major}.{minor}.{patch}{prekind}{num}
{major}.{minor}.{patch}{prekind}{pre}
{major}.{minor}.{patch}
commit = False
tag = False
@@ -27,11 +21,19 @@ values =
rc
final
[bumpversion:part:num]
[bumpversion:part:pre]
first_value = 1
[bumpversion:part:nightly]
[bumpversion:file:core/setup.py]
[bumpversion:file:core/dbt/version.py]
[bumpversion:file:plugins/postgres/setup.py]
[bumpversion:file:plugins/postgres/dbt/adapters/postgres/__version__.py]
[bumpversion:file:docker/Dockerfile]
[bumpversion:file:tests/adapter/setup.py]
[bumpversion:file:tests/adapter/dbt/tests/adapter/__version__.py]

View File

@@ -3,10 +3,6 @@
For information on prior major and minor releases, see their changelogs:
* [1.7](https://github.com/dbt-labs/dbt-core/blob/1.7.latest/CHANGELOG.md)
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)
* [1.2](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md)
* [1.1](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md)

View File

@@ -1,190 +0,0 @@
## dbt-core 1.9.0 - December 09, 2024
### Breaking Changes
- Fix changing the current working directory when using dpt deps, clean and init. ([#8997](https://github.com/dbt-labs/dbt-core/issues/8997))
### Features
- Parseable JSON and text output in quiet mode for `dbt show` and `dbt compile` ([#9840](https://github.com/dbt-labs/dbt-core/issues/9840))
- serialize inferred primary key ([#9824](https://github.com/dbt-labs/dbt-core/issues/9824))
- Add unit_test: selection method ([#10053](https://github.com/dbt-labs/dbt-core/issues/10053))
- Maximally parallelize dbt clone in clone command" ([#7914](https://github.com/dbt-labs/dbt-core/issues/7914))
- Add --host flag to dbt docs serve, defaulting to '127.0.0.1' ([#10229](https://github.com/dbt-labs/dbt-core/issues/10229))
- Update data_test to accept arbitrary config options ([#10197](https://github.com/dbt-labs/dbt-core/issues/10197))
- add pre_model and post_model hook calls to data and unit tests to be able to provide extra config options ([#10198](https://github.com/dbt-labs/dbt-core/issues/10198))
- add --empty value to jinja context as flags.EMPTY ([#10317](https://github.com/dbt-labs/dbt-core/issues/10317))
- Warning message for snapshot timestamp data types ([#10234](https://github.com/dbt-labs/dbt-core/issues/10234))
- Support cumulative_type_params & sub-daily granularities in semantic manifest. ([#10360](https://github.com/dbt-labs/dbt-core/issues/10360))
- Add time_granularity to metric spec. ([#10376](https://github.com/dbt-labs/dbt-core/issues/10376))
- Support standard schema/database fields for snapshots ([#10301](https://github.com/dbt-labs/dbt-core/issues/10301))
- Support ref and source in foreign key constraint expressions, bump dbt-common minimum to 1.6 ([#8062](https://github.com/dbt-labs/dbt-core/issues/8062))
- Support new semantic layer time spine configs to enable sub-daily granularity. ([#10475](https://github.com/dbt-labs/dbt-core/issues/10475))
- Add `order_by` and `limit` fields to saved queries. ([#10531](https://github.com/dbt-labs/dbt-core/issues/10531))
- Add support for behavior flags ([#10618](https://github.com/dbt-labs/dbt-core/issues/10618))
- Enable `--resource-type` and `--exclude-resource-type` CLI flags and environment variables for `dbt test` ([#10656](https://github.com/dbt-labs/dbt-core/issues/10656))
- Allow configuring snapshot column names ([#10185](https://github.com/dbt-labs/dbt-core/issues/10185))
- Add custom_granularities to YAML spec for time spines. ([#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- Add basic functionality for creating microbatch incremental models ([#9490](https://github.com/dbt-labs/dbt-core/issues/9490), [#10635](https://github.com/dbt-labs/dbt-core/issues/10635), [#10637](https://github.com/dbt-labs/dbt-core/issues/10637), [#10638](https://github.com/dbt-labs/dbt-core/issues/10638), [#10636](https://github.com/dbt-labs/dbt-core/issues/10636), [#10662](https://github.com/dbt-labs/dbt-core/issues/10662), [#10639](https://github.com/dbt-labs/dbt-core/issues/10639))
- Execute microbatch models in batches ([#10700](https://github.com/dbt-labs/dbt-core/issues/10700))
- Create 'skip_nodes_if_on_run_start_fails' behavior change flag ([#7387](https://github.com/dbt-labs/dbt-core/issues/7387))
- Allow snapshots to be defined in YAML. ([#10246](https://github.com/dbt-labs/dbt-core/issues/10246))
- Write microbatch compiled/run targets to separate files, one per batch ([#10714](https://github.com/dbt-labs/dbt-core/issues/10714))
- Track incremental_strategy as part of model_run tracking event ([#10761](https://github.com/dbt-labs/dbt-core/issues/10761))
- Support required 'begin' config for microbatch models ([#10701](https://github.com/dbt-labs/dbt-core/issues/10701))
- Parse-time validation of microbatch configs: require event_time, batch_size, lookback and validate input event_time ([#10709](https://github.com/dbt-labs/dbt-core/issues/10709))
- Added the --inline-direct parameter to 'dbt show' ([#10770](https://github.com/dbt-labs/dbt-core/issues/10770))
- Enable specification of dbt_valid_to for current records ([#10187](https://github.com/dbt-labs/dbt-core/issues/10187))
- Enable `retry` support for microbatch models ([#10715](https://github.com/dbt-labs/dbt-core/issues/10715), [#10729](https://github.com/dbt-labs/dbt-core/issues/10729))
- Use unrendered database and schema source properties during state:modified, behind state_modified_compare_more_unrendered_values behavoiur flag ([#9573](https://github.com/dbt-labs/dbt-core/issues/9573))
- Ensure microbatch models respect `full_refresh` model config ([#10785](https://github.com/dbt-labs/dbt-core/issues/10785))
- Adds validations for custom_granularities to ensure unique naming. ([#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- Enable use of multi-column unique key in snapshots ([#9992](https://github.com/dbt-labs/dbt-core/issues/9992))
- Change gating of microbatch feature to be behind project flag / behavior flag ([#10798](https://github.com/dbt-labs/dbt-core/issues/10798))
- Ensure `--event-time-start` is before `--event-time-end` ([#10786](https://github.com/dbt-labs/dbt-core/issues/10786))
- Ensure microbatch models use same `current_time` value ([#10819](https://github.com/dbt-labs/dbt-core/issues/10819))
- Emit warning when microbatch model has no input with `event_time` config ([#10926](https://github.com/dbt-labs/dbt-core/issues/10926))
- Emit debug logging event whenever artifacts are written ([#10937](https://github.com/dbt-labs/dbt-core/issues/10937))
- Support --empty for snapshots ([#10372](https://github.com/dbt-labs/dbt-core/issues/10372))
- Add new hard_deletes="new_record" mode for snapshots. ([#10235](https://github.com/dbt-labs/dbt-core/issues/10235))
- Allow microbatch batches to run in parallel ([#10853](https://github.com/dbt-labs/dbt-core/issues/10853), [#10855](https://github.com/dbt-labs/dbt-core/issues/10855))
- Add `batch` context object to model jinja context ([#11025](https://github.com/dbt-labs/dbt-core/issues/11025))
- Ensure pre/post hooks only run on first/last batch respectively for microbatch model batches ([#11094](https://github.com/dbt-labs/dbt-core/issues/11094), [#11104](https://github.com/dbt-labs/dbt-core/issues/11104))
### Fixes
- Remove unused check_new method ([#7586](https://github.com/dbt-labs/dbt-core/issues/7586))
- Test case for `merge_exclude_columns` ([#8267](https://github.com/dbt-labs/dbt-core/issues/8267))
- Convert "Skipping model due to fail_fast" message to DEBUG level ([#8774](https://github.com/dbt-labs/dbt-core/issues/8774))
- Restore previous behavior for --favor-state: only favor defer_relation if not selected in current command" ([#10107](https://github.com/dbt-labs/dbt-core/issues/10107))
- Unit test fixture (csv) returns null for empty value ([#9881](https://github.com/dbt-labs/dbt-core/issues/9881))
- Fix json format log and --quiet for ls and jinja print by converting print call to fire events ([#8756](https://github.com/dbt-labs/dbt-core/issues/8756))
- Add resource type to saved_query ([#10168](https://github.com/dbt-labs/dbt-core/issues/10168))
- Fix: Order-insensitive unit test equality assertion for expected/actual with multiple nulls ([#10167](https://github.com/dbt-labs/dbt-core/issues/10167))
- Renaming or removing a contracted model should raise a BreakingChange warning/error ([#10116](https://github.com/dbt-labs/dbt-core/issues/10116))
- prefer disabled project nodes to external node ([#10224](https://github.com/dbt-labs/dbt-core/issues/10224))
- Fix issues with selectors and inline nodes ([#8943](https://github.com/dbt-labs/dbt-core/issues/8943), [#9269](https://github.com/dbt-labs/dbt-core/issues/9269))
- Fix snapshot config to work in yaml files ([#4000](https://github.com/dbt-labs/dbt-core/issues/4000))
- Improve handling of error when loading schema file list ([#10284](https://github.com/dbt-labs/dbt-core/issues/10284))
- Use model alias for the CTE identifier generated during ephemeral materialization ([#5273](https://github.com/dbt-labs/dbt-core/issues/5273))
- Implement state:modified for saved queries ([#10294](https://github.com/dbt-labs/dbt-core/issues/10294))
- Saved Query node fail during skip ([#10029](https://github.com/dbt-labs/dbt-core/issues/10029))
- DOn't warn on `unit_test` config paths that are properly used ([#10311](https://github.com/dbt-labs/dbt-core/issues/10311))
- Fix setting `silence` of `warn_error_options` via `dbt_project.yaml` flags ([#10160](https://github.com/dbt-labs/dbt-core/issues/10160))
- Attempt to provide test fixture tables with all values to set types correctly for comparisong with source tables ([#10365](https://github.com/dbt-labs/dbt-core/issues/10365))
- Limit data_tests deprecation to root_project ([#9835](https://github.com/dbt-labs/dbt-core/issues/9835))
- CLI flags should take precedence over env var flags ([#10304](https://github.com/dbt-labs/dbt-core/issues/10304))
- Fix typing for artifact schemas ([#10442](https://github.com/dbt-labs/dbt-core/issues/10442))
- Fix over deletion of generated_metrics in partial parsing ([#10450](https://github.com/dbt-labs/dbt-core/issues/10450))
- Fix error constructing warn_error_options ([#10452](https://github.com/dbt-labs/dbt-core/issues/10452))
- Do not update varchar column definitions if a contract exists ([#10362](https://github.com/dbt-labs/dbt-core/issues/10362))
- fix all_constraints access, disabled node parsing of non-uniquely named resources ([#10509](https://github.com/dbt-labs/dbt-core/issues/10509))
- respect --quiet and --warn-error-options for flag deprecations ([#10105](https://github.com/dbt-labs/dbt-core/issues/10105))
- Propagate measure label when using create_metrics ([#10536](https://github.com/dbt-labs/dbt-core/issues/10536))
- Fix state:modified check for exports ([#10138](https://github.com/dbt-labs/dbt-core/issues/10138))
- Filter out empty nodes after graph selection to support consistent selection of nodes that depend on upstream public models ([#8987](https://github.com/dbt-labs/dbt-core/issues/8987))
- Late render pre- and post-hooks configs in properties / schema YAML files ([#10603](https://github.com/dbt-labs/dbt-core/issues/10603))
- Allow the use of env_var function in certain macros in which it was previously unavailable. ([#10609](https://github.com/dbt-labs/dbt-core/issues/10609))
- Remove deprecation for tests: to data_tests: change ([#10564](https://github.com/dbt-labs/dbt-core/issues/10564))
- Fix `--resource-type test` for `dbt list` and `dbt build` ([#10730](https://github.com/dbt-labs/dbt-core/issues/10730))
- Fix unit tests for incremental model with alias ([#10754](https://github.com/dbt-labs/dbt-core/issues/10754))
- Allow singular tests to be documented in properties.yml ([#9005](https://github.com/dbt-labs/dbt-core/issues/9005))
- Ignore --empty in unit test ref/source rendering ([#10516](https://github.com/dbt-labs/dbt-core/issues/10516))
- Ignore rendered jinja in configs for state:modified, behind state_modified_compare_more_unrendered_values behaviour flag ([#9564](https://github.com/dbt-labs/dbt-core/issues/9564))
- Improve performance of infer primary key ([#10781](https://github.com/dbt-labs/dbt-core/issues/10781))
- Pass test user config to adapter pre_hook by explicitly adding test builder config to node ([#10484](https://github.com/dbt-labs/dbt-core/issues/10484))
- Attempt to skip saved query processing when no semantic manifest changes ([#10563](https://github.com/dbt-labs/dbt-core/issues/10563))
- Ensure dbt retry of microbatch models doesn't lose prior successful state ([#10800](https://github.com/dbt-labs/dbt-core/issues/10800))
- override materialization python models ([#8520](https://github.com/dbt-labs/dbt-core/issues/8520))
- Handle edge cases when a specified `--event-time-end` is equivalent to the batch size truncated batch start time ([#10824](https://github.com/dbt-labs/dbt-core/issues/10824))
- Begin tracking execution time of microbatch model batches ([#10825](https://github.com/dbt-labs/dbt-core/issues/10825))
- Support disabling unit tests via config. ([#9109](https://github.com/dbt-labs/dbt-core/issues/9109), [#10540](https://github.com/dbt-labs/dbt-core/issues/10540))
- Allow instances of generic data tests to be documented ([#2578](https://github.com/dbt-labs/dbt-core/issues/2578))
- Fix warnings for models referring to a deprecated model ([#10833](https://github.com/dbt-labs/dbt-core/issues/10833))
- Change `lookback` default from `0` to `1` to ensure better data completeness ([#10867](https://github.com/dbt-labs/dbt-core/issues/10867))
- Make `--event-time-start` and `--event-time-end` mutually required ([#10874](https://github.com/dbt-labs/dbt-core/issues/10874))
- Ensure KeyboardInterrupt/SystemExit halts microbatch model execution ([#10862](https://github.com/dbt-labs/dbt-core/issues/10862))
- Exclude hook result from results in on-run-end context ([#7387](https://github.com/dbt-labs/dbt-core/issues/7387))
- unit tests with versioned refs ([#10880](https://github.com/dbt-labs/dbt-core/issues/10880), [#10528](https://github.com/dbt-labs/dbt-core/issues/10528), [#10623](https://github.com/dbt-labs/dbt-core/issues/10623))
- Implement partial parsing for all-yaml snapshots ([#10903](https://github.com/dbt-labs/dbt-core/issues/10903))
- Restore source quoting behaviour when quoting config provided in dbt_project.yml ([#10892](https://github.com/dbt-labs/dbt-core/issues/10892))
- Fix bug when referencing deprecated models ([#10915](https://github.com/dbt-labs/dbt-core/issues/10915))
- Fix 'model' jinja context variable type to dict ([#10927](https://github.com/dbt-labs/dbt-core/issues/10927))
- Take `end_time` for batches to the ceiling to handle edge case where `event_time` column is a date ([#10868](https://github.com/dbt-labs/dbt-core/issues/10868))
- Handle exceptions in `get_execution_status` more broadly to better ensure `run_results.json` gets written ([#10934](https://github.com/dbt-labs/dbt-core/issues/10934))
- Fix 'no attribute .config' error when ref-ing a microbatch model from non-Model context ([#10928](https://github.com/dbt-labs/dbt-core/issues/10928))
- Ensure inferred primary_key is a List[str] with no null values ([#10983](https://github.com/dbt-labs/dbt-core/issues/10983))
- Correct when custom microbatch macro deprecation warning is fired ([#10994](https://github.com/dbt-labs/dbt-core/issues/10994))
- Validate manifest has group_map during group_lookup init ([#10988](https://github.com/dbt-labs/dbt-core/issues/10988))
- Fix plural of 'partial success' in log message ([#10999](https://github.com/dbt-labs/dbt-core/issues/10999))
- Emit batch-level exception with node_info on microbatch batch run failure ([#10840](https://github.com/dbt-labs/dbt-core/issues/10840))
- Fix restrict-access to not apply within a package ([#10134](https://github.com/dbt-labs/dbt-core/issues/10134))
- Make microbatch models skippable ([#11021](https://github.com/dbt-labs/dbt-core/issues/11021))
- Catch DbtRuntimeError for hooks ([#11012](https://github.com/dbt-labs/dbt-core/issues/11012))
- Access DBUG flag more consistently with the rest of the codebase in ManifestLoader ([#11068](https://github.com/dbt-labs/dbt-core/issues/11068))
- Implement partial parsing for singular data test configs in yaml files ([#10801](https://github.com/dbt-labs/dbt-core/issues/10801))
### Docs
- Enable display of unit tests ([dbt-docs/#501](https://github.com/dbt-labs/dbt-docs/issues/501))
- Unit tests not rendering ([dbt-docs/#506](https://github.com/dbt-labs/dbt-docs/issues/506))
- Add support for Saved Query node ([dbt-docs/#486](https://github.com/dbt-labs/dbt-docs/issues/486))
- Fix npm security vulnerabilities as of June 2024 ([dbt-docs/#513](https://github.com/dbt-labs/dbt-docs/issues/513))
### Under the Hood
- Clear error message for Private package in dbt-core ([#10083](https://github.com/dbt-labs/dbt-core/issues/10083))
- Enable use of context in serialization ([#10093](https://github.com/dbt-labs/dbt-core/issues/10093))
- Make RSS high water mark measurement more accurate on Linux ([#10177](https://github.com/dbt-labs/dbt-core/issues/10177))
- Enable record filtering by type. ([#10240](https://github.com/dbt-labs/dbt-core/issues/10240))
- Remove IntermediateSnapshotNode ([#10326](https://github.com/dbt-labs/dbt-core/issues/10326))
- Additional logging for skipped ephemeral models ([#10389](https://github.com/dbt-labs/dbt-core/issues/10389))
- bump black to 24.3.0 ([#10454](https://github.com/dbt-labs/dbt-core/issues/10454))
- generate protos with protoc version 5.26.1 ([#10457](https://github.com/dbt-labs/dbt-core/issues/10457))
- Move from minimal-snowplow-tracker fork back to snowplow-tracker ([#8409](https://github.com/dbt-labs/dbt-core/issues/8409))
- Add group info to RunResultError, RunResultFailure, RunResultWarning log lines ([#](https://github.com/dbt-labs/dbt-core/issues/))
- Improve speed of tree traversal when finding children, increasing build speed for some selectors ([#10434](https://github.com/dbt-labs/dbt-core/issues/10434))
- Add test for sources tables with quotes ([#10582](https://github.com/dbt-labs/dbt-core/issues/10582))
- Additional type hints for `core/dbt/version.py` ([#10612](https://github.com/dbt-labs/dbt-core/issues/10612))
- Fix typing issues in core/dbt/contracts/sql.py ([#10614](https://github.com/dbt-labs/dbt-core/issues/10614))
- Fix type errors in `dbt/core/task/clean.py` ([#10616](https://github.com/dbt-labs/dbt-core/issues/10616))
- Add Snowplow tracking for behavior flag deprecations ([#10552](https://github.com/dbt-labs/dbt-core/issues/10552))
- Add test utility patch_microbatch_end_time for adapters testing ([#10713](https://github.com/dbt-labs/dbt-core/issues/10713))
- Replace `TestSelector` with `ResourceTypeSelector` ([#10718](https://github.com/dbt-labs/dbt-core/issues/10718))
- Standardize returning `ResourceTypeSelector` instances in `dbt list` and `dbt build` ([#10739](https://github.com/dbt-labs/dbt-core/issues/10739))
- Add group metadata info to LogModelResult and LogTestResult ([#10775](https://github.com/dbt-labs/dbt-core/issues/10775))
- Remove support and testing for Python 3.8, which is now EOL. ([#10861](https://github.com/dbt-labs/dbt-core/issues/10861))
- Behavior change for mf timespine without yaml configuration ([#10959](https://github.com/dbt-labs/dbt-core/issues/10959))
- Behavior change for cumulative metric type param ([#10960](https://github.com/dbt-labs/dbt-core/issues/10960))
- Upgrade protobuf ([#10658](https://github.com/dbt-labs/dbt-core/issues/10658))
### Dependencies
- Remove logbook dependency ([#8027](https://github.com/dbt-labs/dbt-core/issues/8027))
- Increase supported version range for dbt-semantic-interfaces. Needed to support custom calendar features. ([#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- Bump minimnum allowed dbt-adapters version to 1.8.0 ([#N/A](https://github.com/dbt-labs/dbt-core/issues/N/A))
- Bump minimum dbt-adapters version to 1.9.0 ([#10996](https://github.com/dbt-labs/dbt-core/issues/10996))
### Security
- Explicitly bind to localhost in docs serve ([#10209](https://github.com/dbt-labs/dbt-core/issues/10209))
### Contributors
- [@DevonFulcher](https://github.com/DevonFulcher) ([#10959](https://github.com/dbt-labs/dbt-core/issues/10959), [#10960](https://github.com/dbt-labs/dbt-core/issues/10960))
- [@McKnight-42](https://github.com/McKnight-42) ([#10197](https://github.com/dbt-labs/dbt-core/issues/10197), [#10198](https://github.com/dbt-labs/dbt-core/issues/10198))
- [@Threynaud](https://github.com/Threynaud) ([#11068](https://github.com/dbt-labs/dbt-core/issues/11068))
- [@TowardOliver](https://github.com/TowardOliver) ([#10656](https://github.com/dbt-labs/dbt-core/issues/10656))
- [@aliceliu](https://github.com/aliceliu) ([#10536](https://github.com/dbt-labs/dbt-core/issues/10536), [#10138](https://github.com/dbt-labs/dbt-core/issues/10138))
- [@courtneyholcomb](https://github.com/courtneyholcomb) ([#10360](https://github.com/dbt-labs/dbt-core/issues/10360), [#10376](https://github.com/dbt-labs/dbt-core/issues/10376), [#10475](https://github.com/dbt-labs/dbt-core/issues/10475), [#9265](https://github.com/dbt-labs/dbt-core/issues/9265), [#9265](https://github.com/dbt-labs/dbt-core/issues/9265), [#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- [@danlsn](https://github.com/danlsn) ([#10915](https://github.com/dbt-labs/dbt-core/issues/10915))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#9824](https://github.com/dbt-labs/dbt-core/issues/9824))
- [@devmessias](https://github.com/devmessias) ([#8520](https://github.com/dbt-labs/dbt-core/issues/8520), [#10880](https://github.com/dbt-labs/dbt-core/issues/10880), [#10528](https://github.com/dbt-labs/dbt-core/issues/10528), [#10623](https://github.com/dbt-labs/dbt-core/issues/10623))
- [@jeancochrane](https://github.com/jeancochrane) ([#5273](https://github.com/dbt-labs/dbt-core/issues/5273))
- [@katsugeneration](https://github.com/katsugeneration) ([#10754](https://github.com/dbt-labs/dbt-core/issues/10754))
- [@kevinneville](https://github.com/kevinneville) ([#7586](https://github.com/dbt-labs/dbt-core/issues/7586))
- [@nakamichiworks](https://github.com/nakamichiworks) ([#10442](https://github.com/dbt-labs/dbt-core/issues/10442))
- [@plypaul](https://github.com/plypaul) ([#10531](https://github.com/dbt-labs/dbt-core/issues/10531))
- [@rariyama](https://github.com/rariyama) ([#8997](https://github.com/dbt-labs/dbt-core/issues/8997))
- [@scottgigante,nevdelap](https://github.com/scottgigante,nevdelap) ([#8774](https://github.com/dbt-labs/dbt-core/issues/8774))
- [@tsturge](https://github.com/tsturge) ([#9109](https://github.com/dbt-labs/dbt-core/issues/9109), [#10540](https://github.com/dbt-labs/dbt-core/issues/10540))
- [@ttusing](https://github.com/ttusing) ([#10434](https://github.com/dbt-labs/dbt-core/issues/10434))

View File

@@ -1,12 +0,0 @@
## dbt-core 1.9.1 - December 16, 2024
### Fixes
- update adapter version messages ([#10230](https://github.com/dbt-labs/dbt-core/issues/10230))
- Fix debug log messages for microbatch batch execution information ([#11111](https://github.com/dbt-labs/dbt-core/issues/11111))
- Fix running of extra "last" batch when there is only one batch ([#11112](https://github.com/dbt-labs/dbt-core/issues/11112))
- Fix interpretation of `PartialSuccess` to result in non-zero exit code ([#11114](https://github.com/dbt-labs/dbt-core/issues/11114))
- Warn about invalid usages of `concurrent_batches` config ([#11122](https://github.com/dbt-labs/dbt-core/issues/11122))
### Contributors
- [@dave-connors-3](https://github.com/dave-connors-3) ([#10230](https://github.com/dbt-labs/dbt-core/issues/10230))

View File

@@ -1,12 +0,0 @@
## dbt-core 1.9.2 - January 29, 2025
### Fixes
- Error writing generic test at run time ([#11110](https://github.com/dbt-labs/dbt-core/issues/11110))
- Run check_modified_contract for state:modified ([#11034](https://github.com/dbt-labs/dbt-core/issues/11034))
- Fix unrendered_config for tests from dbt_project.yml ([#11146](https://github.com/dbt-labs/dbt-core/issues/11146))
- Ensure warning about microbatch lacking filter inputs is always fired ([#11159](https://github.com/dbt-labs/dbt-core/issues/11159))
- Fix microbatch dbt list --output json ([#10556](https://github.com/dbt-labs/dbt-core/issues/10556), [#11098](https://github.com/dbt-labs/dbt-core/issues/11098))
### Contributors
- [@internetcoffeephone](https://github.com/internetcoffeephone) ([#10556](https://github.com/dbt-labs/dbt-core/issues/10556), [#11098](https://github.com/dbt-labs/dbt-core/issues/11098))

View File

@@ -1,5 +0,0 @@
## dbt-core 1.9.3 - March 07, 2025
### Fixes
- Fix microbatch execution to not block main thread nor hang ([#11243](https://github.com/dbt-labs/dbt-core/issues/11243), [#11306](https://github.com/dbt-labs/dbt-core/issues/11306))

View File

@@ -1,8 +0,0 @@
## dbt-core 1.9.4 - April 02, 2025
### Fixes
- dbt retry does not respect --threads ([#10584](https://github.com/dbt-labs/dbt-core/issues/10584))
### Contributors
- [@donjin-master](https://github.com/donjin-master) ([#10584](https://github.com/dbt-labs/dbt-core/issues/10584))

View File

@@ -1,5 +0,0 @@
## dbt-core 1.9.5 - May 28, 2025
### Fixes
- Add freshness config to sources ([#11506](https://github.com/dbt-labs/dbt-core/issues/11506))

View File

@@ -1,9 +0,0 @@
## dbt-core 1.9.6 - May 30, 2025
### Features
- Support config on columns ([#11651](https://github.com/dbt-labs/dbt-core/issues/11651))
### Fixes
- Fix source freshness set via config to handle explicit nulls ([#11685](https://github.com/dbt-labs/dbt-core/issues/11685))

View File

@@ -1,6 +1,6 @@
# dbt Core Changelog
- This file provides a full account of all changes to `dbt-core`
- This file provides a full account of all changes to `dbt-core` and `dbt-postgres`
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Update pathspec requirement from ~=0.9.0 to >=0.9,<0.11 in /core"
time: 2022-09-23T00:06:46.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 5917

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Bump black from 22.8.0 to 22.10.0"
time: 2022-10-07T00:08:48.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6019

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Bump mashumaro[msgpack] from 3.0.4 to 3.1.1 in /core"
time: 2022-10-20T00:07:53.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6108

View File

@@ -0,0 +1,7 @@
kind: "Dependency"
body: "Update colorama requirement from <0.4.6,>=0.3.9 to >=0.3.9,<0.4.7 in /core"
time: 2022-10-26T00:09:10.00000Z
custom:
Author: dependabot[bot]
Issue: 4904
PR: 6144

View File

@@ -0,0 +1,7 @@
kind: Docs
body: minor doc correction
time: 2022-09-08T15:41:57.689162-04:00
custom:
Author: andy-clapson
Issue: "5791"
PR: "5684"

View File

@@ -0,0 +1,7 @@
kind: Docs
body: Generate API docs for new CLI interface
time: 2022-10-07T09:06:56.446078-05:00
custom:
Author: stu-k
Issue: "5528"
PR: "6022"

View File

@@ -0,0 +1,6 @@
kind: Docs
time: 2022-10-17T17:14:11.715348-05:00
custom:
Author: paulbenschmidt
Issue: "5880"
PR: "324"

View File

@@ -0,0 +1,7 @@
kind: Docs
body: Fix rendering of sample code for metrics
time: 2022-11-16T15:57:43.204201+01:00
custom:
Author: jtcohen6
Issue: "323"
PR: "346"

View File

@@ -0,0 +1,8 @@
kind: Features
body: Added favor-state flag to optionally favor state nodes even if unselected node
exists
time: 2022-04-08T16:54:59.696564+01:00
custom:
Author: daniel-murray josephberni
Issue: "2968"
PR: "5859"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Proto logging messages
time: 2022-08-17T15:48:57.225267-04:00
custom:
Author: gshank
Issue: "5610"
PR: "5643"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Friendlier error messages when packages.yml is malformed
time: 2022-09-12T12:59:35.121188+01:00
custom:
Author: jared-rimmer
Issue: "5486"
PR: "5812"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Migrate dbt-utils current_timestamp macros into core + adapters
time: 2022-09-14T09:56:25.97818-07:00
custom:
Author: colin-rogers-dbt
Issue: "5521"
PR: "5838"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Allow partitions in external tables to be supplied as a list
time: 2022-09-25T21:16:51.051239654+02:00
custom:
Author: pgoslatara
Issue: "5929"
PR: "5930"

View File

@@ -0,0 +1,7 @@
kind: Features
body: extend -f flag shorthand for seed command
time: 2022-10-03T11:07:05.381632-05:00
custom:
Author: dave-connors-3
Issue: "5990"
PR: "5991"

View File

@@ -0,0 +1,8 @@
kind: Features
body: This pulls the profile name from args when constructing a RuntimeConfig in lib.py,
enabling the dbt-server to override the value that's in the dbt_project.yml
time: 2022-11-02T15:00:03.000805-05:00
custom:
Author: racheldaniel
Issue: "6201"
PR: "6202"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Added an md5 function to the base context
time: 2022-11-14T18:52:07.788593+02:00
custom:
Author: haritamar
Issue: "6246"
PR: "6247"

View File

@@ -0,0 +1,7 @@
kind: Features
body: Exposures support metrics in lineage
time: 2022-11-30T11:29:13.256034-05:00
custom:
Author: michelleark
Issue: "6057"
PR: "6342"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: Account for disabled flags on models in schema files more completely
time: 2022-09-16T10:48:54.162273-05:00
custom:
Author: emmyoop
Issue: "3992"
PR: "5868"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: Add validation of enabled config for metrics, exposures and sources
time: 2022-10-10T11:32:18.752322-05:00
custom:
Author: emmyoop
Issue: "6030"
PR: "6038"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: check length of args of python model function before accessing it
time: 2022-10-11T16:07:15.464093-04:00
custom:
Author: chamini2
Issue: "6041"
PR: "6042"

View File

@@ -0,0 +1,8 @@
kind: Fixes
body: Add functors to ensure event types with str-type attributes are initialized
to spec, even when provided non-str type params.
time: 2022-10-16T17:37:42.846683-07:00
custom:
Author: versusfacit
Issue: "5436"
PR: "5874"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: Allow hooks to fail without halting execution flow
time: 2022-11-07T09:53:14.340257-06:00
custom:
Author: ChenyuLInx
Issue: "5625"
PR: "6059"

View File

@@ -0,0 +1,7 @@
kind: Fixes
body: Clarify Error Message for how many models are allowed in a Python file
time: 2022-11-15T08:10:21.527884-05:00
custom:
Author: justbldwn
Issue: "6245"
PR: "6251"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Put black config in explicit config
time: 2022-09-27T19:42:59.241433-07:00
custom:
Author: max-sixty
Issue: "5946"
PR: "5947"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Added flat_graph attribute the Manifest class's deepcopy() coverage
time: 2022-09-29T13:44:06.275941-04:00
custom:
Author: peterallenwebb
Issue: "5809"
PR: "5975"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Add mypy configs so `mypy` passes from CLI
time: 2022-10-05T12:03:10.061263-07:00
custom:
Author: max-sixty
Issue: "5983"
PR: "5983"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Exception message cleanup.
time: 2022-10-07T09:46:27.682872-05:00
custom:
Author: emmyoop
Issue: "6023"
PR: "6024"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Add dmypy cache to gitignore
time: 2022-10-07T14:00:44.227644-07:00
custom:
Author: max-sixty
Issue: "6028"
PR: "5978"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Provide useful errors when the value of 'materialized' is invalid
time: 2022-10-13T18:19:12.167548-04:00
custom:
Author: peterallenwebb
Issue: "5229"
PR: "6025"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Fixed extra whitespace in strings introduced by black.
time: 2022-10-17T15:15:11.499246-05:00
custom:
Author: luke-bassett
Issue: "1350"
PR: "6086"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Clean up string formatting
time: 2022-10-17T15:58:44.676549-04:00
custom:
Author: eve-johns
Issue: "6068"
PR: "6082"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Remove the 'root_path' field from most nodes
time: 2022-10-28T10:48:37.687886-04:00
custom:
Author: gshank
Issue: "6171"
PR: "6172"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Combine certain logging events with different levels
time: 2022-10-28T11:03:44.887836-04:00
custom:
Author: gshank
Issue: "6173"
PR: "6174"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Convert threading tests to pytest
time: 2022-11-08T07:45:50.589147-06:00
custom:
Author: stu-k
Issue: "5942"
PR: "6226"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Convert postgres index tests to pytest
time: 2022-11-08T11:56:33.743042-06:00
custom:
Author: stu-k
Issue: "5770"
PR: "6228"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Convert use color tests to pytest
time: 2022-11-08T13:31:04.788547-06:00
custom:
Author: stu-k
Issue: "5771"
PR: "6230"

View File

@@ -0,0 +1,7 @@
kind: Under the Hood
body: Add github actions workflow to generate high level CLI API docs
time: 2022-11-16T13:00:37.916202-06:00
custom:
Author: stu-k
Issue: "5942"
PR: "6187"

View File

@@ -4,34 +4,21 @@ headerPath: header.tpl.md
versionHeaderPath: ""
changelogPath: CHANGELOG.md
versionExt: md
envPrefix: "CHANGIE_"
versionFormat: '## dbt-core {{.Version}} - {{.Time.Format "January 02, 2006"}}'
kindFormat: '### {{.Kind}}'
changeFormat: |-
{{- $IssueList := list }}
{{- $changes := splitList " " $.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
changeFormat: '- {{.Body}} ([#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-core/issues/{{.Custom.Issue}}), [#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-core/pull/{{.Custom.PR}}))'
kinds:
- label: Breaking Changes
- label: Features
- label: Fixes
- label: Docs
changeFormat: |-
{{- $IssueList := list }}
{{- $changes := splitList " " $.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[dbt-docs/#nbr](https://github.com/dbt-labs/dbt-docs/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
changeFormat: '- {{.Body}} ([dbt-docs/#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-docs/issues/{{.Custom.Issue}}), [dbt-docs/#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-docs/pull/{{.Custom.PR}}))'
- label: Under the Hood
- label: Dependencies
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-core/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-core/pull/{{.Custom.PR}}))'
- label: Security
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/dbt-labs/dbt-core/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/dbt-labs/dbt-core/pull/{{.Custom.PR}}))'
newlines:
afterChangelogHeader: 1
@@ -46,45 +33,42 @@ custom:
type: string
minLength: 3
- key: Issue
label: GitHub Issue Number (separated by a single space if multiple)
type: string
minLength: 1
label: GitHub Issue Number
type: int
minInt: 1
- key: PR
label: GitHub Pull Request Number
type: int
minInt: 1
footerFormat: |
{{- $contributorDict := dict }}
{{- /* ensure all names in this list are all lowercase for later matching purposes */}}
{{- $core_team := splitList " " .Env.CORE_TEAM }}
{{- /* ensure we always skip snyk and dependabot in addition to the core team */}}
{{- $maintainers := list "dependabot[bot]" "snyk-bot"}}
{{- range $team_member := $core_team }}
{{- $team_member_lower := lower $team_member }}
{{- $maintainers = append $maintainers $team_member_lower }}
{{- end }}
{{- /* any names added to this list should be all lowercase for later matching purposes */}}
{{- $core_team := list "michelleark" "peterallenwebb" "emmyoop" "nathaniel-may" "gshank" "leahwicz" "chenyulinx" "stu-k" "iknox-fa" "versusfacit" "mcknight-42" "jtcohen6" "dependabot[bot]" "snyk-bot" "colin-rogers-dbt" }}
{{- range $change := .Changes }}
{{- $authorList := splitList " " $change.Custom.Author }}
{{- /* loop through all authors for a single changelog */}}
{{- /* loop through all authors for a PR */}}
{{- range $author := $authorList }}
{{- $authorLower := lower $author }}
{{- /* we only want to include non-core team contributors */}}
{{- if not (has $authorLower $maintainers)}}
{{- $changeList := splitList " " $change.Custom.Author }}
{{- $IssueList := list }}
{{- $changeLink := $change.Kind }}
{{- $changes := splitList " " $change.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end }}
{{- /* check if this contributor has other changes associated with them already */}}
{{- if hasKey $contributorDict $author }}
{{- $contributionList := get $contributorDict $author }}
{{- $contributionList = concat $contributionList $IssueList }}
{{- $contributorDict := set $contributorDict $author $contributionList }}
{{- else }}
{{- $contributionList := $IssueList }}
{{- $contributorDict := set $contributorDict $author $contributionList }}
{{- end }}
{{- end}}
{{- if not (has $authorLower $core_team)}}
{{- /* Docs kind link back to dbt-docs instead of dbt-core PRs */}}
{{- $prLink := $change.Kind }}
{{- if eq $change.Kind "Docs" }}
{{- $prLink = "[dbt-docs/#pr](https://github.com/dbt-labs/dbt-docs/pull/pr)" | replace "pr" $change.Custom.PR }}
{{- else }}
{{- $prLink = "[#pr](https://github.com/dbt-labs/dbt-core/pull/pr)" | replace "pr" $change.Custom.PR }}
{{- end }}
{{- /* check if this contributor has other PRs associated with them already */}}
{{- if hasKey $contributorDict $author }}
{{- $prList := get $contributorDict $author }}
{{- $prList = append $prList $prLink }}
{{- $contributorDict := set $contributorDict $author $prList }}
{{- else }}
{{- $prList := list $prLink }}
{{- $contributorDict := set $contributorDict $author $prList }}
{{- end }}
{{- end}}
{{- end}}
{{- end }}
{{- /* no indentation here for formatting so the final markdown doesn't have unneeded indentations */}}

View File

@@ -7,9 +7,6 @@ ignore =
W503 # makes Flake8 work like black
W504
E203 # makes Flake8 work like black
E704 # makes Flake8 work like black
E741
E501 # long line checking is done in black
exclude = test/
per-file-ignores =
*/__init__.py: F401
exclude = test

6
.gitattributes vendored
View File

@@ -1,6 +0,0 @@
core/dbt/task/docs/index.html binary
tests/functional/artifacts/data/state/*/manifest.json binary
core/dbt/docs/build/html/searchindex.js binary
core/dbt/docs/build/html/index.html binary
performance/runner/Cargo.lock binary
core/dbt/events/types_pb2.py binary

63
.github/CODEOWNERS vendored
View File

@@ -11,8 +11,65 @@
# As a default for areas with no assignment,
# the core team as a whole will be assigned
* @dbt-labs/core-team
* @dbt-labs/core
### ARTIFACTS
# Changes to GitHub configurations including Actions
/.github/ @leahwicz
/schemas/dbt @dbt-labs/cloud-artifacts
### LANGUAGE
# Language core modules
/core/dbt/config/ @dbt-labs/core-language
/core/dbt/context/ @dbt-labs/core-language
/core/dbt/contracts/ @dbt-labs/core-language
/core/dbt/deps/ @dbt-labs/core-language
/core/dbt/events/ @dbt-labs/core-language # structured logging
/core/dbt/parser/ @dbt-labs/core-language
# Language misc files
/core/dbt/dataclass_schema.py @dbt-labs/core-language
/core/dbt/hooks.py @dbt-labs/core-language
/core/dbt/node_types.py @dbt-labs/core-language
/core/dbt/semver.py @dbt-labs/core-language
### EXECUTION
# Execution core modules
/core/dbt/graph/ @dbt-labs/core-execution
/core/dbt/task/ @dbt-labs/core-execution
# Execution misc files
/core/dbt/compilation.py @dbt-labs/core-execution
/core/dbt/flags.py @dbt-labs/core-execution
/core/dbt/lib.py @dbt-labs/core-execution
/core/dbt/main.py @dbt-labs/core-execution
/core/dbt/profiler.py @dbt-labs/core-execution
/core/dbt/selected_resources.py @dbt-labs/core-execution
/core/dbt/tracking.py @dbt-labs/core-execution
/core/dbt/version.py @dbt-labs/core-execution
### ADAPTERS
# Adapter interface ("base" + "sql" adapter defaults, cache)
/core/dbt/adapters @dbt-labs/core-adapters
# Global project (default macros + materializations), starter project
/core/dbt/include @dbt-labs/core-adapters
# Postgres plugin
/plugins/ @dbt-labs/core-adapters
# Functional tests for adapter plugins
/tests/adapter @dbt-labs/core-adapters
### TESTS
# Overlapping ownership for vast majority of unit + functional tests
# Perf regression testing framework
# This excludes the test project files itself since those aren't specific
# framework changes (excluded by not setting an owner next to it- no owner)
/performance @nathaniel-may
/performance/projects

View File

@@ -61,7 +61,7 @@ body:
label: Environment
description: |
examples:
- **OS**: Ubuntu 24.04
- **OS**: Ubuntu 20.04
- **Python**: 3.9.12 (`python3 --version`)
- **dbt-core**: 1.1.1 (`dbt --version`)
value: |

View File

@@ -1,18 +0,0 @@
name: 📄 Code docs
description: Report an issue for markdown files within this repo, such as README, ARCHITECTURE, etc.
title: "[Code docs] <title>"
labels: ["triage"]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this code docs issue!
- type: textarea
attributes:
label: Please describe the issue and your proposals.
description: |
Links? References? Anything that will give us more context about the issue you are encountering!
Tip: You can attach images by clicking this area to highlight it and then dragging files in.
validations:
required: false

View File

@@ -1,8 +1,5 @@
blank_issues_enabled: false
contact_links:
- name: Documentation
url: https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose
about: Problems and issues with dbt product documentation hosted on docs.getdbt.com. Issues for markdown files within this repo, such as README, should be opened using the "Code docs" template.
- name: Ask the community for help
url: https://github.com/dbt-labs/docs.getdbt.com/discussions
about: Need help troubleshooting? Check out our guide on how to ask

View File

@@ -1,67 +0,0 @@
name: 🛠️ Implementation
description: This is an implementation ticket intended for use by the maintainers of dbt-core
title: "[<project>] <title>"
labels: ["user docs"]
body:
- type: markdown
attributes:
value: This is an implementation ticket intended for use by the maintainers of dbt-core
- type: checkboxes
attributes:
label: Housekeeping
description: >
A couple friendly reminders:
1. Remove the `user docs` label if the scope of this work does not require changes to https://docs.getdbt.com/docs: no end-user interface (e.g. yml spec, CLI, error messages, etc) or functional changes
2. Link any blocking issues in the "Blocked on" field under the "Core devs & maintainers" project.
options:
- label: I am a maintainer of dbt-core
required: true
- type: textarea
attributes:
label: Short description
description: |
Describe the scope of the ticket, a high-level implementation approach and any tradeoffs to consider
validations:
required: true
- type: textarea
attributes:
label: Acceptance criteria
description: |
What is the definition of done for this ticket? Include any relevant edge cases and/or test cases
validations:
required: true
- type: textarea
attributes:
label: Suggested Tests
description: |
Provide scenarios to test. Link to existing similar tests if appropriate.
placeholder: |
1. Test with no version specified in the schema file and use selection logic on a versioned model for a specific version. Expect pass.
2. Test with a version specified in the schema file that is no valid. Expect ParsingError.
validations:
required: true
- type: textarea
attributes:
label: Impact to Other Teams
description: |
Will this change impact other teams? Include details of the kinds of changes required (new tests, code changes, related tickets) and _add the relevant `Impact:[team]` label_.
placeholder: |
Example: This change impacts `dbt-redshift` because the tests will need to be modified. The `Impact:[Adapter]` label has been added.
validations:
required: true
- type: textarea
attributes:
label: Will backports be required?
description: |
Will this change need to be backported to previous versions? Add details, possible blockers to backporting and _add the relevant backport labels `backport 1.x.latest`_
placeholder: |
Example: Backport to 1.6.latest, 1.5.latest and 1.4.latest. Since 1.4 isn't using click, the backport may be complicated. The `backport 1.6.latest`, `backport 1.5.latest` and `backport 1.4.latest` labels have been added.
validations:
required: true
- type: textarea
attributes:
label: Context
description: |
Provide the "why", motivation, and alternative approaches considered -- linking to previous refinement issues, spikes and documentation as appropriate
validations:
required: false

View File

@@ -55,7 +55,7 @@ body:
label: Environment
description: |
examples:
- **OS**: Ubuntu 24.04
- **OS**: Ubuntu 20.04
- **Python**: 3.9.12 (`python3 --version`)
- **dbt-core (working version)**: 1.1.1 (`dbt --version`)
- **dbt-core (regression version)**: 1.2.0 (`dbt --version`)

25
.github/_README.md vendored
View File

@@ -47,8 +47,7 @@ ___
### How to re-run jobs
- From the UI you can rerun from failure
- You can retrigger the cla check by commenting on the PR with `@cla-bot check`
- Some actions cannot be rerun in the GitHub UI. Namely the snyk checks and the cla check. Snyk checks are rerun by closing and reopening the PR. You can retrigger the cla check by commenting on the PR with `@cla-bot check`
___
@@ -64,12 +63,12 @@ permissions:
contents: read
pull-requests: write
```
### Secrets
- When to use a [Personal Access Token (PAT)](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token) vs the [GITHUB_TOKEN](https://docs.github.com/en/actions/security-guides/automatic-token-authentication) generated for the action?
The `GITHUB_TOKEN` is used by default. In most cases it is sufficient for what you need.
If you expect the workflow to result in a commit to that should retrigger workflows, you will need to use a Personal Access Token for the bot to commit the file. When using the GITHUB_TOKEN, the resulting commit will not trigger another GitHub Actions Workflow run. This is due to limitations set by GitHub. See [the docs](https://docs.github.com/en/actions/security-guides/automatic-token-authentication#using-the-github_token-in-a-workflow) for a more detailed explanation.
For example, we must use a PAT in our workflow to commit a new changelog yaml file for bot PRs. Once the file has been committed to the branch, it should retrigger the check to validate that a changelog exists on the PR. Otherwise, it would stay in a failed state since the check would never retrigger.
@@ -106,7 +105,7 @@ Some triggers of note that we use:
```
# **what?**
# Describe what the action does.
# Describe what the action does.
# **why?**
# Why does this action exist?
@@ -139,7 +138,7 @@ Some triggers of note that we use:
id: fp
run: |
FILEPATH=.changes/unreleased/Dependencies-${{ steps.filename_time.outputs.time }}.yaml
echo "FILEPATH=$FILEPATH" >> $GITHUB_OUTPUT
echo "::set-output name=FILEPATH::$FILEPATH"
```
- Print out all variables you will reference as the first step of a job. This allows for easier debugging. The first job should log all inputs. Subsequent jobs should reference outputs of other jobs, if present.
@@ -159,14 +158,14 @@ Some triggers of note that we use:
echo "The build_script_path: ${{ inputs.build_script_path }}"
echo "The s3_bucket_name: ${{ inputs.s3_bucket_name }}"
echo "The package_test_command: ${{ inputs.package_test_command }}"
# collect all the variables that need to be used in subsequent jobs
- name: Set Variables
id: variables
run: |
echo "important_path='performance/runner/Cargo.toml'" >> $GITHUB_OUTPUT
echo "release_id=${{github.event.inputs.release_id}}" >> $GITHUB_OUTPUT
echo "open_prs=${{github.event.inputs.open_prs}}" >> $GITHUB_OUTPUT
echo "::set-output name=important_path::'performance/runner/Cargo.toml'"
echo "::set-output name=release_id::${{github.event.inputs.release_id}}"
echo "::set-output name=open_prs::${{github.event.inputs.open_prs}}"
job2:
needs: [job1]
@@ -191,14 +190,14 @@ ___
### Actions from the Marketplace
- Dont use external actions for things that can easily be accomplished manually.
- Always read through what an external action does before using it! Often an action in the GitHub Actions Marketplace can be replaced with a few lines in bash. This is much more maintainable (and wont change under us) and clear as to whats actually happening. It also prevents any
- Pin actions _we don't control_ to tags.
- Pin actions _we don't control_ to tags.
### Connecting to AWS
- Authenticate with the aws managed workflow
```yaml
- name: Configure AWS credentials from Test account
uses: aws-actions/configure-aws-credentials@v2
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
@@ -209,7 +208,7 @@ ___
```yaml
- name: Copy Artifacts from S3 via CLI
run: aws s3 cp ${{ env.s3_bucket }} . --recursive
run: aws s3 cp ${{ env.s3_bucket }} . --recursive
```
### Testing

View File

@@ -35,7 +35,7 @@ jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v1
- name: Wrangle latest tag
id: is_latest
uses: ./.github/actions/latest-wrangler

View File

@@ -1,21 +1,20 @@
name: "GitHub package `latest` tag wrangler for containers"
description: "Determines if the published image should include `latest` tags"
name: "Github package 'latest' tag wrangler for containers"
description: "Determines wether or not a given dbt container should be given a bare 'latest' tag (I.E. dbt-core:latest)"
inputs:
package_name:
description: "Package being published (i.e. `dbt-core`, `dbt-redshift`, etc.)"
description: "Package to check (I.E. dbt-core, dbt-redshift, etc)"
required: true
new_version:
description: "SemVer of the package being published (i.e. 1.7.2, 1.8.0a1, etc.)"
description: "Semver of the container being built (I.E. 1.0.4)"
required: true
github_token:
description: "Auth token for GitHub (must have view packages scope)"
gh_token:
description: "Auth token for github (must have view packages scope)"
required: true
outputs:
tags:
description: "A list of tags to associate with this version"
latest:
description: "Wether or not built container should be tagged latest (bool)"
minor_latest:
description: "Wether or not built container should be tagged minor.latest (bool)"
runs:
using: "docker"
image: "Dockerfile"

View File

@@ -13,7 +13,7 @@ jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v1
- name: Wrangle latest tag
id: is_latest
uses: ./.github/actions/latest-wrangler

View File

@@ -1,71 +1,95 @@
import os
from packaging.version import Version, parse
import requests
import sys
from typing import List
def main():
package_name: str = os.environ["INPUT_PACKAGE_NAME"]
new_version: Version = parse(os.environ["INPUT_NEW_VERSION"])
github_token: str = os.environ["INPUT_GITHUB_TOKEN"]
response = _package_metadata(package_name, github_token)
published_versions = _published_versions(response)
new_version_tags = _new_version_tags(new_version, published_versions)
_register_tags(new_version_tags, package_name)
def _package_metadata(package_name: str, github_token: str) -> requests.Response:
url = f"https://api.github.com/orgs/dbt-labs/packages/container/{package_name}/versions"
return requests.get(url, auth=("", github_token))
def _published_versions(response: requests.Response) -> List[Version]:
package_metadata = response.json()
return [
parse(tag)
for version in package_metadata
for tag in version["metadata"]["container"]["tags"]
if "latest" not in tag
]
def _new_version_tags(new_version: Version, published_versions: List[Version]) -> List[str]:
# the package version is always a tag
tags = [str(new_version)]
# pre-releases don't get tagged with `latest`
if new_version.is_prerelease:
return tags
if new_version > max(published_versions):
tags.append("latest")
published_patches = [
version
for version in published_versions
if version.major == new_version.major and version.minor == new_version.minor
]
if new_version > max(published_patches):
tags.append(f"{new_version.major}.{new_version.minor}.latest")
return tags
def _register_tags(tags: List[str], package_name: str) -> None:
fully_qualified_tags = ",".join([f"ghcr.io/dbt-labs/{package_name}:{tag}" for tag in tags])
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write(f"fully_qualified_tags={fully_qualified_tags}")
def _validate_response(response: requests.Response) -> None:
message = response["message"]
if response.status_code != 200:
print(f"Call to GitHub API failed: {response.status_code} - {message}")
sys.exit(1)
import requests
from distutils.util import strtobool
from typing import Union
from packaging.version import parse, Version
if __name__ == "__main__":
main()
# get inputs
package = os.environ["INPUT_PACKAGE"]
new_version = parse(os.environ["INPUT_NEW_VERSION"])
gh_token = os.environ["INPUT_GH_TOKEN"]
halt_on_missing = strtobool(os.environ.get("INPUT_HALT_ON_MISSING", "False"))
# get package metadata from github
package_request = requests.get(
f"https://api.github.com/orgs/dbt-labs/packages/container/{package}/versions",
auth=("", gh_token),
)
package_meta = package_request.json()
# Log info if we don't get a 200
if package_request.status_code != 200:
print(f"Call to GH API failed: {package_request.status_code} {package_meta['message']}")
# Make an early exit if there is no matching package in github
if package_request.status_code == 404:
if halt_on_missing:
sys.exit(1)
else:
# everything is the latest if the package doesn't exist
print(f"::set-output name=latest::{True}")
print(f"::set-output name=minor_latest::{True}")
sys.exit(0)
# TODO: verify package meta is "correct"
# https://github.com/dbt-labs/dbt-core/issues/4640
# map versions and tags
version_tag_map = {
version["id"]: version["metadata"]["container"]["tags"] for version in package_meta
}
# is pre-release
pre_rel = True if any(x in str(new_version) for x in ["a", "b", "rc"]) else False
# semver of current latest
for version, tags in version_tag_map.items():
if "latest" in tags:
# N.B. This seems counterintuitive, but we expect any version tagged
# 'latest' to have exactly three associated tags:
# latest, major.minor.latest, and major.minor.patch.
# Subtracting everything that contains the string 'latest' gets us
# the major.minor.patch which is what's needed for comparison.
current_latest = parse([tag for tag in tags if "latest" not in tag][0])
else:
current_latest = False
# semver of current_minor_latest
for version, tags in version_tag_map.items():
if f"{new_version.major}.{new_version.minor}.latest" in tags:
# Similar to above, only now we expect exactly two tags:
# major.minor.patch and major.minor.latest
current_minor_latest = parse([tag for tag in tags if "latest" not in tag][0])
else:
current_minor_latest = False
def is_latest(
pre_rel: bool, new_version: Version, remote_latest: Union[bool, Version]
) -> bool:
"""Determine if a given contaier should be tagged 'latest' based on:
- it's pre-release status
- it's version
- the version of a previously identified container tagged 'latest'
:param pre_rel: Wether or not the version of the new container is a pre-release
:param new_version: The version of the new container
:param remote_latest: The version of the previously identified container that's
already tagged latest or False
"""
# is a pre-release = not latest
if pre_rel:
return False
# + no latest tag found = is latest
if not remote_latest:
return True
# + if remote version is lower than current = is latest, else not latest
return True if remote_latest <= new_version else False
latest = is_latest(pre_rel, new_version, current_latest)
minor_latest = is_latest(pre_rel, new_version, current_minor_latest)
print(f"::set-output name=latest::{latest}")
print(f"::set-output name=minor_latest::{minor_latest}")

View File

@@ -0,0 +1,10 @@
name: "Set up postgres (linux)"
description: "Set up postgres service on linux vm for dbt integration tests"
runs:
using: "composite"
steps:
- shell: bash
run: |
sudo systemctl start postgresql.service
pg_isready
sudo -u postgres bash ${{ github.action_path }}/setup_db.sh

View File

@@ -0,0 +1 @@
../../../test/setup_db.sh

View File

@@ -0,0 +1,24 @@
name: "Set up postgres (macos)"
description: "Set up postgres service on macos vm for dbt integration tests"
runs:
using: "composite"
steps:
- shell: bash
run: |
brew services start postgresql
echo "Check PostgreSQL service is running"
i=10
COMMAND='pg_isready'
while [ $i -gt -1 ]; do
if [ $i == 0 ]; then
echo "PostgreSQL service not ready, all attempts exhausted"
exit 1
fi
echo "Check PostgreSQL service status"
eval $COMMAND && break
echo "PostgreSQL service not ready, wait 10 more sec, attempts left: $i"
sleep 10
((i--))
done
createuser -s postgres
bash ${{ github.action_path }}/setup_db.sh

View File

@@ -0,0 +1 @@
../../../test/setup_db.sh

View File

@@ -5,22 +5,8 @@ runs:
steps:
- shell: pwsh
run: |
Write-Host -Object "Installing PostgreSQL 16 as windows service..."
$installerArgs = @("--install_runtimes 0", "--superpassword root", "--enable_acledit 1", "--unattendedmodeui none", "--mode unattended")
$filePath = Invoke-DownloadWithRetry -Url "https://get.enterprisedb.com/postgresql/postgresql-16.1-1-windows-x64.exe" -Path "$env:PGROOT/postgresql-16.1-1-windows-x64.exe"
Start-Process -FilePath $filePath -ArgumentList $installerArgs -Wait -PassThru
Write-Host -Object "Validating PostgreSQL 16 Install..."
Get-Service -Name postgresql*
$pgReady = Start-Process -FilePath "$env:PGBIN\pg_isready" -Wait -PassThru
$exitCode = $pgReady.ExitCode
if ($exitCode -ne 0) {
Write-Host -Object "PostgreSQL is not ready. Exitcode: $exitCode"
exit $exitCode
}
Write-Host -Object "Starting PostgreSQL 16 Service..."
$pgService = Get-Service -Name postgresql-x64-16
$pgService = Get-Service -Name postgresql*
Set-Service -InputObject $pgService -Status running -StartupType automatic
Start-Process -FilePath "$env:PGBIN\pg_isready" -Wait -PassThru
$env:Path += ";$env:PGBIN"
bash ${{ github.action_path }}/setup_db.sh

View File

@@ -11,6 +11,11 @@ updates:
schedule:
interval: "daily"
rebase-strategy: "disabled"
- package-ecosystem: "pip"
directory: "/plugins/postgres"
schedule:
interval: "daily"
rebase-strategy: "disabled"
# docker dependencies
- package-ecosystem: "docker"
@@ -23,10 +28,3 @@ updates:
schedule:
interval: "weekly"
rebase-strategy: "disabled"
# github dependencies
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
rebase-strategy: "disabled"

View File

@@ -1,33 +1,23 @@
Resolves #
resolves #
<!---
Include the number of the issue addressed by this PR above, if applicable.
Include the number of the issue addressed by this PR above if applicable.
PRs for code changes without an associated issue *will not be merged*.
See CONTRIBUTING.md for more information.
Add the `user docs` label to this PR if it will need docs changes. An
issue will get opened in docs.getdbt.com upon successful merge of this PR.
-->
### Problem
### Description
<!---
Describe the problem this PR is solving. What is the application state
before this PR is merged?
-->
### Solution
<!---
Describe the way this PR solves the above problem. Add as much detail as you
can to help reviewers understand your changes. Include any alternatives and
tradeoffs you considered.
Describe the Pull Request here. Add any references and info to help reviewers
understand your changes. Include any tradeoffs you considered.
-->
### Checklist
- [ ] I have read [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md) and understand what's expected of me.
- [ ] I have run this code in development, and it appears to resolve the stated issue.
- [ ] This PR includes tests, or tests are not required or relevant for this PR.
- [ ] This PR has no interface changes (e.g., macros, CLI, logs, JSON artifacts, config files, adapter interface, etc.) or this PR has already received feedback and approval from Product or DX.
- [ ] This PR includes [type annotations](https://docs.python.org/3/library/typing.html) for new and modified functions.
- [ ] I have read [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md) and understand what's expected of me
- [ ] I have signed the [CLA](https://docs.getdbt.com/docs/contributor-license-agreements)
- [ ] I have run this code in development and it appears to resolve the stated issue
- [ ] This PR includes tests, or tests are not required/relevant for this PR
- [ ] I have [opened an issue to add/update docs](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose), or docs changes are not required/relevant for this PR
- [ ] I have run `changie new` to [create a changelog entry](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-a-changelog-entry)

View File

@@ -35,6 +35,6 @@ jobs:
github.event.pull_request.merged
&& contains(github.event.label.name, 'backport')
steps:
- uses: tibdex/backport@v2.0.4
- uses: tibdex/backport@v2.0.2
with:
github_token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -40,7 +40,9 @@ jobs:
matrix:
include:
- label: "dependencies"
changie_kind: "Dependencies"
changie_kind: "Dependency"
- label: "snyk"
changie_kind: "Security"
runs-on: ubuntu-latest
steps:
@@ -48,7 +50,7 @@ jobs:
- name: Create and commit changelog on bot PR
if: ${{ contains(github.event.pull_request.labels.*.name, matrix.label) }}
id: bot_changelog
uses: emmyoop/changie_bot@v1.1.0
uses: emmyoop/changie_bot@v1.0.1
with:
GITHUB_TOKEN: ${{ secrets.FISHTOWN_BOT_PAT }}
commit_author_name: "Github Build Bot"
@@ -56,4 +58,4 @@ jobs:
commit_message: "Add automated changelog yaml from template for bot PR"
changie_kind: ${{ matrix.changie_kind }}
label: ${{ matrix.label }}
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n Issue: ${{ github.event.pull_request.number }}"
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n Issue: 4904\n PR: ${{ github.event.pull_request.number }}"

View File

@@ -2,8 +2,10 @@
# Checks that a file has been committed under the /.changes directory
# as a new CHANGELOG entry. Cannot check for a specific filename as
# it is dynamically generated by change type and timestamp.
# This workflow runs on pull_request_target because it requires
# secrets to post comments.
# This workflow should not require any secrets since it runs for PRs
# from forked repos.
# By default, secrets are not passed to workflows running from
# a forked repo.
# **why?**
# Ensure code change gets reflected in the CHANGELOG.
@@ -17,10 +19,8 @@
name: Check Changelog Entry
on:
pull_request_target:
pull_request:
types: [opened, reopened, labeled, unlabeled, synchronize]
paths-ignore: ['.changes/**', '.github/**', 'tests/**', '**.md', '**.yml']
workflow_dispatch:
defaults:

View File

@@ -1,41 +0,0 @@
name: Check Artifact Changes
on:
pull_request:
types: [ opened, reopened, labeled, unlabeled, synchronize ]
paths-ignore: [ '.changes/**', '.github/**', 'tests/**', '**.md', '**.yml' ]
workflow_dispatch:
jobs:
check-artifact-changes:
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.labels.*.name, 'artifact_minor_upgrade') }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for changes in core/dbt/artifacts
# https://github.com/marketplace/actions/paths-changes-filter
uses: dorny/paths-filter@v3
id: check_artifact_changes
with:
filters: |
artifacts_changed:
- 'core/dbt/artifacts/**'
list-files: shell
- name: Fail CI if artifacts have changed
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
run: |
echo "CI failure: Artifact changes checked in core/dbt/artifacts directory."
echo "Files changed: ${{ steps.check_artifact_changes.outputs.artifacts_changed_files }}"
echo "To bypass this check, confirm that the change is not breaking (https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/artifacts/README.md#breaking-changes) and add the 'artifact_minor_upgrade' label to the PR. Modifications and additions to all fields require updates to https://github.com/dbt-labs/dbt-jsonschema."
exit 1
- name: CI check passed
if: steps.check_artifact_changes.outputs.artifacts_changed == 'false'
run: |
echo "No prohibited artifact changes found in core/dbt/artifacts. CI check passed."

View File

@@ -1,39 +0,0 @@
# **what?**
# Label a PR with a `community` label when a PR is opened by a user outside core/adapters
# **why?**
# To streamline triage and ensure that community contributions are recognized and prioritized
# **when?**
# When a PR is opened, not in draft or moved from draft to ready for review
name: Label community PRs
on:
# have to use pull_request_target since community PRs come from forks
pull_request_target:
types: [opened, ready_for_review]
defaults:
run:
shell: bash
permissions:
pull-requests: write # labels PRs
contents: read # reads team membership
jobs:
open_issues:
# If this PR already has the community label, no need to relabel it
# If this PR is opened and not draft, determine if it needs to be labeled
# if the PR is converted out of draft, determine if it needs to be labeled
if: |
(!contains(github.event.pull_request.labels.*.name, 'community') &&
(github.event.action == 'opened' && github.event.pull_request.draft == false ) ||
github.event.action == 'ready_for_review' )
uses: dbt-labs/actions/.github/workflows/label-community.yml@main
with:
github_team: 'core-group'
label: 'community'
secrets: inherit

View File

@@ -1,41 +0,0 @@
# **what?**
# Cuts a new `*.latest` branch
# Also cleans up all files in `.changes/unreleased` and `.changes/previous verion on
# `main` and bumps `main` to the input version.
# **why?**
# Generally reduces the workload of engineers and reduces error. Allow automation.
# **when?**
# This will run when called manually.
name: Cut new release branch
on:
workflow_dispatch:
inputs:
version_to_bump_main:
description: 'The alpha version main should bump to (ex. 1.6.0a1)'
required: true
new_branch_name:
description: 'The full name of the new branch (ex. 1.5.latest)'
required: true
defaults:
run:
shell: bash
permissions:
contents: write
jobs:
cut_branch:
name: "Cut branch and clean up main for dbt-core"
uses: dbt-labs/actions/.github/workflows/cut-release-branch.yml@main
with:
version_to_bump_main: ${{ inputs.version_to_bump_main }}
new_branch_name: ${{ inputs.new_branch_name }}
PR_title: "Cleanup main after cutting new ${{ inputs.new_branch_name }} branch"
PR_body: "All adapter PRs will fail CI until the dbt-core PR has been merged due to release version conflicts."
secrets:
FISHTOWN_BOT_PAT: ${{ secrets.FISHTOWN_BOT_PAT }}

View File

@@ -1,41 +0,0 @@
# **what?**
# Open an issue in docs.getdbt.com when an issue is labeled `user docs` and closed as completed
# **why?**
# To reduce barriers for keeping docs up to date
# **when?**
# When an issue is labeled `user docs` and is closed as completed. Can be labeled before or after the issue is closed.
name: Open issues in docs.getdbt.com repo when an issue is labeled
run-name: "Open an issue in docs.getdbt.com for issue #${{ github.event.issue.number }}"
on:
issues:
types: [labeled, closed]
defaults:
run:
shell: bash
permissions:
issues: write # comments on issues
jobs:
open_issues:
# we only want to run this when the issue is closed as completed and the label `user docs` has been assigned.
# If this logic does not exist in this workflow, it runs the
# risk of duplicaton of issues being created due to merge and label both triggering this workflow to run and neither having
# generating the comment before the other runs. This lives here instead of the shared workflow because this is where we
# decide if it should run or not.
if: |
(github.event.issue.state == 'closed' &&
github.event.issue.state_reason == 'completed' &&
contains( github.event.issue.labels.*.name, 'user docs'))
uses: dbt-labs/actions/.github/workflows/open-issue-in-repo.yml@main
with:
issue_repository: "dbt-labs/docs.getdbt.com"
issue_title: "[Core] Docs Changes Needed from ${{ github.event.repository.name }} Issue #${{ github.event.issue.number }}"
issue_body: "At a minimum, update body to include a link to the page on docs.getdbt.com requiring updates and what part(s) of the page you would like to see updated.\n Originating from this issue: https://github.com/dbt-labs/dbt-core/issues/${{ github.event.issue.number }}"
secrets: inherit

View File

@@ -0,0 +1,166 @@
# **what?**
# On push, if anything in core/dbt/docs or core/dbt/cli has been
# created or modified, regenerate the CLI API docs using sphinx.
# **why?**
# We watch for changes in core/dbt/cli because the CLI API docs rely on click
# and all supporting flags/params to be generated. We watch for changes in
# core/dbt/docs since any changes to sphinx configuration or any of the
# .rst files there could result in a differently build final index.html file.
# **when?**
# Whenever a change has been pushed to a branch, and only if there is a diff
# between the PR branch and main's core/dbt/cli and or core/dbt/docs dirs.
# TODO: add bot comment to PR informing contributor that the docs have been committed
# TODO: figure out why github action triggered pushes cause github to fail to report
# the status of jobs
name: Generate CLI API docs
on:
pull_request:
permissions:
contents: write
pull-requests: write
env:
CLI_DIR: ${{ github.workspace }}/core/dbt/cli
DOCS_DIR: ${{ github.workspace }}/core/dbt/docs
DOCS_BUILD_DIR: ${{ github.workspace }}/core/dbt/docs/build
jobs:
check_gen:
name: check if generation needed
runs-on: ubuntu-latest
outputs:
cli_dir_changed: ${{ steps.check_cli.outputs.cli_dir_changed }}
docs_dir_changed: ${{ steps.check_docs.outputs.docs_dir_changed }}
steps:
- name: "[DEBUG] print variables"
run: |
echo "env.CLI_DIR: ${{ env.CLI_DIR }}"
echo "env.DOCS_BUILD_DIR: ${{ env.DOCS_BUILD_DIR }}"
echo "env.DOCS_DIR: ${{ env.DOCS_DIR }}"
echo ">>>>> git log"
git log --pretty=oneline | head -5
- name: git checkout
uses: actions/checkout@v3
with:
fetch-depth: 0
ref: ${{ github.head_ref }}
- name: set shas
id: set_shas
run: |
THIS_SHA=$(git rev-parse @)
LAST_SHA=$(git rev-parse @~1)
echo "this sha: $THIS_SHA"
echo "last sha: $LAST_SHA"
echo "this_sha=$THIS_SHA" >> $GITHUB_OUTPUT
echo "last_sha=$LAST_SHA" >> $GITHUB_OUTPUT
- name: check for changes in core/dbt/cli
id: check_cli
run: |
CLI_DIR_CHANGES=$(git diff \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.CLI_DIR }})
if [ -n "$CLI_DIR_CHANGES" ]; then
echo "changes found"
echo $CLI_DIR_CHANGES
echo "cli_dir_changed=true" >> $GITHUB_OUTPUT
exit 0
fi
echo "cli_dir_changed=false" >> $GITHUB_OUTPUT
echo "no changes found"
- name: check for changes in core/dbt/docs
id: check_docs
if: steps.check_cli.outputs.cli_dir_changed == 'false'
run: |
DOCS_DIR_CHANGES=$(git diff --name-only \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.DOCS_DIR }} ':!${{ env.DOCS_BUILD_DIR }}')
DOCS_BUILD_DIR_CHANGES=$(git diff --name-only \
${{ steps.set_shas.outputs.last_sha }} \
${{ steps.set_shas.outputs.this_sha }} \
-- ${{ env.DOCS_BUILD_DIR }})
if [ -n "$DOCS_DIR_CHANGES" ] && [ -z "$DOCS_BUILD_DIR_CHANGES" ]; then
echo "changes found"
echo $DOCS_DIR_CHANGES
echo "docs_dir_changed=true" >> $GITHUB_OUTPUT
exit 0
fi
echo "docs_dir_changed=false" >> $GITHUB_OUTPUT
echo "no changes found"
gen_docs:
name: generate docs
runs-on: ubuntu-latest
needs: [check_gen]
if: |
needs.check_gen.outputs.cli_dir_changed == 'true'
|| needs.check_gen.outputs.docs_dir_changed == 'true'
steps:
- name: "[DEBUG] print variables"
run: |
echo "env.DOCS_DIR: ${{ env.DOCS_DIR }}"
echo "github head_ref: ${{ github.head_ref }}"
- name: git checkout
uses: actions/checkout@v3
with:
ref: ${{ github.head_ref }}
- name: install python
uses: actions/setup-python@v4.3.0
with:
python-version: 3.8
- name: install dev requirements
run: |
python3 -m venv env
source env/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt -r dev-requirements.txt
- name: generate docs
run: |
source env/bin/activate
cd ${{ env.DOCS_DIR }}
echo "cleaning existing docs"
make clean
echo "creating docs"
make html
- name: debug
run: |
echo ">>>>> status"
git status
echo ">>>>> remotes"
git remote -v
echo ">>>>> branch"
git branch -v
echo ">>>>> log"
git log --pretty=oneline | head -5
- name: commit docs
run: |
git config user.name 'Github Build Bot'
git config user.email 'buildbot@fishtownanalytics.com'
git commit -am "Add generated CLI API docs"
git push -u origin ${{ github.head_ref }}

26
.github/workflows/jira-creation.yml vendored Normal file
View File

@@ -0,0 +1,26 @@
# **what?**
# Mirrors issues into Jira. Includes the information: title,
# GitHub Issue ID and URL
# **why?**
# Jira is our tool for tracking and we need to see these issues in there
# **when?**
# On issue creation or when an issue is labeled `Jira`
name: Jira Issue Creation
on:
issues:
types: [opened, labeled]
permissions:
issues: write
jobs:
call-label-action:
uses: dbt-labs/jira-actions/.github/workflows/jira-creation.yml@main
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

26
.github/workflows/jira-label.yml vendored Normal file
View File

@@ -0,0 +1,26 @@
# **what?**
# Calls mirroring Jira label Action. Includes adding a new label
# to an existing issue or removing a label as well
# **why?**
# Jira is our tool for tracking and we need to see these labels in there
# **when?**
# On labels being added or removed from issues
name: Jira Label Mirroring
on:
issues:
types: [labeled, unlabeled]
permissions:
issues: read
jobs:
call-label-action:
uses: dbt-labs/jira-actions/.github/workflows/jira-label.yml@main
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

27
.github/workflows/jira-transition.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
# **what?**
# Transition a Jira issue to a new state
# Only supports these GitHub Issue transitions:
# closed, deleted, reopened
# **why?**
# Jira needs to be kept up-to-date
# **when?**
# On issue closing, deletion, reopened
name: Jira Issue Transition
on:
issues:
types: [closed, deleted, reopened]
# no special access is needed
permissions: read-all
jobs:
call-label-action:
uses: dbt-labs/jira-actions/.github/workflows/jira-transition.yml@main
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

View File

@@ -33,11 +33,6 @@ defaults:
run:
shell: bash
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python integration testing
PYTHON_INTEGRATION_TEST_WORKERS: 5
jobs:
code-quality:
name: code-quality
@@ -47,20 +42,23 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4.3.0
with:
python-version: '3.9'
python-version: '3.8'
- name: Install python dependencies
run: |
python -m pip install --user --upgrade pip
python -m pip --version
make dev
make dev_req
python -m pip install pre-commit
pre-commit --version
python -m pip install mypy==0.942
mypy --version
python -m pip install -r requirements.txt
python -m pip install -r dev-requirements.txt
dbt --version
- name: Run pre-commit hooks
@@ -75,17 +73,18 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: [ "3.9", "3.10", "3.11", "3.12" ]
python-version: ["3.7", "3.8", "3.9", "3.10"]
env:
TOXENV: "unit"
PYTEST_ADDOPTS: "-v --color=yes --csv unit_results.csv"
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@v4.3.0
with:
python-version: ${{ matrix.python-version }}
@@ -96,200 +95,61 @@ jobs:
python -m pip install tox
tox --version
- name: Run unit tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 10
max_attempts: 3
command: tox -e unit
- name: Run tox
run: tox
- name: Get current date
if: always()
id: date
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
run: echo "::set-output name=date::$(date +'%Y-%m-%dT%H_%M_%S')" #no colons allowed for artifacts
- name: Upload Unit Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v4
- uses: actions/upload-artifact@v2
if: always()
with:
token: ${{ secrets.CODECOV_TOKEN }}
flags: unit
name: unit_results_${{ matrix.python-version }}-${{ steps.date.outputs.date }}.csv
path: unit_results.csv
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
include: ${{ steps.generate-include.outputs.include }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
- name: generate include
id: generate-include
run: |
INCLUDE=('"python-version":"3.9","os":"windows-latest"' '"python-version":"3.9","os":"macos-14"' )
INCLUDE_GROUPS="["
for include in ${INCLUDE[@]}; do
for group in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
INCLUDE_GROUPS+=$(sed 's/$/, /' <<< "{\"split-group\":\"${group}\",${include}}")
done
done
INCLUDE_GROUPS=$(echo $INCLUDE_GROUPS | sed 's/,*$//g')
INCLUDE_GROUPS+="]"
echo "include=${INCLUDE_GROUPS}"
echo "include=${INCLUDE_GROUPS}" >> $GITHUB_OUTPUT
integration-postgres:
name: (${{ matrix.split-group }}) integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
integration:
name: integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
runs-on: ${{ matrix.os }}
timeout-minutes: 30
needs:
- integration-metadata
timeout-minutes: 45
strategy:
fail-fast: false
matrix:
python-version: [ "3.9", "3.10", "3.11", "3.12" ]
os: [ubuntu-latest]
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
python-version: ["3.7", "3.8", "3.9", "3.10"]
os: [ubuntu-20.04]
include:
- python-version: 3.8
os: windows-latest
- python-version: 3.8
os: macos-latest
env:
TOXENV: integration
PYTEST_ADDOPTS: "-v --color=yes -n4 --csv integration_results.csv"
DBT_INVOCATION_ENV: github-actions
DBT_TEST_USER_1: dbt_test_user_1
DBT_TEST_USER_2: dbt_test_user_2
DBT_TEST_USER_3: dbt_test_user_3
DD_CIVISIBILITY_AGENTLESS_ENABLED: true
DD_API_KEY: ${{ secrets.DATADOG_API_KEY }}
DD_SITE: datadoghq.com
DD_ENV: ci
DD_SERVICE: ${{ github.event.repository.name }}
services:
# Label used to access the service container
postgres:
# Docker Hub image
image: postgres
# Provide the password for postgres
env:
POSTGRES_PASSWORD: password
POSTGRES_USER: postgres
# Set health checks to wait until postgres has started
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@v4.3.0
with:
python-version: ${{ matrix.python-version }}
- name: Run postgres setup script
run: |
./test/setup_db.sh
env:
PGHOST: localhost
PGPORT: 5432
PGPASSWORD: password
- name: Install python tools
run: |
python -m pip install --user --upgrade pip
python -m pip --version
python -m pip install tox
tox --version
- name: Run integration tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 30
max_attempts: 3
command: tox -- --ddtrace
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
- name: Get current date
if: always()
id: date
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v4
if: always()
with:
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ matrix.split-group }}_${{ steps.date.outputs.date }}
path: ./logs
- name: Upload Integration Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
flags: integration
integration-mac-windows:
name: (${{ matrix.split-group }}) integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
runs-on: ${{ matrix.os }}
timeout-minutes: 30
needs:
- integration-metadata
strategy:
fail-fast: false
matrix:
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
# this include is where we add the mac and windows os
include: ${{ fromJson(needs.integration-metadata.outputs.include) }}
env:
TOXENV: integration
DBT_INVOCATION_ENV: github-actions
DBT_TEST_USER_1: dbt_test_user_1
DBT_TEST_USER_2: dbt_test_user_2
DBT_TEST_USER_3: dbt_test_user_3
DD_CIVISIBILITY_AGENTLESS_ENABLED: true
DD_API_KEY: ${{ secrets.DATADOG_API_KEY }}
DD_SITE: datadoghq.com
DD_ENV: ci
DD_SERVICE: ${{ github.event.repository.name }}
steps:
- name: Check out the repository
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Set up postgres (linux)
if: runner.os == 'Linux'
uses: ./.github/actions/setup-postgres-linux
- name: Set up postgres (macos)
if: runner.os == 'macOS'
uses: nick-fields/retry@v3
with:
timeout_minutes: 10
max_attempts: 3
command: ./test/setup_db.sh
uses: ./.github/actions/setup-postgres-macos
- name: Set up postgres (windows)
if: runner.os == 'Windows'
@@ -302,51 +162,25 @@ jobs:
python -m pip install tox
tox --version
- name: Run integration tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 30
max_attempts: 3
command: tox -- --ddtrace
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
- name: Run tests
run: tox
- name: Get current date
if: always()
id: date
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
run: echo "::set-output name=date::$(date +'%Y_%m_%dT%H_%M_%S')" #no colons allowed for artifacts
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v2
if: always()
with:
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ matrix.split-group }}_${{ steps.date.outputs.date }}
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}
path: ./logs
- name: Upload Integration Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v4
- uses: actions/upload-artifact@v2
if: always()
with:
token: ${{ secrets.CODECOV_TOKEN }}
flags: integration
integration-report:
if: ${{ always() }}
name: Integration Test Suite
runs-on: ubuntu-latest
needs: [integration-mac-windows, integration-postgres]
steps:
- name: "Integration Tests Failed"
if: ${{ contains(needs.integration-mac-windows.result, 'failure') || contains(needs.integration-mac-windows.result, 'cancelled') || contains(needs.integration-postgres.result, 'failure') || contains(needs.integration-postgres.result, 'cancelled') }}
# when this is true the next step won't execute
run: |
echo "::notice title='Integration test suite failed'"
exit 1
- name: "Integration Tests Passed"
run: |
echo "::notice title='Integration test suite passed'"
name: integration_results_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}.csv
path: integration_results.csv
build:
name: build packages
@@ -355,12 +189,12 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4.3.0
with:
python-version: '3.9'
python-version: '3.8'
- name: Install python dependencies
run: |
@@ -393,7 +227,7 @@ jobs:
- name: Install source distributions
# ignore dbt-1.0.0, which intentionally raises an error when installed from source
run: |
find ./dist/*.gz -maxdepth 1 -type f | xargs python -m pip install --force-reinstall --find-links=dist/
find ./dist/dbt-[a-z]*.gz -maxdepth 1 -type f | xargs python -m pip install --force-reinstall --find-links=dist/
- name: Check source distributions
run: |

View File

@@ -1,265 +0,0 @@
# **what?**
# This workflow models the performance characteristics of a point in time in dbt.
# It runs specific dbt commands on committed projects multiple times to create and
# commit information about the distribution to the current branch. For more information
# see the readme in the performance module at /performance/README.md.
#
# **why?**
# When developing new features, we can take quick performance samples and compare
# them against the commited baseline measurements produced by this workflow to detect
# some performance regressions at development time before they reach users.
#
# **when?**
# This is only run once directly after each release (for non-prereleases). If for some
# reason the results of a run are not satisfactory, it can also be triggered manually.
name: Model Performance Characteristics
on:
# runs after non-prereleases are published.
release:
types: [released]
# run manually from the actions tab
workflow_dispatch:
inputs:
release_id:
description: 'dbt version to model (must be non-prerelease in Pypi)'
type: string
required: true
env:
RUNNER_CACHE_PATH: performance/runner/target/release/runner
# both jobs need to write
permissions:
contents: write
pull-requests: write
jobs:
set-variables:
name: Setting Variables
runs-on: ubuntu-latest
outputs:
cache_key: ${{ steps.variables.outputs.cache_key }}
release_id: ${{ steps.semver.outputs.base-version }}
release_branch: ${{ steps.variables.outputs.release_branch }}
steps:
# explicitly checkout the performance runner from main regardless of which
# version we are modeling.
- name: Checkout
uses: actions/checkout@v4
with:
ref: main
- name: Parse version into parts
id: semver
uses: dbt-labs/actions/parse-semver@v1
with:
version: ${{ github.event.inputs.release_id || github.event.release.tag_name }}
# collect all the variables that need to be used in subsequent jobs
- name: Set variables
id: variables
run: |
# create a cache key that will be used in the next job. without this the
# next job would have to checkout from main and hash the files itself.
echo "cache_key=${{ runner.os }}-${{ hashFiles('performance/runner/Cargo.toml')}}-${{ hashFiles('performance/runner/src/*') }}" >> $GITHUB_OUTPUT
branch_name="${{steps.semver.outputs.major}}.${{steps.semver.outputs.minor}}.latest"
echo "release_branch=$branch_name" >> $GITHUB_OUTPUT
echo "release branch is inferred to be ${branch_name}"
latest-runner:
name: Build or Fetch Runner
runs-on: ubuntu-latest
needs: [set-variables]
env:
RUSTFLAGS: "-D warnings"
steps:
- name: '[DEBUG] print variables'
run: |
echo "all variables defined in set-variables"
echo "cache_key: ${{ needs.set-variables.outputs.cache_key }}"
echo "release_id: ${{ needs.set-variables.outputs.release_id }}"
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
# explicitly checkout the performance runner from main regardless of which
# version we are modeling.
- name: Checkout
uses: actions/checkout@v4
with:
ref: main
# attempts to access a previously cached runner
- uses: actions/cache@v4
id: cache
with:
path: ${{ env.RUNNER_CACHE_PATH }}
key: ${{ needs.set-variables.outputs.cache_key }}
- name: Fetch Rust Toolchain
if: steps.cache.outputs.cache-hit != 'true'
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Add fmt
if: steps.cache.outputs.cache-hit != 'true'
run: rustup component add rustfmt
- name: Cargo fmt
if: steps.cache.outputs.cache-hit != 'true'
uses: actions-rs/cargo@v1
with:
command: fmt
args: --manifest-path performance/runner/Cargo.toml --all -- --check
- name: Test
if: steps.cache.outputs.cache-hit != 'true'
uses: actions-rs/cargo@v1
with:
command: test
args: --manifest-path performance/runner/Cargo.toml
- name: Build (optimized)
if: steps.cache.outputs.cache-hit != 'true'
uses: actions-rs/cargo@v1
with:
command: build
args: --release --manifest-path performance/runner/Cargo.toml
# the cache action automatically caches this binary at the end of the job
model:
# depends on `latest-runner` as a separate job so that failures in this job do not prevent
# a successfully tested and built binary from being cached.
needs: [set-variables, latest-runner]
name: Model a release
runs-on: ubuntu-latest
steps:
- name: '[DEBUG] print variables'
run: |
echo "all variables defined in set-variables"
echo "cache_key: ${{ needs.set-variables.outputs.cache_key }}"
echo "release_id: ${{ needs.set-variables.outputs.release_id }}"
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.9"
- name: Install dbt
run: pip install dbt-postgres==${{ needs.set-variables.outputs.release_id }}
- name: Install Hyperfine
run: wget https://github.com/sharkdp/hyperfine/releases/download/v1.11.0/hyperfine_1.11.0_amd64.deb && sudo dpkg -i hyperfine_1.11.0_amd64.deb
# explicitly checkout main to get the latest project definitions
- name: Checkout
uses: actions/checkout@v4
with:
ref: main
# this was built in the previous job so it will be there.
- name: Fetch Runner
uses: actions/cache@v4
id: cache
with:
path: ${{ env.RUNNER_CACHE_PATH }}
key: ${{ needs.set-variables.outputs.cache_key }}
- name: Move Runner
run: mv performance/runner/target/release/runner performance/app
- name: Change Runner Permissions
run: chmod +x ./performance/app
- name: '[DEBUG] ls baseline directory before run'
run: ls -R performance/baselines/
# `${{ github.workspace }}` is used to pass the absolute path
- name: Create directories
run: |
mkdir ${{ github.workspace }}/performance/tmp/
mkdir -p performance/baselines/${{ needs.set-variables.outputs.release_id }}/
# Run modeling with taking 20 samples
- name: Run Measurement
run: |
performance/app model -v ${{ needs.set-variables.outputs.release_id }} -b ${{ github.workspace }}/performance/baselines/ -p ${{ github.workspace }}/performance/projects/ -t ${{ github.workspace }}/performance/tmp/ -n 20
- name: '[DEBUG] ls baseline directory after run'
run: ls -R performance/baselines/
- uses: actions/upload-artifact@v4
with:
name: baseline
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}/
create-pr:
name: Open PR for ${{ matrix.base-branch }}
# depends on `model` as a separate job so that the baseline can be committed to more than one branch
# i.e. release branch and main
needs: [set-variables, latest-runner, model]
runs-on: ubuntu-latest
strategy:
matrix:
include:
- base-branch: refs/heads/main
target-branch: performance-bot/main_${{ needs.set-variables.outputs.release_id }}_${{GITHUB.RUN_ID}}
- base-branch: refs/heads/${{ needs.set-variables.outputs.release_branch }}
target-branch: performance-bot/release_${{ needs.set-variables.outputs.release_id }}_${{GITHUB.RUN_ID}}
steps:
- name: '[DEBUG] print variables'
run: |
echo "all variables defined in set-variables"
echo "cache_key: ${{ needs.set-variables.outputs.cache_key }}"
echo "release_id: ${{ needs.set-variables.outputs.release_id }}"
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ matrix.base-branch }}
- name: Create PR branch
run: |
git checkout -b ${{ matrix.target-branch }}
git push origin ${{ matrix.target-branch }}
git branch --set-upstream-to=origin/${{ matrix.target-branch }} ${{ matrix.target-branch }}
- uses: actions/download-artifact@v4
with:
name: baseline
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}
- name: '[DEBUG] ls baselines after artifact download'
run: ls -R performance/baselines/
- name: Commit baseline
uses: EndBug/add-and-commit@v9
with:
add: 'performance/baselines/*'
author_name: 'Github Build Bot'
author_email: 'buildbot@fishtownanalytics.com'
message: 'adding performance baseline for ${{ needs.set-variables.outputs.release_id }}'
push: 'origin origin/${{ matrix.target-branch }}'
- name: Create Pull Request
uses: peter-evans/create-pull-request@v6
with:
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
base: ${{ matrix.base-branch }}
branch: '${{ matrix.target-branch }}'
title: 'Adding performance modeling for ${{needs.set-variables.outputs.release_id}} to ${{ matrix.base-branch }}'
body: 'Committing perf results for tracking for the ${{needs.set-variables.outputs.release_id}}'
labels: |
Skip Changelog
Performance

View File

@@ -1,97 +0,0 @@
# **what?**
# Nightly releases to GitHub and PyPI. This workflow produces the following outcome:
# - generate and validate data for night release (commit SHA, version number, release branch);
# - pass data to release workflow;
# - night release will be pushed to GitHub as a draft release;
# - night build will be pushed to test PyPI;
#
# **why?**
# Ensure an automated and tested release process for nightly builds
#
# **when?**
# This workflow runs on schedule or can be run manually on demand.
name: Nightly Test Release to GitHub and PyPI
on:
workflow_dispatch: # for manual triggering
schedule:
- cron: 0 9 * * *
permissions:
contents: write # this is the permission that allows creating a new release
packages: write # this is the permission that allows Docker release
defaults:
run:
shell: bash
env:
RELEASE_BRANCH: "main"
jobs:
aggregate-release-data:
runs-on: ubuntu-latest
outputs:
version_number: ${{ steps.nightly-release-version.outputs.number }}
release_branch: ${{ steps.release-branch.outputs.name }}
steps:
- name: "Checkout ${{ github.repository }} Branch ${{ env.RELEASE_BRANCH }}"
uses: actions/checkout@v4
with:
ref: ${{ env.RELEASE_BRANCH }}
- name: "Get Current Version Number"
id: version-number-sources
run: |
current_version=`awk -F"current_version = " '{print $2}' .bumpversion.cfg | tr '\n' ' '`
echo "current_version=$current_version" >> $GITHUB_OUTPUT
- name: "Audit Version And Parse Into Parts"
id: semver
uses: dbt-labs/actions/parse-semver@v1.1.0
with:
version: ${{ steps.version-number-sources.outputs.current_version }}
- name: "Get Current Date"
id: current-date
run: echo "date=$(date +'%m%d%Y')" >> $GITHUB_OUTPUT
- name: "Generate Nightly Release Version Number"
id: nightly-release-version
run: |
number="${{ steps.semver.outputs.version }}.dev${{ steps.current-date.outputs.date }}"
echo "number=$number" >> $GITHUB_OUTPUT
- name: "Audit Nightly Release Version And Parse Into Parts"
uses: dbt-labs/actions/parse-semver@v1.1.0
with:
version: ${{ steps.nightly-release-version.outputs.number }}
- name: "Set Release Branch"
id: release-branch
run: |
echo "name=${{ env.RELEASE_BRANCH }}" >> $GITHUB_OUTPUT
log-outputs-aggregate-release-data:
runs-on: ubuntu-latest
needs: [aggregate-release-data]
steps:
- name: "[DEBUG] Log Outputs"
run: |
echo version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
echo release_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
release-github-pypi:
needs: [aggregate-release-data]
uses: ./.github/workflows/release.yml
with:
target_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
test_run: true
nightly_release: true
secrets: inherit

View File

@@ -1,7 +1,11 @@
# **what?**
# The purpose of this workflow is to trigger CI to run for each
# release branch and main branch on a regular cadence. If the CI workflow
# fails for a branch, it will post to #dev-core-alerts to raise awareness.
# fails for a branch, it will post to dev-core-alerts to raise awareness.
# The 'aurelien-baudet/workflow-dispatch' Action triggers the existing
# CI worklow file on the given branch to run so that even if we change the
# CI workflow file in the future, the one that is tailored for the given
# release branch will be used.
# **why?**
# Ensures release branches and main are always shippable and not broken.
@@ -24,8 +28,35 @@ on:
permissions: read-all
jobs:
run_tests:
uses: dbt-labs/actions/.github/workflows/release-branch-tests.yml@main
with:
workflows_to_run: '["main.yml"]'
secrets: inherit
kick-off-ci:
name: Kick-off CI
runs-on: ubuntu-latest
strategy:
# must run CI 1 branch at a time b/c the workflow-dispatch Action polls for
# latest run for results and it gets confused when we kick off multiple runs
# at once. There is a race condition so we will just run in sequential order.
max-parallel: 1
fail-fast: false
matrix:
branch: [1.0.latest, 1.1.latest, 1.2.latest, 1.3.latest, main]
steps:
- name: Call CI workflow for ${{ matrix.branch }} branch
id: trigger-step
uses: aurelien-baudet/workflow-dispatch@v2.1.1
with:
workflow: main.yml
ref: ${{ matrix.branch }}
token: ${{ secrets.FISHTOWN_BOT_PAT }}
- name: Post failure to Slack
uses: ravsamhq/notify-slack-action@v1
if: ${{ always() && !contains(steps.trigger-step.outputs.workflow-conclusion,'success') }}
with:
status: ${{ job.status }}
notification_title: 'dbt-core scheduled run of "${{ matrix.branch }}" branch not successful'
message_format: ':x: CI on branch "${{ matrix.branch }}" ${{ steps.trigger-step.outputs.workflow-conclusion }}'
footer: 'Linked failed CI run ${{ steps.trigger-step.outputs.workflow-url }}'
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_DEV_CORE_ALERTS }}

116
.github/workflows/release-docker.yml vendored Normal file
View File

@@ -0,0 +1,116 @@
# **what?**
# This workflow will generate a series of docker images for dbt and push them to the github container registry
# **why?**
# Docker images for dbt are used in a number of important places throughout the dbt ecosystem. This is how we keep those images up-to-date.
# **when?**
# This is triggered manually
# **next steps**
# - build this into the release workflow (or conversly, break out the different release methods into their own workflow files)
name: Docker release
permissions:
packages: write
on:
workflow_dispatch:
inputs:
package:
description: The package to release. _One_ of [dbt-core, dbt-redshift, dbt-bigquery, dbt-snowflake, dbt-spark, dbt-postgres]
required: true
version_number:
description: The release version number (i.e. 1.0.0b1). Do not include `latest` tags or a leading `v`!
required: true
jobs:
get_version_meta:
name: Get version meta
runs-on: ubuntu-latest
outputs:
major: ${{ steps.version.outputs.major }}
minor: ${{ steps.version.outputs.minor }}
patch: ${{ steps.version.outputs.patch }}
latest: ${{ steps.latest.outputs.latest }}
minor_latest: ${{ steps.latest.outputs.minor_latest }}
steps:
- uses: actions/checkout@v1
- name: Split version
id: version
run: |
IFS="." read -r MAJOR MINOR PATCH <<< ${{ github.event.inputs.version_number }}
echo "::set-output name=major::$MAJOR"
echo "::set-output name=minor::$MINOR"
echo "::set-output name=patch::$PATCH"
- name: Is pkg 'latest'
id: latest
uses: ./.github/actions/latest-wrangler
with:
package: ${{ github.event.inputs.package }}
new_version: ${{ github.event.inputs.version_number }}
gh_token: ${{ secrets.GITHUB_TOKEN }}
halt_on_missing: False
setup_image_builder:
name: Set up docker image builder
runs-on: ubuntu-latest
needs: [get_version_meta]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
build_and_push:
name: Build images and push to GHCR
runs-on: ubuntu-latest
needs: [setup_image_builder, get_version_meta]
steps:
- name: Get docker build arg
id: build_arg
run: |
echo "::set-output name=build_arg_name::"$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
echo "::set-output name=build_arg_value::"$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
- name: Log in to the GHCR
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push MAJOR.MINOR.PATCH tag
uses: docker/build-push-action@v2
with:
file: docker/Dockerfile
push: True
target: ${{ github.event.inputs.package }}
build-args: |
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
tags: |
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:${{ github.event.inputs.version_number }}
- name: Build and push MINOR.latest tag
uses: docker/build-push-action@v2
if: ${{ needs.get_version_meta.outputs.minor_latest == 'True' }}
with:
file: docker/Dockerfile
push: True
target: ${{ github.event.inputs.package }}
build-args: |
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
tags: |
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:${{ needs.get_version_meta.outputs.major }}.${{ needs.get_version_meta.outputs.minor }}.latest
- name: Build and push latest tag
uses: docker/build-push-action@v2
if: ${{ needs.get_version_meta.outputs.latest == 'True' }}
with:
file: docker/Dockerfile
push: True
target: ${{ github.event.inputs.package }}
build-args: |
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
tags: |
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:latest

View File

@@ -1,69 +1,24 @@
# **what?**
# Release workflow provides the following steps:
# - checkout the given commit;
# - validate version in sources and changelog file for given version;
# - bump the version and generate a changelog if needed;
# - merge all changes to the target branch if needed;
# - run unit and integration tests against given commit;
# - build and package that SHA;
# - release it to GitHub and PyPI with that specific build;
# - release it to Docker
#
# Take the given commit, run unit tests specifically on that sha, build and
# package it, and then release to GitHub and PyPi with that specific build
# **why?**
# Ensure an automated and tested release process
#
# **when?**
# This workflow can be run manually on demand or can be called by other workflows
name: "Release to GitHub, PyPI & Docker"
run-name: "Release ${{ inputs.version_number }} to GitHub, PyPI & Docker"
# **when?**
# This will only run manually with a given sha and version
name: Release to GitHub and PyPi
on:
workflow_dispatch:
inputs:
target_branch:
description: "The branch to release from"
type: string
required: true
sha:
description: 'The last commit sha in the release'
required: true
version_number:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
test_run:
description: "Test run (Publish release as draft)"
type: boolean
default: true
required: false
nightly_release:
description: "Nightly release to dev environment"
type: boolean
default: false
required: false
only_docker:
description: "Only release Docker image, skip GitHub & PyPI"
type: boolean
default: false
required: false
workflow_call:
inputs:
target_branch:
description: "The branch to release from"
type: string
required: true
version_number:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
test_run:
description: "Test run (Publish release as draft)"
type: boolean
default: true
required: false
nightly_release:
description: "Nightly release to dev environment"
type: boolean
default: false
required: false
description: 'The release version number (i.e. 1.0.0b1)'
required: true
permissions:
contents: write # this is the permission that allows creating a new release
@@ -73,198 +28,175 @@ defaults:
shell: bash
jobs:
job-setup:
name: Log Inputs
runs-on: ubuntu-latest
outputs:
starting_sha: ${{ steps.set_sha.outputs.starting_sha }}
steps:
- name: "[DEBUG] Print Variables"
run: |
echo Inputs
echo The branch to release from: ${{ inputs.target_branch }}
echo The release version number: ${{ inputs.version_number }}
echo Test run: ${{ inputs.test_run }}
echo Nightly release: ${{ inputs.nightly_release }}
echo Only Docker: ${{ inputs.only_docker }}
unit:
name: Unit test
- name: "Checkout target branch"
uses: actions/checkout@v4
runs-on: ubuntu-latest
env:
TOXENV: "unit"
steps:
- name: Check out the repository
uses: actions/checkout@v2
with:
ref: ${{ inputs.target_branch }}
persist-credentials: false
ref: ${{ github.event.inputs.sha }}
# release-prep.yml really shouldn't take in the sha but since core + all adapters
# depend on it now this workaround lets us not input it manually with risk of error.
# The changes always get merged into the head so we can't use a specific commit for
# releases anyways.
- name: "Capture sha"
id: set_sha
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install python dependencies
run: |
echo "starting_sha=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
pip install --user --upgrade pip
pip install tox
pip --version
tox --version
bump-version-generate-changelog:
name: Bump package version, Generate changelog
needs: [job-setup]
if: ${{ !inputs.only_docker }}
- name: Run tox
run: tox
uses: dbt-labs/dbt-release/.github/workflows/release-prep.yml@main
with:
sha: ${{ needs.job-setup.outputs.starting_sha }}
version_number: ${{ inputs.version_number }}
target_branch: ${{ inputs.target_branch }}
env_setup_script_path: "scripts/env-setup.sh"
test_run: ${{ inputs.test_run }}
nightly_release: ${{ inputs.nightly_release }}
secrets: inherit
log-outputs-bump-version-generate-changelog:
name: "[Log output] Bump package version, Generate changelog"
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [bump-version-generate-changelog]
build:
name: build packages
runs-on: ubuntu-latest
steps:
- name: Print variables
- name: Check out the repository
uses: actions/checkout@v2
with:
persist-credentials: false
ref: ${{ github.event.inputs.sha }}
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install python dependencies
run: |
echo Final SHA : ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
echo Changelog path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
pip install --user --upgrade pip
pip install --upgrade setuptools wheel twine check-wheel-contents
pip --version
build-test-package:
name: Build, Test, Package
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [job-setup, bump-version-generate-changelog]
- name: Build distributions
run: ./scripts/build-dist.sh
uses: dbt-labs/dbt-release/.github/workflows/build.yml@main
- name: Show distributions
run: ls -lh dist/
with:
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
build_script_path: "scripts/build-dist.sh"
s3_bucket_name: "core-team-artifacts"
package_test_command: "dbt --version"
test_run: ${{ inputs.test_run }}
nightly_release: ${{ inputs.nightly_release }}
- name: Check distribution descriptions
run: |
twine check dist/*
secrets:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- name: Check wheel contents
run: |
check-wheel-contents dist/*.whl --ignore W007,W008
- uses: actions/upload-artifact@v2
with:
name: dist
path: |
dist/
!dist/dbt-${{github.event.inputs.version_number}}.tar.gz
test-build:
name: verify packages
needs: [build, unit]
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install python dependencies
run: |
pip install --user --upgrade pip
pip install --upgrade wheel
pip --version
- uses: actions/download-artifact@v2
with:
name: dist
path: dist/
- name: Show distributions
run: ls -lh dist/
- name: Install wheel distributions
run: |
find ./dist/*.whl -maxdepth 1 -type f | xargs pip install --force-reinstall --find-links=dist/
- name: Check wheel distributions
run: |
dbt --version
- name: Install source distributions
run: |
find ./dist/*.gz -maxdepth 1 -type f | xargs pip install --force-reinstall --find-links=dist/
- name: Check source distributions
run: |
dbt --version
github-release:
name: GitHub Release
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [bump-version-generate-changelog, build-test-package]
needs: test-build
uses: dbt-labs/dbt-release/.github/workflows/github-release.yml@main
runs-on: ubuntu-latest
with:
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
test_run: ${{ inputs.test_run }}
steps:
- uses: actions/download-artifact@v2
with:
name: dist
path: '.'
# Need to set an output variable because env variables can't be taken as input
# This is needed for the next step with releasing to GitHub
- name: Find release type
id: release_type
env:
IS_PRERELEASE: ${{ contains(github.event.inputs.version_number, 'rc') || contains(github.event.inputs.version_number, 'b') }}
run: |
echo ::set-output name=isPrerelease::$IS_PRERELEASE
- name: Creating GitHub Release
uses: softprops/action-gh-release@v1
with:
name: dbt-core v${{github.event.inputs.version_number}}
tag_name: v${{github.event.inputs.version_number}}
prerelease: ${{ steps.release_type.outputs.isPrerelease }}
target_commitish: ${{github.event.inputs.sha}}
body: |
[Release notes](https://github.com/dbt-labs/dbt-core/blob/main/CHANGELOG.md)
files: |
dbt_postgres-${{github.event.inputs.version_number}}-py3-none-any.whl
dbt_core-${{github.event.inputs.version_number}}-py3-none-any.whl
dbt-postgres-${{github.event.inputs.version_number}}.tar.gz
dbt-core-${{github.event.inputs.version_number}}.tar.gz
pypi-release:
name: PyPI Release
name: Pypi release
needs: [github-release]
uses: dbt-labs/dbt-release/.github/workflows/pypi-release.yml@main
with:
version_number: ${{ inputs.version_number }}
test_run: ${{ inputs.test_run }}
secrets:
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
TEST_PYPI_API_TOKEN: ${{ secrets.TEST_PYPI_API_TOKEN }}
determine-docker-package:
# dbt-postgres exists within dbt-core for versions 1.7 and earlier but is a separate package for 1.8 and later.
# determine if we need to release dbt-core or both dbt-core and dbt-postgres
name: Determine Docker Package
if: ${{ !failure() && !cancelled() }}
runs-on: ubuntu-latest
needs: [pypi-release]
outputs:
matrix: ${{ steps.determine-docker-package.outputs.matrix }}
needs: github-release
environment: PypiProd
steps:
- name: "Audit Version And Parse Into Parts"
id: semver
uses: dbt-labs/actions/parse-semver@v1.1.0
- uses: actions/download-artifact@v2
with:
version: ${{ inputs.version_number }}
name: dist
path: 'dist'
- name: "Determine Packages to Release"
id: determine-docker-package
run: |
if [ ${{ steps.semver.outputs.minor }} -ge 8 ]; then
json_output={\"package\":[\"dbt-core\"]}
else
json_output={\"package\":[\"dbt-core\",\"dbt-postgres\"]}
fi
echo "matrix=$json_output" >> $GITHUB_OUTPUT
docker-release:
name: "Docker Release for ${{ matrix.package }}"
needs: [determine-docker-package]
# We cannot release to docker on a test run because it uses the tag in GitHub as
# what we need to release but draft releases don't actually tag the commit so it
# finds nothing to release
if: ${{ !failure() && !cancelled() && (!inputs.test_run || inputs.only_docker) }}
strategy:
matrix: ${{fromJson(needs.determine-docker-package.outputs.matrix)}}
permissions:
packages: write
uses: dbt-labs/dbt-release/.github/workflows/release-docker.yml@main
with:
package: ${{ matrix.package }}
version_number: ${{ inputs.version_number }}
test_run: ${{ inputs.test_run }}
slack-notification:
name: Slack Notification
if: ${{ failure() && (!inputs.test_run || inputs.nightly_release) }}
needs:
[
bump-version-generate-changelog,
build-test-package,
github-release,
pypi-release,
docker-release,
]
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
with:
status: "failure"
secrets:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_DEV_CORE_ALERTS }}
testing-slack-notification:
# sends notifications to #slackbot-test
name: Testing - Slack Notification
if: ${{ failure() && inputs.test_run && !inputs.nightly_release }}
needs:
[
bump-version-generate-changelog,
build-test-package,
github-release,
pypi-release,
docker-release,
]
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
with:
status: "failure"
secrets:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_TESTING_WEBHOOK_URL }}
- name: Publish distribution to PyPI
uses: pypa/gh-action-pypi-publish@v1.4.2
with:
password: ${{ secrets.PYPI_API_TOKEN }}

View File

@@ -1,30 +0,0 @@
# **what?**
# Cleanup branches left over from automation and testing. Also cleanup
# draft releases from release testing.
# **why?**
# The automations are leaving behind branches and releases that clutter
# the repository. Sometimes we need them to debug processes so we don't
# want them immediately deleted. Running on Saturday to avoid running
# at the same time as an actual release to prevent breaking a release
# mid-release.
# **when?**
# Mainly on a schedule of 12:00 Saturday.
# Manual trigger can also run on demand
name: Repository Cleanup
on:
schedule:
- cron: '0 12 * * SAT' # At 12:00 on Saturday - details in `why` above
workflow_dispatch: # for manual triggering
permissions:
contents: write
jobs:
cleanup-repo:
uses: dbt-labs/actions/.github/workflows/repository-cleanup.yml@main
secrets: inherit

View File

@@ -13,63 +13,48 @@
name: Artifact Schema Check
on:
pull_request:
types: [ opened, reopened, labeled, unlabeled, synchronize ]
paths-ignore: [ '.changes/**', '.github/**', 'tests/**', '**.md', '**.yml' ]
workflow_dispatch:
pull_request: #TODO: remove before merging
push:
branches:
- "develop"
- "*.latest"
- "releases/*"
# no special access is needed
permissions: read-all
env:
LATEST_SCHEMA_PATH: ${{ github.workspace }}/new_schemas
SCHEMA_DIFF_ARTIFACT: ${{ github.workspace }}/schema_changes.txt
SCHEMA_DIFF_ARTIFACT: ${{ github.workspace }}//schema_schanges.txt
DBT_REPO_DIRECTORY: ${{ github.workspace }}/dbt
SCHEMA_REPO_DIRECTORY: ${{ github.workspace }}/schemas.getdbt.com
jobs:
checking-schemas:
name: "Post-merge schema changes required"
name: "Checking schemas"
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v2
with:
python-version: 3.9
python-version: 3.8
- name: Checkout dbt repo
uses: actions/checkout@v4
uses: actions/checkout@v2.3.4
with:
path: ${{ env.DBT_REPO_DIRECTORY }}
- name: Check for changes in core/dbt/artifacts
# https://github.com/marketplace/actions/paths-changes-filter
uses: dorny/paths-filter@v3
id: check_artifact_changes
with:
filters: |
artifacts_changed:
- 'core/dbt/artifacts/**'
list-files: shell
working-directory: ${{ env.DBT_REPO_DIRECTORY }}
- name: Succeed if no artifacts have changed
if: steps.check_artifact_changes.outputs.artifacts_changed == 'false'
run: |
echo "No artifact changes found in core/dbt/artifacts. CI check passed."
- name: Checkout schemas.getdbt.com repo
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
uses: actions/checkout@v4
uses: actions/checkout@v2.3.4
with:
repository: dbt-labs/schemas.getdbt.com
ref: 'main'
ssh-key: ${{ secrets.SCHEMA_SSH_PRIVATE_KEY }}
path: ${{ env.SCHEMA_REPO_DIRECTORY }}
- name: Generate current schema
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
run: |
cd ${{ env.DBT_REPO_DIRECTORY }}
python3 -m venv env
@@ -80,17 +65,26 @@ jobs:
# Copy generated schema files into the schemas.getdbt.com repo
# Do a git diff to find any changes
# Ignore any lines with date-like (yyyy-mm-dd) or version-like (x.y.z) changes
# Ignore any date or version changes though
- name: Compare schemas
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
run: |
cp -r ${{ env.LATEST_SCHEMA_PATH }}/dbt ${{ env.SCHEMA_REPO_DIRECTORY }}
cd ${{ env.SCHEMA_REPO_DIRECTORY }}
git diff -I='*[0-9]{4}-[0-9]{2}-[0-9]{2}' -I='*[0-9]+\.[0-9]+\.[0-9]+' --exit-code > ${{ env.SCHEMA_DIFF_ARTIFACT }}
diff_results=$(git diff -I='*[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])T' \
-I='*[0-9]{1}.[0-9]{2}.[0-9]{1}(rc[0-9]|b[0-9]| )' --compact-summary)
if [[ $(echo diff_results) ]]; then
echo $diff_results
echo "Schema changes detected!"
git diff -I='*[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])T' \
-I='*[0-9]{1}.[0-9]{2}.[0-9]{1}(rc[0-9]|b[0-9]| )' > ${{ env.SCHEMA_DIFF_ARTIFACT }}
exit 1
else
echo "No schema changes detected"
fi
- name: Upload schema diff
uses: actions/upload-artifact@v4
if: ${{ failure() && steps.check_artifact_changes.outputs.artifacts_changed == 'true' }}
uses: actions/upload-artifact@v2.2.4
if: ${{ failure() }}
with:
name: 'schema_changes.txt'
name: 'schema_schanges.txt'
path: '${{ env.SCHEMA_DIFF_ARTIFACT }}'

View File

@@ -18,41 +18,11 @@ on:
permissions: read-all
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python testing
PYTHON_INTEGRATION_TEST_WORKERS: 5
jobs:
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
# run the performance measurements on the current or default branch
test-schema:
name: Test Log Schema
runs-on: ubuntu-latest
timeout-minutes: 30
needs:
- integration-metadata
strategy:
fail-fast: false
matrix:
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
runs-on: ubuntu-20.04
env:
# turns warnings into errors
RUSTFLAGS: "-D warnings"
@@ -60,41 +30,21 @@ jobs:
LOG_DIR: "/home/runner/work/dbt-core/dbt-core/logs"
# tells integration tests to output into json format
DBT_LOG_FORMAT: "json"
# tell eventmgr to convert logging events into bytes
DBT_TEST_BINARY_SERIALIZATION: "true"
# Additional test users
DBT_TEST_USER_1: dbt_test_user_1
DBT_TEST_USER_2: dbt_test_user_2
DBT_TEST_USER_3: dbt_test_user_3
services:
# Label used to access the service container
postgres:
# Docker Hub image
image: postgres
# Provide the password for postgres
env:
POSTGRES_PASSWORD: password
POSTGRES_USER: postgres
# Set health checks to wait until postgres has started
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- name: checkout dev
uses: actions/checkout@v4
uses: actions/checkout@v2
with:
persist-credentials: false
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@v2.2.2
with:
python-version: "3.9"
python-version: "3.8"
- name: Install python dependencies
run: |
@@ -103,13 +53,8 @@ jobs:
pip install tox
tox --version
- name: Run postgres setup script
run: |
./test/setup_db.sh
env:
PGHOST: localhost
PGPORT: 5432
PGPASSWORD: password
- name: Set up postgres
uses: ./.github/actions/setup-postgres-linux
- name: ls
run: ls
@@ -117,19 +62,4 @@ jobs:
# integration tests generate a ton of logs in different files. the next step will find them all.
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs
- name: Run integration tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 30
max_attempts: 3
command: tox -e integration -- -nauto
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
test-schema-report:
name: Log Schema Test Suite
runs-on: ubuntu-latest
needs: test-schema
steps:
- name: "[Notification] Log test suite passes"
run: |
echo "::notice title="Log test suite passes""
run: tox -e integration -- -nauto

View File

@@ -1,158 +0,0 @@
# **what?**
# This workflow will test all test(s) at the input path given number of times to determine if it's flaky or not. You can test with any supported OS/Python combination.
# This is batched in 10 to allow more test iterations faster.
# **why?**
# Testing if a test is flaky and if a previously flaky test has been fixed. This allows easy testing on supported python versions and OS combinations.
# **when?**
# This is triggered manually from dbt-core.
name: Flaky Tester
on:
workflow_dispatch:
inputs:
branch:
description: 'Branch to check out'
type: string
required: true
default: 'main'
test_path:
description: 'Path to single test to run (ex: tests/functional/retry/test_retry.py::TestRetry::test_fail_fast)'
type: string
required: true
default: 'tests/functional/...'
python_version:
description: 'Version of Python to Test Against'
type: choice
options:
- '3.9'
- '3.10'
- '3.11'
os:
description: 'OS to run test in'
type: choice
options:
- 'ubuntu-latest'
- 'macos-14'
- 'windows-latest'
num_runs_per_batch:
description: 'Max number of times to run the test per batch. We always run 10 batches.'
type: number
required: true
default: '50'
permissions: read-all
defaults:
run:
shell: bash
jobs:
debug:
runs-on: ubuntu-latest
steps:
- name: "[DEBUG] Output Inputs"
run: |
echo "Branch: ${{ inputs.branch }}"
echo "test_path: ${{ inputs.test_path }}"
echo "python_version: ${{ inputs.python_version }}"
echo "os: ${{ inputs.os }}"
echo "num_runs_per_batch: ${{ inputs.num_runs_per_batch }}"
pytest:
runs-on: ${{ inputs.os }}
strategy:
# run all batches, even if one fails. This informs how flaky the test may be.
fail-fast: false
# using a matrix to speed up the jobs since the matrix will run in parallel when runners are available
matrix:
batch: ["1", "2", "3", "4", "5", "6", "7", "8", "9", "10"]
env:
PYTEST_ADDOPTS: "-v --color=yes -n4 --csv integration_results.csv"
DBT_TEST_USER_1: dbt_test_user_1
DBT_TEST_USER_2: dbt_test_user_2
DBT_TEST_USER_3: dbt_test_user_3
DD_CIVISIBILITY_AGENTLESS_ENABLED: true
DD_API_KEY: ${{ secrets.DATADOG_API_KEY }}
DD_SITE: datadoghq.com
DD_ENV: ci
DD_SERVICE: ${{ github.event.repository.name }}
steps:
- name: "Checkout code"
uses: actions/checkout@v4
with:
ref: ${{ inputs.branch }}
- name: "Setup Python"
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python_version }}"
- name: "Setup Dev Environment"
run: make dev
- name: "Set up postgres (linux)"
if: inputs.os == 'ubuntu-latest'
run: make setup-db
# mac and windows don't use make due to limitations with docker with those runners in GitHub
- name: Set up postgres (macos)
if: runner.os == 'macOS'
uses: nick-fields/retry@v3
with:
timeout_minutes: 10
max_attempts: 3
command: ./test/setup_db.sh
- name: "Set up postgres (windows)"
if: inputs.os == 'windows-latest'
uses: ./.github/actions/setup-postgres-windows
- name: "Test Command"
id: command
run: |
test_command="python -m pytest ${{ inputs.test_path }}"
echo "test_command=$test_command" >> $GITHUB_OUTPUT
- name: "Run test ${{ inputs.num_runs_per_batch }} times"
id: pytest
run: |
set +e
for ((i=1; i<=${{ inputs.num_runs_per_batch }}; i++))
do
echo "Running pytest iteration $i..."
python -m pytest --ddtrace ${{ inputs.test_path }}
exit_code=$?
if [[ $exit_code -eq 0 ]]; then
success=$((success + 1))
echo "Iteration $i: Success"
else
failure=$((failure + 1))
echo "Iteration $i: Failure"
fi
echo
echo "==========================="
echo "Successful runs: $success"
echo "Failed runs: $failure"
echo "==========================="
echo
done
echo "failure=$failure" >> $GITHUB_OUTPUT
- name: "Success and Failure Summary: ${{ inputs.os }}/Python ${{ inputs.python_version }}"
run: |
echo "Batch: ${{ matrix.batch }}"
echo "Successful runs: ${{ steps.pytest.outputs.success }}"
echo "Failed runs: ${{ steps.pytest.outputs.failure }}"
- name: "Error for Failures"
if: ${{ steps.pytest.outputs.failure }}
run: |
echo "Batch ${{ matrix.batch }} failed ${{ steps.pytest.outputs.failure }} of ${{ inputs.num_runs_per_batch }} tests"
exit 1

1
.github/workflows/test/.actrc vendored Normal file
View File

@@ -0,0 +1 @@
-P ubuntu-latest=ghcr.io/catthehacker/ubuntu:act-latest

1
.github/workflows/test/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
.secrets

View File

@@ -0,0 +1 @@
GITHUB_TOKEN=GH_PERSONAL_ACCESS_TOKEN_GOES_HERE

View File

@@ -0,0 +1,6 @@
{
"inputs": {
"version_number": "1.0.1",
"package": "dbt-postgres"
}
}

View File

@@ -24,8 +24,10 @@ permissions:
jobs:
triage_label:
if: contains(github.event.issue.labels.*.name, 'awaiting_response')
uses: dbt-labs/actions/.github/workflows/swap-labels.yml@main
with:
add_label: "triage"
remove_label: "awaiting_response"
secrets: inherit
runs-on: ubuntu-latest
steps:
- name: initial labeling
uses: andymckay/labeler@master
with:
add-labels: "triage"
remove-labels: "awaiting_response"

125
.github/workflows/version-bump.yml vendored Normal file
View File

@@ -0,0 +1,125 @@
# **what?**
# This workflow will take the new version number to bump to. With that
# it will run versionbump to update the version number everywhere in the
# code base and then run changie to create the corresponding changelog.
# A PR will be created with the changes that can be reviewed before committing.
# **why?**
# This is to aid in releasing dbt and making sure we have updated
# the version in all places and generated the changelog.
# **when?**
# This is triggered manually
name: Version Bump
on:
workflow_dispatch:
inputs:
version_number:
description: 'The version number to bump to (ex. 1.2.0, 1.3.0b1)'
required: true
permissions:
contents: write
pull-requests: write
jobs:
bump:
runs-on: ubuntu-latest
steps:
- name: "[DEBUG] Print Variables"
run: |
echo "all variables defined as inputs"
echo The version_number: ${{ github.event.inputs.version_number }}
- name: Check out the repository
uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.8"
- name: Install python dependencies
run: |
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
- name: Add Homebrew to PATH
run: |
echo "/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin" >> $GITHUB_PATH
- name: Install Homebrew packages
run: |
brew install pre-commit
brew tap miniscruff/changie https://github.com/miniscruff/changie
brew install changie
- name: Audit Version and Parse Into Parts
id: semver
uses: dbt-labs/actions/parse-semver@v1
with:
version: ${{ github.event.inputs.version_number }}
- name: Set branch value
id: variables
run: |
echo "::set-output name=BRANCH_NAME::prep-release/${{ github.event.inputs.version_number }}_$GITHUB_RUN_ID"
- name: Create PR branch
run: |
git checkout -b ${{ steps.variables.outputs.BRANCH_NAME }}
git push origin ${{ steps.variables.outputs.BRANCH_NAME }}
git branch --set-upstream-to=origin/${{ steps.variables.outputs.BRANCH_NAME }} ${{ steps.variables.outputs.BRANCH_NAME }}
- name: Bump version
run: |
source env/bin/activate
pip install -r dev-requirements.txt
env/bin/bumpversion --allow-dirty --new-version ${{ github.event.inputs.version_number }} major
git status
- name: Run changie
run: |
if [[ ${{ steps.semver.outputs.is-pre-release }} -eq 1 ]]
then
changie batch ${{ steps.semver.outputs.base-version }} --move-dir '${{ steps.semver.outputs.base-version }}' --prerelease '${{ steps.semver.outputs.pre-release }}'
else
changie batch ${{ steps.semver.outputs.base-version }} --include '${{ steps.semver.outputs.base-version }}' --remove-prereleases
fi
changie merge
git status
# this step will fail on whitespace errors but also correct them
- name: Remove trailing whitespace
continue-on-error: true
run: |
pre-commit run trailing-whitespace --files .bumpversion.cfg CHANGELOG.md .changes/*
git status
# this step will fail on newline errors but also correct them
- name: Removing extra newlines
continue-on-error: true
run: |
pre-commit run end-of-file-fixer --files .bumpversion.cfg CHANGELOG.md .changes/*
git status
- name: Commit version bump to branch
uses: EndBug/add-and-commit@v7
with:
author_name: 'Github Build Bot'
author_email: 'buildbot@fishtownanalytics.com'
message: 'Bumping version to ${{ github.event.inputs.version_number }} and generate CHANGELOG'
branch: '${{ steps.variables.outputs.BRANCH_NAME }}'
push: 'origin origin/${{ steps.variables.outputs.BRANCH_NAME }}'
- name: Create Pull Request
uses: peter-evans/create-pull-request@v3
with:
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
base: ${{github.ref}}
title: 'Bumping version to ${{ github.event.inputs.version_number }} and generate changelog'
branch: '${{ steps.variables.outputs.BRANCH_NAME }}'
labels: |
Skip Changelog

10
.gitignore vendored
View File

@@ -11,7 +11,6 @@ __pycache__/
env*/
dbt_env/
build/
!tests/functional/build
!core/dbt/docs/build
develop-eggs/
dist/
@@ -29,8 +28,6 @@ var/
.mypy_cache/
.dmypy.json
logs/
.user.yml
profiles.yml
# PyInstaller
# Usually these files are written by a python script from a template
@@ -54,12 +51,8 @@ coverage.xml
*,cover
.hypothesis/
test.env
makefile.test.env
*.pytest_cache/
# Unit test artifacts
index.html
# Translations
*.mo
@@ -108,6 +101,3 @@ venv/
# poetry
poetry.lock
# asdf
.tool-versions

View File

@@ -1,4 +0,0 @@
[settings]
profile=black
extend_skip_glob=.github/*,third-party-stubs/*,scripts/*
known_first_party=dbt,dbt_adapters,dbt_common,dbt_extractor,dbt_semantic_interfaces

View File

@@ -1,9 +1,10 @@
# Configuration for pre-commit hooks (see https://pre-commit.com/).
# Eventually the hooks described here will be run as tests before merging each PR.
exclude: ^(core/dbt/docs/build/|core/dbt/common/events/types_pb2.py|core/dbt/events/core_types_pb2.py|core/dbt/adapters/events/adapter_types_pb2.py)
# TODO: remove global exclusion of tests when testing overhaul is complete
exclude: ^(test/|core/dbt/docs/build/)
# Force all unspecified python hooks to run python 3.9
# Force all unspecified python hooks to run python 3.8
default_language_version:
python: python3
@@ -15,19 +16,12 @@ repos:
args: [--unsafe]
- id: check-json
- id: end-of-file-fixer
exclude: schemas/dbt/manifest/
- id: trailing-whitespace
exclude_types:
- "markdown"
- id: check-case-conflict
- repo: https://github.com/pycqa/isort
# rev must match what's in dev-requirements.txt
rev: 5.13.2
hooks:
- id: isort
- repo: https://github.com/psf/black
# rev must match what's in dev-requirements.txt
rev: 24.3.0
rev: 22.3.0
hooks:
- id: black
- id: black
@@ -37,7 +31,6 @@ repos:
- "--check"
- "--diff"
- repo: https://github.com/pycqa/flake8
# rev must match what's in dev-requirements.txt
rev: 4.0.1
hooks:
- id: flake8
@@ -45,8 +38,7 @@ repos:
alias: flake8-check
stages: [manual]
- repo: https://github.com/pre-commit/mirrors-mypy
# rev must match what's in dev-requirements.txt
rev: v1.4.1
rev: v0.942
hooks:
- id: mypy
# N.B.: Mypy is... a bit fragile.

View File

@@ -26,13 +26,12 @@ Legacy tests are found in the 'test' directory:
The "tasks" map to top-level dbt commands. So `dbt run` => task.run.RunTask, etc. Some are more like abstract base classes (GraphRunnableTask, for example) but all the concrete types outside of task should map to tasks. Currently one executes at a time. The tasks kick off their “Runners” and those do execute in parallel. The parallelism is managed via a thread pool, in GraphRunnableTask.
core/dbt/task/docs/index.html
core/dbt/include/index.html
This is the docs website code. It comes from the dbt-docs repository, and is generated when a release is packaged.
## Adapters
dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc.
Note: dbt-postgres used to exist in dbt-core but is now in [its own repo](https://github.com/dbt-labs/dbt-postgres)
dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc. For testing and development purposes, the dbt-postgres plugin lives alongside the dbt-core codebase, in the [`plugins`](plugins) subdirectory. Like other adapter plugins, it is a self-contained codebase and package that builds on top of dbt-core.
Each adapter is a mix of python, Jinja2, and SQL. The adapter code also makes heavy use of Jinja2 to wrap modular chunks of SQL functionality, define default implementations, and allow plugins to override it.

View File

@@ -1,269 +1,16 @@
# dbt Core Changelog
- This file provides a full account of all changes to `dbt-core`
- This file provides a full account of all changes to `dbt-core` and `dbt-postgres`
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-core 1.9.6 - May 30, 2025
### Features
- Support config on columns ([#11651](https://github.com/dbt-labs/dbt-core/issues/11651))
### Fixes
- Fix source freshness set via config to handle explicit nulls ([#11685](https://github.com/dbt-labs/dbt-core/issues/11685))
## dbt-core 1.9.5 - May 28, 2025
### Fixes
- Add freshness config to sources ([#11506](https://github.com/dbt-labs/dbt-core/issues/11506))
## dbt-core 1.9.4 - April 02, 2025
### Fixes
- dbt retry does not respect --threads ([#10584](https://github.com/dbt-labs/dbt-core/issues/10584))
### Contributors
- [@donjin-master](https://github.com/donjin-master) ([#10584](https://github.com/dbt-labs/dbt-core/issues/10584))
## dbt-core 1.9.3 - March 07, 2025
### Fixes
- Fix microbatch execution to not block main thread nor hang ([#11243](https://github.com/dbt-labs/dbt-core/issues/11243), [#11306](https://github.com/dbt-labs/dbt-core/issues/11306))
## dbt-core 1.9.2 - January 29, 2025
### Fixes
- Error writing generic test at run time ([#11110](https://github.com/dbt-labs/dbt-core/issues/11110))
- Run check_modified_contract for state:modified ([#11034](https://github.com/dbt-labs/dbt-core/issues/11034))
- Fix unrendered_config for tests from dbt_project.yml ([#11146](https://github.com/dbt-labs/dbt-core/issues/11146))
- Ensure warning about microbatch lacking filter inputs is always fired ([#11159](https://github.com/dbt-labs/dbt-core/issues/11159))
- Fix microbatch dbt list --output json ([#10556](https://github.com/dbt-labs/dbt-core/issues/10556), [#11098](https://github.com/dbt-labs/dbt-core/issues/11098))
### Contributors
- [@internetcoffeephone](https://github.com/internetcoffeephone) ([#10556](https://github.com/dbt-labs/dbt-core/issues/10556), [#11098](https://github.com/dbt-labs/dbt-core/issues/11098))
## dbt-core 1.9.1 - December 16, 2024
### Fixes
- update adapter version messages ([#10230](https://github.com/dbt-labs/dbt-core/issues/10230))
- Fix debug log messages for microbatch batch execution information ([#11111](https://github.com/dbt-labs/dbt-core/issues/11111))
- Fix running of extra "last" batch when there is only one batch ([#11112](https://github.com/dbt-labs/dbt-core/issues/11112))
- Fix interpretation of `PartialSuccess` to result in non-zero exit code ([#11114](https://github.com/dbt-labs/dbt-core/issues/11114))
- Warn about invalid usages of `concurrent_batches` config ([#11122](https://github.com/dbt-labs/dbt-core/issues/11122))
### Contributors
- [@dave-connors-3](https://github.com/dave-connors-3) ([#10230](https://github.com/dbt-labs/dbt-core/issues/10230))
## dbt-core 1.9.0 - December 09, 2024
### Breaking Changes
- Fix changing the current working directory when using dpt deps, clean and init. ([#8997](https://github.com/dbt-labs/dbt-core/issues/8997))
### Features
- Parseable JSON and text output in quiet mode for `dbt show` and `dbt compile` ([#9840](https://github.com/dbt-labs/dbt-core/issues/9840))
- serialize inferred primary key ([#9824](https://github.com/dbt-labs/dbt-core/issues/9824))
- Add unit_test: selection method ([#10053](https://github.com/dbt-labs/dbt-core/issues/10053))
- Maximally parallelize dbt clone in clone command" ([#7914](https://github.com/dbt-labs/dbt-core/issues/7914))
- Add --host flag to dbt docs serve, defaulting to '127.0.0.1' ([#10229](https://github.com/dbt-labs/dbt-core/issues/10229))
- Update data_test to accept arbitrary config options ([#10197](https://github.com/dbt-labs/dbt-core/issues/10197))
- add pre_model and post_model hook calls to data and unit tests to be able to provide extra config options ([#10198](https://github.com/dbt-labs/dbt-core/issues/10198))
- add --empty value to jinja context as flags.EMPTY ([#10317](https://github.com/dbt-labs/dbt-core/issues/10317))
- Warning message for snapshot timestamp data types ([#10234](https://github.com/dbt-labs/dbt-core/issues/10234))
- Support cumulative_type_params & sub-daily granularities in semantic manifest. ([#10360](https://github.com/dbt-labs/dbt-core/issues/10360))
- Add time_granularity to metric spec. ([#10376](https://github.com/dbt-labs/dbt-core/issues/10376))
- Support standard schema/database fields for snapshots ([#10301](https://github.com/dbt-labs/dbt-core/issues/10301))
- Support ref and source in foreign key constraint expressions, bump dbt-common minimum to 1.6 ([#8062](https://github.com/dbt-labs/dbt-core/issues/8062))
- Support new semantic layer time spine configs to enable sub-daily granularity. ([#10475](https://github.com/dbt-labs/dbt-core/issues/10475))
- Add `order_by` and `limit` fields to saved queries. ([#10531](https://github.com/dbt-labs/dbt-core/issues/10531))
- Add support for behavior flags ([#10618](https://github.com/dbt-labs/dbt-core/issues/10618))
- Enable `--resource-type` and `--exclude-resource-type` CLI flags and environment variables for `dbt test` ([#10656](https://github.com/dbt-labs/dbt-core/issues/10656))
- Allow configuring snapshot column names ([#10185](https://github.com/dbt-labs/dbt-core/issues/10185))
- Add custom_granularities to YAML spec for time spines. ([#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- Add basic functionality for creating microbatch incremental models ([#9490](https://github.com/dbt-labs/dbt-core/issues/9490), [#10635](https://github.com/dbt-labs/dbt-core/issues/10635), [#10637](https://github.com/dbt-labs/dbt-core/issues/10637), [#10638](https://github.com/dbt-labs/dbt-core/issues/10638), [#10636](https://github.com/dbt-labs/dbt-core/issues/10636), [#10662](https://github.com/dbt-labs/dbt-core/issues/10662), [#10639](https://github.com/dbt-labs/dbt-core/issues/10639))
- Execute microbatch models in batches ([#10700](https://github.com/dbt-labs/dbt-core/issues/10700))
- Create 'skip_nodes_if_on_run_start_fails' behavior change flag ([#7387](https://github.com/dbt-labs/dbt-core/issues/7387))
- Allow snapshots to be defined in YAML. ([#10246](https://github.com/dbt-labs/dbt-core/issues/10246))
- Write microbatch compiled/run targets to separate files, one per batch ([#10714](https://github.com/dbt-labs/dbt-core/issues/10714))
- Track incremental_strategy as part of model_run tracking event ([#10761](https://github.com/dbt-labs/dbt-core/issues/10761))
- Support required 'begin' config for microbatch models ([#10701](https://github.com/dbt-labs/dbt-core/issues/10701))
- Parse-time validation of microbatch configs: require event_time, batch_size, lookback and validate input event_time ([#10709](https://github.com/dbt-labs/dbt-core/issues/10709))
- Added the --inline-direct parameter to 'dbt show' ([#10770](https://github.com/dbt-labs/dbt-core/issues/10770))
- Enable specification of dbt_valid_to for current records ([#10187](https://github.com/dbt-labs/dbt-core/issues/10187))
- Enable `retry` support for microbatch models ([#10715](https://github.com/dbt-labs/dbt-core/issues/10715), [#10729](https://github.com/dbt-labs/dbt-core/issues/10729))
- Use unrendered database and schema source properties during state:modified, behind state_modified_compare_more_unrendered_values behavoiur flag ([#9573](https://github.com/dbt-labs/dbt-core/issues/9573))
- Ensure microbatch models respect `full_refresh` model config ([#10785](https://github.com/dbt-labs/dbt-core/issues/10785))
- Adds validations for custom_granularities to ensure unique naming. ([#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- Enable use of multi-column unique key in snapshots ([#9992](https://github.com/dbt-labs/dbt-core/issues/9992))
- Change gating of microbatch feature to be behind project flag / behavior flag ([#10798](https://github.com/dbt-labs/dbt-core/issues/10798))
- Ensure `--event-time-start` is before `--event-time-end` ([#10786](https://github.com/dbt-labs/dbt-core/issues/10786))
- Ensure microbatch models use same `current_time` value ([#10819](https://github.com/dbt-labs/dbt-core/issues/10819))
- Emit warning when microbatch model has no input with `event_time` config ([#10926](https://github.com/dbt-labs/dbt-core/issues/10926))
- Emit debug logging event whenever artifacts are written ([#10937](https://github.com/dbt-labs/dbt-core/issues/10937))
- Support --empty for snapshots ([#10372](https://github.com/dbt-labs/dbt-core/issues/10372))
- Add new hard_deletes="new_record" mode for snapshots. ([#10235](https://github.com/dbt-labs/dbt-core/issues/10235))
- Allow microbatch batches to run in parallel ([#10853](https://github.com/dbt-labs/dbt-core/issues/10853), [#10855](https://github.com/dbt-labs/dbt-core/issues/10855))
- Add `batch` context object to model jinja context ([#11025](https://github.com/dbt-labs/dbt-core/issues/11025))
- Ensure pre/post hooks only run on first/last batch respectively for microbatch model batches ([#11094](https://github.com/dbt-labs/dbt-core/issues/11094), [#11104](https://github.com/dbt-labs/dbt-core/issues/11104))
### Fixes
- Remove unused check_new method ([#7586](https://github.com/dbt-labs/dbt-core/issues/7586))
- Test case for `merge_exclude_columns` ([#8267](https://github.com/dbt-labs/dbt-core/issues/8267))
- Convert "Skipping model due to fail_fast" message to DEBUG level ([#8774](https://github.com/dbt-labs/dbt-core/issues/8774))
- Restore previous behavior for --favor-state: only favor defer_relation if not selected in current command" ([#10107](https://github.com/dbt-labs/dbt-core/issues/10107))
- Unit test fixture (csv) returns null for empty value ([#9881](https://github.com/dbt-labs/dbt-core/issues/9881))
- Fix json format log and --quiet for ls and jinja print by converting print call to fire events ([#8756](https://github.com/dbt-labs/dbt-core/issues/8756))
- Add resource type to saved_query ([#10168](https://github.com/dbt-labs/dbt-core/issues/10168))
- Fix: Order-insensitive unit test equality assertion for expected/actual with multiple nulls ([#10167](https://github.com/dbt-labs/dbt-core/issues/10167))
- Renaming or removing a contracted model should raise a BreakingChange warning/error ([#10116](https://github.com/dbt-labs/dbt-core/issues/10116))
- prefer disabled project nodes to external node ([#10224](https://github.com/dbt-labs/dbt-core/issues/10224))
- Fix issues with selectors and inline nodes ([#8943](https://github.com/dbt-labs/dbt-core/issues/8943), [#9269](https://github.com/dbt-labs/dbt-core/issues/9269))
- Fix snapshot config to work in yaml files ([#4000](https://github.com/dbt-labs/dbt-core/issues/4000))
- Improve handling of error when loading schema file list ([#10284](https://github.com/dbt-labs/dbt-core/issues/10284))
- Use model alias for the CTE identifier generated during ephemeral materialization ([#5273](https://github.com/dbt-labs/dbt-core/issues/5273))
- Implement state:modified for saved queries ([#10294](https://github.com/dbt-labs/dbt-core/issues/10294))
- Saved Query node fail during skip ([#10029](https://github.com/dbt-labs/dbt-core/issues/10029))
- DOn't warn on `unit_test` config paths that are properly used ([#10311](https://github.com/dbt-labs/dbt-core/issues/10311))
- Fix setting `silence` of `warn_error_options` via `dbt_project.yaml` flags ([#10160](https://github.com/dbt-labs/dbt-core/issues/10160))
- Attempt to provide test fixture tables with all values to set types correctly for comparisong with source tables ([#10365](https://github.com/dbt-labs/dbt-core/issues/10365))
- Limit data_tests deprecation to root_project ([#9835](https://github.com/dbt-labs/dbt-core/issues/9835))
- CLI flags should take precedence over env var flags ([#10304](https://github.com/dbt-labs/dbt-core/issues/10304))
- Fix typing for artifact schemas ([#10442](https://github.com/dbt-labs/dbt-core/issues/10442))
- Fix over deletion of generated_metrics in partial parsing ([#10450](https://github.com/dbt-labs/dbt-core/issues/10450))
- Fix error constructing warn_error_options ([#10452](https://github.com/dbt-labs/dbt-core/issues/10452))
- Do not update varchar column definitions if a contract exists ([#10362](https://github.com/dbt-labs/dbt-core/issues/10362))
- fix all_constraints access, disabled node parsing of non-uniquely named resources ([#10509](https://github.com/dbt-labs/dbt-core/issues/10509))
- respect --quiet and --warn-error-options for flag deprecations ([#10105](https://github.com/dbt-labs/dbt-core/issues/10105))
- Propagate measure label when using create_metrics ([#10536](https://github.com/dbt-labs/dbt-core/issues/10536))
- Fix state:modified check for exports ([#10138](https://github.com/dbt-labs/dbt-core/issues/10138))
- Filter out empty nodes after graph selection to support consistent selection of nodes that depend on upstream public models ([#8987](https://github.com/dbt-labs/dbt-core/issues/8987))
- Late render pre- and post-hooks configs in properties / schema YAML files ([#10603](https://github.com/dbt-labs/dbt-core/issues/10603))
- Allow the use of env_var function in certain macros in which it was previously unavailable. ([#10609](https://github.com/dbt-labs/dbt-core/issues/10609))
- Remove deprecation for tests: to data_tests: change ([#10564](https://github.com/dbt-labs/dbt-core/issues/10564))
- Fix `--resource-type test` for `dbt list` and `dbt build` ([#10730](https://github.com/dbt-labs/dbt-core/issues/10730))
- Fix unit tests for incremental model with alias ([#10754](https://github.com/dbt-labs/dbt-core/issues/10754))
- Allow singular tests to be documented in properties.yml ([#9005](https://github.com/dbt-labs/dbt-core/issues/9005))
- Ignore --empty in unit test ref/source rendering ([#10516](https://github.com/dbt-labs/dbt-core/issues/10516))
- Ignore rendered jinja in configs for state:modified, behind state_modified_compare_more_unrendered_values behaviour flag ([#9564](https://github.com/dbt-labs/dbt-core/issues/9564))
- Improve performance of infer primary key ([#10781](https://github.com/dbt-labs/dbt-core/issues/10781))
- Pass test user config to adapter pre_hook by explicitly adding test builder config to node ([#10484](https://github.com/dbt-labs/dbt-core/issues/10484))
- Attempt to skip saved query processing when no semantic manifest changes ([#10563](https://github.com/dbt-labs/dbt-core/issues/10563))
- Ensure dbt retry of microbatch models doesn't lose prior successful state ([#10800](https://github.com/dbt-labs/dbt-core/issues/10800))
- override materialization python models ([#8520](https://github.com/dbt-labs/dbt-core/issues/8520))
- Handle edge cases when a specified `--event-time-end` is equivalent to the batch size truncated batch start time ([#10824](https://github.com/dbt-labs/dbt-core/issues/10824))
- Begin tracking execution time of microbatch model batches ([#10825](https://github.com/dbt-labs/dbt-core/issues/10825))
- Support disabling unit tests via config. ([#9109](https://github.com/dbt-labs/dbt-core/issues/9109), [#10540](https://github.com/dbt-labs/dbt-core/issues/10540))
- Allow instances of generic data tests to be documented ([#2578](https://github.com/dbt-labs/dbt-core/issues/2578))
- Fix warnings for models referring to a deprecated model ([#10833](https://github.com/dbt-labs/dbt-core/issues/10833))
- Change `lookback` default from `0` to `1` to ensure better data completeness ([#10867](https://github.com/dbt-labs/dbt-core/issues/10867))
- Make `--event-time-start` and `--event-time-end` mutually required ([#10874](https://github.com/dbt-labs/dbt-core/issues/10874))
- Ensure KeyboardInterrupt/SystemExit halts microbatch model execution ([#10862](https://github.com/dbt-labs/dbt-core/issues/10862))
- Exclude hook result from results in on-run-end context ([#7387](https://github.com/dbt-labs/dbt-core/issues/7387))
- unit tests with versioned refs ([#10880](https://github.com/dbt-labs/dbt-core/issues/10880), [#10528](https://github.com/dbt-labs/dbt-core/issues/10528), [#10623](https://github.com/dbt-labs/dbt-core/issues/10623))
- Implement partial parsing for all-yaml snapshots ([#10903](https://github.com/dbt-labs/dbt-core/issues/10903))
- Restore source quoting behaviour when quoting config provided in dbt_project.yml ([#10892](https://github.com/dbt-labs/dbt-core/issues/10892))
- Fix bug when referencing deprecated models ([#10915](https://github.com/dbt-labs/dbt-core/issues/10915))
- Fix 'model' jinja context variable type to dict ([#10927](https://github.com/dbt-labs/dbt-core/issues/10927))
- Take `end_time` for batches to the ceiling to handle edge case where `event_time` column is a date ([#10868](https://github.com/dbt-labs/dbt-core/issues/10868))
- Handle exceptions in `get_execution_status` more broadly to better ensure `run_results.json` gets written ([#10934](https://github.com/dbt-labs/dbt-core/issues/10934))
- Fix 'no attribute .config' error when ref-ing a microbatch model from non-Model context ([#10928](https://github.com/dbt-labs/dbt-core/issues/10928))
- Ensure inferred primary_key is a List[str] with no null values ([#10983](https://github.com/dbt-labs/dbt-core/issues/10983))
- Correct when custom microbatch macro deprecation warning is fired ([#10994](https://github.com/dbt-labs/dbt-core/issues/10994))
- Validate manifest has group_map during group_lookup init ([#10988](https://github.com/dbt-labs/dbt-core/issues/10988))
- Fix plural of 'partial success' in log message ([#10999](https://github.com/dbt-labs/dbt-core/issues/10999))
- Emit batch-level exception with node_info on microbatch batch run failure ([#10840](https://github.com/dbt-labs/dbt-core/issues/10840))
- Fix restrict-access to not apply within a package ([#10134](https://github.com/dbt-labs/dbt-core/issues/10134))
- Make microbatch models skippable ([#11021](https://github.com/dbt-labs/dbt-core/issues/11021))
- Catch DbtRuntimeError for hooks ([#11012](https://github.com/dbt-labs/dbt-core/issues/11012))
- Access DBUG flag more consistently with the rest of the codebase in ManifestLoader ([#11068](https://github.com/dbt-labs/dbt-core/issues/11068))
- Implement partial parsing for singular data test configs in yaml files ([#10801](https://github.com/dbt-labs/dbt-core/issues/10801))
### Docs
- Enable display of unit tests ([dbt-docs/#501](https://github.com/dbt-labs/dbt-docs/issues/501))
- Unit tests not rendering ([dbt-docs/#506](https://github.com/dbt-labs/dbt-docs/issues/506))
- Add support for Saved Query node ([dbt-docs/#486](https://github.com/dbt-labs/dbt-docs/issues/486))
- Fix npm security vulnerabilities as of June 2024 ([dbt-docs/#513](https://github.com/dbt-labs/dbt-docs/issues/513))
### Under the Hood
- Clear error message for Private package in dbt-core ([#10083](https://github.com/dbt-labs/dbt-core/issues/10083))
- Enable use of context in serialization ([#10093](https://github.com/dbt-labs/dbt-core/issues/10093))
- Make RSS high water mark measurement more accurate on Linux ([#10177](https://github.com/dbt-labs/dbt-core/issues/10177))
- Enable record filtering by type. ([#10240](https://github.com/dbt-labs/dbt-core/issues/10240))
- Remove IntermediateSnapshotNode ([#10326](https://github.com/dbt-labs/dbt-core/issues/10326))
- Additional logging for skipped ephemeral models ([#10389](https://github.com/dbt-labs/dbt-core/issues/10389))
- bump black to 24.3.0 ([#10454](https://github.com/dbt-labs/dbt-core/issues/10454))
- generate protos with protoc version 5.26.1 ([#10457](https://github.com/dbt-labs/dbt-core/issues/10457))
- Move from minimal-snowplow-tracker fork back to snowplow-tracker ([#8409](https://github.com/dbt-labs/dbt-core/issues/8409))
- Add group info to RunResultError, RunResultFailure, RunResultWarning log lines ([#](https://github.com/dbt-labs/dbt-core/issues/))
- Improve speed of tree traversal when finding children, increasing build speed for some selectors ([#10434](https://github.com/dbt-labs/dbt-core/issues/10434))
- Add test for sources tables with quotes ([#10582](https://github.com/dbt-labs/dbt-core/issues/10582))
- Additional type hints for `core/dbt/version.py` ([#10612](https://github.com/dbt-labs/dbt-core/issues/10612))
- Fix typing issues in core/dbt/contracts/sql.py ([#10614](https://github.com/dbt-labs/dbt-core/issues/10614))
- Fix type errors in `dbt/core/task/clean.py` ([#10616](https://github.com/dbt-labs/dbt-core/issues/10616))
- Add Snowplow tracking for behavior flag deprecations ([#10552](https://github.com/dbt-labs/dbt-core/issues/10552))
- Add test utility patch_microbatch_end_time for adapters testing ([#10713](https://github.com/dbt-labs/dbt-core/issues/10713))
- Replace `TestSelector` with `ResourceTypeSelector` ([#10718](https://github.com/dbt-labs/dbt-core/issues/10718))
- Standardize returning `ResourceTypeSelector` instances in `dbt list` and `dbt build` ([#10739](https://github.com/dbt-labs/dbt-core/issues/10739))
- Add group metadata info to LogModelResult and LogTestResult ([#10775](https://github.com/dbt-labs/dbt-core/issues/10775))
- Remove support and testing for Python 3.8, which is now EOL. ([#10861](https://github.com/dbt-labs/dbt-core/issues/10861))
- Behavior change for mf timespine without yaml configuration ([#10959](https://github.com/dbt-labs/dbt-core/issues/10959))
- Behavior change for cumulative metric type param ([#10960](https://github.com/dbt-labs/dbt-core/issues/10960))
- Upgrade protobuf ([#10658](https://github.com/dbt-labs/dbt-core/issues/10658))
### Dependencies
- Remove logbook dependency ([#8027](https://github.com/dbt-labs/dbt-core/issues/8027))
- Increase supported version range for dbt-semantic-interfaces. Needed to support custom calendar features. ([#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- Bump minimnum allowed dbt-adapters version to 1.8.0 ([#N/A](https://github.com/dbt-labs/dbt-core/issues/N/A))
- Bump minimum dbt-adapters version to 1.9.0 ([#10996](https://github.com/dbt-labs/dbt-core/issues/10996))
### Security
- Explicitly bind to localhost in docs serve ([#10209](https://github.com/dbt-labs/dbt-core/issues/10209))
### Contributors
- [@DevonFulcher](https://github.com/DevonFulcher) ([#10959](https://github.com/dbt-labs/dbt-core/issues/10959), [#10960](https://github.com/dbt-labs/dbt-core/issues/10960))
- [@McKnight-42](https://github.com/McKnight-42) ([#10197](https://github.com/dbt-labs/dbt-core/issues/10197), [#10198](https://github.com/dbt-labs/dbt-core/issues/10198))
- [@Threynaud](https://github.com/Threynaud) ([#11068](https://github.com/dbt-labs/dbt-core/issues/11068))
- [@TowardOliver](https://github.com/TowardOliver) ([#10656](https://github.com/dbt-labs/dbt-core/issues/10656))
- [@aliceliu](https://github.com/aliceliu) ([#10536](https://github.com/dbt-labs/dbt-core/issues/10536), [#10138](https://github.com/dbt-labs/dbt-core/issues/10138))
- [@courtneyholcomb](https://github.com/courtneyholcomb) ([#10360](https://github.com/dbt-labs/dbt-core/issues/10360), [#10376](https://github.com/dbt-labs/dbt-core/issues/10376), [#10475](https://github.com/dbt-labs/dbt-core/issues/10475), [#9265](https://github.com/dbt-labs/dbt-core/issues/9265), [#9265](https://github.com/dbt-labs/dbt-core/issues/9265), [#9265](https://github.com/dbt-labs/dbt-core/issues/9265))
- [@danlsn](https://github.com/danlsn) ([#10915](https://github.com/dbt-labs/dbt-core/issues/10915))
- [@dave-connors-3](https://github.com/dave-connors-3) ([#9824](https://github.com/dbt-labs/dbt-core/issues/9824))
- [@devmessias](https://github.com/devmessias) ([#8520](https://github.com/dbt-labs/dbt-core/issues/8520), [#10880](https://github.com/dbt-labs/dbt-core/issues/10880), [#10528](https://github.com/dbt-labs/dbt-core/issues/10528), [#10623](https://github.com/dbt-labs/dbt-core/issues/10623))
- [@jeancochrane](https://github.com/jeancochrane) ([#5273](https://github.com/dbt-labs/dbt-core/issues/5273))
- [@katsugeneration](https://github.com/katsugeneration) ([#10754](https://github.com/dbt-labs/dbt-core/issues/10754))
- [@kevinneville](https://github.com/kevinneville) ([#7586](https://github.com/dbt-labs/dbt-core/issues/7586))
- [@nakamichiworks](https://github.com/nakamichiworks) ([#10442](https://github.com/dbt-labs/dbt-core/issues/10442))
- [@plypaul](https://github.com/plypaul) ([#10531](https://github.com/dbt-labs/dbt-core/issues/10531))
- [@rariyama](https://github.com/rariyama) ([#8997](https://github.com/dbt-labs/dbt-core/issues/8997))
- [@scottgigante,nevdelap](https://github.com/scottgigante,nevdelap) ([#8774](https://github.com/dbt-labs/dbt-core/issues/8774))
- [@tsturge](https://github.com/tsturge) ([#9109](https://github.com/dbt-labs/dbt-core/issues/9109), [#10540](https://github.com/dbt-labs/dbt-core/issues/10540))
- [@ttusing](https://github.com/ttusing) ([#10434](https://github.com/dbt-labs/dbt-core/issues/10434))
## Previous Releases
For information on prior major and minor releases, see their changelogs:
* [1.7](https://github.com/dbt-labs/dbt-core/blob/1.7.latest/CHANGELOG.md)
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/dbt-labs/dbt-core/blob/1.3.latest/CHANGELOG.md)
* [1.2](https://github.com/dbt-labs/dbt-core/blob/1.2.latest/CHANGELOG.md)
* [1.1](https://github.com/dbt-labs/dbt-core/blob/1.1.latest/CHANGELOG.md)

Some files were not shown because too many files have changed in this diff Show More