Compare commits

..

3 Commits

Author SHA1 Message Date
Jeremy Cohen
2c8856da3b Add changelog entry 2023-07-19 12:37:50 +02:00
Jeremy Cohen
029045e556 Pin sqlparse<0.5 2023-07-19 12:35:31 +02:00
Jeremy Cohen
433e5c670e Pin click<9 2023-07-19 12:35:17 +02:00
1120 changed files with 40628 additions and 88271 deletions

View File

@@ -1,5 +1,5 @@
[bumpversion]
current_version = 1.10.0a1
current_version = 1.7.0a1
parse = (?P<major>[\d]+) # major version number
\.(?P<minor>[\d]+) # minor version number
\.(?P<patch>[\d]+) # patch version number
@@ -35,3 +35,13 @@ first_value = 1
[bumpversion:file:core/setup.py]
[bumpversion:file:core/dbt/version.py]
[bumpversion:file:plugins/postgres/setup.py]
[bumpversion:file:plugins/postgres/dbt/adapters/postgres/__version__.py]
[bumpversion:file:docker/Dockerfile]
[bumpversion:file:tests/adapter/setup.py]
[bumpversion:file:tests/adapter/dbt/tests/adapter/__version__.py]

View File

@@ -3,7 +3,6 @@
For information on prior major and minor releases, see their changelogs:
* [1.7](https://github.com/dbt-labs/dbt-core/blob/1.7.latest/CHANGELOG.md)
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)

View File

@@ -1,6 +1,6 @@
# dbt Core Changelog
- This file provides a full account of all changes to `dbt-core`
- This file provides a full account of all changes to `dbt-core` and `dbt-postgres`
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)

View File

@@ -0,0 +1,6 @@
kind: Dependencies
body: Pin click<9 + sqlparse<0.5
time: 2023-07-19T12:37:43.716495+02:00
custom:
Author: jtcohen6
PR: "8146"

View File

@@ -1,6 +0,0 @@
kind: Dependencies
body: Upgrading dbt-semantic-interfaces to 0.8.3 for custom grain support in offset windows
time: 2024-11-12T16:38:15.351519-05:00
custom:
Author: WilliamDee
Issue: None

View File

@@ -1,6 +0,0 @@
kind: "Dependencies"
body: "Bump codecov/codecov-action from 4 to 5"
time: 2024-11-18T00:11:13.00000Z
custom:
Author: dependabot[bot]
Issue: 11009

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Fix for column tests not rendering on quoted columns
time: 2023-05-31T11:54:19.687363-04:00
custom:
Author: drewbanin
Issue: "201"

View File

@@ -0,0 +1,6 @@
kind: Docs
body: Remove static SQL codeblock for metrics
time: 2023-07-18T19:24:22.155323+02:00
custom:
Author: marcodamore
Issue: "436"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Add new hard_deletes="new_record" mode for snapshots.
time: 2024-11-04T12:00:53.95191-05:00
custom:
Author: peterallenwebb
Issue: "10235"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Add `batch` context object to model jinja context
time: 2024-11-21T12:56:30.715473-06:00
custom:
Author: QMalcolm
Issue: "11025"

View File

@@ -1,7 +0,0 @@
kind: Features
body: Ensure pre/post hooks only run on first/last batch respectively for microbatch
model batches
time: 2024-12-06T19:53:08.928793-06:00
custom:
Author: MichelleArk QMalcolm
Issue: 11094 11104

View File

@@ -1,6 +0,0 @@
kind: Features
body: Generate latest model version view
time: 2024-12-10T13:07:34.723167-05:00
custom:
Author: gshank
Issue: "7442"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Support "tags" in Saved Queries
time: 2024-12-16T09:54:35.327675-08:00
custom:
Author: theyostalservice
Issue: "11155"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Calculate source freshness via a SQL query
time: 2024-12-17T17:16:31.841076-08:00
custom:
Author: ChenyuLInx
Issue: "8797"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Add freshness definition on model for adaptive job
time: 2024-12-18T17:07:29.55754-08:00
custom:
Author: ChenyuLInx
Issue: "11123"

View File

@@ -1,6 +0,0 @@
kind: Features
body: Meta config for dimensions measures and entities
time: 2025-01-06T13:28:29.176439-06:00
custom:
Author: DevonFulcher
Issue: None

View File

@@ -1,6 +0,0 @@
kind: Features
body: Add doc_blocks to manifest for nodes and columns
time: 2025-01-22T17:03:28.866522Z
custom:
Author: aranke
Issue: 11000 11001

View File

@@ -1,6 +0,0 @@
kind: Features
body: Initial implementation of sample mode
time: 2025-02-02T14:00:54.074209-06:00
custom:
Author: QMalcolm
Issue: 11227 11230 11231 11248 11252 11254 11258

View File

@@ -0,0 +1,6 @@
kind: Fixes
body: Enable converting deprecation warnings to errors
time: 2023-07-18T12:55:18.03914-04:00
custom:
Author: michelleark
Issue: "8130"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: dbt retry does not respect --threads
time: 2024-08-22T12:21:32.358066+05:30
custom:
Author: donjin-master
Issue: "10584"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: update adapter version messages
time: 2024-10-25T10:43:39.274723-05:00
custom:
Author: dave-connors-3
Issue: "10230"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Catch DbtRuntimeError for hooks
time: 2024-11-21T18:17:39.753235Z
custom:
Author: aranke
Issue: "11012"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Access DBUG flag more consistently with the rest of the codebase in ManifestLoader
time: 2024-11-28T16:29:36.236729+01:00
custom:
Author: Threynaud
Issue: "11068"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Improve the performance characteristics of add_test_edges()
time: 2024-12-04T10:04:29.096231-05:00
custom:
Author: peterallenwebb
Issue: "10950"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Implement partial parsing for singular data test configs in yaml files
time: 2024-12-05T14:53:07.295536-05:00
custom:
Author: gshank
Issue: "10801"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Fix debug log messages for microbatch batch execution information
time: 2024-12-09T11:38:06.972743-06:00
custom:
Author: MichelleArk QMalcolm
Issue: "11111"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Fix running of extra "last" batch when there is only one batch
time: 2024-12-09T13:33:17.253326-06:00
custom:
Author: QMalcolm
Issue: "11112"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Fix interpretation of `PartialSuccess` to result in non-zero exit code
time: 2024-12-09T15:07:11.391313-06:00
custom:
Author: QMalcolm
Issue: "11114"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Warn about invalid usages of `concurrent_batches` config
time: 2024-12-12T11:36:11.451962-06:00
custom:
Author: QMalcolm
Issue: "11122"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Error writing generic test at run time
time: 2024-12-16T13:46:45.936573-05:00
custom:
Author: gshank
Issue: "11110"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Run check_modified_contract for state:modified
time: 2024-12-17T15:48:48.053054-05:00
custom:
Author: gshank
Issue: "11034"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Fix unrendered_config for tests from dbt_project.yml
time: 2024-12-18T11:26:40.270022-05:00
custom:
Author: gshank
Issue: "11146"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Make partial parsing reparse referencing nodes of newly versioned models.
time: 2025-01-02T14:05:43.629959-05:00
custom:
Author: d-cole
Issue: "8872"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Ensure warning about microbatch lacking filter inputs is always fired
time: 2025-01-07T17:37:19.373261-06:00
custom:
Author: QMalcolm
Issue: "11159"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Fix microbatch dbt list --output json
time: 2025-01-09T12:33:09.958795+01:00
custom:
Author: internetcoffeephone
Issue: 10556 11098

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Fix for custom fields in generic test config for not_null and unique tests
time: 2025-01-10T15:58:24.479245-05:00
custom:
Author: gshank
Issue: "11208"

View File

@@ -1,6 +0,0 @@
kind: Fixes
body: Loosen validation on freshness to accomodate previously wrong but harmless config.
time: 2025-01-28T13:55:09.318833-08:00
custom:
Author: ChenyuLInx peterallenwebb
Issue: "11123"

View File

@@ -1,6 +0,0 @@
kind: Under the Hood
body: Create a no-op exposure runner
time: 2024-12-02T16:47:15.766574Z
custom:
Author: aranke
Issue: ' '

View File

@@ -1,7 +0,0 @@
kind: Under the Hood
body: Improve selection peformance by optimizing the select_children() and select_parents()
functions.
time: 2024-12-05T14:31:44.584216-05:00
custom:
Author: peterallenwebb
Issue: "11099"

View File

@@ -1,7 +0,0 @@
kind: Under the Hood
body: Change exception type from DbtInternalException to UndefinedMacroError when
macro not found in 'run operation' command
time: 2025-01-07T12:39:55.234321-05:00
custom:
Author: michelleark
Issue: "11192"

View File

@@ -1,6 +0,0 @@
kind: Under the Hood
body: Create LogNodeResult event
time: 2025-01-07T20:58:38.821036Z
custom:
Author: aranke
Issue: ' '

View File

@@ -1,6 +0,0 @@
kind: Under the Hood
body: Fix error counts for exposures
time: 2025-01-10T20:20:57.01632Z
custom:
Author: aranke
Issue: ' '

View File

@@ -1,6 +0,0 @@
kind: Under the Hood
body: Misc fixes for group info in logging
time: 2025-01-17T15:22:15.497485Z
custom:
Author: aranke
Issue: '11218'

View File

@@ -31,7 +31,43 @@ kinds:
- {{.Body}} ({{ range $index, $element := $IssueList }}{{if $index}}, {{end}}{{$element}}{{end}})
- label: Under the Hood
- label: Dependencies
changeFormat: |-
{{- $PRList := list }}
{{- $changes := splitList " " $.Custom.PR }}
{{- range $pullrequest := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $pullrequest }}
{{- $PRList = append $PRList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $PRList }}{{if $index}}, {{end}}{{$element}}{{end}})
skipGlobalChoices: true
additionalChoices:
- key: Author
label: GitHub Username(s) (separated by a single space if multiple)
type: string
minLength: 3
- key: PR
label: GitHub Pull Request Number (separated by a single space if multiple)
type: string
minLength: 1
- label: Security
changeFormat: |-
{{- $PRList := list }}
{{- $changes := splitList " " $.Custom.PR }}
{{- range $pullrequest := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $pullrequest }}
{{- $PRList = append $PRList $changeLink }}
{{- end -}}
- {{.Body}} ({{ range $index, $element := $PRList }}{{if $index}}, {{end}}{{$element}}{{end}})
skipGlobalChoices: true
additionalChoices:
- key: Author
label: GitHub Username(s) (separated by a single space if multiple)
type: string
minLength: 3
- key: PR
label: GitHub Pull Request Number (separated by a single space if multiple)
type: string
minLength: 1
newlines:
afterChangelogHeader: 1
@@ -70,10 +106,18 @@ footerFormat: |
{{- $changeList := splitList " " $change.Custom.Author }}
{{- $IssueList := list }}
{{- $changeLink := $change.Kind }}
{{- $changes := splitList " " $change.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- if or (eq $change.Kind "Dependencies") (eq $change.Kind "Security") }}
{{- $changes := splitList " " $change.Custom.PR }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/pull/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
{{- else }}
{{- $changes := splitList " " $change.Custom.Issue }}
{{- range $issueNbr := $changes }}
{{- $changeLink := "[#nbr](https://github.com/dbt-labs/dbt-core/issues/nbr)" | replace "nbr" $issueNbr }}
{{- $IssueList = append $IssueList $changeLink }}
{{- end -}}
{{- end }}
{{- /* check if this contributor has other changes associated with them already */}}
{{- if hasKey $contributorDict $author }}

View File

@@ -7,9 +7,6 @@ ignore =
W503 # makes Flake8 work like black
W504
E203 # makes Flake8 work like black
E704 # makes Flake8 work like black
E741
E501 # long line checking is done in black
exclude = test/
per-file-ignores =
*/__init__.py: F401

2
.gitattributes vendored
View File

@@ -1,4 +1,4 @@
core/dbt/task/docs/index.html binary
core/dbt/include/index.html binary
tests/functional/artifacts/data/state/*/manifest.json binary
core/dbt/docs/build/html/searchindex.js binary
core/dbt/docs/build/html/index.html binary

42
.github/CODEOWNERS vendored
View File

@@ -13,6 +13,48 @@
# the core team as a whole will be assigned
* @dbt-labs/core-team
### OSS Tooling Guild
/.github/ @dbt-labs/guild-oss-tooling
.bumpversion.cfg @dbt-labs/guild-oss-tooling
.changie.yaml @dbt-labs/guild-oss-tooling
pre-commit-config.yaml @dbt-labs/guild-oss-tooling
pytest.ini @dbt-labs/guild-oss-tooling
tox.ini @dbt-labs/guild-oss-tooling
pyproject.toml @dbt-labs/guild-oss-tooling
requirements.txt @dbt-labs/guild-oss-tooling
dev_requirements.txt @dbt-labs/guild-oss-tooling
/core/setup.py @dbt-labs/guild-oss-tooling
/core/MANIFEST.in @dbt-labs/guild-oss-tooling
### ADAPTERS
# Adapter interface ("base" + "sql" adapter defaults, cache)
/core/dbt/adapters @dbt-labs/core-adapters
# Global project (default macros + materializations), starter project
/core/dbt/include @dbt-labs/core-adapters
# Postgres plugin
/plugins/ @dbt-labs/core-adapters
/plugins/postgres/setup.py @dbt-labs/core-adapters @dbt-labs/guild-oss-tooling
# Functional tests for adapter plugins
/tests/adapter @dbt-labs/core-adapters
### TESTS
# Overlapping ownership for vast majority of unit + functional tests
# Perf regression testing framework
# This excludes the test project files itself since those aren't specific
# framework changes (excluded by not setting an owner next to it- no owner)
/performance @nathaniel-may
/performance/projects
### ARTIFACTS
/schemas/dbt @dbt-labs/cloud-artifacts

View File

@@ -1,18 +0,0 @@
name: 📄 Code docs
description: Report an issue for markdown files within this repo, such as README, ARCHITECTURE, etc.
title: "[Code docs] <title>"
labels: ["triage"]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this code docs issue!
- type: textarea
attributes:
label: Please describe the issue and your proposals.
description: |
Links? References? Anything that will give us more context about the issue you are encountering!
Tip: You can attach images by clicking this area to highlight it and then dragging files in.
validations:
required: false

View File

@@ -1,8 +1,5 @@
blank_issues_enabled: false
contact_links:
- name: Documentation
url: https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose
about: Problems and issues with dbt product documentation hosted on docs.getdbt.com. Issues for markdown files within this repo, such as README, should be opened using the "Code docs" template.
- name: Ask the community for help
url: https://github.com/dbt-labs/docs.getdbt.com/discussions
about: Need help troubleshooting? Check out our guide on how to ask

View File

@@ -1,67 +0,0 @@
name: 🛠️ Implementation
description: This is an implementation ticket intended for use by the maintainers of dbt-core
title: "[<project>] <title>"
labels: ["user docs"]
body:
- type: markdown
attributes:
value: This is an implementation ticket intended for use by the maintainers of dbt-core
- type: checkboxes
attributes:
label: Housekeeping
description: >
A couple friendly reminders:
1. Remove the `user docs` label if the scope of this work does not require changes to https://docs.getdbt.com/docs: no end-user interface (e.g. yml spec, CLI, error messages, etc) or functional changes
2. Link any blocking issues in the "Blocked on" field under the "Core devs & maintainers" project.
options:
- label: I am a maintainer of dbt-core
required: true
- type: textarea
attributes:
label: Short description
description: |
Describe the scope of the ticket, a high-level implementation approach and any tradeoffs to consider
validations:
required: true
- type: textarea
attributes:
label: Acceptance criteria
description: |
What is the definition of done for this ticket? Include any relevant edge cases and/or test cases
validations:
required: true
- type: textarea
attributes:
label: Suggested Tests
description: |
Provide scenarios to test. Link to existing similar tests if appropriate.
placeholder: |
1. Test with no version specified in the schema file and use selection logic on a versioned model for a specific version. Expect pass.
2. Test with a version specified in the schema file that is no valid. Expect ParsingError.
validations:
required: true
- type: textarea
attributes:
label: Impact to Other Teams
description: |
Will this change impact other teams? Include details of the kinds of changes required (new tests, code changes, related tickets) and _add the relevant `Impact:[team]` label_.
placeholder: |
Example: This change impacts `dbt-redshift` because the tests will need to be modified. The `Impact:[Adapter]` label has been added.
validations:
required: true
- type: textarea
attributes:
label: Will backports be required?
description: |
Will this change need to be backported to previous versions? Add details, possible blockers to backporting and _add the relevant backport labels `backport 1.x.latest`_
placeholder: |
Example: Backport to 1.6.latest, 1.5.latest and 1.4.latest. Since 1.4 isn't using click, the backport may be complicated. The `backport 1.6.latest`, `backport 1.5.latest` and `backport 1.4.latest` labels have been added.
validations:
required: true
- type: textarea
attributes:
label: Context
description: |
Provide the "why", motivation, and alternative approaches considered -- linking to previous refinement issues, spikes and documentation as appropriate
validations:
required: false

3
.github/_README.md vendored
View File

@@ -47,8 +47,7 @@ ___
### How to re-run jobs
- From the UI you can rerun from failure
- You can retrigger the cla check by commenting on the PR with `@cla-bot check`
- Some actions cannot be rerun in the GitHub UI. Namely the snyk checks and the cla check. Snyk checks are rerun by closing and reopening the PR. You can retrigger the cla check by commenting on the PR with `@cla-bot check`
___

View File

@@ -1,21 +1,20 @@
name: "GitHub package `latest` tag wrangler for containers"
description: "Determines if the published image should include `latest` tags"
name: "Github package 'latest' tag wrangler for containers"
description: "Determines wether or not a given dbt container should be given a bare 'latest' tag (I.E. dbt-core:latest)"
inputs:
package_name:
description: "Package being published (i.e. `dbt-core`, `dbt-redshift`, etc.)"
description: "Package to check (I.E. dbt-core, dbt-redshift, etc)"
required: true
new_version:
description: "SemVer of the package being published (i.e. 1.7.2, 1.8.0a1, etc.)"
description: "Semver of the container being built (I.E. 1.0.4)"
required: true
github_token:
description: "Auth token for GitHub (must have view packages scope)"
gh_token:
description: "Auth token for github (must have view packages scope)"
required: true
outputs:
tags:
description: "A list of tags to associate with this version"
latest:
description: "Wether or not built container should be tagged latest (bool)"
minor_latest:
description: "Wether or not built container should be tagged minor.latest (bool)"
runs:
using: "docker"
image: "Dockerfile"

View File

@@ -1,71 +1,98 @@
import os
from packaging.version import Version, parse
import requests
import sys
from typing import List
def main():
package_name: str = os.environ["INPUT_PACKAGE_NAME"]
new_version: Version = parse(os.environ["INPUT_NEW_VERSION"])
github_token: str = os.environ["INPUT_GITHUB_TOKEN"]
response = _package_metadata(package_name, github_token)
published_versions = _published_versions(response)
new_version_tags = _new_version_tags(new_version, published_versions)
_register_tags(new_version_tags, package_name)
def _package_metadata(package_name: str, github_token: str) -> requests.Response:
url = f"https://api.github.com/orgs/dbt-labs/packages/container/{package_name}/versions"
return requests.get(url, auth=("", github_token))
def _published_versions(response: requests.Response) -> List[Version]:
package_metadata = response.json()
return [
parse(tag)
for version in package_metadata
for tag in version["metadata"]["container"]["tags"]
if "latest" not in tag
]
def _new_version_tags(new_version: Version, published_versions: List[Version]) -> List[str]:
# the package version is always a tag
tags = [str(new_version)]
# pre-releases don't get tagged with `latest`
if new_version.is_prerelease:
return tags
if new_version > max(published_versions):
tags.append("latest")
published_patches = [
version
for version in published_versions
if version.major == new_version.major and version.minor == new_version.minor
]
if new_version > max(published_patches):
tags.append(f"{new_version.major}.{new_version.minor}.latest")
return tags
def _register_tags(tags: List[str], package_name: str) -> None:
fully_qualified_tags = ",".join([f"ghcr.io/dbt-labs/{package_name}:{tag}" for tag in tags])
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write(f"fully_qualified_tags={fully_qualified_tags}")
def _validate_response(response: requests.Response) -> None:
message = response["message"]
if response.status_code != 200:
print(f"Call to GitHub API failed: {response.status_code} - {message}")
sys.exit(1)
import requests
from distutils.util import strtobool
from typing import Union
from packaging.version import parse, Version
if __name__ == "__main__":
main()
# get inputs
package = os.environ["INPUT_PACKAGE"]
new_version = parse(os.environ["INPUT_NEW_VERSION"])
gh_token = os.environ["INPUT_GH_TOKEN"]
halt_on_missing = strtobool(os.environ.get("INPUT_HALT_ON_MISSING", "False"))
# get package metadata from github
package_request = requests.get(
f"https://api.github.com/orgs/dbt-labs/packages/container/{package}/versions",
auth=("", gh_token),
)
package_meta = package_request.json()
# Log info if we don't get a 200
if package_request.status_code != 200:
print(f"Call to GH API failed: {package_request.status_code} {package_meta['message']}")
# Make an early exit if there is no matching package in github
if package_request.status_code == 404:
if halt_on_missing:
sys.exit(1)
# everything is the latest if the package doesn't exist
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write("latest=True")
gh_output.write("minor_latest=True")
sys.exit(0)
# TODO: verify package meta is "correct"
# https://github.com/dbt-labs/dbt-core/issues/4640
# map versions and tags
version_tag_map = {
version["id"]: version["metadata"]["container"]["tags"] for version in package_meta
}
# is pre-release
pre_rel = True if any(x in str(new_version) for x in ["a", "b", "rc"]) else False
# semver of current latest
for version, tags in version_tag_map.items():
if "latest" in tags:
# N.B. This seems counterintuitive, but we expect any version tagged
# 'latest' to have exactly three associated tags:
# latest, major.minor.latest, and major.minor.patch.
# Subtracting everything that contains the string 'latest' gets us
# the major.minor.patch which is what's needed for comparison.
current_latest = parse([tag for tag in tags if "latest" not in tag][0])
else:
current_latest = False
# semver of current_minor_latest
for version, tags in version_tag_map.items():
if f"{new_version.major}.{new_version.minor}.latest" in tags:
# Similar to above, only now we expect exactly two tags:
# major.minor.patch and major.minor.latest
current_minor_latest = parse([tag for tag in tags if "latest" not in tag][0])
else:
current_minor_latest = False
def is_latest(
pre_rel: bool, new_version: Version, remote_latest: Union[bool, Version]
) -> bool:
"""Determine if a given contaier should be tagged 'latest' based on:
- it's pre-release status
- it's version
- the version of a previously identified container tagged 'latest'
:param pre_rel: Wether or not the version of the new container is a pre-release
:param new_version: The version of the new container
:param remote_latest: The version of the previously identified container that's
already tagged latest or False
"""
# is a pre-release = not latest
if pre_rel:
return False
# + no latest tag found = is latest
if not remote_latest:
return True
# + if remote version is lower than current = is latest, else not latest
return True if remote_latest <= new_version else False
latest = is_latest(pre_rel, new_version, current_latest)
minor_latest = is_latest(pre_rel, new_version, current_minor_latest)
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write(f"latest={latest}")
gh_output.write(f"minor_latest={minor_latest}")

View File

@@ -5,15 +5,6 @@ runs:
steps:
- shell: bash
run: |
sudo apt-get --purge remove postgresql postgresql-*
sudo apt update -y
sudo apt install gnupg2 wget vim -y
sudo sh -c 'echo "deb https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
curl -fsSL https://www.postgresql.org/media/keys/ACCC4CF8.asc|sudo gpg --dearmor -o /etc/apt/trusted.gpg.d/postgresql.gpg
sudo apt update -y
sudo apt install postgresql-16
sudo apt-get -y install postgresql postgresql-contrib
sudo systemctl start postgresql
sudo systemctl enable postgresql
sudo systemctl start postgresql.service
pg_isready
sudo -u postgres bash ${{ github.action_path }}/setup_db.sh

View File

@@ -5,9 +5,7 @@ runs:
steps:
- shell: bash
run: |
brew install postgresql@16
brew link postgresql@16 --force
brew services start postgresql@16
brew services start postgresql
echo "Check PostgreSQL service is running"
i=10
COMMAND='pg_isready'

View File

@@ -5,22 +5,8 @@ runs:
steps:
- shell: pwsh
run: |
Write-Host -Object "Installing PostgreSQL 16 as windows service..."
$installerArgs = @("--install_runtimes 0", "--superpassword root", "--enable_acledit 1", "--unattendedmodeui none", "--mode unattended")
$filePath = Invoke-DownloadWithRetry -Url "https://get.enterprisedb.com/postgresql/postgresql-16.1-1-windows-x64.exe" -Path "$env:PGROOT/postgresql-16.1-1-windows-x64.exe"
Start-Process -FilePath $filePath -ArgumentList $installerArgs -Wait -PassThru
Write-Host -Object "Validating PostgreSQL 16 Install..."
Get-Service -Name postgresql*
$pgReady = Start-Process -FilePath "$env:PGBIN\pg_isready" -Wait -PassThru
$exitCode = $pgReady.ExitCode
if ($exitCode -ne 0) {
Write-Host -Object "PostgreSQL is not ready. Exitcode: $exitCode"
exit $exitCode
}
Write-Host -Object "Starting PostgreSQL 16 Service..."
$pgService = Get-Service -Name postgresql-x64-16
$pgService = Get-Service -Name postgresql*
Set-Service -InputObject $pgService -Status running -StartupType automatic
Start-Process -FilePath "$env:PGBIN\pg_isready" -Wait -PassThru
$env:Path += ";$env:PGBIN"
bash ${{ github.action_path }}/setup_db.sh

View File

@@ -11,6 +11,11 @@ updates:
schedule:
interval: "daily"
rebase-strategy: "disabled"
- package-ecosystem: "pip"
directory: "/plugins/postgres"
schedule:
interval: "daily"
rebase-strategy: "disabled"
# docker dependencies
- package-ecosystem: "docker"
@@ -23,10 +28,3 @@ updates:
schedule:
interval: "weekly"
rebase-strategy: "disabled"
# github dependencies
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
rebase-strategy: "disabled"

View File

@@ -1,12 +1,15 @@
Resolves #
resolves #
[docs](https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose) dbt-labs/docs.getdbt.com/#
<!---
Include the number of the issue addressed by this PR above, if applicable.
Include the number of the issue addressed by this PR above if applicable.
PRs for code changes without an associated issue *will not be merged*.
See CONTRIBUTING.md for more information.
Add the `user docs` label to this PR if it will need docs changes. An
issue will get opened in docs.getdbt.com upon successful merge of this PR.
Include the number of the docs issue that was opened for this PR. If
this change has no user-facing implications, "N/A" suffices instead. New
docs tickets can be created by clicking the link above or by going to
https://github.com/dbt-labs/docs.getdbt.com/issues/new/choose.
-->
### Problem
@@ -26,8 +29,7 @@ Resolves #
### Checklist
- [ ] I have read [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md) and understand what's expected of me.
- [ ] I have run this code in development, and it appears to resolve the stated issue.
- [ ] This PR includes tests, or tests are not required or relevant for this PR.
- [ ] This PR has no interface changes (e.g., macros, CLI, logs, JSON artifacts, config files, adapter interface, etc.) or this PR has already received feedback and approval from Product or DX.
- [ ] This PR includes [type annotations](https://docs.python.org/3/library/typing.html) for new and modified functions.
- [ ] I have read [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md) and understand what's expected of me
- [ ] I have run this code in development and it appears to resolve the stated issue
- [ ] This PR includes tests, or tests are not required/relevant for this PR
- [ ] This PR has no interface changes (e.g. macros, cli, logs, json artifacts, config files, adapter interface, etc) or this PR has already received feedback and approval from Product or DX

View File

@@ -1,50 +0,0 @@
# **what?**
# Check if the an issue is opened near or during an extended holiday period.
# If so, post an automatically-generated comment about the holiday for bug reports.
# Also provide specific information to customers of dbt Cloud.
# **why?**
# Explain why responses will be delayed during our holiday period.
# **when?**
# This will run when new issues are opened.
name: Auto-Respond to Bug Reports During Holiday Period
on:
issues:
types:
- opened
permissions:
contents: read
issues: write
jobs:
auto-response:
runs-on: ubuntu-latest
steps:
- name: Check if current date is within holiday period
id: date-check
run: |
current_date=$(date -u +"%Y-%m-%d")
start_date="2024-12-23"
end_date="2025-01-05"
if [[ "$current_date" < "$start_date" || "$current_date" > "$end_date" ]]; then
echo "outside_holiday=true" >> $GITHUB_ENV
else
echo "outside_holiday=false" >> $GITHUB_ENV
fi
- name: Post comment
if: ${{ env.outside_holiday == 'false' && contains(github.event.issue.labels.*.name, 'bug') }}
run: |
gh issue comment ${{ github.event.issue.number }} --repo ${{ github.repository }} --body "Thank you for your bug report! Our team is will be out of the office for [Christmas and our Global Week of Rest](https://handbook.getdbt.com/docs/time_off#2024-us-holidays), from December 25, 2024, through January 3, 2025.
We will review your issue as soon as possible after returning.
Thank you for your understanding, and happy holidays! 🎄🎉
If you are a customer of dbt Cloud, please contact our Customer Support team via the dbt Cloud web interface or email **support@dbtlabs.com**."
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -35,6 +35,6 @@ jobs:
github.event.pull_request.merged
&& contains(github.event.label.name, 'backport')
steps:
- uses: tibdex/backport@v2.0.4
- uses: tibdex/backport@v2.0.3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -41,6 +41,8 @@ jobs:
include:
- label: "dependencies"
changie_kind: "Dependencies"
- label: "snyk"
changie_kind: "Security"
runs-on: ubuntu-latest
steps:
@@ -56,4 +58,4 @@ jobs:
commit_message: "Add automated changelog yaml from template for bot PR"
changie_kind: ${{ matrix.changie_kind }}
label: ${{ matrix.label }}
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n Issue: ${{ github.event.pull_request.number }}"
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n PR: ${{ github.event.pull_request.number }}"

View File

@@ -2,8 +2,10 @@
# Checks that a file has been committed under the /.changes directory
# as a new CHANGELOG entry. Cannot check for a specific filename as
# it is dynamically generated by change type and timestamp.
# This workflow runs on pull_request_target because it requires
# secrets to post comments.
# This workflow should not require any secrets since it runs for PRs
# from forked repos.
# By default, secrets are not passed to workflows running from
# a forked repo.
# **why?**
# Ensure code change gets reflected in the CHANGELOG.
@@ -17,10 +19,8 @@
name: Check Changelog Entry
on:
pull_request_target:
pull_request:
types: [opened, reopened, labeled, unlabeled, synchronize]
paths-ignore: ['.changes/**', '.github/**', 'tests/**', '**.md', '**.yml']
workflow_dispatch:
defaults:

View File

@@ -1,41 +0,0 @@
name: Check Artifact Changes
on:
pull_request:
types: [ opened, reopened, labeled, unlabeled, synchronize ]
paths-ignore: [ '.changes/**', '.github/**', 'tests/**', '**.md', '**.yml' ]
workflow_dispatch:
jobs:
check-artifact-changes:
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.labels.*.name, 'artifact_minor_upgrade') }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for changes in core/dbt/artifacts
# https://github.com/marketplace/actions/paths-changes-filter
uses: dorny/paths-filter@v3
id: check_artifact_changes
with:
filters: |
artifacts_changed:
- 'core/dbt/artifacts/**'
list-files: shell
- name: Fail CI if artifacts have changed
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
run: |
echo "CI failure: Artifact changes checked in core/dbt/artifacts directory."
echo "Files changed: ${{ steps.check_artifact_changes.outputs.artifacts_changed_files }}"
echo "To bypass this check, confirm that the change is not breaking (https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/artifacts/README.md#breaking-changes) and add the 'artifact_minor_upgrade' label to the PR. Modifications and additions to all fields require updates to https://github.com/dbt-labs/dbt-jsonschema."
exit 1
- name: CI check passed
if: steps.check_artifact_changes.outputs.artifacts_changed == 'false'
run: |
echo "No prohibited artifact changes found in core/dbt/artifacts. CI check passed."

View File

@@ -1,39 +0,0 @@
# **what?**
# Label a PR with a `community` label when a PR is opened by a user outside core/adapters
# **why?**
# To streamline triage and ensure that community contributions are recognized and prioritized
# **when?**
# When a PR is opened, not in draft or moved from draft to ready for review
name: Label community PRs
on:
# have to use pull_request_target since community PRs come from forks
pull_request_target:
types: [opened, ready_for_review]
defaults:
run:
shell: bash
permissions:
pull-requests: write # labels PRs
contents: read # reads team membership
jobs:
open_issues:
# If this PR already has the community label, no need to relabel it
# If this PR is opened and not draft, determine if it needs to be labeled
# if the PR is converted out of draft, determine if it needs to be labeled
if: |
(!contains(github.event.pull_request.labels.*.name, 'community') &&
(github.event.action == 'opened' && github.event.pull_request.draft == false ) ||
github.event.action == 'ready_for_review' )
uses: dbt-labs/actions/.github/workflows/label-community.yml@main
with:
github_team: 'core-group'
label: 'community'
secrets: inherit

View File

@@ -1,41 +0,0 @@
# **what?**
# Open an issue in docs.getdbt.com when an issue is labeled `user docs` and closed as completed
# **why?**
# To reduce barriers for keeping docs up to date
# **when?**
# When an issue is labeled `user docs` and is closed as completed. Can be labeled before or after the issue is closed.
name: Open issues in docs.getdbt.com repo when an issue is labeled
run-name: "Open an issue in docs.getdbt.com for issue #${{ github.event.issue.number }}"
on:
issues:
types: [labeled, closed]
defaults:
run:
shell: bash
permissions:
issues: write # comments on issues
jobs:
open_issues:
# we only want to run this when the issue is closed as completed and the label `user docs` has been assigned.
# If this logic does not exist in this workflow, it runs the
# risk of duplicaton of issues being created due to merge and label both triggering this workflow to run and neither having
# generating the comment before the other runs. This lives here instead of the shared workflow because this is where we
# decide if it should run or not.
if: |
(github.event.issue.state == 'closed' &&
github.event.issue.state_reason == 'completed' &&
contains( github.event.issue.labels.*.name, 'user docs'))
uses: dbt-labs/actions/.github/workflows/open-issue-in-repo.yml@main
with:
issue_repository: "dbt-labs/docs.getdbt.com"
issue_title: "[Core] Docs Changes Needed from ${{ github.event.repository.name }} Issue #${{ github.event.issue.number }}"
issue_body: "At a minimum, update body to include a link to the page on docs.getdbt.com requiring updates and what part(s) of the page you would like to see updated.\n Originating from this issue: https://github.com/dbt-labs/dbt-core/issues/${{ github.event.issue.number }}"
secrets: inherit

26
.github/workflows/jira-creation.yml vendored Normal file
View File

@@ -0,0 +1,26 @@
# **what?**
# Mirrors issues into Jira. Includes the information: title,
# GitHub Issue ID and URL
# **why?**
# Jira is our tool for tracking and we need to see these issues in there
# **when?**
# On issue creation or when an issue is labeled `Jira`
name: Jira Issue Creation
on:
issues:
types: [opened, labeled]
permissions:
issues: write
jobs:
call-creation-action:
uses: dbt-labs/actions/.github/workflows/jira-creation-actions.yml@main
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

26
.github/workflows/jira-label.yml vendored Normal file
View File

@@ -0,0 +1,26 @@
# **what?**
# Calls mirroring Jira label Action. Includes adding a new label
# to an existing issue or removing a label as well
# **why?**
# Jira is our tool for tracking and we need to see these labels in there
# **when?**
# On labels being added or removed from issues
name: Jira Label Mirroring
on:
issues:
types: [labeled, unlabeled]
permissions:
issues: read
jobs:
call-label-action:
uses: dbt-labs/actions/.github/workflows/jira-label-actions.yml@main
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

27
.github/workflows/jira-transition.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
# **what?**
# Transition a Jira issue to a new state
# Only supports these GitHub Issue transitions:
# closed, deleted, reopened
# **why?**
# Jira needs to be kept up-to-date
# **when?**
# On issue closing, deletion, reopened
name: Jira Issue Transition
on:
issues:
types: [closed, deleted, reopened]
# no special access is needed
permissions: read-all
jobs:
call-transition-action:
uses: dbt-labs/actions/.github/workflows/jira-transition-actions.yml@main
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}

View File

@@ -33,11 +33,6 @@ defaults:
run:
shell: bash
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python integration testing
PYTHON_INTEGRATION_TEST_WORKERS: 5
jobs:
code-quality:
name: code-quality
@@ -47,19 +42,18 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: '3.9'
python-version: '3.8'
- name: Install python dependencies
run: |
python -m pip install --user --upgrade pip
python -m pip --version
make dev
make dev_req
mypy --version
dbt --version
@@ -75,17 +69,17 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: [ "3.9", "3.10", "3.11", "3.12" ]
python-version: ["3.8", "3.9", "3.10", "3.11"]
env:
TOXENV: "unit"
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
@@ -96,12 +90,8 @@ jobs:
python -m pip install tox
tox --version
- name: Run unit tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 10
max_attempts: 3
command: tox -e unit
- name: Run tox
run: tox
- name: Get current date
if: always()
@@ -112,60 +102,27 @@ jobs:
- name: Upload Unit Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
flags: unit
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
include: ${{ steps.generate-include.outputs.include }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
- name: generate include
id: generate-include
run: |
INCLUDE=('"python-version":"3.9","os":"windows-latest"' '"python-version":"3.9","os":"macos-14"' )
INCLUDE_GROUPS="["
for include in ${INCLUDE[@]}; do
for group in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
INCLUDE_GROUPS+=$(sed 's/$/, /' <<< "{\"split-group\":\"${group}\",${include}}")
done
done
INCLUDE_GROUPS=$(echo $INCLUDE_GROUPS | sed 's/,*$//g')
INCLUDE_GROUPS+="]"
echo "include=${INCLUDE_GROUPS}"
echo "include=${INCLUDE_GROUPS}" >> $GITHUB_OUTPUT
uses: codecov/codecov-action@v3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
integration:
name: (${{ matrix.split-group }}) integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
name: integration test / python ${{ matrix.python-version }} / ${{ matrix.os }}
runs-on: ${{ matrix.os }}
timeout-minutes: 30
needs:
- integration-metadata
timeout-minutes: 60
strategy:
fail-fast: false
matrix:
python-version: [ "3.9", "3.10", "3.11", "3.12" ]
python-version: ["3.8", "3.9", "3.10", "3.11"]
os: [ubuntu-20.04]
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
include: ${{ fromJson(needs.integration-metadata.outputs.include) }}
include:
- python-version: 3.8
os: windows-latest
- python-version: 3.8
os: macos-latest
env:
TOXENV: integration
DBT_INVOCATION_ENV: github-actions
@@ -180,10 +137,10 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
@@ -206,14 +163,8 @@ jobs:
python -m pip install tox
tox --version
- name: Run integration tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 30
max_attempts: 3
command: tox -- --ddtrace
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
- name: Run tests
run: tox -- --ddtrace
- name: Get current date
if: always()
@@ -222,35 +173,17 @@ jobs:
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v3
if: always()
with:
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ matrix.split-group }}_${{ steps.date.outputs.date }}
name: logs_${{ matrix.python-version }}_${{ matrix.os }}_${{ steps.date.outputs.date }}
path: ./logs
- name: Upload Integration Test Coverage to Codecov
if: ${{ matrix.python-version == '3.11' }}
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
flags: integration
integration-report:
if: ${{ always() }}
name: Integration Test Suite
runs-on: ubuntu-latest
needs: integration
steps:
- name: "Integration Tests Failed"
if: ${{ contains(needs.integration.result, 'failure') || contains(needs.integration.result, 'cancelled') }}
# when this is true the next step won't execute
run: |
echo "::notice title='Integration test suite failed'"
exit 1
- name: "Integration Tests Passed"
run: |
echo "::notice title='Integration test suite passed'"
uses: codecov/codecov-action@v3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
build:
name: build packages
@@ -259,12 +192,12 @@ jobs:
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: '3.9'
python-version: '3.8'
- name: Install python dependencies
run: |
@@ -297,7 +230,7 @@ jobs:
- name: Install source distributions
# ignore dbt-1.0.0, which intentionally raises an error when installed from source
run: |
find ./dist/*.gz -maxdepth 1 -type f | xargs python -m pip install --force-reinstall --find-links=dist/
find ./dist/dbt-[a-z]*.gz -maxdepth 1 -type f | xargs python -m pip install --force-reinstall --find-links=dist/
- name: Check source distributions
run: |

View File

@@ -48,7 +48,7 @@ jobs:
# explicitly checkout the performance runner from main regardless of which
# version we are modeling.
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
ref: main
@@ -87,12 +87,12 @@ jobs:
# explicitly checkout the performance runner from main regardless of which
# version we are modeling.
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
ref: main
# attempts to access a previously cached runner
- uses: actions/cache@v4
- uses: actions/cache@v3
id: cache
with:
path: ${{ env.RUNNER_CACHE_PATH }}
@@ -148,9 +148,9 @@ jobs:
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: "3.9"
python-version: "3.8"
- name: Install dbt
run: pip install dbt-postgres==${{ needs.set-variables.outputs.release_id }}
@@ -160,13 +160,13 @@ jobs:
# explicitly checkout main to get the latest project definitions
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
ref: main
# this was built in the previous job so it will be there.
- name: Fetch Runner
uses: actions/cache@v4
uses: actions/cache@v3
id: cache
with:
path: ${{ env.RUNNER_CACHE_PATH }}
@@ -195,7 +195,7 @@ jobs:
- name: '[DEBUG] ls baseline directory after run'
run: ls -R performance/baselines/
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v3
with:
name: baseline
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}/
@@ -225,7 +225,7 @@ jobs:
echo "release_branch: ${{ needs.set-variables.outputs.release_branch }}"
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
ref: ${{ matrix.base-branch }}
@@ -235,7 +235,7 @@ jobs:
git push origin ${{ matrix.target-branch }}
git branch --set-upstream-to=origin/${{ matrix.target-branch }} ${{ matrix.target-branch }}
- uses: actions/download-artifact@v4
- uses: actions/download-artifact@v3
with:
name: baseline
path: performance/baselines/${{ needs.set-variables.outputs.release_id }}
@@ -253,7 +253,7 @@ jobs:
push: 'origin origin/${{ matrix.target-branch }}'
- name: Create Pull Request
uses: peter-evans/create-pull-request@v6
uses: peter-evans/create-pull-request@v5
with:
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
base: ${{ matrix.base-branch }}

View File

@@ -20,7 +20,6 @@ on:
permissions:
contents: write # this is the permission that allows creating a new release
packages: write # this is the permission that allows Docker release
defaults:
run:
@@ -34,15 +33,22 @@ jobs:
runs-on: ubuntu-latest
outputs:
commit_sha: ${{ steps.resolve-commit-sha.outputs.release_commit }}
version_number: ${{ steps.nightly-release-version.outputs.number }}
release_branch: ${{ steps.release-branch.outputs.name }}
steps:
- name: "Checkout ${{ github.repository }} Branch ${{ env.RELEASE_BRANCH }}"
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
ref: ${{ env.RELEASE_BRANCH }}
- name: "Resolve Commit To Release"
id: resolve-commit-sha
run: |
commit_sha=$(git rev-parse HEAD)
echo "release_commit=$commit_sha" >> $GITHUB_OUTPUT
- name: "Get Current Version Number"
id: version-number-sources
run: |
@@ -82,6 +88,7 @@ jobs:
steps:
- name: "[DEBUG] Log Outputs"
run: |
echo commit_sha : ${{ needs.aggregate-release-data.outputs.commit_sha }}
echo version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
echo release_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
@@ -90,8 +97,13 @@ jobs:
uses: ./.github/workflows/release.yml
with:
sha: ${{ needs.aggregate-release-data.outputs.commit_sha }}
target_branch: ${{ needs.aggregate-release-data.outputs.release_branch }}
version_number: ${{ needs.aggregate-release-data.outputs.version_number }}
build_script_path: "scripts/build-dist.sh"
env_setup_script_path: "scripts/env-setup.sh"
s3_bucket_name: "core-team-artifacts"
package_test_command: "dbt --version"
test_run: true
nightly_release: true
secrets: inherit

118
.github/workflows/release-docker.yml vendored Normal file
View File

@@ -0,0 +1,118 @@
# **what?**
# This workflow will generate a series of docker images for dbt and push them to the github container registry
# **why?**
# Docker images for dbt are used in a number of important places throughout the dbt ecosystem. This is how we keep those images up-to-date.
# **when?**
# This is triggered manually
# **next steps**
# - build this into the release workflow (or conversly, break out the different release methods into their own workflow files)
name: Docker release
permissions:
packages: write
on:
workflow_dispatch:
inputs:
package:
description: The package to release. _One_ of [dbt-core, dbt-redshift, dbt-bigquery, dbt-snowflake, dbt-spark, dbt-postgres]
required: true
version_number:
description: The release version number (i.e. 1.0.0b1). Do not include `latest` tags or a leading `v`!
required: true
jobs:
get_version_meta:
name: Get version meta
runs-on: ubuntu-latest
outputs:
major: ${{ steps.version.outputs.major }}
minor: ${{ steps.version.outputs.minor }}
patch: ${{ steps.version.outputs.patch }}
latest: ${{ steps.latest.outputs.latest }}
minor_latest: ${{ steps.latest.outputs.minor_latest }}
steps:
- uses: actions/checkout@v3
- name: Split version
id: version
run: |
IFS="." read -r MAJOR MINOR PATCH <<< ${{ github.event.inputs.version_number }}
echo "major=$MAJOR" >> $GITHUB_OUTPUT
echo "minor=$MINOR" >> $GITHUB_OUTPUT
echo "patch=$PATCH" >> $GITHUB_OUTPUT
- name: Is pkg 'latest'
id: latest
uses: ./.github/actions/latest-wrangler
with:
package: ${{ github.event.inputs.package }}
new_version: ${{ github.event.inputs.version_number }}
gh_token: ${{ secrets.GITHUB_TOKEN }}
halt_on_missing: False
setup_image_builder:
name: Set up docker image builder
runs-on: ubuntu-latest
needs: [get_version_meta]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
build_and_push:
name: Build images and push to GHCR
runs-on: ubuntu-latest
needs: [setup_image_builder, get_version_meta]
steps:
- name: Get docker build arg
id: build_arg
run: |
BUILD_ARG_NAME=$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
BUILD_ARG_VALUE=$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
echo "build_arg_name=$BUILD_ARG_NAME" >> $GITHUB_OUTPUT
echo "build_arg_value=$BUILD_ARG_VALUE" >> $GITHUB_OUTPUT
- name: Log in to the GHCR
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push MAJOR.MINOR.PATCH tag
uses: docker/build-push-action@v4
with:
file: docker/Dockerfile
push: True
target: ${{ github.event.inputs.package }}
build-args: |
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
tags: |
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:${{ github.event.inputs.version_number }}
- name: Build and push MINOR.latest tag
uses: docker/build-push-action@v4
if: ${{ needs.get_version_meta.outputs.minor_latest == 'True' }}
with:
file: docker/Dockerfile
push: True
target: ${{ github.event.inputs.package }}
build-args: |
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
tags: |
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:${{ needs.get_version_meta.outputs.major }}.${{ needs.get_version_meta.outputs.minor }}.latest
- name: Build and push latest tag
uses: docker/build-push-action@v4
if: ${{ needs.get_version_meta.outputs.latest == 'True' }}
with:
file: docker/Dockerfile
push: True
target: ${{ github.event.inputs.package }}
build-args: |
${{ steps.build_arg.outputs.build_arg_name }}_ref=${{ steps.build_arg.outputs.build_arg_value }}@v${{ github.event.inputs.version_number }}
tags: |
ghcr.io/dbt-labs/${{ github.event.inputs.package }}:latest

View File

@@ -7,7 +7,6 @@
# - run unit and integration tests against given commit;
# - build and package that SHA;
# - release it to GitHub and PyPI with that specific build;
# - release it to Docker
#
# **why?**
# Ensure an automated and tested release process
@@ -15,12 +14,15 @@
# **when?**
# This workflow can be run manually on demand or can be called by other workflows
name: "Release to GitHub, PyPI & Docker"
run-name: "Release ${{ inputs.version_number }} to GitHub, PyPI & Docker"
name: Release to GitHub and PyPI
on:
workflow_dispatch:
inputs:
sha:
description: "The last commit sha in the release"
type: string
required: true
target_branch:
description: "The branch to release from"
type: string
@@ -29,6 +31,26 @@ on:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
build_script_path:
description: "Build script path"
type: string
default: "scripts/build-dist.sh"
required: true
env_setup_script_path:
description: "Environment setup script path"
type: string
default: "scripts/env-setup.sh"
required: false
s3_bucket_name:
description: "AWS S3 bucket name"
type: string
default: "core-team-artifacts"
required: true
package_test_command:
description: "Package test command"
type: string
default: "dbt --version"
required: true
test_run:
description: "Test run (Publish release as draft)"
type: boolean
@@ -39,13 +61,12 @@ on:
type: boolean
default: false
required: false
only_docker:
description: "Only release Docker image, skip GitHub & PyPI"
type: boolean
default: false
required: false
workflow_call:
inputs:
sha:
description: "The last commit sha in the release"
type: string
required: true
target_branch:
description: "The branch to release from"
type: string
@@ -54,6 +75,26 @@ on:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
build_script_path:
description: "Build script path"
type: string
default: "scripts/build-dist.sh"
required: true
env_setup_script_path:
description: "Environment setup script path"
type: string
default: "scripts/env-setup.sh"
required: false
s3_bucket_name:
description: "AWS S3 bucket name"
type: string
default: "core-team-artifacts"
required: true
package_test_command:
description: "Package test command"
type: string
default: "dbt --version"
required: true
test_run:
description: "Test run (Publish release as draft)"
type: boolean
@@ -73,47 +114,32 @@ defaults:
shell: bash
jobs:
job-setup:
log-inputs:
name: Log Inputs
runs-on: ubuntu-latest
outputs:
starting_sha: ${{ steps.set_sha.outputs.starting_sha }}
steps:
- name: "[DEBUG] Print Variables"
run: |
echo Inputs
echo The last commit sha in the release: ${{ inputs.sha }}
echo The branch to release from: ${{ inputs.target_branch }}
echo The release version number: ${{ inputs.version_number }}
echo Build script path: ${{ inputs.build_script_path }}
echo Environment setup script path: ${{ inputs.env_setup_script_path }}
echo AWS S3 bucket name: ${{ inputs.s3_bucket_name }}
echo Package test command: ${{ inputs.package_test_command }}
echo Test run: ${{ inputs.test_run }}
echo Nightly release: ${{ inputs.nightly_release }}
echo Only Docker: ${{ inputs.only_docker }}
- name: "Checkout target branch"
uses: actions/checkout@v4
with:
ref: ${{ inputs.target_branch }}
# release-prep.yml really shouldn't take in the sha but since core + all adapters
# depend on it now this workaround lets us not input it manually with risk of error.
# The changes always get merged into the head so we can't use a specific commit for
# releases anyways.
- name: "Capture sha"
id: set_sha
run: |
echo "starting_sha=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
bump-version-generate-changelog:
name: Bump package version, Generate changelog
needs: [job-setup]
if: ${{ !inputs.only_docker }}
uses: dbt-labs/dbt-release/.github/workflows/release-prep.yml@main
with:
sha: ${{ needs.job-setup.outputs.starting_sha }}
sha: ${{ inputs.sha }}
version_number: ${{ inputs.version_number }}
target_branch: ${{ inputs.target_branch }}
env_setup_script_path: "scripts/env-setup.sh"
env_setup_script_path: ${{ inputs.env_setup_script_path }}
test_run: ${{ inputs.test_run }}
nightly_release: ${{ inputs.nightly_release }}
@@ -121,7 +147,7 @@ jobs:
log-outputs-bump-version-generate-changelog:
name: "[Log output] Bump package version, Generate changelog"
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
if: ${{ !failure() && !cancelled() }}
needs: [bump-version-generate-changelog]
@@ -135,8 +161,8 @@ jobs:
build-test-package:
name: Build, Test, Package
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [job-setup, bump-version-generate-changelog]
if: ${{ !failure() && !cancelled() }}
needs: [bump-version-generate-changelog]
uses: dbt-labs/dbt-release/.github/workflows/build.yml@main
@@ -144,9 +170,9 @@ jobs:
sha: ${{ needs.bump-version-generate-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
changelog_path: ${{ needs.bump-version-generate-changelog.outputs.changelog_path }}
build_script_path: "scripts/build-dist.sh"
s3_bucket_name: "core-team-artifacts"
package_test_command: "dbt --version"
build_script_path: ${{ inputs.build_script_path }}
s3_bucket_name: ${{ inputs.s3_bucket_name }}
package_test_command: ${{ inputs.package_test_command }}
test_run: ${{ inputs.test_run }}
nightly_release: ${{ inputs.nightly_release }}
@@ -156,7 +182,7 @@ jobs:
github-release:
name: GitHub Release
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
if: ${{ !failure() && !cancelled() }}
needs: [bump-version-generate-changelog, build-test-package]
@@ -183,51 +209,6 @@ jobs:
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
TEST_PYPI_API_TOKEN: ${{ secrets.TEST_PYPI_API_TOKEN }}
determine-docker-package:
# dbt-postgres exists within dbt-core for versions 1.7 and earlier but is a separate package for 1.8 and later.
# determine if we need to release dbt-core or both dbt-core and dbt-postgres
name: Determine Docker Package
if: ${{ !failure() && !cancelled() }}
runs-on: ubuntu-latest
needs: [pypi-release]
outputs:
matrix: ${{ steps.determine-docker-package.outputs.matrix }}
steps:
- name: "Audit Version And Parse Into Parts"
id: semver
uses: dbt-labs/actions/parse-semver@v1.1.0
with:
version: ${{ inputs.version_number }}
- name: "Determine Packages to Release"
id: determine-docker-package
run: |
if [ ${{ steps.semver.outputs.minor }} -ge 8 ]; then
json_output={\"package\":[\"dbt-core\"]}
else
json_output={\"package\":[\"dbt-core\",\"dbt-postgres\"]}
fi
echo "matrix=$json_output" >> $GITHUB_OUTPUT
docker-release:
name: "Docker Release for ${{ matrix.package }}"
needs: [determine-docker-package]
# We cannot release to docker on a test run because it uses the tag in GitHub as
# what we need to release but draft releases don't actually tag the commit so it
# finds nothing to release
if: ${{ !failure() && !cancelled() && (!inputs.test_run || inputs.only_docker) }}
strategy:
matrix: ${{fromJson(needs.determine-docker-package.outputs.matrix)}}
permissions:
packages: write
uses: dbt-labs/dbt-release/.github/workflows/release-docker.yml@main
with:
package: ${{ matrix.package }}
version_number: ${{ inputs.version_number }}
test_run: ${{ inputs.test_run }}
slack-notification:
name: Slack Notification
if: ${{ failure() && (!inputs.test_run || inputs.nightly_release) }}
@@ -238,7 +219,6 @@ jobs:
build-test-package,
github-release,
pypi-release,
docker-release,
]
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
@@ -247,24 +227,3 @@ jobs:
secrets:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_DEV_CORE_ALERTS }}
testing-slack-notification:
# sends notifications to #slackbot-test
name: Testing - Slack Notification
if: ${{ failure() && inputs.test_run && !inputs.nightly_release }}
needs:
[
bump-version-generate-changelog,
build-test-package,
github-release,
pypi-release,
docker-release,
]
uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
with:
status: "failure"
secrets:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_TESTING_WEBHOOK_URL }}

View File

@@ -1,30 +0,0 @@
# **what?**
# Cleanup branches left over from automation and testing. Also cleanup
# draft releases from release testing.
# **why?**
# The automations are leaving behind branches and releases that clutter
# the repository. Sometimes we need them to debug processes so we don't
# want them immediately deleted. Running on Saturday to avoid running
# at the same time as an actual release to prevent breaking a release
# mid-release.
# **when?**
# Mainly on a schedule of 12:00 Saturday.
# Manual trigger can also run on demand
name: Repository Cleanup
on:
schedule:
- cron: '0 12 * * SAT' # At 12:00 on Saturday - details in `why` above
workflow_dispatch: # for manual triggering
permissions:
contents: write
jobs:
cleanup-repo:
uses: dbt-labs/actions/.github/workflows/repository-cleanup.yml@main
secrets: inherit

View File

@@ -13,63 +13,48 @@
name: Artifact Schema Check
on:
pull_request:
types: [ opened, reopened, labeled, unlabeled, synchronize ]
paths-ignore: [ '.changes/**', '.github/**', 'tests/**', '**.md', '**.yml' ]
workflow_dispatch:
pull_request: #TODO: remove before merging
push:
branches:
- "develop"
- "*.latest"
- "releases/*"
# no special access is needed
permissions: read-all
env:
LATEST_SCHEMA_PATH: ${{ github.workspace }}/new_schemas
SCHEMA_DIFF_ARTIFACT: ${{ github.workspace }}/schema_changes.txt
SCHEMA_DIFF_ARTIFACT: ${{ github.workspace }}//schema_schanges.txt
DBT_REPO_DIRECTORY: ${{ github.workspace }}/dbt
SCHEMA_REPO_DIRECTORY: ${{ github.workspace }}/schemas.getdbt.com
jobs:
checking-schemas:
name: "Post-merge schema changes required"
name: "Checking schemas"
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: 3.9
python-version: 3.8
- name: Checkout dbt repo
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
path: ${{ env.DBT_REPO_DIRECTORY }}
- name: Check for changes in core/dbt/artifacts
# https://github.com/marketplace/actions/paths-changes-filter
uses: dorny/paths-filter@v3
id: check_artifact_changes
with:
filters: |
artifacts_changed:
- 'core/dbt/artifacts/**'
list-files: shell
working-directory: ${{ env.DBT_REPO_DIRECTORY }}
- name: Succeed if no artifacts have changed
if: steps.check_artifact_changes.outputs.artifacts_changed == 'false'
run: |
echo "No artifact changes found in core/dbt/artifacts. CI check passed."
- name: Checkout schemas.getdbt.com repo
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
repository: dbt-labs/schemas.getdbt.com
ref: 'main'
ssh-key: ${{ secrets.SCHEMA_SSH_PRIVATE_KEY }}
path: ${{ env.SCHEMA_REPO_DIRECTORY }}
- name: Generate current schema
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
run: |
cd ${{ env.DBT_REPO_DIRECTORY }}
python3 -m venv env
@@ -80,17 +65,26 @@ jobs:
# Copy generated schema files into the schemas.getdbt.com repo
# Do a git diff to find any changes
# Ignore any lines with date-like (yyyy-mm-dd) or version-like (x.y.z) changes
# Ignore any date or version changes though
- name: Compare schemas
if: steps.check_artifact_changes.outputs.artifacts_changed == 'true'
run: |
cp -r ${{ env.LATEST_SCHEMA_PATH }}/dbt ${{ env.SCHEMA_REPO_DIRECTORY }}
cd ${{ env.SCHEMA_REPO_DIRECTORY }}
git diff -I='*[0-9]{4}-[0-9]{2}-[0-9]{2}' -I='*[0-9]+\.[0-9]+\.[0-9]+' --exit-code > ${{ env.SCHEMA_DIFF_ARTIFACT }}
diff_results=$(git diff -I='*[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])T' \
-I='*[0-9]{1}.[0-9]{2}.[0-9]{1}(rc[0-9]|b[0-9]| )' --compact-summary)
if [[ $(echo diff_results) ]]; then
echo $diff_results
echo "Schema changes detected!"
git diff -I='*[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])T' \
-I='*[0-9]{1}.[0-9]{2}.[0-9]{1}(rc[0-9]|b[0-9]| )' > ${{ env.SCHEMA_DIFF_ARTIFACT }}
exit 1
else
echo "No schema changes detected"
fi
- name: Upload schema diff
uses: actions/upload-artifact@v4
if: ${{ failure() && steps.check_artifact_changes.outputs.artifacts_changed == 'true' }}
uses: actions/upload-artifact@v3
if: ${{ failure() }}
with:
name: 'schema_changes.txt'
name: 'schema_schanges.txt'
path: '${{ env.SCHEMA_DIFF_ARTIFACT }}'

View File

@@ -18,41 +18,11 @@ on:
permissions: read-all
# top-level adjustments can be made here
env:
# number of parallel processes to spawn for python testing
PYTHON_INTEGRATION_TEST_WORKERS: 5
jobs:
integration-metadata:
name: integration test metadata generation
runs-on: ubuntu-latest
outputs:
split-groups: ${{ steps.generate-split-groups.outputs.split-groups }}
steps:
- name: generate split-groups
id: generate-split-groups
run: |
MATRIX_JSON="["
for B in $(seq 1 ${{ env.PYTHON_INTEGRATION_TEST_WORKERS }}); do
MATRIX_JSON+=$(sed 's/^/"/;s/$/"/' <<< "${B}")
done
MATRIX_JSON="${MATRIX_JSON//\"\"/\", \"}"
MATRIX_JSON+="]"
echo "split-groups=${MATRIX_JSON}" >> $GITHUB_OUTPUT
# run the performance measurements on the current or default branch
test-schema:
name: Test Log Schema
runs-on: ubuntu-20.04
timeout-minutes: 30
needs:
- integration-metadata
strategy:
fail-fast: false
matrix:
split-group: ${{ fromJson(needs.integration-metadata.outputs.split-groups) }}
env:
# turns warnings into errors
RUSTFLAGS: "-D warnings"
@@ -69,14 +39,14 @@ jobs:
steps:
- name: checkout dev
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
persist-credentials: false
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: "3.9"
python-version: "3.8"
- name: Install python dependencies
run: |
@@ -94,19 +64,4 @@ jobs:
# integration tests generate a ton of logs in different files. the next step will find them all.
# we actually care if these pass, because the normal test run doesn't usually include many json log outputs
- name: Run integration tests
uses: nick-fields/retry@v3
with:
timeout_minutes: 30
max_attempts: 3
command: tox -e integration -- -nauto
env:
PYTEST_ADDOPTS: ${{ format('--splits {0} --group {1}', env.PYTHON_INTEGRATION_TEST_WORKERS, matrix.split-group) }}
test-schema-report:
name: Log Schema Test Suite
runs-on: ubuntu-latest
needs: test-schema
steps:
- name: "[Notification] Log test suite passes"
run: |
echo "::notice title="Log test suite passes""
run: tox -e integration -- -nauto

View File

@@ -27,6 +27,7 @@ on:
description: 'Version of Python to Test Against'
type: choice
options:
- '3.8'
- '3.9'
- '3.10'
- '3.11'
@@ -35,7 +36,7 @@ on:
type: choice
options:
- 'ubuntu-latest'
- 'macos-14'
- 'macos-latest'
- 'windows-latest'
num_runs_per_batch:
description: 'Max number of times to run the test per batch. We always run 10 batches.'
@@ -82,12 +83,12 @@ jobs:
steps:
- name: "Checkout code"
uses: actions/checkout@v4
uses: actions/checkout@v3
with:
ref: ${{ inputs.branch }}
- name: "Setup Python"
uses: actions/setup-python@v5
uses: actions/setup-python@v4
with:
python-version: "${{ inputs.python_version }}"
@@ -100,7 +101,7 @@ jobs:
# mac and windows don't use make due to limitations with docker with those runners in GitHub
- name: "Set up postgres (macos)"
if: inputs.os == 'macos-14'
if: inputs.os == 'macos-latest'
uses: ./.github/actions/setup-postgres-macos
- name: "Set up postgres (windows)"

1
.github/workflows/test/.actrc vendored Normal file
View File

@@ -0,0 +1 @@
-P ubuntu-latest=ghcr.io/catthehacker/ubuntu:act-latest

1
.github/workflows/test/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
.secrets

View File

@@ -0,0 +1 @@
GITHUB_TOKEN=GH_PERSONAL_ACCESS_TOKEN_GOES_HERE

View File

@@ -0,0 +1,6 @@
{
"inputs": {
"version_number": "1.0.1",
"package": "dbt-postgres"
}
}

28
.github/workflows/version-bump.yml vendored Normal file
View File

@@ -0,0 +1,28 @@
# **what?**
# This workflow will take the new version number to bump to. With that
# it will run versionbump to update the version number everywhere in the
# code base and then run changie to create the corresponding changelog.
# A PR will be created with the changes that can be reviewed before committing.
# **why?**
# This is to aid in releasing dbt and making sure we have updated
# the version in all places and generated the changelog.
# **when?**
# This is triggered manually
name: Version Bump
on:
workflow_dispatch:
inputs:
version_number:
description: 'The version number to bump to (ex. 1.2.0, 1.3.0b1)'
required: true
jobs:
version_bump_and_changie:
uses: dbt-labs/actions/.github/workflows/version-bump.yml@main
with:
version_number: ${{ inputs.version_number }}
secrets: inherit # ok since what we are calling is internally maintained

6
.gitignore vendored
View File

@@ -57,9 +57,6 @@ test.env
makefile.test.env
*.pytest_cache/
# Unit test artifacts
index.html
# Translations
*.mo
@@ -108,6 +105,3 @@ venv/
# poetry
poetry.lock
# asdf
.tool-versions

View File

@@ -1,4 +0,0 @@
[settings]
profile=black
extend_skip_glob=.github/*,third-party-stubs/*,scripts/*
known_first_party=dbt,dbt_adapters,dbt_common,dbt_extractor,dbt_semantic_interfaces

View File

@@ -1,9 +1,9 @@
# Configuration for pre-commit hooks (see https://pre-commit.com/).
# Eventually the hooks described here will be run as tests before merging each PR.
exclude: ^(core/dbt/docs/build/|core/dbt/common/events/types_pb2.py|core/dbt/events/core_types_pb2.py|core/dbt/adapters/events/adapter_types_pb2.py)
exclude: ^(core/dbt/docs/build/|core/dbt/events/types_pb2.py)
# Force all unspecified python hooks to run python 3.9
# Force all unspecified python hooks to run python 3.8
default_language_version:
python: python3
@@ -15,19 +15,12 @@ repos:
args: [--unsafe]
- id: check-json
- id: end-of-file-fixer
exclude: schemas/dbt/manifest/
- id: trailing-whitespace
exclude_types:
- "markdown"
- id: check-case-conflict
- repo: https://github.com/pycqa/isort
# rev must match what's in dev-requirements.txt
rev: 5.13.2
hooks:
- id: isort
- repo: https://github.com/psf/black
# rev must match what's in dev-requirements.txt
rev: 24.3.0
rev: 22.3.0
hooks:
- id: black
- id: black
@@ -37,7 +30,6 @@ repos:
- "--check"
- "--diff"
- repo: https://github.com/pycqa/flake8
# rev must match what's in dev-requirements.txt
rev: 4.0.1
hooks:
- id: flake8
@@ -45,8 +37,7 @@ repos:
alias: flake8-check
stages: [manual]
- repo: https://github.com/pre-commit/mirrors-mypy
# rev must match what's in dev-requirements.txt
rev: v1.4.1
rev: v1.3.0
hooks:
- id: mypy
# N.B.: Mypy is... a bit fragile.

View File

@@ -26,13 +26,12 @@ Legacy tests are found in the 'test' directory:
The "tasks" map to top-level dbt commands. So `dbt run` => task.run.RunTask, etc. Some are more like abstract base classes (GraphRunnableTask, for example) but all the concrete types outside of task should map to tasks. Currently one executes at a time. The tasks kick off their “Runners” and those do execute in parallel. The parallelism is managed via a thread pool, in GraphRunnableTask.
core/dbt/task/docs/index.html
core/dbt/include/index.html
This is the docs website code. It comes from the dbt-docs repository, and is generated when a release is packaged.
## Adapters
dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc.
Note: dbt-postgres used to exist in dbt-core but is now in [a separate repo](https://github.com/dbt-labs/dbt-adapters/dbt-postgres)
dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc. For testing and development purposes, the dbt-postgres plugin lives alongside the dbt-core codebase, in the [`plugins`](plugins) subdirectory. Like other adapter plugins, it is a self-contained codebase and package that builds on top of dbt-core.
Each adapter is a mix of python, Jinja2, and SQL. The adapter code also makes heavy use of Jinja2 to wrap modular chunks of SQL functionality, define default implementations, and allow plugins to override it.

View File

@@ -1,6 +1,6 @@
# dbt Core Changelog
- This file provides a full account of all changes to `dbt-core`
- This file provides a full account of all changes to `dbt-core` and `dbt-postgres`
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-core/blob/main/CONTRIBUTING.md#adding-changelog-entry)
@@ -10,9 +10,6 @@
For information on prior major and minor releases, see their changelogs:
* [1.9](https://github.com/dbt-labs/dbt-core/blob/1.9.latest/CHANGELOG.md)
* [1.8](https://github.com/dbt-labs/dbt-core/blob/1.8.latest/CHANGELOG.md)
* [1.7](https://github.com/dbt-labs/dbt-core/blob/1.7.latest/CHANGELOG.md)
* [1.6](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/dbt-labs/dbt-core/blob/1.4.latest/CHANGELOG.md)

View File

@@ -10,7 +10,6 @@
6. [Debugging](#debugging)
7. [Adding or modifying a changelog entry](#adding-or-modifying-a-changelog-entry)
8. [Submitting a Pull Request](#submitting-a-pull-request)
9. [Troubleshooting Tips](#troubleshooting-tips)
## About this document
@@ -22,10 +21,10 @@ If you get stuck, we're happy to help! Drop us a line in the `#dbt-core-developm
### Notes
- **Adapters:** Is your issue or proposed code change related to a specific [database adapter](https://docs.getdbt.com/docs/available-adapters)? If so, please open issues, PRs, and discussions in that adapter's repository instead.
- **Adapters:** Is your issue or proposed code change related to a specific [database adapter](https://docs.getdbt.com/docs/available-adapters)? If so, please open issues, PRs, and discussions in that adapter's repository instead. The sole exception is Postgres; the `dbt-postgres` plugin lives in this repository (`dbt-core`).
- **CLA:** Please note that anyone contributing code to `dbt-core` must sign the [Contributor License Agreement](https://docs.getdbt.com/docs/contributor-license-agreements). If you are unable to sign the CLA, the `dbt-core` maintainers will unfortunately be unable to merge any of your Pull Requests. We welcome you to participate in discussions, open issues, and comment on existing ones.
- **Branches:** All pull requests from community contributors should target the `main` branch (default). If the change is needed as a patch for a minor version of dbt that has already been released (or is already a release candidate), a maintainer will backport the changes in your PR to the relevant "latest" release branch (`1.0.latest`, `1.1.latest`, ...). If an issue fix applies to a release branch, that fix should be first committed to the development branch and then to the release branch (rarely release-branch fixes may not apply to `main`).
- **Releases**: Before releasing a new minor version of Core, we prepare a series of alphas and release candidates to allow users (especially employees of dbt Labs!) to test the new version in live environments. This is an important quality assurance step, as it exposes the new code to a wide variety of complicated deployments and can surface bugs before official release. Releases are accessible via our [supported installation methods](https://docs.getdbt.com/docs/core/installation-overview#install-dbt-core).
- **Releases**: Before releasing a new minor version of Core, we prepare a series of alphas and release candidates to allow users (especially employees of dbt Labs!) to test the new version in live environments. This is an important quality assurance step, as it exposes the new code to a wide variety of complicated deployments and can surface bugs before official release. Releases are accessible via pip, homebrew, and dbt Cloud.
## Getting the code
@@ -45,7 +44,9 @@ If you are not a member of the `dbt-labs` GitHub organization, you can contribut
### dbt Labs contributors
If you are a member of the `dbt-labs` GitHub organization, you will have push access to the `dbt-core` repo. Rather than forking `dbt-core` to make your changes, just clone the repository, check out a new branch, and push directly to that branch.
If you are a member of the `dbt-labs` GitHub organization, you will have push access to the `dbt-core` repo. Rather than forking `dbt-core` to make your changes, just clone the repository, check out a new branch, and push directly to that branch. Branch names should be fixed by `CT-XXX/` where:
* CT stands for 'core team'
* XXX stands for a JIRA ticket number
## Setting up an environment
@@ -170,9 +171,9 @@ Finally, you can also run a specific test or group of tests using [`pytest`](htt
```sh
# run all unit tests in a file
python3 -m pytest tests/unit/test_invocation_id.py
python3 -m pytest tests/unit/test_graph.py
# run a specific unit test
python3 -m pytest tests/unit/test_invocation_id.py::TestInvocationId::test_invocation_id
python3 -m pytest tests/unit/test_graph.py::GraphTest::test__dependency_list
# run specific Postgres functional tests
python3 -m pytest tests/functional/sources
```
@@ -220,12 +221,10 @@ You don't need to worry about which `dbt-core` version your change will go into.
## Submitting a Pull Request
Code can be merged into the current development branch `main` by opening a pull request. If the proposal looks like it's on the right track, then a `dbt-core` maintainer will triage the PR and label it as `ready_for_review`. From this point, two code reviewers will be assigned with the aim of responding to any updates to the PR within about one week. They may suggest code revision for style or clarity, or request that you add unit or integration test(s). These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code. Once merged, your contribution will be available for the next release of `dbt-core`.
Code can be merged into the current development branch `main` by opening a pull request. A `dbt-core` maintainer will review your PR. They may suggest code revision for style or clarity, or request that you add unit or integration test(s). These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code.
Automated tests run via GitHub Actions. If you're a first-time contributor, all tests (including code checks and unit tests) will require a maintainer to approve. Changes in the `dbt-core` repository trigger integration tests against Postgres. dbt Labs also provides CI environments in which to test changes to other adapters, triggered by PRs in those adapters' repositories, as well as periodic maintenance checks of each adapter in concert with the latest `dbt-core` code changes.
Once all tests are passing and your PR has been approved, a `dbt-core` maintainer will merge your changes into the active development branch. And that's it! Happy developing :tada:
## Troubleshooting Tips
Sometimes, the content license agreement auto-check bot doesn't find a user's entry in its roster. If you need to force a rerun, add `@cla-bot check` in a comment on the pull request.

View File

@@ -33,6 +33,9 @@ RUN apt-get update \
python-is-python3 \
python-dev-is-python3 \
python3-pip \
python3.8 \
python3.8-dev \
python3.8-venv \
python3.9 \
python3.9-dev \
python3.9-venv \

View File

@@ -30,22 +30,17 @@ CI_FLAGS =\
.PHONY: dev_req
dev_req: ## Installs dbt-* packages in develop mode along with only development dependencies.
@\
pip install -r dev-requirements.txt -r editable-requirements.txt
pip install -r dev-requirements.txt
pip install -r editable-requirements.txt
.PHONY: dev
dev: dev_req ## Installs dbt-* packages in develop mode along with development dependencies and pre-commit.
@\
pre-commit install
.PHONY: dev-uninstall
dev-uninstall: ## Uninstall all packages in venv except for build tools
@\
pip freeze | grep -v "^-e" | cut -d "@" -f1 | xargs pip uninstall -y; \
pip uninstall -y dbt-core
.PHONY: core_proto_types
core_proto_types: ## generates google protobuf python file from core_types.proto
protoc -I=./core/dbt/events --python_out=./core/dbt/events ./core/dbt/events/core_types.proto
.PHONY: proto_types
proto_types: ## generates google protobuf python file from types.proto
protoc -I=./core/dbt/events --python_out=./core/dbt/events ./core/dbt/events/types.proto
.PHONY: mypy
mypy: .env ## Runs mypy against staged changes for static type checking.
@@ -82,12 +77,12 @@ test: .env ## Runs unit tests with py and code checks against staged changes.
$(DOCKER_CMD) pre-commit run mypy-check --hook-stage manual | grep -v "INFO"
.PHONY: integration
integration: .env ## Runs core integration tests using postgres with py-integration
integration: .env ## Runs postgres integration tests with py-integration
@\
$(CI_FLAGS) $(DOCKER_CMD) tox -e py-integration -- -nauto
.PHONY: integration-fail-fast
integration-fail-fast: .env ## Runs core integration tests using postgres with py-integration in "fail fast" mode.
integration-fail-fast: .env ## Runs postgres integration tests with py-integration in "fail fast" mode.
@\
$(DOCKER_CMD) tox -e py-integration -- -x -nauto
@@ -144,7 +139,3 @@ help: ## Show this help message.
@echo
@echo 'options:'
@echo 'use USE_DOCKER=true to run target in a docker container'
.PHONY: json_schema
json_schema: ## Update generated JSON schema using code changes.
scripts/collect-artifact-schema.py --path schemas

View File

@@ -21,7 +21,7 @@ These select statements, or "models", form a dbt project. Models frequently buil
## Getting started
- [Install dbt Core](https://docs.getdbt.com/docs/get-started/installation) or explore the [dbt Cloud CLI](https://docs.getdbt.com/docs/cloud/cloud-cli-installation), a command-line interface powered by [dbt Cloud](https://docs.getdbt.com/docs/cloud/about-cloud/dbt-cloud-features) that enhances collaboration.
- [Install dbt](https://docs.getdbt.com/docs/get-started/installation)
- Read the [introduction](https://docs.getdbt.com/docs/introduction/) and [viewpoint](https://docs.getdbt.com/docs/about/viewpoint/)
## Join the dbt Community
@@ -31,7 +31,7 @@ These select statements, or "models", form a dbt project. Models frequently buil
## Reporting bugs and contributing code
- Want to report a bug or request a feature? Let us know and open [an issue](https://github.com/dbt-labs/dbt-core/issues/new/choose)
- Want to report a bug or request a feature? Let us know on [Slack](http://community.getdbt.com/), or open [an issue](https://github.com/dbt-labs/dbt-core/issues/new)
- Want to help us build dbt? Check out the [Contributing Guide](https://github.com/dbt-labs/dbt-core/blob/HEAD/CONTRIBUTING.md)
## Code of Conduct

View File

@@ -1 +0,0 @@
[About dbt Core versions](https://docs.getdbt.com/docs/dbt-versions/core)

View File

@@ -1,39 +0,0 @@
ignore:
- ".github"
- ".changes"
coverage:
status:
project:
default:
target: auto
threshold: 0.1% # Reduce noise by ignoring rounding errors in coverage drops
patch:
default:
target: auto
threshold: 80%
comment:
layout: "header, diff, flags, components" # show component info in the PR comment
component_management:
default_rules: # default rules that will be inherited by all components
statuses:
- type: project # in this case every component that doens't have a status defined will have a project type one
target: auto
threshold: 0.1%
- type: patch
target: 80%
individual_components:
- component_id: unittests
name: "Unit Tests"
flag_regexes:
- "unit"
statuses:
- type: patch
target: 80%
threshold: 5%
- component_id: integrationtests
name: "Integration Tests"
flag_regexes:
- "integration"

View File

@@ -1,3 +1,2 @@
recursive-include dbt/include *.py *.sql *.yml *.html *.md .gitkeep .gitignore
include dbt/py.typed
recursive-include dbt/task/docs *.html

View File

@@ -22,6 +22,8 @@
### links.py
### logger.py
### main.py
### node_types.py

View File

@@ -0,0 +1,30 @@
# Adapters README
The Adapters module is responsible for defining database connection methods, caching information from databases, how relations are defined, and the two major connection types we have - base and sql.
# Directories
## `base`
Defines the base implementation Adapters can use to build out full functionality.
## `sql`
Defines a sql implementation for adapters that initially inherits the above base implementation and comes with some premade methods and macros that can be overwritten as needed per adapter. (most common type of adapter.)
# Files
## `cache.py`
Cached information from the database.
## `factory.py`
Defines how we generate adapter objects
## `protocol.py`
Defines various interfaces for various adapter objects. Helps mypy correctly resolve methods.
## `reference_keys.py`
Configures naming scheme for cache elements to be universal.

View File

@@ -0,0 +1,7 @@
# N.B.
# This will add to the packages __path__ all subdirectories of directories on sys.path named after the package which effectively combines both modules into a single namespace (dbt.adapters)
# The matching statement is in plugins/postgres/dbt/adapters/__init__.py
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)

View File

@@ -0,0 +1,10 @@
## Base adapters
### impl.py
The class `SQLAdapter` in [base/imply.py](https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/adapters/base/impl.py) is a (mostly) abstract object that adapter objects inherit from. The base class scaffolds out methods that every adapter project usually should implement for smooth communication between dbt and database.
Some target databases require more or fewer methods--it all depends on what the warehouse's featureset is.
Look into the class for function-level comments.

View File

@@ -0,0 +1,19 @@
# these are all just exports, #noqa them so flake8 will be happy
# TODO: Should we still include this in the `adapters` namespace?
from dbt.contracts.connection import Credentials # noqa: F401
from dbt.adapters.base.meta import available # noqa: F401
from dbt.adapters.base.connections import BaseConnectionManager # noqa: F401
from dbt.adapters.base.relation import ( # noqa: F401
BaseRelation,
RelationType,
SchemaSearchMap,
)
from dbt.adapters.base.column import Column # noqa: F401
from dbt.adapters.base.impl import ( # noqa: F401
AdapterConfig,
BaseAdapter,
PythonJobHelper,
ConstraintSupport,
)
from dbt.adapters.base.plugin import AdapterPlugin # noqa: F401

View File

@@ -0,0 +1,161 @@
from dataclasses import dataclass
import re
from typing import Dict, ClassVar, Any, Optional
from dbt.exceptions import DbtRuntimeError
@dataclass
class Column:
TYPE_LABELS: ClassVar[Dict[str, str]] = {
"STRING": "TEXT",
"TIMESTAMP": "TIMESTAMP",
"FLOAT": "FLOAT",
"INTEGER": "INT",
"BOOLEAN": "BOOLEAN",
}
column: str
dtype: str
char_size: Optional[int] = None
numeric_precision: Optional[Any] = None
numeric_scale: Optional[Any] = None
@classmethod
def translate_type(cls, dtype: str) -> str:
return cls.TYPE_LABELS.get(dtype.upper(), dtype)
@classmethod
def create(cls, name, label_or_dtype: str) -> "Column":
column_type = cls.translate_type(label_or_dtype)
return cls(name, column_type)
@property
def name(self) -> str:
return self.column
@property
def quoted(self) -> str:
return '"{}"'.format(self.column)
@property
def data_type(self) -> str:
if self.is_string():
return self.string_type(self.string_size())
elif self.is_numeric():
return self.numeric_type(self.dtype, self.numeric_precision, self.numeric_scale)
else:
return self.dtype
def is_string(self) -> bool:
return self.dtype.lower() in ["text", "character varying", "character", "varchar"]
def is_number(self):
return any([self.is_integer(), self.is_numeric(), self.is_float()])
def is_float(self):
return self.dtype.lower() in [
# floats
"real",
"float4",
"float",
"double precision",
"float8",
"double",
]
def is_integer(self) -> bool:
return self.dtype.lower() in [
# real types
"smallint",
"integer",
"bigint",
"smallserial",
"serial",
"bigserial",
# aliases
"int2",
"int4",
"int8",
"serial2",
"serial4",
"serial8",
]
def is_numeric(self) -> bool:
return self.dtype.lower() in ["numeric", "decimal"]
def string_size(self) -> int:
if not self.is_string():
raise DbtRuntimeError("Called string_size() on non-string field!")
if self.dtype == "text" or self.char_size is None:
# char_size should never be None. Handle it reasonably just in case
return 256
else:
return int(self.char_size)
def can_expand_to(self, other_column: "Column") -> bool:
"""returns True if this column can be expanded to the size of the
other column"""
if not self.is_string() or not other_column.is_string():
return False
return other_column.string_size() > self.string_size()
def literal(self, value: Any) -> str:
return "{}::{}".format(value, self.data_type)
@classmethod
def string_type(cls, size: int) -> str:
return "character varying({})".format(size)
@classmethod
def numeric_type(cls, dtype: str, precision: Any, scale: Any) -> str:
# This could be decimal(...), numeric(...), number(...)
# Just use whatever was fed in here -- don't try to get too clever
if precision is None or scale is None:
return dtype
else:
return "{}({},{})".format(dtype, precision, scale)
def __repr__(self) -> str:
return "<Column {} ({})>".format(self.name, self.data_type)
@classmethod
def from_description(cls, name: str, raw_data_type: str) -> "Column":
match = re.match(r"([^(]+)(\([^)]+\))?", raw_data_type)
if match is None:
raise DbtRuntimeError(f'Could not interpret data type "{raw_data_type}"')
data_type, size_info = match.groups()
char_size = None
numeric_precision = None
numeric_scale = None
if size_info is not None:
# strip out the parentheses
size_info = size_info[1:-1]
parts = size_info.split(",")
if len(parts) == 1:
try:
char_size = int(parts[0])
except ValueError:
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[0]}" to an integer'
)
elif len(parts) == 2:
try:
numeric_precision = int(parts[0])
except ValueError:
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[0]}" to an integer'
)
try:
numeric_scale = int(parts[1])
except ValueError:
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[1]}" to an integer'
)
return cls(name, data_type, char_size, numeric_precision, numeric_scale)

Some files were not shown because too many files have changed in this diff Show More