Compare commits

...

5 Commits

Author SHA1 Message Date
Callum McCann
2da925aa25 Adding entity node to core (#6648)
* first draft

* finishing first commit

* adding testing project

* adding changie

* cleaning

* removing blocks

* fixing proto error message

* updates to events

* fixing issues

* adding test dimension

* updating schemas

* updating manfiest.json

* removing old versions from compatability

* updating

* fixes

* fixing more bugs caught by tests

* updating tests
2023-01-31 09:03:06 -06:00
David Bloss
43e24c5ae6 update gh action set-output variables (#6635)
* update gh action set-output variables

* add changie file
2023-01-18 11:23:13 -06:00
Gerda Shank
89d111a5f6 CT 1440 Fix code to emit ConnectionReused event (#6605)
* Refactor "set_connection_name" to properly handle reused connection

* Update test

* Changie

* Limit test of ConnectionUsed events to non-Windows
2023-01-17 13:18:07 -05:00
Gerda Shank
e1b5e68904 Convert 068_partial_parsing_tests (#6614)
* Convert partial parsing tests

* reformat
2023-01-17 12:22:31 -05:00
Jeremy Cohen
065ab2ebc2 Reformat tests/ (#6622)
* Run black + flake8 on tests dir

* Run pre-commit
2023-01-16 16:39:54 +01:00
188 changed files with 4153 additions and 21359 deletions

View File

@@ -0,0 +1,6 @@
kind: Features
body: Adding the entity node
time: 2023-01-18T13:48:04.487817-06:00
custom:
Author: callum-mcdata
Issue: "6627"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Fix use of ConnectionReused logging event
time: 2023-01-13T13:25:13.023168-05:00
custom:
Author: gshank
Issue: "6168"

View File

@@ -0,0 +1,6 @@
kind: Under the Hood
body: Update deprecated github action command
time: 2023-01-17T11:17:37.046095-06:00
custom:
Author: davidbloss
Issue: "6153"

View File

@@ -9,4 +9,4 @@ ignore =
E203 # makes Flake8 work like black
E741
E501 # long line checking is done in black
exclude = test
exclude = test/

20
.github/_README.md vendored
View File

@@ -63,12 +63,12 @@ permissions:
contents: read
pull-requests: write
```
### Secrets
- When to use a [Personal Access Token (PAT)](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token) vs the [GITHUB_TOKEN](https://docs.github.com/en/actions/security-guides/automatic-token-authentication) generated for the action?
The `GITHUB_TOKEN` is used by default. In most cases it is sufficient for what you need.
If you expect the workflow to result in a commit to that should retrigger workflows, you will need to use a Personal Access Token for the bot to commit the file. When using the GITHUB_TOKEN, the resulting commit will not trigger another GitHub Actions Workflow run. This is due to limitations set by GitHub. See [the docs](https://docs.github.com/en/actions/security-guides/automatic-token-authentication#using-the-github_token-in-a-workflow) for a more detailed explanation.
For example, we must use a PAT in our workflow to commit a new changelog yaml file for bot PRs. Once the file has been committed to the branch, it should retrigger the check to validate that a changelog exists on the PR. Otherwise, it would stay in a failed state since the check would never retrigger.
@@ -105,7 +105,7 @@ Some triggers of note that we use:
```
# **what?**
# Describe what the action does.
# Describe what the action does.
# **why?**
# Why does this action exist?
@@ -138,7 +138,7 @@ Some triggers of note that we use:
id: fp
run: |
FILEPATH=.changes/unreleased/Dependencies-${{ steps.filename_time.outputs.time }}.yaml
echo "::set-output name=FILEPATH::$FILEPATH"
echo "FILEPATH=$FILEPATH" >> $GITHUB_OUTPUT
```
- Print out all variables you will reference as the first step of a job. This allows for easier debugging. The first job should log all inputs. Subsequent jobs should reference outputs of other jobs, if present.
@@ -158,14 +158,14 @@ Some triggers of note that we use:
echo "The build_script_path: ${{ inputs.build_script_path }}"
echo "The s3_bucket_name: ${{ inputs.s3_bucket_name }}"
echo "The package_test_command: ${{ inputs.package_test_command }}"
# collect all the variables that need to be used in subsequent jobs
- name: Set Variables
id: variables
run: |
echo "::set-output name=important_path::'performance/runner/Cargo.toml'"
echo "::set-output name=release_id::${{github.event.inputs.release_id}}"
echo "::set-output name=open_prs::${{github.event.inputs.open_prs}}"
echo "important_path='performance/runner/Cargo.toml'" >> $GITHUB_OUTPUT
echo "release_id=${{github.event.inputs.release_id}}" >> $GITHUB_OUTPUT
echo "open_prs=${{github.event.inputs.open_prs}}" >> $GITHUB_OUTPUT
job2:
needs: [job1]
@@ -190,7 +190,7 @@ ___
### Actions from the Marketplace
- Dont use external actions for things that can easily be accomplished manually.
- Always read through what an external action does before using it! Often an action in the GitHub Actions Marketplace can be replaced with a few lines in bash. This is much more maintainable (and wont change under us) and clear as to whats actually happening. It also prevents any
- Pin actions _we don't control_ to tags.
- Pin actions _we don't control_ to tags.
### Connecting to AWS
- Authenticate with the aws managed workflow
@@ -208,7 +208,7 @@ ___
```yaml
- name: Copy Artifacts from S3 via CLI
run: aws s3 cp ${{ env.s3_bucket }} . --recursive
run: aws s3 cp ${{ env.s3_bucket }} . --recursive
```
### Testing

View File

@@ -28,11 +28,12 @@ if __name__ == "__main__":
if package_request.status_code == 404:
if halt_on_missing:
sys.exit(1)
else:
# everything is the latest if the package doesn't exist
print(f"::set-output name=latest::{True}")
print(f"::set-output name=minor_latest::{True}")
sys.exit(0)
# everything is the latest if the package doesn't exist
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write("latest=True")
gh_output.write("minor_latest=True")
sys.exit(0)
# TODO: verify package meta is "correct"
# https://github.com/dbt-labs/dbt-core/issues/4640
@@ -91,5 +92,7 @@ if __name__ == "__main__":
latest = is_latest(pre_rel, new_version, current_latest)
minor_latest = is_latest(pre_rel, new_version, current_minor_latest)
print(f"::set-output name=latest::{latest}")
print(f"::set-output name=minor_latest::{minor_latest}")
github_output = os.environ.get("GITHUB_OUTPUT")
with open(github_output, "at", encoding="utf-8") as gh_output:
gh_output.write(f"latest={latest}")
gh_output.write(f"minor_latest={minor_latest}")

View File

@@ -101,7 +101,9 @@ jobs:
- name: Get current date
if: always()
id: date
run: echo "::set-output name=date::$(date +'%Y-%m-%dT%H_%M_%S')" #no colons allowed for artifacts
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v2
if: always()
@@ -168,7 +170,9 @@ jobs:
- name: Get current date
if: always()
id: date
run: echo "::set-output name=date::$(date +'%Y_%m_%dT%H_%M_%S')" #no colons allowed for artifacts
run: |
CURRENT_DATE=$(date +'%Y-%m-%dT%H_%M_%S') # no colons allowed for artifacts
echo "date=$CURRENT_DATE" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v2
if: always()

View File

@@ -41,9 +41,9 @@ jobs:
id: version
run: |
IFS="." read -r MAJOR MINOR PATCH <<< ${{ github.event.inputs.version_number }}
echo "::set-output name=major::$MAJOR"
echo "::set-output name=minor::$MINOR"
echo "::set-output name=patch::$PATCH"
echo "major=$MAJOR" >> $GITHUB_OUTPUT
echo "minor=$MINOR" >> $GITHUB_OUTPUT
echo "patch=$PATCH" >> $GITHUB_OUTPUT
- name: Is pkg 'latest'
id: latest
@@ -70,8 +70,10 @@ jobs:
- name: Get docker build arg
id: build_arg
run: |
echo "::set-output name=build_arg_name::"$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
echo "::set-output name=build_arg_value::"$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
BUILD_ARG_NAME=$(echo ${{ github.event.inputs.package }} | sed 's/\-/_/g')
BUILD_ARG_VALUE=$(echo ${{ github.event.inputs.package }} | sed 's/postgres/core/g')
echo "build_arg_name=$BUILD_ARG_NAME" >> $GITHUB_OUTPUT
echo "build_arg_value=$BUILD_ARG_VALUE" >> $GITHUB_OUTPUT
- name: Log in to the GHCR
uses: docker/login-action@v1

View File

@@ -165,7 +165,7 @@ jobs:
env:
IS_PRERELEASE: ${{ contains(github.event.inputs.version_number, 'rc') || contains(github.event.inputs.version_number, 'b') }}
run: |
echo ::set-output name=isPrerelease::$IS_PRERELEASE
echo "isPrerelease=$IS_PRERELEASE" >> $GITHUB_OUTPUT
- name: Creating GitHub Release
uses: softprops/action-gh-release@v1

View File

@@ -65,7 +65,7 @@ jobs:
- name: Set branch value
id: variables
run: |
echo "::set-output name=BRANCH_NAME::prep-release/${{ github.event.inputs.version_number }}_$GITHUB_RUN_ID"
echo "BRANCH_NAME=prep-release/${{ github.event.inputs.version_number }}_$GITHUB_RUN_ID" >> $GITHUB_OUTPUT
- name: Create PR branch
run: |

View File

@@ -142,44 +142,44 @@ class BaseConnectionManager(metaclass=abc.ABCMeta):
)
def set_connection_name(self, name: Optional[str] = None) -> Connection:
conn_name: str
if name is None:
# if a name isn't specified, we'll re-use a single handle
# named 'master'
conn_name = "master"
else:
if not isinstance(name, str):
raise dbt.exceptions.CompilerException(
f"For connection name, got {name} - not a string!"
)
assert isinstance(name, str)
conn_name = name
"""Called by 'acquire_connection' in BaseAdapter, which is called by
'connection_named', called by 'connection_for(node)'.
Creates a connection for this thread if one doesn't already
exist, and will rename an existing connection."""
conn_name: str = "master" if name is None else name
# Get a connection for this thread
conn = self.get_if_exists()
if conn and conn.name == conn_name and conn.state == "open":
# Found a connection and nothing to do, so just return it
return conn
if conn is None:
# Create a new connection
conn = Connection(
type=Identifier(self.TYPE),
name=None,
name=conn_name,
state=ConnectionState.INIT,
transaction_open=False,
handle=None,
credentials=self.profile.credentials,
)
self.set_thread_connection(conn)
if conn.name == conn_name and conn.state == "open":
return conn
fire_event(
NewConnection(conn_name=conn_name, conn_type=self.TYPE, node_info=get_node_info())
)
if conn.state == "open":
fire_event(ConnectionReused(conn_name=conn_name))
else:
conn.handle = LazyHandle(self.open)
# Add the connection to thread_connections for this thread
self.set_thread_connection(conn)
fire_event(
NewConnection(conn_name=conn_name, conn_type=self.TYPE, node_info=get_node_info())
)
else: # existing connection either wasn't open or didn't have the right name
if conn.state != "open":
conn.handle = LazyHandle(self.open)
if conn.name != conn_name:
orig_conn_name: str = conn.name or ""
conn.name = conn_name
fire_event(ConnectionReused(orig_conn_name=orig_conn_name, conn_name=conn_name))
conn.name = conn_name
return conn
@classmethod

View File

@@ -48,6 +48,7 @@ def print_compile_stats(stats):
NodeType.Source: "source",
NodeType.Exposure: "exposure",
NodeType.Metric: "metric",
NodeType.Entity: "entity",
}
results = {k: 0 for k in names.keys()}
@@ -83,6 +84,8 @@ def _generate_stats(manifest: Manifest):
stats[exposure.resource_type] += 1
for metric in manifest.metrics.values():
stats[metric.resource_type] += 1
for entity in manifest.entities.values():
stats[entity.resource_type] += 1
for macro in manifest.macros.values():
stats[macro.resource_type] += 1
return stats
@@ -398,6 +401,8 @@ class Compiler:
linker.dependency(node.unique_id, (manifest.sources[dependency].unique_id))
elif dependency in manifest.metrics:
linker.dependency(node.unique_id, (manifest.metrics[dependency].unique_id))
elif dependency in manifest.entities:
linker.dependency(node.unique_id, (manifest.entities[dependency].unique_id))
else:
raise GraphDependencyNotFoundError(node, dependency)
@@ -410,6 +415,8 @@ class Compiler:
self.link_node(linker, exposure, manifest)
for metric in manifest.metrics.values():
self.link_node(linker, metric, manifest)
for entity in manifest.entities.values():
self.link_node(linker, entity, manifest)
cycle = linker.find_cycles()

View File

@@ -381,6 +381,7 @@ class PartialProject(RenderComponents):
sources: Dict[str, Any]
tests: Dict[str, Any]
metrics: Dict[str, Any]
entities: Dict[str, Any]
exposures: Dict[str, Any]
vars_value: VarProvider
@@ -391,6 +392,7 @@ class PartialProject(RenderComponents):
sources = cfg.sources
tests = cfg.tests
metrics = cfg.metrics
entities = cfg.entities
exposures = cfg.exposures
if cfg.vars is None:
vars_dict: Dict[str, Any] = {}
@@ -446,6 +448,7 @@ class PartialProject(RenderComponents):
sources=sources,
tests=tests,
metrics=metrics,
entities=entities,
exposures=exposures,
vars=vars_value,
config_version=cfg.config_version,
@@ -550,6 +553,7 @@ class Project:
sources: Dict[str, Any]
tests: Dict[str, Any]
metrics: Dict[str, Any]
entities: Dict[str, Any]
exposures: Dict[str, Any]
vars: VarProvider
dbt_version: List[VersionSpecifier]
@@ -624,6 +628,7 @@ class Project:
"sources": self.sources,
"tests": self.tests,
"metrics": self.metrics,
"entities": self.entities,
"exposures": self.exposures,
"vars": self.vars.to_dict(),
"require-dbt-version": [v.to_version_string() for v in self.dbt_version],

View File

@@ -117,6 +117,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
sources=project.sources,
tests=project.tests,
metrics=project.metrics,
entities=project.entities,
exposures=project.exposures,
vars=project.vars,
config_version=project.config_version,
@@ -312,6 +313,7 @@ class RuntimeConfig(Project, Profile, AdapterRequiredConfig):
"sources": self._get_config_paths(self.sources),
"tests": self._get_config_paths(self.tests),
"metrics": self._get_config_paths(self.metrics),
"entities": self._get_config_paths(self.entities),
"exposures": self._get_config_paths(self.exposures),
}
@@ -500,6 +502,7 @@ class UnsetProfileConfig(RuntimeConfig):
"sources": self.sources,
"tests": self.tests,
"metrics": self.metrics,
"entities": self.entities,
"exposures": self.exposures,
"vars": self.vars.to_dict(),
"require-dbt-version": [v.to_version_string() for v in self.dbt_version],
@@ -562,6 +565,7 @@ class UnsetProfileConfig(RuntimeConfig):
sources=project.sources,
tests=project.tests,
metrics=project.metrics,
entities=project.entities,
exposures=project.exposures,
vars=project.vars,
config_version=project.config_version,

View File

@@ -45,6 +45,8 @@ class UnrenderedConfig(ConfigSource):
model_configs = unrendered.get("tests")
elif resource_type == NodeType.Metric:
model_configs = unrendered.get("metrics")
elif resource_type == NodeType.Entity:
model_configs = unrendered.get("entities")
elif resource_type == NodeType.Exposure:
model_configs = unrendered.get("exposures")
else:
@@ -70,6 +72,8 @@ class RenderedConfig(ConfigSource):
model_configs = self.project.tests
elif resource_type == NodeType.Metric:
model_configs = self.project.metrics
elif resource_type == NodeType.Entity:
model_configs = self.project.entities
elif resource_type == NodeType.Exposure:
model_configs = self.project.exposures
else:

View File

@@ -33,6 +33,7 @@ from dbt.contracts.graph.nodes import (
Macro,
Exposure,
Metric,
Entity,
SeedNode,
SourceDefinition,
Resource,
@@ -1504,6 +1505,44 @@ def generate_parse_metrics(
}
class EntityRefResolver(BaseResolver):
def __call__(self, *args) -> str:
package = None
if len(args) == 1:
name = args[0]
elif len(args) == 2:
package, name = args
else:
raise RefArgsError(node=self.model, args=args)
self.validate_args(name, package)
self.model.refs.append(list(args))
return ""
def validate_args(self, name, package):
if not isinstance(name, str):
raise ParsingError(
f"In the entity associated with {self.model.original_file_path} "
"the name argument to ref() must be a string"
)
def generate_parse_entities(
entity: Entity,
config: RuntimeConfig,
manifest: Manifest,
package_name: str,
) -> Dict[str, Any]:
project = config.load_dependencies()[package_name]
return {
"ref": EntityRefResolver(
None,
entity,
project,
manifest,
),
}
# This class is currently used by the schema parser in order
# to limit the number of macros in the context by using
# the TestMacroNamespace

View File

@@ -227,6 +227,7 @@ class SchemaSourceFile(BaseSourceFile):
sources: List[str] = field(default_factory=list)
exposures: List[str] = field(default_factory=list)
metrics: List[str] = field(default_factory=list)
entities: List[str] = field(default_factory=list)
# node patches contain models, seeds, snapshots, analyses
ndp: List[str] = field(default_factory=list)
# any macro patches in this file by macro unique_id.

View File

@@ -29,6 +29,7 @@ from dbt.contracts.graph.nodes import (
GenericTestNode,
Exposure,
Metric,
Entity,
UnpatchedSourceDefinition,
ManifestNode,
GraphMemberNode,
@@ -212,6 +213,39 @@ class MetricLookup(dbtClassMixin):
return manifest.metrics[unique_id]
class EntityLookup(dbtClassMixin):
def __init__(self, manifest: "Manifest"):
self.storage: Dict[str, Dict[PackageName, UniqueID]] = {}
self.populate(manifest)
def get_unique_id(self, search_name, package: Optional[PackageName]):
return find_unique_id_for_package(self.storage, search_name, package)
def find(self, search_name, package: Optional[PackageName], manifest: "Manifest"):
unique_id = self.get_unique_id(search_name, package)
if unique_id is not None:
return self.perform_lookup(unique_id, manifest)
return None
def add_entity(self, entity: Entity):
if entity.search_name not in self.storage:
self.storage[entity.search_name] = {}
self.storage[entity.search_name][entity.package_name] = entity.unique_id
def populate(self, manifest):
for entity in manifest.entities.values():
if hasattr(entity, "name"):
self.add_entity(entity)
def perform_lookup(self, unique_id: UniqueID, manifest: "Manifest") -> Entity:
if unique_id not in manifest.entities:
raise dbt.exceptions.DbtInternalError(
f"Entity {unique_id} found in cache but not found in manifest"
)
return manifest.entities[unique_id]
# This handles both models/seeds/snapshots and sources/metrics/exposures
class DisabledLookup(dbtClassMixin):
def __init__(self, manifest: "Manifest"):
@@ -456,6 +490,9 @@ class Disabled(Generic[D]):
MaybeMetricNode = Optional[Union[Metric, Disabled[Metric]]]
MaybeEntityNode = Optional[Union[Entity, Disabled[Entity]]]
MaybeDocumentation = Optional[Documentation]
@@ -599,6 +636,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
docs: MutableMapping[str, Documentation] = field(default_factory=dict)
exposures: MutableMapping[str, Exposure] = field(default_factory=dict)
metrics: MutableMapping[str, Metric] = field(default_factory=dict)
entities: MutableMapping[str, Entity] = field(default_factory=dict)
selectors: MutableMapping[str, Any] = field(default_factory=dict)
files: MutableMapping[str, AnySourceFile] = field(default_factory=dict)
metadata: ManifestMetadata = field(default_factory=ManifestMetadata)
@@ -620,6 +658,9 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
_metric_lookup: Optional[MetricLookup] = field(
default=None, metadata={"serialize": lambda x: None, "deserialize": lambda x: None}
)
_entity_lookup: Optional[EntityLookup] = field(
default=None, metadata={"serialize": lambda x: None, "deserialize": lambda x: None}
)
_disabled_lookup: Optional[DisabledLookup] = field(
default=None, metadata={"serialize": lambda x: None, "deserialize": lambda x: None}
)
@@ -670,6 +711,9 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
def update_metric(self, new_metric: Metric):
_update_into(self.metrics, new_metric)
def update_entity(self, new_entity: Entity):
_update_into(self.entities, new_entity)
def update_node(self, new_node: ManifestNode):
_update_into(self.nodes, new_node)
@@ -685,6 +729,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self.flat_graph = {
"exposures": {k: v.to_dict(omit_none=False) for k, v in self.exposures.items()},
"metrics": {k: v.to_dict(omit_none=False) for k, v in self.metrics.items()},
"entities": {k: v.to_dict(omit_none=False) for k, v in self.entities.items()},
"nodes": {k: v.to_dict(omit_none=False) for k, v in self.nodes.items()},
"sources": {k: v.to_dict(omit_none=False) for k, v in self.sources.items()},
}
@@ -747,6 +792,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self.nodes.values(),
self.sources.values(),
self.metrics.values(),
self.entities.values(),
)
for resource in all_resources:
resource_type_plural = resource.resource_type.pluralize()
@@ -775,6 +821,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
docs={k: _deepcopy(v) for k, v in self.docs.items()},
exposures={k: _deepcopy(v) for k, v in self.exposures.items()},
metrics={k: _deepcopy(v) for k, v in self.metrics.items()},
entities={k: _deepcopy(v) for k, v in self.entities.items()},
selectors={k: _deepcopy(v) for k, v in self.selectors.items()},
metadata=self.metadata,
disabled={k: _deepcopy(v) for k, v in self.disabled.items()},
@@ -791,6 +838,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self.sources.values(),
self.exposures.values(),
self.metrics.values(),
self.entities.values(),
)
)
forward_edges, backward_edges = build_node_edges(edge_members)
@@ -816,6 +864,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
docs=self.docs,
exposures=self.exposures,
metrics=self.metrics,
entities=self.entities,
selectors=self.selectors,
metadata=self.metadata,
disabled=self.disabled,
@@ -837,6 +886,8 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
return self.exposures[unique_id]
elif unique_id in self.metrics:
return self.metrics[unique_id]
elif unique_id in self.entities:
return self.entities[unique_id]
else:
# something terrible has happened
raise dbt.exceptions.DbtInternalError(
@@ -873,6 +924,12 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self._metric_lookup = MetricLookup(self)
return self._metric_lookup
@property
def entity_lookup(self) -> EntityLookup:
if self._entity_lookup is None:
self._entity_lookup = EntityLookup(self)
return self._entity_lookup
def rebuild_ref_lookup(self):
self._ref_lookup = RefableLookup(self)
@@ -973,6 +1030,31 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
return Disabled(disabled[0])
return None
def resolve_entity(
self,
target_entity_name: str,
target_entity_package: Optional[str],
current_project: str,
node_package: str,
) -> MaybeEntityNode:
entity: Optional[Entity] = None
disabled: Optional[List[Entity]] = None
candidates = _search_packages(current_project, node_package, target_entity_package)
for pkg in candidates:
entity = self.entity_lookup.find(target_entity_name, pkg, self)
if entity is not None and entity.config.enabled:
return entity
# it's possible that the node is disabled
if disabled is None:
disabled = self.disabled_lookup.find(f"{target_entity_name}", pkg)
if disabled:
return Disabled(disabled[0])
return None
# Called by DocsRuntimeContext.doc
def resolve_doc(
self,
@@ -1083,6 +1165,11 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self.metrics[metric.unique_id] = metric
source_file.metrics.append(metric.unique_id)
def add_entity(self, source_file: SchemaSourceFile, entity: Entity):
_check_duplicates(entity, self.entities)
self.entities[entity.unique_id] = entity
source_file.entities.append(entity.unique_id)
def add_disabled_nofile(self, node: GraphMemberNode):
# There can be multiple disabled nodes for the same unique_id
if node.unique_id in self.disabled:
@@ -1098,6 +1185,8 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
source_file.add_test(node.unique_id, test_from)
if isinstance(node, Metric):
source_file.metrics.append(node.unique_id)
if isinstance(node, Entity):
source_file.entities.append(node.unique_id)
if isinstance(node, Exposure):
source_file.exposures.append(node.unique_id)
else:
@@ -1125,6 +1214,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self.docs,
self.exposures,
self.metrics,
self.entities,
self.selectors,
self.files,
self.metadata,
@@ -1137,6 +1227,7 @@ class Manifest(MacroMethods, DataClassMessagePackMixin, dbtClassMixin):
self._source_lookup,
self._ref_lookup,
self._metric_lookup,
self._entity_lookup,
self._disabled_lookup,
self._analysis_lookup,
)
@@ -1178,6 +1269,9 @@ class WritableManifest(ArtifactMixin):
metrics: Mapping[UniqueID, Metric] = field(
metadata=dict(description=("The metrics defined in the dbt project and its dependencies"))
)
entities: Mapping[UniqueID, Entity] = field(
metadata=dict(description=("The entities defined in the dbt project and its dependencies"))
)
selectors: Mapping[UniqueID, Any] = field(
metadata=dict(description=("The selectors defined in selectors.yml"))
)
@@ -1202,7 +1296,8 @@ class WritableManifest(ArtifactMixin):
@classmethod
def compatible_previous_versions(self):
return [("manifest", 4), ("manifest", 5), ("manifest", 6), ("manifest", 7)]
# return [("manifest", 4), ("manifest", 5), ("manifest", 6), ("manifest", 7)]
return []
def __post_serialize__(self, dct):
for unique_id, node in dct["nodes"].items():

View File

@@ -368,6 +368,11 @@ class MetricConfig(BaseConfig):
enabled: bool = True
@dataclass
class EntityConfig(BaseConfig):
enabled: bool = True
@dataclass
class ExposureConfig(BaseConfig):
enabled: bool = True
@@ -604,6 +609,7 @@ class SnapshotConfig(EmptySnapshotConfig):
RESOURCE_TYPES: Dict[NodeType, Type[BaseConfig]] = {
NodeType.Metric: MetricConfig,
NodeType.Entity: EntityConfig,
NodeType.Exposure: ExposureConfig,
NodeType.Source: SourceConfig,
NodeType.Seed: SeedConfig,

View File

@@ -55,6 +55,7 @@ from .model_config import (
TestConfig,
SourceConfig,
MetricConfig,
EntityConfig,
ExposureConfig,
EmptySnapshotConfig,
SnapshotConfig,
@@ -272,7 +273,7 @@ class ParsedNode(NodeInfoMixin, ParsedNodeMandatory, SerializableType):
@classmethod
def _deserialize(cls, dct: Dict[str, int]):
# The serialized ParsedNodes do not differ from each other
# in fields that would allow 'from_dict' to distinguis
# in fields that would allow 'from_dict' to distinguish
# between them.
resource_type = dct["resource_type"]
if resource_type == "model":
@@ -392,6 +393,7 @@ class CompiledNode(ParsedNode):
refs: List[List[str]] = field(default_factory=list)
sources: List[List[str]] = field(default_factory=list)
metrics: List[List[str]] = field(default_factory=list)
entities: List[List[str]] = field(default_factory=list)
depends_on: DependsOn = field(default_factory=DependsOn)
compiled_path: Optional[str] = None
compiled: bool = False
@@ -906,6 +908,7 @@ class Exposure(GraphNode):
refs: List[List[str]] = field(default_factory=list)
sources: List[List[str]] = field(default_factory=list)
metrics: List[List[str]] = field(default_factory=list)
entities: List[List[str]] = field(default_factory=list)
created_at: float = field(default_factory=lambda: time.time())
@property
@@ -997,6 +1000,7 @@ class Metric(GraphNode):
depends_on: DependsOn = field(default_factory=DependsOn)
refs: List[List[str]] = field(default_factory=list)
metrics: List[List[str]] = field(default_factory=list)
entities: List[List[str]] = field(default_factory=list)
created_at: float = field(default_factory=lambda: time.time())
@property
@@ -1065,6 +1069,63 @@ class Metric(GraphNode):
)
@dataclass
class Entity(GraphNode):
name: str
model: str
description: str
dimensions: List[str]
resource_type: NodeType = field(metadata={"restrict": [NodeType.Entity]})
model_unique_id: Optional[str] = None
meta: Dict[str, Any] = field(default_factory=dict)
tags: List[str] = field(default_factory=list)
config: EntityConfig = field(default_factory=EntityConfig)
unrendered_config: Dict[str, Any] = field(default_factory=dict)
sources: List[List[str]] = field(default_factory=list)
depends_on: DependsOn = field(default_factory=DependsOn)
refs: List[List[str]] = field(default_factory=list)
entities: List[List[str]] = field(default_factory=list)
metrics: List[List[str]] = field(default_factory=list)
created_at: float = field(default_factory=lambda: time.time())
@property
def depends_on_nodes(self):
return self.depends_on.nodes
@property
def search_name(self):
return self.name
def same_model(self, old: "Entity") -> bool:
return self.model == old.model
def same_dimensions(self, old: "Entity") -> bool:
return self.dimensions == old.dimensions
def same_description(self, old: "Entity") -> bool:
return self.description == old.description
def same_config(self, old: "Entity") -> bool:
return self.config.same_contents(
self.unrendered_config,
old.unrendered_config,
)
def same_contents(self, old: Optional["Entity"]) -> bool:
# existing when it didn't before is a change!
# metadata/tags changes are not "changes"
if old is None:
return True
return (
self.same_model(old)
and self.same_dimensions(old)
and self.same_description(old)
and self.same_config(old)
and True
)
# ====================================
# Patches
# ====================================
@@ -1126,6 +1187,7 @@ GraphMemberNode = Union[
ResultNode,
Exposure,
Metric,
Entity,
]
# All "nodes" (or node-like objects) in this file

View File

@@ -533,3 +533,21 @@ class UnparsedMetric(dbtClassMixin, Replaceable):
if data.get("model") is not None and data.get("calculation_method") == "derived":
raise ValidationError("Derived metrics cannot have a 'model' property")
@dataclass
class UnparsedEntity(dbtClassMixin, Replaceable):
"""This class is used for entity information"""
name: str
model: str
description: str = ""
dimensions: List[str] = field(default_factory=list)
meta: Dict[str, Any] = field(default_factory=dict)
tags: List[str] = field(default_factory=list)
config: Dict[str, Any] = field(default_factory=dict)
@classmethod
def validate(cls, data):
super(UnparsedEntity, cls).validate(data)
# TODO: Add validation here around include/exclude and others

View File

@@ -214,6 +214,7 @@ class Project(HyphenatedDbtClassMixin, Replaceable):
sources: Dict[str, Any] = field(default_factory=dict)
tests: Dict[str, Any] = field(default_factory=dict)
metrics: Dict[str, Any] = field(default_factory=dict)
entities: Dict[str, Any] = field(default_factory=dict)
exposures: Dict[str, Any] = field(default_factory=dict)
vars: Optional[Dict[str, Any]] = field(
default=None,

View File

@@ -531,6 +531,7 @@ class ConnectionReused(betterproto.Message):
"""E006"""
conn_name: str = betterproto.string_field(1)
orig_conn_name: str = betterproto.string_field(2)
@dataclass

View File

@@ -419,6 +419,7 @@ message NewConnectionMsg {
// E006
message ConnectionReused {
string conn_name = 1;
string orig_conn_name = 2;
}
message ConnectionReusedMsg {

View File

@@ -449,7 +449,7 @@ class ConnectionReused(DebugLevel, pt.ConnectionReused):
return "E006"
def message(self) -> str:
return f"Re-using an available connection from the pool (formerly {self.conn_name})"
return f"Re-using an available connection from the pool (formerly {self.orig_conn_name}, now {self.conn_name})"
@dataclass

View File

@@ -20,7 +20,7 @@ from .selector_spec import (
INTERSECTION_DELIMITER = ","
DEFAULT_INCLUDES: List[str] = ["fqn:*", "source:*", "exposure:*", "metric:*"]
DEFAULT_INCLUDES: List[str] = ["fqn:*", "source:*", "exposure:*", "metric:*", "entity:*"]
DEFAULT_EXCLUDES: List[str] = []

View File

@@ -9,6 +9,7 @@ from dbt.contracts.graph.nodes import (
SourceDefinition,
Exposure,
Metric,
Entity,
GraphMemberNode,
)
from dbt.contracts.graph.manifest import Manifest
@@ -51,8 +52,8 @@ class GraphQueue:
node = self.manifest.expect(node_id)
if node.resource_type != NodeType.Model:
return False
# must be a Model - tell mypy this won't be a Source or Exposure or Metric
assert not isinstance(node, (SourceDefinition, Exposure, Metric))
# must be a Model - tell mypy this won't be a Source or Exposure or Metric or Entity
assert not isinstance(node, (SourceDefinition, Exposure, Metric, Entity))
if node.is_ephemeral:
return False
return True

View File

@@ -163,6 +163,9 @@ class NodeSelector(MethodManager):
elif unique_id in self.manifest.metrics:
metric = self.manifest.metrics[unique_id]
return metric.config.enabled
elif unique_id in self.manifest.entities:
entity = self.manifest.entities[unique_id]
return entity.config.enabled
node = self.manifest.nodes[unique_id]
return not node.empty and node.config.enabled
@@ -182,6 +185,8 @@ class NodeSelector(MethodManager):
node = self.manifest.exposures[unique_id]
elif unique_id in self.manifest.metrics:
node = self.manifest.metrics[unique_id]
elif unique_id in self.manifest.entities:
node = self.manifest.entities[unique_id]
else:
raise DbtInternalError(f"Node {unique_id} not found in the manifest!")
return self.node_is_match(node)

View File

@@ -12,6 +12,7 @@ from dbt.contracts.graph.nodes import (
SingularTestNode,
Exposure,
Metric,
Entity,
GenericTestNode,
SourceDefinition,
ResultNode,
@@ -43,6 +44,7 @@ class MethodName(StrEnum):
State = "state"
Exposure = "exposure"
Metric = "metric"
Entity = "entity"
Result = "result"
SourceStatus = "source_status"
@@ -71,7 +73,7 @@ def is_selected_node(fqn: List[str], node_selector: str):
return True
SelectorTarget = Union[SourceDefinition, ManifestNode, Exposure, Metric]
SelectorTarget = Union[SourceDefinition, ManifestNode, Exposure, Metric, Entity]
class SelectorMethod(metaclass=abc.ABCMeta):
@@ -118,6 +120,14 @@ class SelectorMethod(metaclass=abc.ABCMeta):
continue
yield unique_id, metric
def entity_nodes(self, included_nodes: Set[UniqueId]) -> Iterator[Tuple[UniqueId, Entity]]:
for key, metric in self.manifest.entities.items():
unique_id = UniqueId(key)
if unique_id not in included_nodes:
continue
yield unique_id, metric
def all_nodes(
self, included_nodes: Set[UniqueId]
) -> Iterator[Tuple[UniqueId, SelectorTarget]]:
@@ -126,6 +136,7 @@ class SelectorMethod(metaclass=abc.ABCMeta):
self.source_nodes(included_nodes),
self.exposure_nodes(included_nodes),
self.metric_nodes(included_nodes),
self.entity_nodes(included_nodes),
)
def configurable_nodes(
@@ -136,11 +147,12 @@ class SelectorMethod(metaclass=abc.ABCMeta):
def non_source_nodes(
self,
included_nodes: Set[UniqueId],
) -> Iterator[Tuple[UniqueId, Union[Exposure, ManifestNode, Metric]]]:
) -> Iterator[Tuple[UniqueId, Union[Exposure, ManifestNode, Metric, Entity]]]:
yield from chain(
self.parsed_nodes(included_nodes),
self.exposure_nodes(included_nodes),
self.metric_nodes(included_nodes),
self.entity_nodes(included_nodes),
)
@abc.abstractmethod
@@ -270,6 +282,33 @@ class MetricSelectorMethod(SelectorMethod):
yield node
class EntitySelectorMethod(SelectorMethod):
"""TODO: Add a description of what this selector method is doing"""
def search(self, included_nodes: Set[UniqueId], selector: str) -> Iterator[UniqueId]:
parts = selector.split(".")
target_package = SELECTOR_GLOB
if len(parts) == 1:
target_name = parts[0]
elif len(parts) == 2:
target_package, target_name = parts
else:
msg = (
'Invalid entity selector value "{}". Entities must be of '
"the form ${{entity_name}} or "
"${{entity_package.entity_name}}"
).format(selector)
raise DbtRuntimeError(msg)
for node, real_node in self.entity_nodes(included_nodes):
if target_package not in (real_node.package_name, SELECTOR_GLOB):
continue
if target_name not in (real_node.name, SELECTOR_GLOB):
continue
yield node
class PathSelectorMethod(SelectorMethod):
def search(self, included_nodes: Set[UniqueId], selector: str) -> Iterator[UniqueId]:
"""Yields nodes from included that match the given path."""
@@ -530,6 +569,8 @@ class StateSelectorMethod(SelectorMethod):
previous_node = manifest.exposures[node]
elif node in manifest.metrics:
previous_node = manifest.metrics[node]
elif node in manifest.entities:
previous_node = manifest.entities[node]
if checker(previous_node, real_node):
yield node
@@ -616,6 +657,7 @@ class MethodManager:
MethodName.State: StateSelectorMethod,
MethodName.Exposure: ExposureSelectorMethod,
MethodName.Metric: MetricSelectorMethod,
MethodName.Entity: EntitySelectorMethod,
MethodName.Result: ResultSelectorMethod,
MethodName.SourceStatus: SourceStatusSelectorMethod,
}

View File

@@ -18,6 +18,7 @@ class NodeType(StrEnum):
Macro = "macro"
Exposure = "exposure"
Metric = "metric"
Entity = "entity"
@classmethod
def executable(cls) -> List["NodeType"]:
@@ -52,11 +53,14 @@ class NodeType(StrEnum):
cls.Analysis,
cls.Exposure,
cls.Metric,
cls.Entity,
]
def pluralize(self) -> str:
if self is self.Analysis:
return "analyses"
if self is self.Entity:
return "entities"
return f"{self}s"

View File

@@ -272,7 +272,7 @@ class TestBuilder(Generic[Testable]):
column_name=column_name,
name=self.name,
key=key,
err_msg=e.msg
err_msg=e.msg,
)
if value is not None:

View File

@@ -56,6 +56,7 @@ from dbt.contracts.graph.nodes import (
ColumnInfo,
Exposure,
Metric,
Entity,
SeedNode,
ManifestNode,
ResultNode,
@@ -340,7 +341,7 @@ class ManifestLoader:
project, project_parser_files[project.project_name], parser_types
)
# Now that we've loaded most of the nodes (except for schema tests, sources, metrics)
# Now that we've loaded most of the nodes (except for schema tests, sources, metrics, entities)
# load up the Lookup objects to resolve them by name, so the SourceFiles store
# the unique_id instead of the name. Sources are loaded from yaml files, so
# aren't in place yet
@@ -376,7 +377,7 @@ class ManifestLoader:
# copy the selectors from the root_project to the manifest
self.manifest.selectors = self.root_project.manifest_selectors
# update the refs, sources, docs and metrics depends_on.nodes
# update the refs, sources, docs, entities and metrics depends_on.nodes
# These check the created_at time on the nodes to
# determine whether they need processing.
start_process = time.perf_counter()
@@ -384,6 +385,7 @@ class ManifestLoader:
self.process_refs(self.root_project.project_name)
self.process_docs(self.root_project)
self.process_metrics(self.root_project)
self.process_entities(self.root_project)
# update tracking data
self._perf_info.process_manifest_elapsed = time.perf_counter() - start_process
@@ -838,6 +840,10 @@ class ManifestLoader:
if metric.created_at < self.started_at:
continue
_process_refs_for_metric(self.manifest, current_project, metric)
for entity in self.manifest.entities.values():
if entity.created_at < self.started_at:
continue
_process_refs_for_entity(self.manifest, current_project, entity)
# Takes references in 'metrics' array of nodes and exposures, finds the target
# node, and updates 'depends_on.nodes' with the unique id
@@ -858,6 +864,23 @@ class ManifestLoader:
continue
_process_metrics_for_node(self.manifest, current_project, exposure)
# Takes references in 'entities' array of nodes and exposures, finds the target
# node, and updates 'depends_on.nodes' with the unique id
def process_entities(self, config: RuntimeConfig):
current_project = config.project_name
for node in self.manifest.nodes.values():
if node.created_at < self.started_at:
continue
_process_entities_for_node(self.manifest, current_project, node)
for entity in self.manifest.entities.values():
if entity.created_at < self.started_at:
continue
_process_entities_for_node(self.manifest, current_project, entity)
for exposure in self.manifest.exposures.values():
if exposure.created_at < self.started_at:
continue
_process_entities_for_node(self.manifest, current_project, exposure)
# nodes: node and column descriptions
# sources: source and table descriptions, column descriptions
# macros: macro argument descriptions
@@ -913,6 +936,16 @@ class ManifestLoader:
config.project_name,
)
_process_docs_for_metrics(ctx, metric)
for entity in self.manifest.entities.values():
if entity.created_at < self.started_at:
continue
ctx = generate_runtime_docs_context(
config,
entity,
self.manifest,
config.project_name,
)
_process_docs_for_entities(ctx, entity)
# Loops through all nodes and exposures, for each element in
# 'sources' array finds the source node and updates the
@@ -1103,6 +1136,10 @@ def _process_docs_for_metrics(context: Dict[str, Any], metric: Metric) -> None:
metric.description = get_rendered(metric.description, context)
def _process_docs_for_entities(context: Dict[str, Any], entity: Entity) -> None:
entity.description = get_rendered(entity.description, context)
def _process_refs_for_exposure(manifest: Manifest, current_project: str, exposure: Exposure):
"""Given a manifest and exposure in that manifest, process its refs"""
for ref in exposure.refs:
@@ -1190,6 +1227,48 @@ def _process_refs_for_metric(manifest: Manifest, current_project: str, metric: M
manifest.update_metric(metric)
def _process_refs_for_entity(manifest: Manifest, current_project: str, entity: Entity):
"""Given a manifest and an entity in that manifest, process its refs"""
for ref in entity.refs:
target_model: Optional[Union[Disabled, ManifestNode]] = None
target_model_name: str
target_model_package: Optional[str] = None
if len(ref) == 1:
target_model_name = ref[0]
elif len(ref) == 2:
target_model_package, target_model_name = ref
else:
raise dbt.exceptions.DbtInternalError(
f"Refs should always be 1 or 2 arguments - got {len(ref)}"
)
target_model = manifest.resolve_ref(
target_model_name,
target_model_package,
current_project,
entity.package_name,
)
if target_model is None or isinstance(target_model, Disabled):
# This may raise. Even if it doesn't, we don't want to add
# this entity to the graph b/c there is no destination entity
entity.config.enabled = False
invalid_target_fail_unless_test(
node=entity,
target_name=target_model_name,
target_kind="node",
target_package=target_model_package,
disabled=(isinstance(target_model, Disabled)),
)
continue
target_model_id = target_model.unique_id
entity.depends_on.nodes.append(target_model_id)
manifest.update_entity(entity)
def _process_metrics_for_node(
manifest: Manifest,
current_project: str,
@@ -1239,6 +1318,55 @@ def _process_metrics_for_node(
node.depends_on.nodes.append(target_metric_id)
def _process_entities_for_node(
manifest: Manifest,
current_project: str,
node: Union[ManifestNode, Entity, Exposure],
):
"""Given a manifest and a node in that manifest, process its entities"""
if isinstance(node, SeedNode):
return
for entity in node.entities:
target_entity: Optional[Union[Disabled, Entity]] = None
target_entity_name: str
target_entity_package: Optional[str] = None
if len(entity) == 1:
target_entity_name = entity[0]
elif len(entity) == 2:
target_entity_package, target_entity_name = entity
else:
raise dbt.exceptions.DbtInternalError(
f"Entity references should always be 1 or 2 arguments - got {len(entity)}"
)
target_entity = manifest.resolve_entity(
target_entity_name,
target_entity_package,
current_project,
node.package_name,
)
if target_entity is None or isinstance(target_entity, Disabled):
# This may raise. Even if it doesn't, we don't want to add
# this node to the graph b/c there is no destination node
node.config.enabled = False
invalid_target_fail_unless_test(
node=node,
target_name=target_entity_name,
target_kind="source",
target_package=target_entity_package,
disabled=(isinstance(target_entity, Disabled)),
)
continue
target_entity_id = target_entity.unique_id
node.depends_on.nodes.append(target_entity_id)
def _process_refs_for_node(manifest: Manifest, current_project: str, node: ManifestNode):
"""Given a manifest and a node in that manifest, process its refs"""
@@ -1313,6 +1441,7 @@ def _process_sources_for_exposure(manifest: Manifest, current_project: str, expo
manifest.update_exposure(exposure)
# TODO: Remove this code because metrics can't be based on sources
def _process_sources_for_metric(manifest: Manifest, current_project: str, metric: Metric):
target_source: Optional[Union[Disabled, SourceDefinition]] = None
for source_name, table_name in metric.sources:

View File

@@ -8,6 +8,7 @@ from dbt.contracts.files import (
parse_file_type_to_parser,
)
from dbt.events.functions import fire_event
from dbt.events.base_types import EventLevel
from dbt.events.types import (
PartialParsingEnabled,
PartialParsingFile,
@@ -155,7 +156,11 @@ class PartialParsing:
self.macro_child_map = self.saved_manifest.build_macro_child_map()
deleted = len(deleted) + len(deleted_schema_files)
changed = len(changed) + len(changed_schema_files)
fire_event(PartialParsingEnabled(deleted=deleted, added=len(added), changed=changed))
event = PartialParsingEnabled(deleted=deleted, added=len(added), changed=changed)
if os.environ.get("DBT_PP_TEST"):
fire_event(event, level=EventLevel.INFO)
else:
fire_event(event)
self.file_diff = file_diff
# generate the list of files that need parsing
@@ -237,7 +242,7 @@ class PartialParsing:
self.remove_source_override_target(source)
def delete_disabled(self, unique_id, file_id):
# This node/metric/exposure is disabled. Find it and remove it from disabled dictionary.
# This node/metric/entity/exposure is disabled. Find it and remove it from disabled dictionary.
for dis_index, dis_node in enumerate(self.saved_manifest.disabled[unique_id]):
if dis_node.file_id == file_id:
node = dis_node
@@ -436,6 +441,18 @@ class PartialParsing:
if metric_element:
self.delete_schema_metric(schema_file, metric_element)
self.merge_patch(schema_file, "metrics", metric_element)
elif unique_id in self.saved_manifest.entities:
entity = self.saved_manifest.entities[unique_id]
file_id = entity.file_id
if file_id in self.saved_files and file_id not in self.file_diff["deleted"]:
schema_file = self.saved_files[file_id]
entities = []
if "entities" in schema_file.dict_from_yaml:
entities = schema_file.dict_from_yaml["entities"]
entity_element = self.get_schema_element(entities, entity.name)
if entity_element:
self.delete_schema_entity(schema_file, entity_element)
self.merge_patch(schema_file, "entities", entity_element)
elif unique_id in self.saved_manifest.macros:
macro = self.saved_manifest.macros[unique_id]
file_id = macro.file_id
@@ -741,6 +758,29 @@ class PartialParsing:
self.delete_schema_metric(schema_file, elem)
self.merge_patch(schema_file, dict_key, elem)
# entities
dict_key = "entities"
entity_diff = self.get_diff_for("entities", saved_yaml_dict, new_yaml_dict)
if entity_diff["changed"]:
for entity in entity_diff["changed"]:
self.delete_schema_entity(schema_file, entity)
self.merge_patch(schema_file, dict_key, entity)
if entity_diff["deleted"]:
for entity in entity_diff["deleted"]:
self.delete_schema_entity(schema_file, entity)
if entity_diff["added"]:
for entity in entity_diff["added"]:
self.merge_patch(schema_file, dict_key, entity)
# Handle schema file updates due to env_var changes
if dict_key in env_var_changes and dict_key in new_yaml_dict:
for name in env_var_changes[dict_key]:
if name in entity_diff["changed_or_deleted_names"]:
continue
elem = self.get_schema_element(new_yaml_dict[dict_key], name)
if elem:
self.delete_schema_entity(schema_file, elem)
self.merge_patch(schema_file, dict_key, elem)
# Take a "section" of the schema file yaml dictionary from saved and new schema files
# and determine which parts have changed
def get_diff_for(self, key, saved_yaml_dict, new_yaml_dict):
@@ -916,6 +956,24 @@ class PartialParsing:
elif unique_id in self.saved_manifest.disabled:
self.delete_disabled(unique_id, schema_file.file_id)
# entities are created only from schema files, but also can be referred to by other nodes
def delete_schema_entity(self, schema_file, entity_dict):
entity_name = entity_dict["name"]
entities = schema_file.entities.copy()
for unique_id in entities:
if unique_id in self.saved_manifest.entities:
entity = self.saved_manifest.entities[unique_id]
if entity.name == entity_name:
# Need to find everything that referenced this entity and schedule for parsing
if unique_id in self.saved_manifest.child_map:
self.schedule_nodes_for_parsing(self.saved_manifest.child_map[unique_id])
self.deleted_manifest.entities[unique_id] = self.saved_manifest.entities.pop(
unique_id
)
schema_file.entities.remove(unique_id)
elif unique_id in self.saved_manifest.disabled:
self.delete_disabled(unique_id, schema_file.file_id)
def get_schema_element(self, elem_list, elem_name):
for element in elem_list:
if "name" in element and element["name"] == elem_name:

View File

@@ -22,11 +22,12 @@ from dbt.context.configured import generate_schema_yml_context, SchemaYamlVars
from dbt.context.providers import (
generate_parse_exposure,
generate_parse_metrics,
generate_parse_entities,
generate_test_context,
)
from dbt.context.macro_resolver import MacroResolver
from dbt.contracts.files import FileHash, SchemaSourceFile
from dbt.contracts.graph.model_config import MetricConfig, ExposureConfig
from dbt.contracts.graph.model_config import MetricConfig, ExposureConfig, EntityConfig
from dbt.contracts.graph.nodes import (
ParsedNodePatch,
ColumnInfo,
@@ -35,6 +36,7 @@ from dbt.contracts.graph.nodes import (
UnpatchedSourceDefinition,
Exposure,
Metric,
Entity,
)
from dbt.contracts.graph.unparsed import (
HasColumnDocs,
@@ -47,6 +49,7 @@ from dbt.contracts.graph.unparsed import (
UnparsedNodeUpdate,
UnparsedExposure,
UnparsedMetric,
UnparsedEntity,
UnparsedSourceDefinition,
)
from dbt.exceptions import (
@@ -94,6 +97,7 @@ schema_file_keys = (
"analyses",
"exposures",
"metrics",
"entities",
)
@@ -114,6 +118,7 @@ class ParserRef:
def __init__(self):
self.column_info: Dict[str, ColumnInfo] = {}
# TODO: Mimic this for dimension information at the entity level
def add(
self,
column: Union[HasDocs, UnparsedColumn],
@@ -536,6 +541,11 @@ class SchemaParser(SimpleParser[GenericTestBlock, GenericTestNode]):
metric_parser = MetricParser(self, yaml_block)
metric_parser.parse()
# parse entities
if "entities" in dct:
entity_parser = EntityParser(self, yaml_block)
entity_parser.parse()
def check_format_version(file_path, yaml_dct) -> None:
if "version" not in yaml_dct:
@@ -1183,3 +1193,107 @@ class MetricParser(YamlReader):
except (ValidationError, JSONValidationError) as exc:
raise YamlParseDictError(self.yaml.path, self.key, data, exc)
self.parse_metric(unparsed)
class EntityParser(YamlReader):
def __init__(self, schema_parser: SchemaParser, yaml: YamlBlock):
super().__init__(schema_parser, yaml, NodeType.Entity.pluralize())
self.schema_parser = schema_parser
self.yaml = yaml
def parse_entity(self, unparsed: UnparsedEntity):
package_name = self.project.project_name
unique_id = f"{NodeType.Entity}.{package_name}.{unparsed.name}"
path = self.yaml.path.relative_path
fqn = self.schema_parser.get_fqn_prefix(path)
fqn.append(unparsed.name)
config = self._generate_entity_config(
target=unparsed,
fqn=fqn,
package_name=package_name,
rendered=True,
)
config = config.finalize_and_validate()
unrendered_config = self._generate_entity_config(
target=unparsed,
fqn=fqn,
package_name=package_name,
rendered=False,
)
if not isinstance(config, EntityConfig):
raise DbtInternalError(
f"Calculated a {type(config)} for an entity, but expected a EntityConfig"
)
parsed = Entity(
resource_type=NodeType.Entity,
package_name=package_name,
path=path,
original_file_path=self.yaml.path.original_file_path,
unique_id=unique_id,
fqn=fqn,
model=unparsed.model,
name=unparsed.name,
description=unparsed.description,
dimensions=unparsed.dimensions,
meta=unparsed.meta,
tags=unparsed.tags,
config=config,
unrendered_config=unrendered_config,
)
ctx = generate_parse_entities(
parsed,
self.root_project,
self.schema_parser.manifest,
package_name,
)
if parsed.model is not None:
model_ref = "{{ " + parsed.model + " }}"
get_rendered(model_ref, ctx, parsed)
# if the metric is disabled we do not want it included in the manifest, only in the disabled dict
if parsed.config.enabled:
# self.manifest.add_metric(self.yaml.file, parsed)
self.manifest.add_entity(self.yaml.file, parsed)
else:
self.manifest.add_disabled(self.yaml.file, parsed)
def _generate_entity_config(
self, target: UnparsedEntity, fqn: List[str], package_name: str, rendered: bool
):
generator: BaseContextConfigGenerator
if rendered:
generator = ContextConfigGenerator(self.root_project)
else:
generator = UnrenderedConfigGenerator(self.root_project)
# configs with precendence set
precedence_configs = dict()
# first apply metric configs
precedence_configs.update(target.config)
return generator.calculate_node_config(
config_call_dict={},
fqn=fqn,
resource_type=NodeType.Entity,
project_name=package_name,
base=False,
patch_config_dict=precedence_configs,
)
def parse(self):
for data in self.get_key_dicts():
try:
UnparsedEntity.validate(data)
unparsed = UnparsedEntity.from_dict(data)
except (ValidationError, JSONValidationError) as exc:
raise YamlParseDictError(self.yaml.path, self.key, data, exc)
self.parse_entity(unparsed)

View File

@@ -1,6 +1,6 @@
import json
from dbt.contracts.graph.nodes import Exposure, SourceDefinition, Metric
from dbt.contracts.graph.nodes import Exposure, SourceDefinition, Metric, Entity
from dbt.graph import ResourceTypeSelector
from dbt.task.runnable import GraphRunnableTask, ManifestTask
from dbt.task.test import TestSelector
@@ -22,6 +22,7 @@ class ListTask(GraphRunnableTask):
NodeType.Source,
NodeType.Exposure,
NodeType.Metric,
NodeType.Entity,
)
)
ALL_RESOURCE_VALUES = DEFAULT_RESOURCE_VALUES | frozenset((NodeType.Analysis,))
@@ -82,6 +83,8 @@ class ListTask(GraphRunnableTask):
yield self.manifest.exposures[node]
elif node in self.manifest.metrics:
yield self.manifest.metrics[node]
elif node in self.manifest.entities:
yield self.manifest.entities[node]
else:
raise DbtRuntimeError(
f'Got an unexpected result from node selection: "{node}"'
@@ -105,6 +108,11 @@ class ListTask(GraphRunnableTask):
# metrics are searched for by pkg.metric_name
metric_selector = ".".join([node.package_name, node.name])
yield f"metric:{metric_selector}"
elif node.resource_type == NodeType.Entity:
assert isinstance(node, Entity)
# entities are searched for by pkg.entity_name
entity_selector = ".".join([node.package_name, node.name])
yield f"entity:{entity_selector}"
else:
# everything else is from `fqn`
yield ".".join(node.fqn)

View File

@@ -91,9 +91,7 @@ class TestRunner(CompileRunner):
def before_execute(self):
self.print_start_line()
def execute_test(
self, test: TestNode, manifest: Manifest
) -> TestResultData:
def execute_test(self, test: TestNode, manifest: Manifest) -> TestResultData:
context = generate_runtime_model_context(test, self.config, manifest)
materialization_macro = manifest.find_materialization_macro_by_name(
@@ -101,7 +99,9 @@ class TestRunner(CompileRunner):
)
if materialization_macro is None:
raise MissingMaterializationError(materialization=test.get_materialization(), adapter_type=self.adapter.type())
raise MissingMaterializationError(
materialization=test.get_materialization(), adapter_type=self.adapter.type()
)
if "config" not in context:
raise DbtInternalError(

View File

@@ -249,7 +249,9 @@ def clean_up_logging():
# otherwise this will fail. So to test errors in those areas, you need to copy the files
# into the project in the tests instead of putting them in the fixtures.
@pytest.fixture(scope="class")
def adapter(unique_schema, project_root, profiles_root, profiles_yml, dbt_project_yml, clean_up_logging):
def adapter(
unique_schema, project_root, profiles_root, profiles_yml, dbt_project_yml, clean_up_logging
):
# The profiles.yml and dbt_project.yml should already be written out
args = Namespace(
profiles_dir=str(profiles_root), project_dir=str(project_root), target=None, profile=None

View File

@@ -12,7 +12,12 @@ from dbt.adapters.factory import Adapter
from dbt.main import handle_and_check
from dbt.logger import log_manager
from dbt.contracts.graph.manifest import Manifest
from dbt.events.functions import fire_event, capture_stdout_logs, stop_capture_stdout_logs, reset_metadata_vars
from dbt.events.functions import (
fire_event,
capture_stdout_logs,
stop_capture_stdout_logs,
reset_metadata_vars,
)
from dbt.events.test_types import IntegrationTestDebug
# =============================================================================

View File

@@ -6,6 +6,6 @@ namespace_packages = true
[tool.black]
# TODO: remove global exclusion of tests when testing overhaul is complete
force-exclude = 'test'
force-exclude = 'test/'
line-length = 99
target-version = ['py38']

View File

@@ -36,7 +36,7 @@
}
},
"additionalProperties": false,
"description": "CatalogArtifact(metadata: dbt.contracts.results.CatalogMetadata, nodes: Dict[str, dbt.contracts.results.CatalogTable], sources: Dict[str, dbt.contracts.results.CatalogTable], errors: Union[List[str], NoneType] = None, _compile_results: Union[Any, NoneType] = None)",
"description": "CatalogArtifact(metadata: dbt.contracts.results.CatalogMetadata, nodes: Dict[str, dbt.contracts.results.CatalogTable], sources: Dict[str, dbt.contracts.results.CatalogTable], errors: Optional[List[str]] = None, _compile_results: Optional[Any] = None)",
"definitions": {
"CatalogMetadata": {
"type": "object",
@@ -48,12 +48,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.2.0a1"
"default": "1.5.0a1"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2022-04-15T20:38:22.701177Z"
"default": "2023-01-23T21:56:17.789289Z"
},
"invocation_id": {
"oneOf": [
@@ -64,7 +64,7 @@
"type": "null"
}
],
"default": "34abf75e-59d3-442f-920c-fa3843d98014"
"default": "10c9c26b-6682-4d46-84d2-12f641a070e5"
},
"env": {
"type": "object",
@@ -75,7 +75,7 @@
}
},
"additionalProperties": false,
"description": "CatalogMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.2.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Union[str, NoneType] = <factory>, env: Dict[str, str] = <factory>)"
"description": "CatalogMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.5.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Optional[str] = <factory>, env: Dict[str, str] = <factory>)"
},
"CatalogTable": {
"type": "object",
@@ -112,7 +112,7 @@
}
},
"additionalProperties": false,
"description": "CatalogTable(metadata: dbt.contracts.results.TableMetadata, columns: Dict[str, dbt.contracts.results.ColumnMetadata], stats: Dict[str, dbt.contracts.results.StatsItem], unique_id: Union[str, NoneType] = None)"
"description": "CatalogTable(metadata: dbt.contracts.results.TableMetadata, columns: Dict[str, dbt.contracts.results.ColumnMetadata], stats: Dict[str, dbt.contracts.results.StatsItem], unique_id: Optional[str] = None)"
},
"TableMetadata": {
"type": "object",
@@ -163,7 +163,7 @@
}
},
"additionalProperties": false,
"description": "TableMetadata(type: str, schema: str, name: str, database: Union[str, NoneType] = None, comment: Union[str, NoneType] = None, owner: Union[str, NoneType] = None)"
"description": "TableMetadata(type: str, schema: str, name: str, database: Optional[str] = None, comment: Optional[str] = None, owner: Optional[str] = None)"
},
"ColumnMetadata": {
"type": "object",
@@ -194,7 +194,7 @@
}
},
"additionalProperties": false,
"description": "ColumnMetadata(type: str, index: int, name: str, comment: Union[str, NoneType] = None)"
"description": "ColumnMetadata(type: str, index: int, name: str, comment: Optional[str] = None)"
},
"StatsItem": {
"type": "object",
@@ -241,7 +241,7 @@
}
},
"additionalProperties": false,
"description": "StatsItem(id: str, label: str, value: Union[bool, str, float, NoneType], include: bool, description: Union[str, NoneType] = None)"
"description": "StatsItem(id: str, label: str, value: Union[bool, str, float, NoneType], include: bool, description: Optional[str] = None)"
}
},
"$schema": "http://json-schema.org/draft-07/schema#",

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -8,6 +8,7 @@
"docs",
"exposures",
"metrics",
"entities",
"selectors"
],
"properties": {
@@ -85,6 +86,13 @@
},
"description": "The metrics defined in the dbt project and its dependencies"
},
"entities": {
"type": "object",
"additionalProperties": {
"$ref": "#/definitions/Entity"
},
"description": "The entities defined in the dbt project and its dependencies"
},
"selectors": {
"type": "object",
"description": "The selectors defined in selectors.yml"
@@ -173,7 +181,7 @@
}
},
"additionalProperties": false,
"description": "WritableManifest(metadata: dbt.contracts.graph.manifest.ManifestMetadata, nodes: Mapping[str, Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode]], sources: Mapping[str, dbt.contracts.graph.nodes.SourceDefinition], macros: Mapping[str, dbt.contracts.graph.nodes.Macro], docs: Mapping[str, dbt.contracts.graph.nodes.Documentation], exposures: Mapping[str, dbt.contracts.graph.nodes.Exposure], metrics: Mapping[str, dbt.contracts.graph.nodes.Metric], selectors: Mapping[str, Any], disabled: Optional[Mapping[str, List[Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode, dbt.contracts.graph.nodes.SourceDefinition]]]], parent_map: Optional[Dict[str, List[str]]], child_map: Optional[Dict[str, List[str]]])",
"description": "WritableManifest(metadata: dbt.contracts.graph.manifest.ManifestMetadata, nodes: Mapping[str, Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode]], sources: Mapping[str, dbt.contracts.graph.nodes.SourceDefinition], macros: Mapping[str, dbt.contracts.graph.nodes.Macro], docs: Mapping[str, dbt.contracts.graph.nodes.Documentation], exposures: Mapping[str, dbt.contracts.graph.nodes.Exposure], metrics: Mapping[str, dbt.contracts.graph.nodes.Metric], entities: Mapping[str, dbt.contracts.graph.nodes.Entity], selectors: Mapping[str, Any], disabled: Optional[Mapping[str, List[Union[dbt.contracts.graph.nodes.AnalysisNode, dbt.contracts.graph.nodes.SingularTestNode, dbt.contracts.graph.nodes.HookNode, dbt.contracts.graph.nodes.ModelNode, dbt.contracts.graph.nodes.RPCNode, dbt.contracts.graph.nodes.SqlNode, dbt.contracts.graph.nodes.GenericTestNode, dbt.contracts.graph.nodes.SnapshotNode, dbt.contracts.graph.nodes.SeedNode, dbt.contracts.graph.nodes.SourceDefinition]]]], parent_map: Optional[Dict[str, List[str]]], child_map: Optional[Dict[str, List[str]]])",
"definitions": {
"ManifestMetadata": {
"type": "object",
@@ -185,12 +193,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.4.0a1"
"default": "1.5.0a1"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2022-12-13T03:30:15.966964Z"
"default": "2023-01-23T21:56:17.790304Z"
},
"invocation_id": {
"oneOf": [
@@ -201,7 +209,7 @@
"type": "null"
}
],
"default": "4f2b967b-7e02-46de-a7ea-268a05e3fab1"
"default": "10c9c26b-6682-4d46-84d2-12f641a070e5"
},
"env": {
"type": "object",
@@ -262,7 +270,6 @@
"AnalysisNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -276,7 +283,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -400,7 +414,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.970579
"default": 1674510977.792257
},
"config_call_dict": {
"type": "object",
@@ -454,6 +468,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -498,7 +522,7 @@
}
},
"additionalProperties": false,
"description": "AnalysisNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
"description": "AnalysisNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
},
"FileHash": {
"type": "object",
@@ -811,7 +835,6 @@
"SingularTestNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -825,7 +848,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -941,7 +971,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.973521
"default": 1674510977.79368
},
"config_call_dict": {
"type": "object",
@@ -995,6 +1025,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -1039,7 +1079,7 @@
}
},
"additionalProperties": false,
"description": "SingularTestNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.TestConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
"description": "SingularTestNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.TestConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
},
"TestConfig": {
"type": "object",
@@ -1156,7 +1196,6 @@
"HookNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -1170,7 +1209,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -1294,7 +1340,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.975156
"default": 1674510977.795094
},
"config_call_dict": {
"type": "object",
@@ -1348,6 +1394,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -1402,12 +1458,11 @@
}
},
"additionalProperties": false,
"description": "HookNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None, index: Optional[int] = None)"
"description": "HookNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None, index: Optional[int] = None)"
},
"ModelNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -1421,7 +1476,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -1545,7 +1607,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.976732
"default": 1674510977.7959611
},
"config_call_dict": {
"type": "object",
@@ -1599,6 +1661,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -1643,12 +1715,11 @@
}
},
"additionalProperties": false,
"description": "ModelNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
"description": "ModelNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
},
"RPCNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -1662,7 +1733,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -1786,7 +1864,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.978195
"default": 1674510977.796774
},
"config_call_dict": {
"type": "object",
@@ -1840,6 +1918,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -1884,12 +1972,11 @@
}
},
"additionalProperties": false,
"description": "RPCNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
"description": "RPCNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
},
"SqlNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -1903,7 +1990,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -2027,7 +2121,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.979718
"default": 1674510977.797567
},
"config_call_dict": {
"type": "object",
@@ -2081,6 +2175,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -2125,13 +2229,12 @@
}
},
"additionalProperties": false,
"description": "SqlNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
"description": "SqlNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.NodeConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
},
"GenericTestNode": {
"type": "object",
"required": [
"test_metadata",
"database",
"schema",
"name",
"resource_type",
@@ -2148,7 +2251,14 @@
"$ref": "#/definitions/TestMetadata"
},
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -2264,7 +2374,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.981434
"default": 1674510977.79852
},
"config_call_dict": {
"type": "object",
@@ -2318,6 +2428,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -2382,7 +2502,7 @@
}
},
"additionalProperties": false,
"description": "GenericTestNode(test_metadata: dbt.contracts.graph.nodes.TestMetadata, database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.TestConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None, column_name: Optional[str] = None, file_key_name: Optional[str] = None)"
"description": "GenericTestNode(test_metadata: dbt.contracts.graph.nodes.TestMetadata, database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.TestConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None, column_name: Optional[str] = None, file_key_name: Optional[str] = None)"
},
"TestMetadata": {
"type": "object",
@@ -2414,7 +2534,6 @@
"SnapshotNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -2429,7 +2548,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -2529,7 +2655,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.984685
"default": 1674510977.79998
},
"config_call_dict": {
"type": "object",
@@ -2583,6 +2709,16 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
@@ -2627,7 +2763,7 @@
}
},
"additionalProperties": false,
"description": "SnapshotNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SnapshotConfig, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
"description": "SnapshotNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SnapshotConfig, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', language: str = 'sql', refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, compiled_path: Optional[str] = None, compiled: bool = False, compiled_code: Optional[str] = None, extra_ctes_injected: bool = False, extra_ctes: List[dbt.contracts.graph.nodes.InjectedCTE] = <factory>, _pre_injected_sql: Optional[str] = None)"
},
"SnapshotConfig": {
"type": "object",
@@ -2837,7 +2973,6 @@
"SeedNode": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -2851,7 +2986,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -2976,7 +3118,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.987447
"default": 1674510977.801306
},
"config_call_dict": {
"type": "object",
@@ -3008,7 +3150,7 @@
}
},
"additionalProperties": false,
"description": "SeedNode(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SeedConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', root_path: Optional[str] = None)"
"description": "SeedNode(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], alias: str, checksum: dbt.contracts.files.FileHash, config: dbt.contracts.graph.model_config.SeedConfig = <factory>, _event_status: Dict[str, Any] = <factory>, tags: List[str] = <factory>, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, docs: dbt.contracts.graph.unparsed.Docs = <factory>, patch_path: Optional[str] = None, build_path: Optional[str] = None, deferred: bool = False, unrendered_config: Dict[str, Any] = <factory>, created_at: float = <factory>, config_call_dict: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, raw_code: str = '', root_path: Optional[str] = None)"
},
"SeedConfig": {
"type": "object",
@@ -3178,7 +3320,6 @@
"SourceDefinition": {
"type": "object",
"required": [
"database",
"schema",
"name",
"resource_type",
@@ -3194,7 +3335,14 @@
],
"properties": {
"database": {
"type": "string"
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"schema": {
"type": "string"
@@ -3335,11 +3483,11 @@
},
"created_at": {
"type": "number",
"default": 1670902215.989922
"default": 1674510977.802621
}
},
"additionalProperties": false,
"description": "SourceDefinition(database: str, schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], source_name: str, source_description: str, loader: str, identifier: str, _event_status: Dict[str, Any] = <factory>, quoting: dbt.contracts.graph.unparsed.Quoting = <factory>, loaded_at_field: Optional[str] = None, freshness: Optional[dbt.contracts.graph.unparsed.FreshnessThreshold] = None, external: Optional[dbt.contracts.graph.unparsed.ExternalTable] = None, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, source_meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.SourceConfig = <factory>, patch_path: Optional[str] = None, unrendered_config: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, created_at: float = <factory>)"
"description": "SourceDefinition(database: Optional[str], schema: str, name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], source_name: str, source_description: str, loader: str, identifier: str, _event_status: Dict[str, Any] = <factory>, quoting: dbt.contracts.graph.unparsed.Quoting = <factory>, loaded_at_field: Optional[str] = None, freshness: Optional[dbt.contracts.graph.unparsed.FreshnessThreshold] = None, external: Optional[dbt.contracts.graph.unparsed.ExternalTable] = None, description: str = '', columns: Dict[str, dbt.contracts.graph.nodes.ColumnInfo] = <factory>, meta: Dict[str, Any] = <factory>, source_meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.SourceConfig = <factory>, patch_path: Optional[str] = None, unrendered_config: Dict[str, Any] = <factory>, relation_name: Optional[str] = None, created_at: float = <factory>)"
},
"Quoting": {
"type": "object",
@@ -3445,12 +3593,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.4.0a1"
"default": "1.5.0a1"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2022-12-13T03:30:15.961825Z"
"default": "2023-01-23T21:56:17.787436Z"
},
"invocation_id": {
"oneOf": [
@@ -3461,7 +3609,7 @@
"type": "null"
}
],
"default": "4f2b967b-7e02-46de-a7ea-268a05e3fab1"
"default": "10c9c26b-6682-4d46-84d2-12f641a070e5"
},
"env": {
"type": "object",
@@ -3472,7 +3620,7 @@
}
},
"additionalProperties": false,
"description": "FreshnessMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.4.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Optional[str] = <factory>, env: Dict[str, str] = <factory>)"
"description": "FreshnessMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.5.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Optional[str] = <factory>, env: Dict[str, str] = <factory>)"
},
"SourceFreshnessRuntimeError": {
"type": "object",
@@ -3814,7 +3962,7 @@
},
"created_at": {
"type": "number",
"default": 1670902215.990816
"default": 1674510977.8031092
},
"supported_languages": {
"oneOf": [
@@ -4070,13 +4218,23 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"created_at": {
"type": "number",
"default": 1670902215.993354
"default": 1674510977.8040562
}
},
"additionalProperties": false,
"description": "Exposure(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], type: dbt.contracts.graph.unparsed.ExposureType, owner: dbt.contracts.graph.unparsed.ExposureOwner, description: str = '', label: Optional[str] = None, maturity: Optional[dbt.contracts.graph.unparsed.MaturityType] = None, meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.ExposureConfig = <factory>, unrendered_config: Dict[str, Any] = <factory>, url: Optional[str] = None, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, created_at: float = <factory>)"
"description": "Exposure(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], type: dbt.contracts.graph.unparsed.ExposureType, owner: dbt.contracts.graph.unparsed.ExposureOwner, description: str = '', label: Optional[str] = None, maturity: Optional[dbt.contracts.graph.unparsed.MaturityType] = None, meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.ExposureConfig = <factory>, unrendered_config: Dict[str, Any] = <factory>, url: Optional[str] = None, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, refs: List[List[str]] = <factory>, sources: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, created_at: float = <factory>)"
},
"ExposureOwner": {
"type": "object",
@@ -4126,7 +4284,6 @@
"description",
"label",
"calculation_method",
"timestamp",
"expression",
"filters",
"time_grains",
@@ -4169,9 +4326,6 @@
"calculation_method": {
"type": "string"
},
"timestamp": {
"type": "string"
},
"expression": {
"type": "string"
},
@@ -4193,6 +4347,16 @@
"type": "string"
}
},
"timestamp": {
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"window": {
"oneOf": [
{
@@ -4281,13 +4445,23 @@
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"created_at": {
"type": "number",
"default": 1670902215.995033
"default": 1674510977.804972
}
},
"additionalProperties": false,
"description": "Metric(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], description: str, label: str, calculation_method: str, timestamp: str, expression: str, filters: List[dbt.contracts.graph.unparsed.MetricFilter], time_grains: List[str], dimensions: List[str], window: Optional[dbt.contracts.graph.unparsed.MetricTime] = None, model: Optional[str] = None, model_unique_id: Optional[str] = None, meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.MetricConfig = <factory>, unrendered_config: Dict[str, Any] = <factory>, sources: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, refs: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, created_at: float = <factory>)"
"description": "Metric(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], description: str, label: str, calculation_method: str, expression: str, filters: List[dbt.contracts.graph.unparsed.MetricFilter], time_grains: List[str], dimensions: List[str], timestamp: Optional[str] = None, window: Optional[dbt.contracts.graph.unparsed.MetricTime] = None, model: Optional[str] = None, model_unique_id: Optional[str] = None, meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.MetricConfig = <factory>, unrendered_config: Dict[str, Any] = <factory>, sources: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, refs: List[List[str]] = <factory>, metrics: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, created_at: float = <factory>)"
},
"MetricFilter": {
"type": "object",
@@ -4355,6 +4529,148 @@
},
"additionalProperties": true,
"description": "MetricConfig(_extra: Dict[str, Any] = <factory>, enabled: bool = True)"
},
"Entity": {
"type": "object",
"required": [
"name",
"resource_type",
"package_name",
"path",
"original_file_path",
"unique_id",
"fqn",
"model",
"description",
"dimensions"
],
"properties": {
"name": {
"type": "string"
},
"resource_type": {
"type": "string",
"enum": [
"entity"
]
},
"package_name": {
"type": "string"
},
"path": {
"type": "string"
},
"original_file_path": {
"type": "string"
},
"unique_id": {
"type": "string"
},
"fqn": {
"type": "array",
"items": {
"type": "string"
}
},
"model": {
"type": "string"
},
"description": {
"type": "string"
},
"dimensions": {
"type": "array",
"items": {
"type": "string"
}
},
"model_unique_id": {
"oneOf": [
{
"type": "string"
},
{
"type": "null"
}
]
},
"meta": {
"type": "object",
"default": {}
},
"tags": {
"type": "array",
"items": {
"type": "string"
},
"default": []
},
"config": {
"$ref": "#/definitions/EntityConfig",
"default": {
"enabled": true
}
},
"unrendered_config": {
"type": "object",
"default": {}
},
"sources": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"depends_on": {
"$ref": "#/definitions/DependsOn",
"default": {
"macros": [],
"nodes": []
}
},
"refs": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"entities": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "string"
}
},
"default": []
},
"created_at": {
"type": "number",
"default": 1674510977.805523
}
},
"additionalProperties": false,
"description": "Entity(name: str, resource_type: dbt.node_types.NodeType, package_name: str, path: str, original_file_path: str, unique_id: str, fqn: List[str], model: str, description: str, dimensions: List[str], model_unique_id: Optional[str] = None, meta: Dict[str, Any] = <factory>, tags: List[str] = <factory>, config: dbt.contracts.graph.model_config.EntityConfig = <factory>, unrendered_config: Dict[str, Any] = <factory>, sources: List[List[str]] = <factory>, depends_on: dbt.contracts.graph.nodes.DependsOn = <factory>, refs: List[List[str]] = <factory>, entities: List[List[str]] = <factory>, created_at: float = <factory>)"
},
"EntityConfig": {
"type": "object",
"required": [],
"properties": {
"enabled": {
"type": "boolean",
"default": true
}
},
"additionalProperties": true,
"description": "EntityConfig(_extra: Dict[str, Any] = <factory>, enabled: bool = True)"
}
},
"$schema": "http://json-schema.org/draft-07/schema#",

View File

@@ -37,12 +37,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.2.0a1"
"default": "1.5.0a1"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2022-04-15T20:38:22.700175Z"
"default": "2023-01-23T21:56:17.788708Z"
},
"invocation_id": {
"oneOf": [
@@ -53,7 +53,7 @@
"type": "null"
}
],
"default": "34abf75e-59d3-442f-920c-fa3843d98014"
"default": "10c9c26b-6682-4d46-84d2-12f641a070e5"
},
"env": {
"type": "object",
@@ -64,7 +64,7 @@
}
},
"additionalProperties": false,
"description": "BaseArtifactMetadata(dbt_schema_version: str, dbt_version: str = '1.2.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Union[str, NoneType] = <factory>, env: Dict[str, str] = <factory>)"
"description": "BaseArtifactMetadata(dbt_schema_version: str, dbt_version: str = '1.5.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Optional[str] = <factory>, env: Dict[str, str] = <factory>)"
},
"RunResultOutput": {
"type": "object",
@@ -148,7 +148,7 @@
}
},
"additionalProperties": false,
"description": "RunResultOutput(status: Union[dbt.contracts.results.RunStatus, dbt.contracts.results.TestStatus, dbt.contracts.results.FreshnessStatus], timing: List[dbt.contracts.results.TimingInfo], thread_id: str, execution_time: float, adapter_response: Dict[str, Any], message: Union[str, NoneType], failures: Union[int, NoneType], unique_id: str)"
"description": "RunResultOutput(status: Union[dbt.contracts.results.RunStatus, dbt.contracts.results.TestStatus, dbt.contracts.results.FreshnessStatus], timing: List[dbt.contracts.results.TimingInfo], thread_id: str, execution_time: float, adapter_response: Dict[str, Any], message: Optional[str], failures: Optional[int], unique_id: str)"
},
"TimingInfo": {
"type": "object",
@@ -183,7 +183,7 @@
}
},
"additionalProperties": false,
"description": "TimingInfo(name: str, started_at: Union[datetime.datetime, NoneType] = None, completed_at: Union[datetime.datetime, NoneType] = None)"
"description": "TimingInfo(name: str, started_at: Optional[datetime.datetime] = None, completed_at: Optional[datetime.datetime] = None)"
},
"FreshnessMetadata": {
"type": "object",
@@ -195,12 +195,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.2.0a1"
"default": "1.5.0a1"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2022-04-15T20:38:22.697740Z"
"default": "2023-01-23T21:56:17.787436Z"
},
"invocation_id": {
"oneOf": [
@@ -211,7 +211,7 @@
"type": "null"
}
],
"default": "34abf75e-59d3-442f-920c-fa3843d98014"
"default": "10c9c26b-6682-4d46-84d2-12f641a070e5"
},
"env": {
"type": "object",
@@ -222,7 +222,7 @@
}
},
"additionalProperties": false,
"description": "FreshnessMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.2.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Union[str, NoneType] = <factory>, env: Dict[str, str] = <factory>)"
"description": "FreshnessMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.5.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Optional[str] = <factory>, env: Dict[str, str] = <factory>)"
},
"SourceFreshnessRuntimeError": {
"type": "object",
@@ -361,7 +361,7 @@
}
},
"additionalProperties": false,
"description": "FreshnessThreshold(warn_after: Union[dbt.contracts.graph.unparsed.Time, NoneType] = <factory>, error_after: Union[dbt.contracts.graph.unparsed.Time, NoneType] = <factory>, filter: Union[str, NoneType] = None)"
"description": "FreshnessThreshold(warn_after: Optional[dbt.contracts.graph.unparsed.Time] = <factory>, error_after: Optional[dbt.contracts.graph.unparsed.Time] = <factory>, filter: Optional[str] = None)"
},
"Time": {
"type": "object",
@@ -394,7 +394,7 @@
}
},
"additionalProperties": false,
"description": "Time(count: Union[int, NoneType] = None, period: Union[dbt.contracts.graph.unparsed.TimePeriod, NoneType] = None)"
"description": "Time(count: Optional[int] = None, period: Optional[dbt.contracts.graph.unparsed.TimePeriod] = None)"
}
},
"$schema": "http://json-schema.org/draft-07/schema#",

View File

@@ -39,12 +39,12 @@
},
"dbt_version": {
"type": "string",
"default": "1.2.0a1"
"default": "1.5.0a1"
},
"generated_at": {
"type": "string",
"format": "date-time",
"default": "2022-04-15T20:38:22.697740Z"
"default": "2023-01-23T21:56:17.787436Z"
},
"invocation_id": {
"oneOf": [
@@ -55,7 +55,7 @@
"type": "null"
}
],
"default": "34abf75e-59d3-442f-920c-fa3843d98014"
"default": "10c9c26b-6682-4d46-84d2-12f641a070e5"
},
"env": {
"type": "object",
@@ -66,7 +66,7 @@
}
},
"additionalProperties": false,
"description": "FreshnessMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.2.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Union[str, NoneType] = <factory>, env: Dict[str, str] = <factory>)"
"description": "FreshnessMetadata(dbt_schema_version: str = <factory>, dbt_version: str = '1.5.0a1', generated_at: datetime.datetime = <factory>, invocation_id: Optional[str] = <factory>, env: Dict[str, str] = <factory>)"
},
"SourceFreshnessRuntimeError": {
"type": "object",
@@ -205,7 +205,7 @@
}
},
"additionalProperties": false,
"description": "FreshnessThreshold(warn_after: Union[dbt.contracts.graph.unparsed.Time, NoneType] = <factory>, error_after: Union[dbt.contracts.graph.unparsed.Time, NoneType] = <factory>, filter: Union[str, NoneType] = None)"
"description": "FreshnessThreshold(warn_after: Optional[dbt.contracts.graph.unparsed.Time] = <factory>, error_after: Optional[dbt.contracts.graph.unparsed.Time] = <factory>, filter: Optional[str] = None)"
},
"Time": {
"type": "object",
@@ -238,7 +238,7 @@
}
},
"additionalProperties": false,
"description": "Time(count: Union[int, NoneType] = None, period: Union[dbt.contracts.graph.unparsed.TimePeriod, NoneType] = None)"
"description": "Time(count: Optional[int] = None, period: Optional[dbt.contracts.graph.unparsed.TimePeriod] = None)"
},
"TimingInfo": {
"type": "object",
@@ -273,7 +273,7 @@
}
},
"additionalProperties": false,
"description": "TimingInfo(name: str, started_at: Union[datetime.datetime, NoneType] = None, completed_at: Union[datetime.datetime, NoneType] = None)"
"description": "TimingInfo(name: str, started_at: Optional[datetime.datetime] = None, completed_at: Optional[datetime.datetime] = None)"
}
},
"$schema": "http://json-schema.org/draft-07/schema#",

View File

@@ -1,23 +0,0 @@
name: 'local_dep'
version: '1.0'
config-version: 2
profile: 'default'
model-paths: ["models"]
analysis-paths: ["analyses"]
test-paths: ["tests"]
seed-paths: ["seeds"]
macro-paths: ["macros"]
require-dbt-version: '>=0.1.0'
target-path: "target" # directory which will store compiled SQL files
clean-targets: # directories to be removed by `dbt clean`
- "target"
- "dbt_packages"
seeds:
quote_columns: False

View File

@@ -1,3 +0,0 @@
{% macro some_overridden_macro() -%}
100
{%- endmacro %}

View File

@@ -1 +0,0 @@
select * from {{ ref('seed') }}

View File

@@ -1,10 +0,0 @@
version: 2
sources:
- name: seed_source
schema: "{{ var('schema_override', target.schema) }}"
tables:
- name: "seed"
columns:
- name: id
tests:
- unique

View File

@@ -1,19 +0,0 @@
{% test type_one(model) %}
select * from (
select * from {{ model }}
union all
select * from {{ ref('model_b') }}
) as Foo
{% endtest %}
{% test type_two(model) %}
{{ config(severity = "WARN") }}
select * from {{ model }}
{% endtest %}

View File

@@ -1,19 +0,0 @@
{% test type_one(model) %}
select * from (
select * from {{ model }}
union all
select * from {{ ref('model_b') }}
) as Foo
{% endtest %}
{% test type_two(model) %}
{{ config(severity = "ERROR") }}
select * from {{ model }}
{% endtest %}

View File

@@ -1,19 +0,0 @@
with source as (
select * from {{ source('seed_sources', 'raw_customers') }}
),
renamed as (
select
id as customer_id,
first_name,
last_name,
email
from source
)
select * from renamed

View File

@@ -1,5 +0,0 @@
{% docs customer_table %}
This table contains customer data
{% enddocs %}

View File

@@ -1,5 +0,0 @@
{% docs customer_table %}
LOTS of customer data
{% enddocs %}

View File

@@ -1,18 +0,0 @@
version: 2
sources:
- name: seed_sources
schema: "{{ target.schema }}"
database: "{{ env_var('ENV_VAR_DATABASE') }}"
tables:
- name: raw_customers
columns:
- name: id
tests:
- not_null:
severity: "{{ env_var('ENV_VAR_SEVERITY') }}"
- unique
- name: first_name
- name: last_name
- name: email

View File

@@ -1,7 +0,0 @@
{% macro do_something(foo2, bar2) %}
select
'{{ foo2 }}' as foo2,
'{{ bar2 }}' as bar2
{% endmacro %}

View File

@@ -1,7 +0,0 @@
version: 2
macros:
- name: do_something
description: "This is a test macro"
meta:
some_key: "{{ env_var('ENV_VAR_SOME_KEY') }}"

View File

@@ -1,30 +0,0 @@
version: 2
metrics:
- model: "ref('people')"
name: number_of_people
description: Total count of people
label: "Number of people"
calculation_method: count
expression: "*"
timestamp: created_at
time_grains: [day, week, month]
dimensions:
- favorite_color
- loves_dbt
meta:
my_meta: '{{ env_var("ENV_VAR_METRICS") }}'
- model: "ref('people')"
name: collective_tenure
description: Total number of years of team experience
label: "Collective tenure"
calculation_method: sum
expression: tenure
timestamp: created_at
time_grains: [day]
filters:
- field: loves_dbt
operator: is
value: 'true'

View File

@@ -1 +0,0 @@
select '{{ env_var('ENV_VAR_TEST') }}' as vartest

View File

@@ -1 +0,0 @@
select 'blue' as fun

View File

@@ -1,8 +0,0 @@
version: 2
models:
- name: model_color
columns:
- name: fun
tests:
- unique:
enabled: "{{ env_var('ENV_VAR_ENABLED', True) }}"

View File

@@ -1,6 +0,0 @@
version: 2
models:
- name: model_one
config:
materialized: "{{ env_var('TEST_SCHEMA_VAR') }}"

View File

@@ -1,11 +0,0 @@
version: 2
models:
- name: model_one
config:
materialized: "{{ env_var('TEST_SCHEMA_VAR') }}"
tests:
- check_color:
column_name: fun
color: "env_var('ENV_VAR_COLOR')"

View File

@@ -1,21 +0,0 @@
version: 2
models:
- name: model_one
config:
materialized: "{{ env_var('TEST_SCHEMA_VAR') }}"
tests:
- check_color:
column_name: fun
color: "env_var('ENV_VAR_COLOR')"
exposures:
- name: proxy_for_dashboard
description: "This is for the XXX dashboard"
type: "dashboard"
owner:
name: "{{ env_var('ENV_VAR_OWNER') }}"
email: "tester@dashboard.com"
depends_on:
- ref("model_color")
- source("seed_sources", "raw_customers")

View File

@@ -1,9 +0,0 @@
version: 2
models:
- name: orders
description: "Some order data"
columns:
- name: id
tests:
- unique

View File

@@ -1,26 +0,0 @@
{% test is_odd(model, column_name) %}
with validation as (
select
{{ column_name }} as odd_field
from {{ model }}
),
validation_errors as (
select
odd_field
from validation
-- if this is true, then odd_field is actually even!
where (odd_field % 2) = 0
)
select *
from validation_errors
{% endtest %}

View File

@@ -1,26 +0,0 @@
{% test is_odd(model, column_name) %}
with validation as (
select
{{ column_name }} as odd_field2
from {{ model }}
),
validation_errors as (
select
odd_field2
from validation
-- if this is true, then odd_field is actually even!
where (odd_field2 % 2) = 0
)
select *
from validation_errors
{% endtest %}

View File

@@ -1,10 +0,0 @@
version: 2
models:
- name: orders
description: "Some order data"
columns:
- name: id
tests:
- unique
- is_odd

View File

@@ -1,6 +0,0 @@
- custom macro
{% macro generate_schema_name(schema_name, node) %}
{{ schema_name }}_{{ target.schema }}
{% endmacro %}

View File

@@ -1,6 +0,0 @@
- custom macro xxxx
{% macro generate_schema_name(schema_name, node) %}
{{ schema_name }}_{{ target.schema }}
{% endmacro %}

View File

@@ -1,8 +0,0 @@
version: 2
models:
- name: model_a
tests:
- type_one
- type_two

View File

@@ -1,4 +0,0 @@
version: 2
macros:
- name: do_something
description: "This is a test macro"

View File

@@ -1,21 +0,0 @@
{%
set metric_list = [
metric('number_of_people'),
metric('collective_tenure')
]
%}
{% if not execute %}
{% set metric_names = [] %}
{% for m in metric_list %}
{% do metric_names.append(m.metric_name) %}
{% endfor %}
-- this config does nothing, but it lets us check these values
{{ config(metric_names = metric_names) }}
{% endif %}
select 1 as fun

View File

@@ -1 +0,0 @@
select 1 as fun

View File

@@ -1 +0,0 @@
select 1 as notfun

View File

@@ -1 +0,0 @@
select 'blue' as fun

View File

@@ -1 +0,0 @@
select * from {{ ref('model_three') }}

View File

@@ -1 +0,0 @@
select fun from {{ ref('model_one') }}

View File

@@ -1 +0,0 @@
select 1 as fun

View File

@@ -1,12 +0,0 @@
{{ config(materialized='table') }}
with source_data as (
select 1 as id
union all
select null as id
)
select *
from source_data

View File

@@ -1,12 +0,0 @@
{{ config(materialized='table', enabled=False) }}
with source_data as (
select 1 as id
union all
select null as id
)
select *
from source_data

View File

@@ -1,13 +0,0 @@
- Disabled model
{{ config(materialized='table', enabled=False) }}
with source_data as (
select 1 as id
union all
select null as id
)
select *
from source_data

View File

@@ -1,14 +0,0 @@
{{ config(materialized='table') }}
with source_data as (
{#- This is model three #}
select 1 as id
union all
select null as id
)
select *
from source_data

View File

@@ -1 +0,0 @@
select 1 as notfun

View File

@@ -1,5 +0,0 @@
version: 2
models:
- name: model_one
description: "The first model"

View File

@@ -1,11 +0,0 @@
version: 2
models:
- name: model_one
description: "The first model"
- name: model_three
description: "The third model"
columns:
- name: id
tests:
- unique

View File

@@ -1,11 +0,0 @@
version: 2
models:
- name: model_one
description: "The first model"
- name: model_three
description: "The third model"
columns:
- name: id
tests:
- not_null

View File

@@ -1,12 +0,0 @@
version: 2
models:
- name: model_one
description: "The first model"
- name: model_three
description: "The third model"
tests:
- unique
macros:
- name: do_something
description: "This is a test macro"

View File

@@ -1,13 +0,0 @@
version: 2
models:
- name: model_one
description: "The first model"
- name: model_three
description: "The third model"
config:
enabled: false
columns:
- name: id
tests:
- unique

View File

@@ -1,13 +0,0 @@
version: 2
models:
- name: model_one
description: "The first model"
- name: model_three
description: "The third model"
config:
enabled: true
columns:
- name: id
tests:
- unique

View File

@@ -1 +0,0 @@
select * from customers

View File

@@ -1,7 +0,0 @@
{% macro do_something(foo2, bar2) %}
select
'{{ foo2 }}' as foo2,
'{{ bar2 }}' as bar2
{% endmacro %}

View File

@@ -1,7 +0,0 @@
{% macro do_something(foo2, bar2) %}
select
'foo' as foo2,
'var' as bar2
{% endmacro %}

View File

@@ -1,23 +0,0 @@
version: 2
metrics:
- name: new_customers
label: New Customers
model: customers
description: "The number of paid customers who are using the product"
calculation_method: count
expression: user_id
timestamp: signup_date
time_grains: [day, week, month]
dimensions:
- plan
- country
filters:
- field: is_paying
value: True
operator: '='
+meta:
is_okr: True
tags:
- okrs

View File

@@ -1,2 +0,0 @@
select
* from {{ ref('customers') }} where first_name = '{{ macro_something() }}'

View File

@@ -1 +0,0 @@
select 1 as id, 101 as user_id, 'pending' as status

Some files were not shown because too many files have changed in this diff Show More