Skip to content

[Bug]: File format is not set correctly when using non-standard target name for databricks connection #519

@diederikperdok

Description

@diederikperdok

Overview

Here the file format is set based on the target name in stead of (as it should IMO) the target type. If you have a connection to databricks but the target name is not set to "databricks", creation of the dbt_artifact models fails.

How to reproduce

Create a target in your profiles.yml not named "databricks" with type "databricks". Then run dbt run --select dbt_artifacts

Expected behaviour

dbt_artifact models are created without errors.

Screenshots

Creation of the dbt_artifact models fails with

  Invalid file format provided: 
      Expected one of: text, csv, json, jdbc, parquet, orc, hive, delta, libsvm, hudi

Environment

Results of running dbt --version:

Core:
  - installed: 1.8.7 
  - latest:    1.10.4 - Update available!

  Your version of dbt-core is out of date!
  You can find instructions for upgrading here:
  https://docs.getdbt.com/docs/installation

Plugins:
  - spark:      1.8.0 - Update available!
  - databricks: 1.8.7 - Update available!

  At least one plugin is out of date or incompatible with dbt-core.
  You can find instructions for upgrading here:
  https://docs.getdbt.com/docs/installation

Please paste the contents of your packages.yml file here:

packages:
  - package: dbt-labs/dbt_utils
    version: 0.8.2
  - package: dbt-labs/dbt_external_tables
    version: 0.8.0
  - package: brooklyn-data/dbt_artifacts
    version: 2.9.3

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions