Terraform databricks cannot configure default credentials – Databricks

by
Ali Hasan
azure-databricks azure-pipelines terraform-provider-azure terraform-provider-databricks

Quick Fix: Configure the provider authentication correctly by providing the host and other required attributes for service principal authentication. The Databricks Terraform provider uses the same environment variables as the azurerm provider.

The Solutions:

Solution 1: Use Service Principal Authentication

If creating a Databricks workspace using a service principal, you can continue using it to access or create Databricks resources and data sources. You don’t need to specify databricks_connection_profile. Instead, configure provider authentication correctly by providing the host and other necessary attributes for service principal authentication. The Databricks Terraform provider utilizes environment variables similar to the azurerm provider.

Solution 2: Explicitly pass the workspace provider

If you’re using modules and have multiple Databricks providers defined, you must explicitly pass the workspace provider. To do this:

  1. Add the following to your module:
...
providers = {
    databricks.workspace = databricks.workspace
}
...
  1. Then, in the module, use the latest_spark_version as follows:
// Inside module.foo
data "databricks_spark_version" "latest_spark_version" {
  provider          = databricks.workspace
  long_term_support = true
}

Q&A

We are running terraform through an Azure pipeline to create a databricks workspace and related resources, however when the apply stage of Terraform gets to the stage where it is grabbing the latest version of spark, the process throws an error

Really, if you’re creating Databricks workspace using the service principal, you can continue to use it to access/create Databricks resources & data sources. You don’t need to specify databricks_connection_profile & just need to [configure provider authentication correctly][1] by providing host & other necessary attributes for service principal authentication – Databricks Terraform provider uses the same environment variables like azurerm provider.

If you are using modules and also have multiple databricks providers in your providers, you need to explicitly pass the workspace provider

In our case we pass the provider to the module where we define the data.latest_lts_version this way:

Video Explanation:

The following video, titled "Getting started with Databricks Terraform Modules - YouTube", provides additional insights and in-depth exploration related to the topics discussed in this post.

Play video

Your browser can't play this video. Learn more ... How to setup Databricks Unity Catalog with Terraform. La Data avec ...