Table of Contents

Databricks Refresh Fails with Empty Catalog Error

Note

Starting with Tabular Editor 3.26.1, the Table Import Wizard uses Implementation 2.0 by default for Databricks connections. This issue only affects M-queries created with earlier versions of Tabular Editor or M-queries where the third parameter of Databricks.Catalogs() is null.

When refreshing a semantic model that imports data from Databricks, you may encounter the following error:

"[Microsoft][ThriftExtension] (38) An attempt was made to set an empty string as the current catalog. This type of operation is not allowed."

This error occurs when the M-query generated by the Table Import Wizard uses the legacy connector implementation against a Databricks workspace that requires the newer Arrow Database Connectivity (ADBC) driver, also known as Implementation 2.0.


Understanding the Issue

The Databricks.Catalogs() Power Query function accepts an optional third parameter that controls which connector implementation to use. When this parameter is null, the connector defaults to the legacy implementation (1.0).

Why this happens

  1. Newer Databricks workspaces require Implementation 2.0. Recent Databricks instances enforce stricter catalog handling that is incompatible with the legacy connector.

  2. The Table Import Wizard in Tabular Editor versions before 3.26.1 generates M-queries with null as the third parameter. This means the legacy implementation is used, which fails on newer Databricks workspaces.

  3. Power BI Desktop already defaults to Implementation 2.0. Microsoft has made the Arrow Database Connectivity driver the default for new Databricks connections in Power BI Desktop.


Resolution

Edit the M-query on each affected partition to include [Implementation="2.0"] as the third parameter of the Databricks.Catalogs() call.

Before (legacy implementation)

let
    Source = Databricks.Catalogs("adb-xxxx.1.azuredatabricks.net", "/sql/1.0/warehouses/xxxx", null),
    Database = Source{[Name="my_catalog",Kind="Database"]}[Data],
    Schema = Database{[Name="my_schema",Kind="Schema"]}[Data],
    Data = Schema{[Name="my_table",Kind="Table"]}[Data]
in
    Data

After (Implementation 2.0)

let
    Source = Databricks.Catalogs("adb-xxxx.1.azuredatabricks.net", "/sql/1.0/warehouses/xxxx", [Implementation="2.0"]),
    Database = Source{[Name="my_catalog",Kind="Database"]}[Data],
    Schema = Database{[Name="my_schema",Kind="Schema"]}[Data],
    Data = Schema{[Name="my_table",Kind="Table"]}[Data]
in
    Data

The only change is replacing null with [Implementation="2.0"] in the third parameter.

Steps

  1. Open your model in Tabular Editor 3.
  2. In the TOM Explorer, expand the affected table and select its partition.
  3. In the Expression Editor, locate the Databricks.Catalogs(...) call.
  4. Replace null (the third parameter) with [Implementation="2.0"].
  5. Repeat for each Databricks partition in your model.
  6. Save the model and retry the refresh.
Tip

If your model has many Databricks partitions, use Edit > Find and Replace (Ctrl+H) to replace Catalogs("adb- across all expressions. Alternatively, use the C# script below to update all Databricks partitions at once.

Bulk update with C# script

The following script finds all M-partitions that call Databricks.Catalogs with null as the third parameter and replaces it with [Implementation="2.0"]:

var pattern = new System.Text.RegularExpressions.Regex(
    @"(Databricks\.Catalogs\([^,]+,\s*""[^""]*"",\s*)null(\s*\))");

var updated = 0;
foreach (var partition in Model.AllPartitions.OfType<MPartition>())
{
    if (partition.Expression == null) continue;
    var newExpr = pattern.Replace(partition.Expression, "$1[Implementation=\"2.0\"]$2");
    if (newExpr != partition.Expression)
    {
        partition.Expression = newExpr;
        updated++;
    }
}

Info($"Updated {updated} partition(s).");

Prevention

  • Tabular Editor 3.26.1 and later: The Table Import Wizard generates M-queries with Implementation 2.0 by default. Update to 3.26.1 or later to avoid this issue for new imports.
  • Existing models: Review Databricks partition expressions after upgrading. Any expressions with null as the third parameter should be updated.

Additional Resources