Read Databricks tables from Delta clients

Use the Iceberg REST catalog to read Unity Catalog-registered tables on Databricks from supported Iceberg clients, including Apache Spark and DuckDB.

Read using the Unity REST API

The Unity REST API provides external clients read access to Delta tables registered to Unity Catalog. Some clients also support creating tables and writing to existing tables.

Configure access using the endpoint /api/2.1/unity-catalog.

Requirements

Databricks supports Unity REST API access to tables as part of Unity Catalog. You must have Unity Catalog enabled in your workspace to use these endpoints. The following table types are eligible for Unity REST API reads:

  • Unity Catalog managed tables.

  • Unity Catalog external tables stored with Delta Lake.

You must complete the following configuration steps to configure access to read Databricks tables from Delta clients using the Unity REST API:

Read Delta tables with Apache Spark

The following is an example of the settings to configure Apache Spark to read Unity Catalog managed and external Delta tables:

"spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension",
"spark.sql.catalog.spark_catalog": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>": "io.unitycatalog.spark.UCSingleCatalog",
"spark.sql.catalog.<uc-catalog-name>.uri": "<workspace-url>/api/2.1/unity-catalog",
"spark.sql.catalog.<uc-catalog-name>.token":"<token>",
"spark.sql.defaultCatalog":"<uc-catalog-name>"

Substitute the following variables:

  • <uc-catalog-name>: The name of the catalog in Unity Catalog that contains your tables.

  • <workspace-url>: URL of the Databricks workspace.

  • <token>: PAT token for the principal configuring the integration.

Important

Specific configurations vary depending on the type of cloud object storage backing the catalog. See the OSS Unity Catalog docs for additional configurations.