October 2023

These features and Databricks platform improvements were released in October 2023.

Note

Releases are staged. Your Databricks workspace might not be updated until a week or more after the initial release date.

Configure the availability zone of instance pools

October 31, 2023

Instance pools now support availability zone configuration through the Instance Pools API. See Configure the availability zone.

View the YAML source for a Databricks job

October 30, 2023

You can now view and copy the YAML source for a job by clicking Jobs Vertical Ellipsis on the job details page and selecting View YAML/JSON. You can use the YAML source to create CI/CD workflows with Databricks Asset Bundles. See What are Databricks Asset Bundles?.

Add conditional logic to your Databricks workflows

October 30, 2023

You can now use the If/else condition task to conditionally run tasks in a Databricks job based on the results of a boolean expression. See Add branching logic to your job with the If/else condition task.

Configure parameters on a Databricks job that can be referenced by all job tasks

October 30, 2023

You can now add parameters to your Databricks jobs that are automatically passed to all job tasks that accept key-value pairs. See Add parameters for all job tasks. Additionally, you can now use an expanded set of value references to pass context and state between job tasks. See Pass context about job runs into job tasks.

Init scripts are now supported on Unity Catalog volumes

October 30, 2023

You can now use init scripts stored on Unity Catalog volumes on clusters running Databricks Runtime 13.3 LTS and above. See Where can init scripts be installed?.

Unity Catalog support for UNDROP TABLE is GA

October 25, 2023

You can undrop a dropped managed or external table in an existing schema within seven days of dropping. Requires Databricks Runtime 12.1 and above. See UNDROP TABLE and SHOW TABLES DROPPED.

Partner Connect supports Dataiku

October 25, 2023

You can now use Partner Connect to connect your Databricks workspace to Dataiku. See Connect to Dataiku.

Databricks AutoML generated notebooks are now saved as MLflow artifacts

October 24, 2023

Databricks AutoML generated notebooks are now saved as MLflow artifacts in all Databricks Runtime for Machine Learning versions.

Unity Catalog support for Python and PySpark UDFs

October 23, 2023

In Databricks Runtime 14.1 and above, you can now use PySpark UDFs on shared clusters and Python UDFs on all Unity Catalog-enabled compute. See User-defined functions (UDFs) in Unity Catalog.

Feature Engineering in Unity Catalog is GA

October 19, 2023

With Feature Engineering in Unity Catalog, Unity Catalog becomes your feature store. You can use any Delta table with a primary key as a feature table for model training or inference. Unity Catalog provides feature discovery and governance.

AI-generated table comments (Public Preview)

October 18, 2023

As part of the initiative to use AI to assist you as you work with Databricks, Databricks is introducing AI-generated table and column comments to Public Preview. In Catalog Explorer, you can view, edit, and add an AI-generated comment for any table or table column managed by Unity Catalog. Comments are powered by a large language model (LLM) that takes into account the table metadata, such as the table schema and column names. In workspaces enabled for HIPAA compliance, AI-generated comments may use external model partners to provide responses. Data sent to these services is not used for model training. For all other workspaces on GCP, AI-generated comments use an internal model.

See Add AI-generated comments to a table.

Models in Unity Catalog is GA

October 17, 2023

ML Models in Unity Catalog are now generally available. Unity Catalog provides centralized access control, auditing, lineage, model sharing across workspaces, and better MLOps deployment workflows. Databricks recommends using Models in Unity Catalog instead of the Workspace Model Registry. See Manage model lifecycle in Unity Catalog for details.

Partner Connect supports Monte Carlo

October 16, 2023

You can now use Partner Connect to connect your Databricks workspace to Monte Carlo. For more information, see Connect Databricks to Monte Carlo.

Semantic search (Public Preview)

October 16, 2023

You can now use natural language to search Unity Catalog tables in the advanced Search dialog. See Semantic search.

Enable Databricks Assistant at the workspace level

October 11, 2023

A workspace admin can now enable or disable Databricks Assistant for an individual workspace if the account admin has allowed it. For details, see How do I enable Databricks Assistant?.

IP access lists for the account console is GA

October 11, 2023

IP access lists is GA. The feature allows you to control access to the account console by IP address ranges. See Configure IP access lists for the account console.

New Photon defaults

October 11, 2023

When creating a new cluster through the UI, the default Databricks Runtime engine is Photon enabled. This applies to all-purpose and job clusters.

New clusters created with a cluster policy that is Photon-compatible have Photon enabled by default. A cluster policy is Photon-compatible if Databricks Runtime supports it, node type is supported, and the runtime_engine is not explicitly set to STANDARD.

Databricks Runtime 14.1 is GA

October 11, 2023

Databricks Runtime 14.1 and Databricks Runtime 14.1 ML are now generally available.

See Databricks Runtime 14.1 and Databricks Runtime 14.1 for Machine Learning.

Developer tools release notes have moved

October 10, 2023

Release notes for Databricks developer tools after October 10, 2023 are now posted in the Databricks developer tools and SDKs release notes instead of the Databricks platform release notes.

Databricks extension for Visual Studio Code updated to version 1.1.5

October 9, 2023

The Databricks extension for Visual Studio Code version 1.1.5 contains a few minor fixes. For details, see the changelog for version 1.1.5.

Predictive I/O for updates is GA

October 9, 2023

Predictive I/O for updates is now generally available on Databricks Runtime 14.0 and above. See What is predictive I/O?.

Deletion vectors are GA

October 9, 2023

Deletion vectors are now generally available on Databricks Runtime 14.0 and above. See What are deletion vectors?.

Customer-managed keys are generally available

October 6, 2023

Some services and data support adding a customer-managed key to help protect and control access to encrypted data. Databricks has two customer-managed key features that involve different types of data and locations. Both features are now generally available. See Customer-managed keys for encryption.

Personal Compute cluster policy is available by default

October 6, 2023

The Personal Compute policy has started rolling out to all accounts. The policy allows users to easily create single-machine compute resources for their individual use so they can start running workloads immediately, minimizing compute management overhead.

Account admins have a 28-day grace period (ending on November 6, 2023) to disable the account-wide enablement of the policy. If the setting remains enabled, all account users will have access to the policy by default. If you disable the setting, workspace admins can still manually assign permissions to the Personal Compute policy. See Manage the Personal Compute policy

Partner Connect supports RudderStack

October 5, 2023

You can now use Partner Connect to connect your Databricks workspace to RudderStack. For more information, see Connect to RudderStack.

Databricks CLI updated to version 0.207.0 (Public Preview)

October 4, 2023

The Databricks command-line interface (Databricks CLI) has been updated to version 0.207.0. This release contains feature updates and fixes for Databricks Asset Bundles, makes additions and changes to several command groups and commands, and more. For details, see the changelog for version 0.207.0.

Capture data lineage using Unity Catalog

October 4, 2023

Unity Catalog support for capturing and viewing lineage data is now generally available in Databricks on Google Cloud. Databricks began capturing lineage data in existing workspaces on July 24. See Capture and view data lineage using Unity Catalog.

Run selected cells in a notebook

October 4, 2023

You can now run only selected cells in a notebook. See Run selected cells.

Use workspace-catalog binding to give read-only access to a catalog

October 4, 2023

When you use workspace-catalog binding to limit catalog access to specific workspaces in your account, you can now make that access read-only. Read-only workspace-catalog binding is helpful for scenarios like giving users read-only access to production data from a developer workspace to enable development and testing.

This update also deprecates the /api/2.1/unity-catalog/workspace-bindings/ API endpoint and replaces it with /api/2.1/unity-catalog/bindings/.

See (Optional) Assign a catalog to specific workspaces.

New in-product Help experience (Public Preview)

October 4, 2023

The new in-product Help experience is now in Public Preview. See Get help.

Databricks extension for Visual Studio Code updated to version 1.1.4

October 2, 2023

The Databricks extension for Visual Studio Code version 1.1.4 adds support for custom Databricks workspace URLs, and more. For details, see the changelog for version 1.1.4.

Databricks SDK for Python updated to version 0.10.0 (Beta)

October 3, 2023

Databricks SDK for Python version 0.10.0 introduces 7 breaking changes, adds 10 dataclasses, adds 6 fields, and adds one service. For details, see the changelog for version 0.10.0.

Databricks SDK for Go updated to version 0.22.0 (Beta)

October 3, 2023

Databricks SDK for Go version 0.22.0 introduces one breaking API change and adds one API. For details, see the changelog for version 0.22.0.