June 2022

These features and Databricks platform improvements were released in June 2022.

Note

Releases are staged. Your Databricks account may not be updated until a week or more after the initial release date.

Databricks Runtime 6.4 Extended Support reaches end of support

June 30, 2022

Support for Databricks Runtime 6.4 Extended Support ended on June 30. See Databricks support lifecycles.

Databricks Runtime 10.2 series support ends

June 22, 2022

Support for Databricks Runtime 10.2 and Databricks Runtime 10.2 for Machine Learning ended on June 22. See Databricks support lifecycles.

Databricks ODBC driver 2.6.24

June 22, 2022

We have released version 2.6.24 of the Databricks ODBC driver (download). This release adds support to configure query translation to CTAS syntax, allows users to override SQL_ATTR_QUERY_TIMEOUT in the connector, and updates OpenSSL library.

This release also resolves the following issues:

  • The connector does not allow the use of server and intermediate certificates that do not have a CRL distribution points (CDP) entry.

  • When using a proxy, the connector sets the incorrect host name for SSL Server Name Indication (SNI).

Databricks Terraform provider is now GA

June 22, 2022

The Databricks Terraform provider is now generally available.

Terraform enables you to fully automate deployment for your data platforms with Terraform’s existing infrastructure-as-code (IaC) processes.

You can use the Databricks Terraform provider to define assets in Databricks workspaces, such as clusters and jobs, and to enforce access control through permissions for users, groups, and other identities.

The Databricks Terraform provider provides a complete audit trail of deployments. You can use the Databricks Terraform provider as a backbone for your disaster recovery and business continuity strategies.

HIPAA compliance controls (Public Preview)

June 21, 2022

HIPAA compliance controls provide enhancements that help you with HIPAA compliance for your workspace. This requires signing additional agreements.

Databricks Runtime 11.0 and 11.0 ML are GA; 11.0 Photon is Public Preview

June 16, 2022

Databricks Runtime 11.0 and Databricks Runtime 11.0 ML are now generally available. Databricks Runtime 11.0 Photon is in Public Preview.

See Databricks Runtime 11.0 (unsupported) and Databricks Runtime 11.0 for Machine Learning (unsupported).

Change to Repos default working directory in Databricks Runtime 11.0

June 16, 2022

The Python working directory for notebooks in a Repo defaults to the directory containing the notebooks. For example, instead of /databricks/driver, the default working directory is /Workspace/Repos/<user>/<repo>/<path-to-notebook>. This allows importing and reading from Files in Repos to work by default on Databricks Runtime 11.0 clusters.

This also means that writing to the current working directory fails with a Read-only filesystem error message. If you want to continue writing to the local file system for a cluster, write to /tmp/<filename> or /databricks/driver/<filename>.

Databricks Runtime 10.1 series support ends

June 14, 2022

Support for Databricks Runtime 10.1 and Databricks Runtime 10.1 for Machine Learning ended on June 14. See Databricks support lifecycles.

Delta Live Tables now supports SCD type 2

June 13-21, 2022: Version 3.74

Your Delta Live Tables pipelines can now use SCD type 2 to capture source data changes and retain the full history of updates to records. This enhances the existing Delta Live Tables support for SCD type 1. See APPLY CHANGES API: Simplify change data capture in Delta Live Tables.

Create Delta Live Tables pipelines directly in the Databricks UI

June 13-21, 2022: Version 3.74

You can now create a Delta Live Tables pipeline from the Create menu on the sidebar of the Databricks UI.

Select the Delta Live Tables channel when you create or edit a pipeline

June 13-21, 2022: Version 3.74

You can now configure the channel for your Delta Live Tables pipeline with the Create pipeline and Edit pipeline settings dialogs. Previously, configuring the channel required editing the settings in the pipeline’s JSON configuration.

Communicate between tasks in your Databricks jobs with task values

June 13, 2022

You can now communicate values between tasks in your Databricks jobs with task values. For example, you can use task values to pass the output of a machine learning model to downstream tasks in the same job run. See taskValues subutility (dbutils.jobs.taskValues).

Enable account switching in the Databricks UI

June 8, 2022

If users belong to more than one account, they can now switch between accounts in the Databricks UI. To use the account switcher, click your email address at the top of the Databricks UI then hover over Switch account. Then select the account you want to navigate to.

Default instance type is now n2-highmem-4

June 7, 2022

The default instance type for new Google Cloud clusters changed from n1-highmem-4 to n2-highmem-4. This affects clusters created using the UI, and clusters created from API requests that omit the node type. Existing clusters are not affected.