August 2022
These features and Databricks platform improvements were released in August 2022.
Note
Releases are staged. Your Databricks account may not be updated until a week or more after the initial release date.
Databricks ODBC driver 2.6.26
August 29, 2022
We have released version 2.6.26 of the Databricks ODBC driver (download). This release updates query support. You can now asynchronously cancel queries on HTTP connections upon API request.
This release also resolves the following issue:
When using custom queries in Spotfire, the connector becomes unresponsive.
Databricks JDBC driver 2.6.29
August 29, 2022
We have released version 2.6.29 of the Databricks JDBC driver (download). This release resolves the following issues:
When using an HTTP proxy with Cloud Fetch enabled, the connector does not return large data set results.
Minor text issues in Databricks License Text. Documentation links were missing.
The JAR file names were incorrect. Instead of SparkJDBC41.jar it should have been DatabricksJDBC41.jar. Instead of SparkJDBC42.jar, it should have been DatabricksJDBC42.jar.
Databricks Feature Store client now available on PyPI
August 26, 2022
The Feature Store client is now available on PyPI. The client requires Databricks Runtime 9.1 LTS or above, and can be installed using:
%pip install databricks-feature-store
The client is already packaged with Databricks Runtime for Machine Learning 9.1 LTS and above.
The client cannot be run outside of Databricks; however, you can install it locally to aid in unit testing and for additional IDE support (for example, autocompletion). For more information, see Databricks Feature Store Python client
Databricks Runtime 11.2 (Beta)
August 23, 2022
Databricks Runtime 11.2, 11.2 Photon, and 11.2 ML are now available as Beta releases.
See the full release notes at Databricks Runtime 11.2 (EoS) and Databricks Runtime 11.2 for Machine Learning (EoS).
Decreased storage costs by reducing default boot disk size
August 23, 2022
To save disk space and alleviate quota pressure, the default boot disk size of Databricks clusters was reduced from 500 GB to 100 GB. This doesn’t affect most node types where local SSDs are used as storage. For node families that don’t have local SSDs support, such as the E2 family, you can override the boot disk with the API.
Reduced message volume in the Delta Live Tables UI for continuous pipelines
August 22-29, 2022: Version 3.79
With this release, the state transitions for live tables in a Delta Live Tables continuous pipeline are displayed in the UI only until the tables enter the running state. Any transitions related to successful recomputation of the tables are not displayed in the UI, but are available in the Delta Live Tables event log at the METRICS level. Any transitions to failure states are still displayed in the UI. Previously, all state transitions were displayed in the UI for live tables. This change reduces the volume of pipeline events displayed in the UI and makes it easier to find important messages for your pipelines. To learn more about querying the event log, see What is the Delta Live Tables event log?.
Easier cluster configuration for your Delta Live Tables pipelines
August 22-29, 2022: Version 3.79
You can now select a cluster mode, either autoscaling or fixed size, directly in the Delta Live Tables UI when you create a pipeline. Previously, configuring an autoscaling cluster required changes to the pipeline’s JSON settings. For more information on creating a pipeline and the new Cluster mode setting, see Run an update on a Delta Live Tables pipeline.
Orchestrate dbt tasks in your Databricks workflows (Public Preview)
August 22-29, 2022: Version 3.79
You can run your dbt core project as a task in a Databricks job with the new dbt task, allowing you to include your dbt transformations in a data processing workflow. For example, your workflow can ingest data with Auto Loader, transform the data with dbt, and analyze the data with a notebook task. For more information about the dbt task, including an example, see Use dbt transformations in a Databricks job. For more information on creating, running, and scheduling a workflow that includes a dbt task, see Schedule and orchestrate workflows.
Control workspace access with Customer Approved Workspace Login
August 22, 2022
The Customer Approved Workspace Login feature allows admins to give Databricks engineers and support staff access to their workspace for a temporary session.
Deploy a workspace in a customer-managed VPC (GA)
August 18, 2022
The customer-managed VPC feature is now generally available. See Configure a customer-managed VPC.
Databricks is now available in region europe-west3
August 16, 2022
Databricks is now available in the Google Cloud region europe-west3
. See Databricks clouds and regions.
Use generated columns when you create Delta Live Tables datasets
August 8-15, 2022: Version 3.78
You can now use generated columns when you define tables in your Delta Live Tables pipelines. Generated columns are supported by the Delta Live Tables Python and SQL interfaces.
Improved editing for notebooks with Monaco-based editor (Experimental)
August 8-15, 2022
A new Monaco-based code editor is available for Python notebooks. To enable it, check the option Turn on the new notebook editor on the Editor settings tab on the User Settings page.
The new editor includes parameter type hints, object inspection on hover, code folding, multi-cursor support, column (box) selection, and side-by-side diffs in the notebook version history.
You can share feedback with the Give feedback link on the User Settings page.
Compliance controls for HIPAA (GA)
August 4, 2022
The compliance controls for HIPAA are now generally available.
Databricks Runtime 10.3 series support ends
August 2, 2022
Support for Databricks Runtime 10.3 and Databricks Runtime 10.3 for Machine Learning ended on August 2. See Databricks support lifecycles.
Delta Live Tables now supports refreshing only selected tables in pipeline updates
August 2-24, 2022
You can now start an update for only selected tables in a Delta Live Tables pipeline. This feature accelerates testing of pipelines and resolution of errors by allowing you to start a pipeline update that refreshes only selected tables. To learn how to start an update of only selected tables, see Run an update on a Delta Live Tables pipeline.
Job execution now waits for cluster libraries to finish installing
August 1, 2022
When a cluster is starting, your Databricks jobs now wait for cluster libraries to complete installation before executing. Previously, job runs would wait for libraries to install on all-purpose clusters only if they were specified as a dependent library for the job. For more information on configuring dependent libraries for tasks, see Configure and edit Databricks tasks.