March 2021

These features and Databricks platform improvements were released in March 2021.

Note

Releases are staged. Your Databricks account may not be updated until a week or more after the initial release date.

Databricks now supports dark mode for viewing notebooks

March 23-29, 2021: Version 3.42

You can now view notebooks with a dark background. See View notebooks in dark mode for instructions on how to change this setting.

Databricks on Google Cloud (Public Preview)

March 22, 2021

Databricks is pleased to announced the public preview of Databricks on Google Cloud, which brings deep integration with Google Cloud technologies. To get started, go to Google Cloud Platform Marketplace, select Databricks, and follow the instructions in Get started: Account and workspace setup.

Databricks on Google Cloud runs on Google Kubernetes Engine (GKE) and provides a built-in integration with Google Cloud technologies:

  • Google Cloud Identity: Databricks workspace users can authenticate with their Google Cloud Identity account (or GSuite account) using Google’s OAuth 2.0 implementation, which conforms to the OpenID Connect spec and is OpenID certified.

  • Google Cloud Storage: Databricks notebooks can use Google Cloud Storage (GCS) buckets as data sources:

  • BigQuery: Databricks on Google Cloud notebooks can read and write to BigQuery as a data source.

The full public preview of Databricks on Google Cloud includes all of the features provided in the February 17 public preview, with these additions:

  • MLflow support (requires Databricks Runtime 8.1; model serving is not supported)

  • Databricks Runtime for Machine Learning support

  • Added region europe-west2

Some features available in Databricks on other clouds are not available in Databricks on Google Cloud as of this release. For a list of supported and unsupported features, along with known issues, see Databricks on Google Cloud features.

Easier job creation and management with the enhanced jobs user interface (Public Preview)

March 22-29, 2021: Version 3.42

Databricks has redesigned the Create and run Databricks Jobs user interface to make it easier to create and manage jobs. The new UI lets you:

  • Create and edit jobs with a streamlined interface.

  • View job configuration in a separate tab (enhancing readability).

  • Create new jobs by cloning existing jobs.

  • Run a job with different parameters from the user interface.

  • View the JSON configuration of a job.

The new job UI is the default for all users. While the new UI is in public preview, you can switch back to the previous UI by clicking the NEW tab at the top of the page. Reload the page to switch back to the new UI.

Track job retry attempts with a new sequential value returned for each job run attempt

March 22-29, 2021: Version 3.42

The Jobs API Runs list and Runs get responses now return the field attempt_number in the Run data structure. The attempt_number field is a sequential number incremented with each run attempt when you configure a retry policy for a job. You can use the attempt_number field to track the retry attempts for a job.

Easier way to connect to Databricks from your favorite BI tools and SQL clients

March 16, 2021

The Databricks JDBC and ODBC driver download pages have been revamped. There is now:

  • One-click driver download of the latest Databricks drivers via dedicated download areas for ODBC and JDBC, links to release notes, documentation and KB troubleshooting.

  • “Hot off the press” drivers: New driver versions are available for download immediately after being released.

  • Driver versions: Access to both newest and selected older driver versions.

See JDBC driver download and ODBC driver download.

Databricks Runtime 8.1 (Beta)

March 10, 2021

Databricks Runtime 8.1 and Databricks Runtime 8.1 ML are now available as Beta releases.

For information, see the full release notes at Databricks Runtime 8.1 (unsupported) and Databricks Runtime 8.1 for ML (unsupported).

Increased limit for the number of terminated all-purpose clusters

March 8-15, 2021: Version 3.41

You can now have up to 100 terminated all-purpose clusters in a Databricks workspace. Previously the limit was 70. For details, see Terminate a compute. The limit on the number of all-purpose clusters returned by the Clusters API request is also now 100.

Increased limit for the number of pinned clusters in a workspace

March 8-15, 2021: Version 3.41

You can now have up to 40 pinned clusters in a Databricks workspace. Previously the limit was 20. For details, see Pin a compute.

Databricks Runtime 8.0 (GA)

March 2, 2021

Databricks Runtime 8.0 and Databricks Runtime 8.0 ML are now generally available.

For information, see the full release notes at Databricks Runtime 8.0 (unsupported) and Databricks Runtime 8.0 for ML (unsupported).