August 2024

These features and Databricks platform improvements were released in August 2024.

Note

Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.

Monitor Unity Catalog object usage against quotas using the new Resource Quotas APIs

August 30, 2024

The new Resource Quotas APIs enable you to monitor your usage of Unity Catalog securable objects against resource quotas. Soon, you’ll also be able to receive email notifications when you approach quota limits. See Monitor your usage of Unity Catalog resource quotas and the Resource Quotas API reference.

Compute system tables are now available (Public Preview)

August 30, 2024

The system.compute schema contains three tables you can use to monitor the all-purpose and jobs compute in your account: clusters, node_types, and node_timeline. See Compute system tables reference.

If you don’t have access to these system tables, ensure you have enabled the compute schema in your account (see Enable system table schemas).

AskSupport replaced by Databricks assistant

August 29, 2024

AskSupport, the Databricks Slack-based support channel, has been replaced with Databricks Assistant, which is available in your Databricks workspace. Use Databricks Assistant to search technical documentation, create tickets, and get context-aware support.

To use AI Assistant, AI Assistant must be enabled in your account console.

Lakeflow system tables are extend with additional columns

August 23, 2024

The tables in system.lakeflow schema are being extended with additional columns. The following changes have been made:

  • jobs is extended with description column.

  • job_run_timeline is extended with run_type, run_name, compute_ids, termination_code and job_parameters columns.

  • job_task_run_timeline is extended with job_run_id, parent_run_id and termination_code columns.

The schema change is non-breaking and won’t interrupt your existing workflows. The new columns will not be backfilled for already emitted rows. For more information, see Jobs system table reference.

Python code formatting error highlights

August 23, 2024

Python code in notebooks and file editors can highlight formatting errors and warnings like unexpected indentation, long line length, and more. See Python formatting highlighting.

Delta Sharing: More Delta Lake features now supported by the Python and Power BI connectors

August 21, 2024

Delta Sharing Python connector 1.1.0+ and Power BI v2.132.908.0+ now support:

  • Column mapping name mode

  • Deletion vectors

  • Uniform format

These Delta Lake features were already supported on Databricks Runtime 14.1+ and open-source Delta Sharing Apache Spark connector 3.1+.

See Delta Lake feature support matrix.

Delta Sharing adds support for TimestampNTZ

August 21, 2024

Delta Sharing adds support for TimestampNTZ on Databricks Runtime 14.1 and above and open-source Delta Sharing Apache Spark connector 3.3 and above.

See Delta Lake feature support matrix.

The Databricks Jobs For each task is GA

August 21, 2024

The For each task is now generally available. You can use the For each task to run another task in a loop, passing a different set of parameters to each iteration of the task. The For each task can iterate over any of the standard job tasks, such as a notebook, JAR, Python script, or SQL task. See Run a parameterized Databricks job task in a loop.

Databricks Runtime 15.4 LTS is GA

August 19, 2024

Databricks Runtime 15.4 LTS and Databricks Runtime 15.4 LTS ML are now generally available.

See Databricks Runtime 15.4 LTS and Databricks Runtime 15.4 LTS for Machine Learning.

Personalized notebook autocomplete

August 19, 2024

Notebook autocomplete now prioritizes suggestions based on your individual Unity Catalog metadata and usage, providing custom suggestion rankings for each user. See Personalized autocomplete

Configure your workspace’s default access mode for jobs compute

August 16, 2024

Workspace admins can now configure the default access mode for jobs compute in their workspace. This default access mode is applied to compute resources without a defined access mode. For more information, see Default access mode for jobs compute.

New slash commands for Databricks Assistant

August 14, 2024

Databricks Assistant has added the following slash commands as shortcuts for common tasks:

  • /findTables: Searches for relevant tables based on Unity Catalog metadata.

  • /findQueries: Searches for relevant queries based on Unity Catalog metadata.

  • /prettify: Formats code for readability.

  • /rename: Suggests updated names to notebook cells and other elements, depending on the context.

  • /settings: Adjusts your notebook settings directly from Assistant.

For more information, see Use slash commands for prompts.

Workspace search now supports volumes

August 14, 2024

Volumes are now included in search results. See Search for workspace objects.

Databricks JDBC driver 2.6.40

August 13, 2024

Databricks JDBC Driver version 2.6.40 is now available from JDBC driver download. This release removes redundant WARNING log messages to increase logging usability and security.

This release includes the following enhancements and new features:

  • OIDC discovery endpoint support. The driver can now set an OIDC discovery endpoint to fetch a token and retrieve an authorization endpoint.

  • Updated Arrow support. The driver now uses Apache Arrow version 14.0.2. Earlier versions of the driver used Apache Arrow version 9.0.0.

  • ProxyIgnoreList support. The driver now supports ProxyIgnoreList property when UseProxy is set to 1.

  • Refresh token support. The driver now supports an optional refresh token. It saves the access token and reuses it for new connections as long as it is valid. If the driver cannot renew the access token using the refresh token, it will sign in again.

  • Updated authentication support. The driver now supports browser-based (U2M) and client credentials (M2M) authentication on Google Cloud.

  • Added unified default OAuth options.

  • You can now configure the OAuth redirect port. To do this, set the OAuth2RedirectUrlPort property to your port.

For complete configuration information, see the Databricks JDBC Driver Guide installed with the driver download package.

Databricks personal access tokens revoked if unused after 90 days

August 13, 2024

Databricks now automatically revokes any personal access tokens (PATs) that have not been used in 90 or more days. For more details, see Automatic revocation of old access tokens.

Wrap lines in notebook cells

August 12, 2024

You can now enable or disable line wrapping in notebook cells, allowing text to either wrap onto multiple lines or remain on a single line with horizontal scrolling. See Line wrapping.

Create budgets to monitor account spending (Public Preview)

August 9, 2024

Account admins can now create budgets to track spending in their Databricks account. Budgets can include customized filters to track spending based on workspace and custom tags. See Use budgets to monitor account spending.

Files can no longer have identical names in workspace folders

August 9, 2024

Databricks now prevents you from creating or renaming assets in workspace folders when an asset’s name matches another file’s name exactly, accounting for the asset’s file extension. For example, you can no longer create a file named test.py if there is already a notebook with a base name of test with an extension of .py in the same workspace folder.

For more details, see Naming assets in workspace folders.

Compute policy enforcement now available

August 8, 2024

Policy compliance enforcement enables workspace admins to update their workspace’s compute resources to comply with the latest version of a policy. This feature can be used in the UI or through the Cluster Policies API.

See Enforce policy compliance or Cluster Policies API.

Format columns in notebook and query results tables

August 6, 2024

Customize your results tables to be more readable with column-formatting options like Currency, Percentage, URL, control over decimal places, and more. See Format columns.

Row filters and column masks are now GA, with improvements

August 6, 2024

The ability to apply row filters and column masks to tables is now generally available on Databricks Runtime 12.2 and above. Row filters and colum masks prevent access to sensitive data by specified users. These filters and masks are implemented as SQL user-defined functions (UDFs). GA brings support for the following functionality that was not available in the public preview:

  • Constant expressions in policy parameters (strings, numeric, intervals, booleans, nulls).

  • MERGE statements.

  • Table sampling.

See Filter sensitive table data using row filters and column masks.

Lakehouse Federation is generally available (GA)

August 1, 2024

In Databricks Runtime 15.2 and later and Databricks SQL version 2024.30 and later, Lakehouse Federation connectors across the following database types are generally available (GA):

  • MySQL

  • PostgreSQL

  • Amazon Redshift

  • Snowflake

  • Microsoft SQL Server

  • Azure Synapse (SQL Data Warehouse)

  • Databricks

This release also introduces the following improvements:

  • Support for additional pushdowns (string, math, and miscellaneous functions).

  • Improved pushdown success rate across different query shapes.

  • Additional pushdown debugging capabilities:

    • The EXPLAIN FORMATTED output displays the pushed-down query text.

    • The query profile UI displays the pushed-down query text, federated node identifiers, and JDBC query execution times (in verbose mode). See View system-generated federated queries.