November 2022
October 04, 2024
These features and Databricks platform improvements were released in November 2022.
Note
Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.
- Enhanced notifications for your Databricks jobs (Public Preview)
- Databricks Runtime 12.0 (Beta)
- Enforce user isolation cluster types on a workspace
- New Google Cloud region: asia-northeast1 (Tokyo)
- Upload data UI can now be disabled via admin settings
- Work with large repositories with Sparse Checkout
- Databricks Terraform provider updated to version 1.6.5
- Databricks Terraform provider updated to versions 1.6.3 and 1.6.4
- Create or modify table from file upload page now supports multiple files
- Create or modify table from file upload page now supports overwrite
- Search for jobs by name with the Jobs API 2.1
- Databricks Terraform provider updated to version 1.6.2
- Use a Google ID token to authenticate to workspace REST APIs (Public Preview)
Enhanced notifications for your Databricks jobs (Public Preview)
November 30 - Dec 7, 2022
You can now send notifications for important job run events using webhooks or native Slack notifications. This feature adds to the existing email notification support for job run events. For more information, see Add email and system notifications for job events.
Databricks Runtime 12.0 (Beta)
November 28, 2022
Databricks Runtime 12.0 is now available as a Beta release.
See Databricks Runtime 12.0 (EoS) and Databricks Runtime 12.0 for Machine Learning (EoS).
Enforce user isolation cluster types on a workspace
November 18, 2022
An admin can now prevent a user from creating or starting a cluster with a “No isolation shared” cluster access type or equivalent legacy cluster type. Use the new workspace setting called Enforce User Isolation.
New Google Cloud region: asia-northeast1 (Tokyo)
November 18, 2021
Databricks on Google Cloud is now available in the asia-northeast1 (Tokyo) region.
Upload data UI can now be disabled via admin settings
November 17, 2022
The workspace administrator setting to disable the upload data UI now applies to the new upload data UI as well as the legacy DBFS file upload UI. This setting applies to the Data Science & Engineering, Databricks Mosaic AI, and Databricks SQL personas.
Work with large repositories with Sparse Checkout
November 14, 2022
Sparse Checkout support in Repos enables you to clone and work with a subset of the remote repository’s directories in Databricks. This is useful if you are working with a monorepo or your repository’s size is beyond the Databricks supported limits.
For more information, see Configure Sparse checkout mode in Repos.
Databricks Terraform provider updated to version 1.6.5
November 11, 2022
Version 1.6.5 adds a query_plan
parameter to the databricks_sql_visualization
resource, uses a new name
filter to search for jobs by name in the databricks_job
data source, and more. For details, see the changelogs for version 1.6.5.
Databricks Terraform provider updated to versions 1.6.3 and 1.6.4
November 7-9, 2022
Versions 1.6.3 and 1.6.4 add the warehouse_type
parameter to the databricks_sql_endpoint
resource to support additional Databricks SQL warehouse types, and more. For details, see the changelogs for versions 1.6.3 and 1.6.4.
Create or modify table from file upload page now supports multiple files
November 8, 2022
You can now use the Create or modify table from file upload page to load up to 10 files into a Delta table simultaneously. See Create or modify a table using file upload.
Create or modify table from file upload page now supports overwrite
November 8, 2022
You can now use the Create or modify table from file upload page to create or overwrite managed Delta tables. See Create or modify a table using file upload.
Search for jobs by name with the Jobs API 2.1
November 3, 2022
You can now filter by job name with the List all jobs operation (GET /jobs/list
) in the Jobs API.
Databricks Terraform provider updated to version 1.6.2
November 2, 2022
Version 1.6.2 adds runtime_engine
to the databricks_cluster
resource, validation for path
in the databricks_repo
resource, auto-detection of AWS CodeCommit URLs in the databricks_repo
resource, and much more. For details, see the changelog.
Use a Google ID token to authenticate to workspace REST APIs (Public Preview)
November 1, 2022
Instead of using a Databricks personal access token for a user or service principal to authenticate to workspace REST APIs, you can now use a Google ID token. A Google ID token is the common name for a Google-issued OIDC token. See Authentication with Google ID tokens.