Required permissions
This page explains the permissions required for creating and managing a Databricks workspace on Google Cloud.
On Google Cloud, each workspace runs inside a customer-owned workspace project. Databricks creates and owns a per-workspace service account with the minimal permissions needed to manage the workspace. Databricks uses the credentials of the workspace creator to grant permissions to the service account on the workspace project. A Databricks account admin must have the required permissions on the workspace project to successfully create a workspace.
The Legacy permissions section of this article contains the list of permissions previously required by Databricks to launch compute on GKE. For more information about the migration from GKE to GCE, see Update permissions for GCE compute deployment.
Required permissions for the workspace creator
Databricks uses the workspace creator’s credentials to validate settings, grant permissions, enable required services, and provision the workspace.
The following is the minimal set of permissions required on the workspace and network projects. Databricks recommends that the workspace creator have the roles/owner
role on both the workspace and VPC projects.
Note
Workspace creation typically takes less than a minute to complete. Databricks won’t retain or use these permissions after the workspace creation.
Google permission |
Purpose |
Required for workspace project |
Required for VPC project |
Use case |
---|---|---|---|---|
|
Create the custom role. |
✓ |
✓ |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Delete the custom role. |
✓ |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
|
Get the custom role. |
✓ |
✓ |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Update the custom role. |
✓ |
✓ |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Create the Databricks-compute service account. |
✓ |
Create the Databricks-compute service account used by all clusters in the workspace that do not have a custom service account attached. This service account has minimal permissions, limited to logging and metrics. |
|
|
Get the Databricks-compute service account. |
✓ |
Used to check if the required Databricks-compute service account used by all clusters in the workspace exists. |
|
|
Get IAM policy. |
✓ |
Grant workspace service account the Service Account User role on Google Compute Engine (GCE) service account for launching GKE clusters. |
|
|
Set IAM policy. |
✓ |
Grant workspace service account the Service Account User role on Google Compute Engine (GCE) service account for launching GKE clusters. |
|
|
Get a project number from its project ID. |
✓ |
✓ |
Get basic information about the workspace project. |
|
Get IAM policy. |
✓ |
✓ |
Get basic information about the workspace project. |
|
Set IAM policy. |
✓ |
Get basic information about the workspace project. |
|
|
Validate whether the customer project has enabled the required Google Cloud APIs. |
✓ |
✓ |
Enable Google Cloud services needed for Databricks workloads. |
|
Validate whether the customer project has enabled the required Google Cloud APIs. |
✓ |
✓ |
Enable Google Cloud services needed for Databricks workloads. |
|
Enable the required Google Cloud APIs on the project if they are not already enabled. |
✓ |
Enable Google Cloud services needed for Databricks workloads. |
|
|
Validate the existence of a VPC network. |
✓ |
Validate network resources for the customer-provided VPC network, which might belong to a project other than the workspace project. |
|
|
Update the firewall policy on VPC network. |
✓ |
Updates the firewall policy on the customer-provided VPC network, which might belong to a project other than the workspace project. |
|
|
Get the host project of a VPC network. |
✓ |
Validate network resources for the customer-provided VPC network, which might belong to a project other than the workspace project. |
|
|
Validate subnets of a VPC network. |
✓ |
Validate network resources for the customer-provided VPC network, which might belong to a project other than the workspace project. Required if you use a customer-managed VPC. |
|
|
Get the IAM policy on the VPC subnet. |
✓ |
Validate the grants on the subnetwork for the customer-provided VPC network, which might belong to a project other than the workspace project. Required if you use a customer-managed VPC. |
|
|
Set the IAM policy on the VPC subnet. |
✓ |
Sets the IAM policy on the subnetwork for the customer-provided VPC network, which might belong to a project other than the workspace project. Required if you use a customer-managed VPC. |
|
|
List forwarding rules for Private Service Connect. |
✓ |
Required if you enable Private Service Connect. |
|
|
Get forwarding rules for Private Service Connect. |
✓ |
Required if you enable Private Service Connect. |
|
|
Get a firewall rule. |
✓ |
Gets the required firewall rule in the customer-provided VPC network to check if it exists. |
|
|
Create a firewall rule. |
✓ |
Creates a firewall rule in the customer-provided VPC network, which might belong to a project other than the workspace project. |
Required permissions for the workspace service account
The workspace service account requires permissions in the following IAM roles on the workspace project to operate and manage a workspace:
Databricks Project Role v2: This role is required to operate and manage project-level resources such as instances, disks, cloud operations, and service accounts managed by Databricks. It is granted at the project level to the workspace service account.
Databricks Resource Role v2: This is required to operate and manage Google Compute Engine (GCE) instances, storage disks, and other workspace-level resources managed by Databricks. This role is granted at the project level to the workspace service account. The workspace-level scoping is enforced using an IAM condition on the workspace ID. The following example uses
1234567890
in place of an actual workspace ID:resource.name.extract("{x}databricks”) != "" && resource.name.extract("{x}1234567890”) != ""
Databricks Network Role v2: This is required to use subnetwork resources under a customer-managed VPC network. It is granted to the workspace service account on the specific subnet.
Permissions for Databricks Project Role v2
Permission |
Purpose |
Use case |
---|---|---|
|
List disks |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List cloud operations |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List regional cloud operations |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List zonal cloud operations |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List GCE instances |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List available zones |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get zone description |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get region description |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get quota details |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get quota details |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List Databricks-managed GCS buckets |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get on-demand capacity recommendations |
Get zone/machine type recommendations for on-demand instances based on available capacity in the region |
|
Get spot capacity recommendations |
Get zone/machine type recommendations for spot instances based on available capacity in the region |
|
Get details of a GCE reservation |
Get details of a GCE reservation for use in zone/machine type selection |
|
List all GCE reservations |
List details of all GCE reservations for use in zone/machine type selection |
Permissions for Databricks Resource Role v2
Permission |
Purpose |
Use case |
---|---|---|
|
Create Databricks-managed disks |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete Databricks-managed disks |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get Databricks-managed disk info |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Resize Databricks-managed disks |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Set Labels on Databricks-managed disks |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update Databricks-managed disks |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Attach Databricks-managed disks to a VM |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Attach Databricks-managed disks to a VM in read-only mode |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Create Databricks-managed instances |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete Databricks-managed instances |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Attach a disk to a Databricks-managed instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Detach a disk from a Databricks-managed instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get instance details |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get instance guest attributes |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get instance serial port logs |
Debug failed Google Compute Engine (GCE) resources |
|
Set labels on an instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Set tags on an instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update an instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Set metadata on an instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Set service account on an instance |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Cancel a multipart upload to Databricks-managed GCS bucket |
Manage Google Cloud Storage (GCS) upload sessions when uploading large files |
|
Create a multipart upload to Databricks-managed GCS bucket |
Manage Google Cloud Storage (GCS) upload sessions when uploading large files |
|
List multipart uploads to Databricks-managed GCS bucket |
Manage Google Cloud Storage (GCS) upload sessions when uploading large files |
|
List parts uploaded for a specific multipart upload to a Databricks-managed GCS bucket |
Manage Google Cloud Storage (GCS) upload sessions when uploading large files |
|
Create a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get details of a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get IAM policy of a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Set IAM policy of a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Create a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get details for a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
List objects in a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update objects in a Databricks-managed GCS bucket |
Manage Google Compute Engine (GCE) resources to run workloads |
Additional permissions for workspaces on Databricks-managed VPC network
The following permissions are also required for workspaces that use Databricks-managed VPC network:
Permission |
Purpose |
Use case |
---|---|---|
|
Launch VMs in the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get details of the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Launch VMs in the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Launch VMs in the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get details of the Databricks-managed router |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get details of the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Launch VMs in the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Launch VMs in the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get IAM policy for Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Launch VMs in the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Create the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update the Databricks-managed VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Create the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Expand CIDR range on the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Set IAM policy on the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Configure Private Google API Access on the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update the Databricks-managed subnet |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Create the Databricks-managed router |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete the Databricks-managed router |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update the Databricks-managed router |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Create the ingress firewall rule to allow Databricks VMs to communicate |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Delete the ingress firewall rule on workspace teardown in order to clean up the VPC |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get the ingress firewall rule details |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Update ingress firewall rule details |
Manage Google Compute Engine (GCE) resources to run workloads |
Permissions for Databricks Network Role v2
Permission |
Purpose |
Use case |
---|---|---|
|
Use the subnet in the customer managed network |
Manage Google Compute Engine (GCE) resources to run workloads |
|
Get the info of the subnet in the customer-managed network |
Manage Google Compute Engine (GCE) resources to run workloads |
Legacy permissions
The following permissions are legacy and were required when Databricks launched GKE clusters. You should only reference the list if your account has no updated permissions for GCE compute deployment. See Update permissions for GCE compute deployment.
Required permissions for the workspace creator
Google permission |
Purpose |
Use case |
---|---|---|
|
Create the custom role. |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Delete the custom role. |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Get the custom role. |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Update the custom role. |
Create and manage a custom role for granting permissions to the workspace’s service account. |
|
Get IAM policy. |
Grant workspace service account the |
|
Set IAM policy. |
Grant workspace service account the |
|
Get a project number from its project ID. |
Get basic information about the workspace project. |
|
Get IAM policy. |
Get basic information about the workspace project. |
|
Set IAM policy. |
Get basic information about the workspace project. |
|
Validate whether the customer project has enabled the required Google Cloud APIs. |
Enable Google Cloud services needed for Databricks workloads. |
|
Validate whether the customer project has enabled the required Google Cloud APIs. |
Enable Google Cloud services needed for Databricks workloads. |
|
Enable the required Google Cloud APIs on the project if they are not already enabled. |
Enable Google Cloud services needed for Databricks workloads. |
|
Validate the existence of a VPC network. |
Validate network resources for the customer-provided VPC network, which might belong to a project other than the workspace project. |
|
Get the host project of a VPC network. |
Validate network resources for the customer-provided VPC network, which might belong to a project other than the workspace project. |
|
Validate subnets of a VPC network. |
Validate network resources for the customer-provided VPC network, which might belong to a project other than the workspace project. |
|
List forwarding rules for Private Service Connect. |
Required if you enable Private Service Connect. |
|
Get forwarding rules for Private Service Connect. |
Required if you enable Private Service Connect. |
|
Get the access control policy for a Cloud KMS resource. |
Required on the Cloud KMS key if you enable customer-managed keys. |
|
Set the access control policy on a Cloud KMS resource. |
Required on the Cloud KMS key if you enable customer-managed keys. |
Required permissions for the workspace service account
The workspace service account requires permissions in the following IAM roles on the workspace project to operate and manage a workspace:
GKE Admin Role: This is required to operate and manage customer workloads running on GKE.
GCE Storage Admin Role: This is required to operate and manage Google Compute Engine (GCE) persistent storages associated with GKE nodes.
Databricks Workspace Role: A per-workspace custom role for granting additional permissions needed to manage a workspace.
Permission |
Purpose |
Use case |
---|---|---|
|
Get operation data for visibility into GCE operations during GCE outages. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Get instance groups for GCE troubleshooting. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
List instance groups for GCE troubleshooting. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Get compute instances. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
List compute instances for GCE troubleshooting. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Set compute instance labels. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Get disks. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Set disk labels. Manage Google Compute Engine (GCE) resources to run workloads. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Get region operations for visibility into Google Compute Engine (GCE) operations during GCE outages. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. If you use a customer-managed VPC, this permission is not in the custom role that Databricks grants to the service account. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Manage network resources. |
Manage Google Compute Engine (GCE) resources to run workloads. |
|
Create cluster role bindings. |
Manage GKE clusters to run Databricks workloads. |
|
Get cluster role bindings. |
Manage GKE clusters to run Databricks workloads. |
|
Bind cluster role bindings. |
Manage GKE clusters to run Databricks workloads. |
|
Create cluster roles. |
Manage GKE clusters to run Databricks workloads. |
|
Get cluster roles. |
Manage GKE clusters to run Databricks workloads. |
|
Create cluster roles. |
Manage GKE clusters to run Databricks workloads. |
|
Delete cluster roles. |
Manage GKE clusters to run Databricks workloads. |
|
Get clusters. |
Manage GKE clusters to run Databricks workloads. |
|
Get cluster credentials. |
Manage GKE clusters to run Databricks workloads. |
|
List clusters. |
Manage GKE clusters to run Databricks workloads. |
|
Update clusters. |
Manage GKE clusters to run Databricks workloads. |
|
Create configMaps. |
Manage GKE clusters to run Databricks workloads. |
|
Get configMaps. |
Manage GKE clusters to run Databricks workloads. |
|
Update configMaps. |
Manage GKE clusters to run Databricks workloads. |
|
Create custom resource definitions. |
Manage GKE clusters to run Databricks workloads. |
|
Get custom resource definitions. |
Manage GKE clusters to run Databricks workloads. |
|
Update custom resource definitions. |
Manage GKE clusters to run Databricks workloads. |
|
Create daemon sets. |
Manage GKE clusters to run Databricks workloads. |
|
Get daemon sets. |
Manage GKE clusters to run Databricks workloads. |
|
Update daemon sets. |
Manage GKE clusters to run Databricks workloads. |
|
Create deployments. |
Manage GKE clusters to run Databricks workloads. |
|
Get deployments. |
Manage GKE clusters to run Databricks workloads. |
|
Update deployments. |
Manage GKE clusters to run Databricks workloads. |
|
Create job. |
Manage GKE clusters to run Databricks workloads. |
|
Get job. |
Manage GKE clusters to run Databricks workloads. |
|
Update job. |
Manage GKE clusters to run Databricks workloads. |
|
Create namespace. |
Manage GKE clusters to run Databricks workloads. |
|
Get namespace. |
Manage GKE clusters to run Databricks workloads. |
|
List namespaces. |
Manage GKE clusters to run Databricks workloads. |
|
Get operations. |
Manage GKE clusters to run Databricks workloads. |
|
Get pods. |
Manage GKE clusters to run Databricks workloads. |
|
Get pod logs. |
Manage GKE clusters to run Databricks workloads. |
|
List pods. |
Manage GKE clusters to run Databricks workloads. |
|
Create role bindings. |
Manage GKE clusters to run Databricks workloads. |
|
Get role bindings. |
Manage GKE clusters to run Databricks workloads. |
|
Bind roles. |
Manage GKE clusters to run Databricks workloads. |
|
Create roles. |
Manage GKE clusters to run Databricks workloads. |
|
Get roles. |
Manage GKE clusters to run Databricks workloads. |
|
Create secret. |
Manage GKE clusters to run Databricks workloads. |
|
Get a secret. |
Manage GKE clusters to run Databricks workloads. |
|
Update a secret. |
Manage GKE clusters to run Databricks workloads. |
|
Create a service account. |
Manage GKE clusters to run Databricks workloads. |
|
Get a service account. |
Manage GKE clusters to run Databricks workloads. |
|
Create a service. |
Manage GKE clusters to run Databricks workloads. |
|
Get a service. |
Manage GKE clusters to run Databricks workloads. |
|
Create a third-party object. |
Manage GKE clusters to run Databricks workloads. |
|
Delete a third-party object. |
Manage GKE clusters to run Databricks workloads. |
|
Get a third-party object. |
Manage GKE clusters to run Databricks workloads. |
|
List third-party objects. |
Manage GKE clusters to run Databricks workloads. |
|
Update a third-party object. |
Manage GKE clusters to run Databricks workloads. |
|
Inspect service accounts or bind them to a cluster. |
Configure GKE Workload Identity for a cluster’s service account to access your data. |
|
Inspect service accounts or bind them to a cluster. |
Configure GKE Workload Identity for a cluster’s service account to access your data. |
|
Convert customer project ID to a project number. |
Validate the project status, such as whether the project is live and whether the workspace service account has enough permissions. |
|
Check if the project IAM policy is correctly configured. |
Validate the project status, such as whether the project is live and whether the workspace service account has enough permissions. |
|
Create a bucket. |
This is required to create and manage GCS buckets for DBFS. |
|
Delete a bucket. |
This is required to create and manage GCS buckets for DBFS. |
|
Get a bucket. |
This is required to create and manage GCS buckets for DBFS. |
|
Get storage IAM policy. |
This is required to create and manage GCS buckets for DBFS. |
|
List buckets. |
This is required to create and manage GCS buckets for DBFS. |
|
Set storage IAM policy. |
This is required to create and manage GCS buckets for DBFS. |
|
Update storage IAM policy. |
This is required to create and manage GCS buckets for DBFS. |
|
Abort a multipart upload. |
Read and write DBFS objects. |
|
Create a multipart upload. |
Read and write DBFS objects. |
|
List multipart uploads. |
Read and write DBFS objects. |
|
List parts of a multipart upload. |
Read and write DBFS objects. |
|
Create a storage object. |
Read and write DBFS objects. |
|
Delete storage object. |
Read and write DBFS objects. |
|
Get a storage object. |
Read and write DBFS objects. |
|
List storage objects. |
Read and write DBFS objects. |
|
Update a storage object. |
Read and write DBFS objects. |