This article describes how Databricks SQL administrators configure a new workspace for access to data objects.
If you are using Databricks managed tables you do not need to configure access to cloud storage.
Databricks SQL warehouses all share the same cloud storage access credentials.
To configure data access for Databricks SQL, follow the steps in this section:
Databricks account on the Premium plan
A Databricks SQL warehouse
Groups representing users who you will give access to data
Databricks recommends setting up a new service account with access to all GCS buckets that should be accessed from Databricks SQL.
A Databricks administrator performs one of the following steps in the Google Cloud console:
(Optional) Create a service account to access GCS buckets. If you want to reuse an existing service account, you can skip this step.
If you are reusing a service account, get the service account email address from the Google Cloud console (or from your Databricks cluster configuration).
A Databricks administrator performs the following steps in the Google Cloud console.
A Databricks administrator performs this step in the SQL admin console:
In the sidebar, use the persona selector to select SQL.
Click Settings at the bottom of the sidebar and select SQL Admin Console.
Click the SQL Warehouse Settings tab.
Add the Google Service Account email to configure data access.
To configure data access privileges, see Data access control.
A Databricks administrator performs this step in a notebook in a Data Science & Engineering workspace.
Administrators set owners using ALTER TABLE statements. The simplest option is to set the owner to a group of admins. Alternatively, to enable a delegated security model, you can select different owners for each database, giving each the ability to manage permissions on the objects in the database.