Configure audit log delivery

Note

This feature requires the Databricks Premium Plan.

Databricks provides access to audit logs of activities performed by Databricks users, allowing your enterprise to monitor detailed Databricks usage patterns. For details about logged events, log schema, delivery latency, and the exact delivery path syntax, see Access audit logs.

To configure audit log delivery, you must set up a GCS bucket, give Databricks access to the bucket, and then use the account console to define a log delivery configuration. Each log delivery configuration includes an optional delivery path prefix that is used to define the location of the logs.

You cannot edit or delete a log delivery configuration after creation, but you can temporarily or permanently disable a log delivery configuration using the account console. You can have a maximum of two currently-enabled audit log delivery configurations.

You can use the Google Cloud Console or the Google CLI to create a Google Cloud Storage bucket in your GCP account. The following instructions assume you will use the Google Cloud Console.

Set up audit log delivery

To configure audit log delivery, you must set up a GCS bucket and define a log delivery configuration in Databricks.

Create and configure your GCS bucket

  1. Use the Google Cloud Console to create a Google Cloud Storage bucket in your GCP account.

    • For region, choose multi-region.

    • For storage class, choose Standard for typical usage. See the Google article for storage classes.

    • For control access, choose Uniform.

  2. Click the Permissions tab on your new bucket.

  3. Click **ADD*, and then enter the Service Account log-delivery@databricks-prod-master.iam.gserviceaccount.com as New member of the storage bucket. Grant the Service Account the Storage Admin role under Cloud Storage, without specifying an access condition.

    This is required for Databricks to write and list the delivered log files for this bucket. You cannot give permission to only a bucket subdirectory. See the Google article about access control, which recommends that you create multiple buckets for granular access permissions.

    Log delivery bucket permission

Create a log delivery configuration

A log delivery configuration defines the path to the GCS bucket location where you want Databricks to deliver your audit logs.

  1. As an account admin, log in to the Databricks account console.

  2. Click Settings.

  3. Click Log delivery.

    Log delivery config
  4. Click Add log delivery.

  5. In Log delivery configuration name, add a name that is unique within your Databricks account. Spaces are allowed.

  6. In GCS bucket name, specify your GCS bucket name.

  7. In Delivery path prefix, optionally specify a prefix to be used in the path. See Location.

    The prefix can include forward slash characters but cannot start with a slash. Otherwise, the prefix can include any valid GCS object path characters. Note that space characters are not allowed.

  8. Click Add log delivery.

Disable or enable a log delivery configuration

You cannot edit or delete a log delivery configuration after creation, but you can temporarily or permanently disable a log delivery configuration using the account console. You can have a maximum of two enabled audit log delivery configurations at a time.

To disable a log delivery configuration:

  1. As an account admin, log in to the Databricks account console.

  2. Click Settings.

  3. Click Log delivery.

  4. Next to the log delivery configuration you want to disable, click the three dot icon to the right of the name.

    • To disable it, select Disable log delivery.

    • To enable it, select Enable log delivery.