Manage third-party analytics tools

Databricks is constantly working to improve our customer experience. Like all SaaS companies, Databricks generates analytics about how customers use our products to better support them and to improve their overall experience. To enable Databricks to make changes quickly and efficiently, some of this data is collected through third-party analytics tools. These usage analytics tools are listed, along with other processors utilized by Databricks in the delivery of services, at Databricks Subprocessors.

If you are a Databricks admin for your workspace, you can opt in or opt out:

  1. Go to the admin settings page.

  2. Click the Advanced tab.

  3. Click the Usage Analytics toggle.

FAQ

How does this help me?

We can’t fix what we don’t measure. Seeing how our customers use our products provides us with the insights to quickly iterate and improve your product experience.

Is this feature enabled by default?

Yes it is. If you’d like to disable our use of third-party analytics tools to collect usage data, see the instructions at the beginning of this article.

Do these third-party analytics tools collect my customer data or notebooks?

No. Databricks does not use these tools to collect data about your notebooks or customer data.

Is my email address included in the data collected?

No. Databricks takes steps to limit the information sent to these vendors to protect the privacy of our users.

How does Databricks use this usage information?

We are looking at high level usage patterns to help identify ways that we can improve our customer experience. The more customers who use this, the quicker we can make improvements to our product.

Are the third-party analytics tool vendors allowed to use the data that is being collected for their own purposes?

No. Databricks has agreements in place with the vendors that provide the third-party analytics tools which don’t permit these vendors to use the data other than to provide the services to us. They are considered subprocessors under your agreement with Databricks. See Databricks Subprocessors for more details.