Databricks clouds and regions
Databricks workspaces can be hosted on Amazon AWS, Microsoft Azure, and Google Cloud Platform. You can use Databricks on any of these hosting platforms to access data wherever you keep it, regardless of cloud.
This article lists:
The regions supported by Databricks on Google Cloud.
IP addresses and domains for Databricks services and assets.
You may need this information if you are configuring your Databricks workspace to limit network egress using a firewall.
Supported regions list
Databricks is available in the following Google Cloud Platform regions:
asia-northeast1
(Tokyo)asia-southeast1
(Singapore)australia-southeast1
(Sydney, Australia)europe-west1
(Belgium, Europe)europe-west2
(England, Europe)europe-west3
(Frankfurt, Germany)us-central1
(Iowa, US)us-east1
(South Carolina, US)us-east4
(Virginia, US)us-west1
(Oregon, US)us-west4
(Nevada, US)
All Databricks on Google Cloud features are supported in each of these regions. However, some GCP regions do not support GPU instance types. For a list of regions and zones that support GPU instance types, see the GCP documentation.
IP addresses and domains
If you want to configure a firewall to block egress, you must define new VPC egress firewall rules and routes to allow essential services that are hosted in the Databricks control plane. Use the following table to get the port and IP address of the control plane ingresses for your workspace’s Google Cloud region.
See Limit network egress for your workspace using a firewall.
Google Cloud region |
Web app (port 443) |
SCC relay (port 443) |
Default metastore (port 3306) |
---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|