Clusters CLI

Note

The CLI feature is unavailable on Databricks on Google Cloud as of this release.

You run Databricks clusters CLI subcommands by appending them to databricks clusters.

databricks clusters -h
Usage: databricks clusters [OPTIONS] COMMAND [ARGS]...

  Utility to interact with Databricks clusters.

Options:
  -v, --version  [VERSION]
  -h, --help     Show this message and exit.

Commands:
  create           Creates a Databricks cluster.
    Options:
      --json-file PATH  File containing JSON request to POST to /api/2.0/clusters/create.
      --json JSON       JSON string to POST to /api/2.0/clusters/create.
  delete           Removes a Databricks cluster.
    Options:
      --cluster-id CLUSTER_ID Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  edit             Edits a Databricks cluster.
    Options:
      --json-file PATH  File containing JSON request to POST to /api/2.0/clusters/edit.
      --json JSON       JSON string to POST to /api/2.0/clusters/edit.
  events Gets events for a Spark cluster.
    Options:
      --cluster-id CLUSTER_ID  Can be found in the URL at https://<databricks-instance>/#/setting/clusters/$CLUSTER_ID/configuration.  [required]
      --start-time TEXT        The start time in epoch milliseconds. If
                               unprovided, returns events starting from the
                               beginning of time.
      --end-time TEXT          The end time in epoch milliseconds. If unprovided,
                               returns events up to the current time
      --order TEXT             The order to list events in; either ASC or DESC.
                               Defaults to DESC (most recent first).
      --event-type TEXT        An event types to filter on (specify multiple event
                               types by passing the --event-type option multiple
                               times). If empty, all event types are returned.
      --offset TEXT            The offset in the result set. Defaults to 0 (no
                               offset). When an offset is specified and the
                               results are requested in descending order, the
                               end_time field is required.
      --limit TEXT             The maximum number of events to include in a page
                               of events. Defaults to 50, and maximum allowed
                               value is 500.
      --output FORMAT          can be "JSON" or "TABLE". Set to TABLE by default.
  get              Retrieves metadata about a cluster.
    Options:
      --cluster-id CLUSTER_ID Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  list             Lists active and recently terminated clusters.
    Options:
      --output FORMAT          JSON or TABLE. Set to TABLE by default.
  list-node-types  Lists node types for a cluster.
  list-zones       Lists zones where clusters can be created.
  permanent-delete Permanently deletes a cluster.
    Options:
      --cluster-id CLUSTER_ID  Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  resize           Resizes a Databricks cluster given its ID.
    Options:
      --cluster-id CLUSTER_ID  Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
      --num-workers INTEGER    Number of workers. [required]
  restart          Restarts a Databricks cluster.
    Options:
      --cluster-id CLUSTER_ID  Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  spark-versions   Lists possible Databricks Runtime versions.
  start            Starts a terminated Databricks cluster.
    Options:
      --cluster-id CLUSTER_ID  Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.

Create a cluster

To display usage documentation, run databricks clusters create --help.

databricks clusters create --json-file create-cluster.json

create-cluster.json:

{
  "cluster_name": "memoptimized-cluster-1",
  "spark_version": "7.5.x-scala2.12",
  "spark_conf": {},
  "gcp_attributes": {
    "use_preemptible_executors": false
  },
  "node_type_id": "n1-highmem-4",
  "driver_node_type_id": "n1-highmem-4",
  "ssh_public_keys": [],
  "custom_tags": {},
  "spark_env_vars": {
    "PYSPARK_PYTHON": "/databricks/python3/bin/python3"
  },
  "autotermination_minutes": 120,
  "enable_elastic_disk": false,
  "cluster_source": "API",
  "init_scripts": []
}
{
  "cluster_id": "1234-567890-batch123"
}

Delete a cluster

To display usage documentation, run databricks clusters delete --help.

databricks clusters delete --cluster-id 1234-567890-batch123

If successful, no output is displayed.

Change a cluster’s configuration

To display usage documentation, run databricks clusters edit --help.

databricks clusters edit --json-file edit-cluster.json

edit-cluster.json:

{
  "cluster_id": "1234-567890-batch123",
  "num_workers": 10,
  "spark_version": "7.3.x-scala2.12",
  "node_type_id": "n1-standard-4"
}

If successful, no output is displayed.

List events for a cluster

To display usage documentation, run databricks clusters events --help.

databricks clusters events \
--cluster-id 1234-567890-batch123 \
--start-time 1617238800000 \
--end-time 1619485200000 \
--order DESC \
--limit 5 \
--event-type RUNNING \
--output JSON \
| jq .
{
  "events": [
    {
      "cluster_id": "1234-567890-batch123",
      "timestamp": 1619214150232,
      "type": "RUNNING",
      "details": {
        "current_num_workers": 2,
        "target_num_workers": 2
      }
    },
    ...
    {
      "cluster_id": "1234-567890-batch123",
      "timestamp": 1617895221986,
      "type": "RUNNING",
      "details": {
        "current_num_workers": 2,
        "target_num_workers": 2
      }
    }
  ],
  "next_page": {
    "cluster_id": "1234-567890-batch123",
    "start_time": 1617238800000,
    "end_time": 1619485200000,
    "order": "DESC",
    "event_types": [
      "RUNNING"
    ],
    "offset": 5,
    "limit": 5
  },
  "total_count": 11
}

Get information about a cluster

To display usage documentation, run databricks clusters get --help.

databricks clusters get --cluster-id 1234-567890-batch123

Or:

databricks clusters get --cluster-name my-cluster
{
  "cluster_id": "1234-567890-batch123",
  "driver": {
    "node_aws_attributes": {
      "is_spot": false
    },
    "private_ip": "127.0.0.1"
  },
  "cluster_name": "my-cluster",
  "spark_version": "8.1.x-scala2.12",
  "node_type_id": "n1-standard-4",
  "driver_node_type_id": "n1-standard-4",
  "autotermination_minutes": 120,
  "enable_elastic_disk": false,
  "disk_spec": {},
  "cluster_source": "API",
  "enable_local_disk_encryption": false,
  "gcp_attributes": {
    "use_preemptible_executors": false
  },
  "instance_source": {
    "node_type_id": "n1-standard-4"
  },
  "driver_instance_source": {
    "node_type_id": "n1-standard-4"
  },
  "state": "TERMINATED",
  "state_message": "Cluster terminated by INACTIVITY",
  "start_time": 1619478205710,
  "terminated_time": 1619487451280,
  "last_state_loss_time": 1619478205710,
  "autoscale": {
    "min_workers": 2,
    "max_workers": 8
  },
  "creator_user_name": "someone@example.com",
  "termination_reason": {
    "code": "INACTIVITY",
    "parameters": {
      "inactivity_duration_min": "120"
    },
    "type": "SUCCESS"
  },
  "init_scripts_safe_mode": false
}

List information about all available clusters

To display usage documentation, run databricks clusters list --help.

databricks clusters list --output JSON | jq .
{
  "clusters": [
    {
      "cluster_id": "1234-567890-batch123",
      "driver": {
        "node_aws_attributes": {
          "is_spot": false
        },
        "private_ip": "127.0.0.1"
      },
      "cluster_name": "my-cluster",
      "spark_version": "8.1.x-scala2.12",
      "node_type_id": "n1-standard-4",
      "driver_node_type_id": "n1-standard-4",
      "autotermination_minutes": 120,
      "enable_elastic_disk": false,
      "disk_spec": {},
      "cluster_source": "API",
      "enable_local_disk_encryption": false,
      "gcp_attributes": {
        "use_preemptible_executors": false
      },
      "instance_source": {
        "node_type_id": "n1-standard-4"
      },
      "driver_instance_source": {
        "node_type_id": "n1-standard-4"
      },
      "state": "TERMINATED",
      "state_message": "Cluster terminated by INACTIVITY",
      "start_time": 1619478205710,
      "terminated_time": 1619487451280,
      "last_state_loss_time": 1619478205710,
      "autoscale": {
        "min_workers": 2,
        "max_workers": 8
      },
      "creator_user_name": "someone@example.com",
      "termination_reason": {
        "code": "INACTIVITY",
        "parameters": {
          "inactivity_duration_min": "120"
        },
        "type": "SUCCESS"
      },
      "init_scripts_safe_mode": false
    },
    ...
  ]
}

List available cluster node types

To display usage documentation, run databricks clusters list-node-types --help.

databricks clusters list-node-types
{
  "node_types": [
    {
      "node_type_id": "n2d-highcpu-224",
      "memory_mb": 229376,
      "num_cores": 224.0,
      "description": "n2d-highcpu-224",
      "instance_type_id": "n2d-highcpu-224",
      "category": "Compute Optimized",
      "support_ebs_volumes": true,
      "support_cluster_tags": true,
      "num_gpus": 0,
      "support_port_forwarding": true,
      "display_order": 0,
      "is_io_cache_enabled": false
    },
    ...
  ]
}

List available zones for creating clusters

Note

This command does not work with Databricks on Google Cloud.

To display usage documentation, run databricks clusters list-zones --help.

databricks clusters list-zones

Permanently delete a cluster

To display usage documentation, run databricks clusters permanent-delete --help.

databricks clusters permanent-delete --cluster-id 1234-567890-batch123

If successful, no output is displayed.

Resize a cluster

To display usage documentation, run databricks clusters resize --help.

databricks clusters resize --cluster-id 1234-567890-batch123 --num-workers 10

If successful, no output is displayed.

Restart a cluster

To display usage documentation, run databricks clusters restart --help.

databricks clusters restart --cluster-id 1234-567890-batch123

If successful, no output is displayed.

List available Spark runtime versions

To display usage documentation, run databricks clusters spark-versions --help.

databricks clusters spark-versions
{
  "versions": [
    {
      "key": "8.2.x-scala2.12",
      "name": "8.2 (includes Apache Spark 3.1.1, Scala 2.12)"
    },
    ...
  ]
}

Start a cluster

To display usage documentation, run databricks clusters start --help.

databricks clusters start --cluster-id 1234-567890-batch123

If successful, no output is displayed.