Databricks REST API reference

Databricks has three REST APIs that perform different tasks:

  • 2.0 and 2.1 for general administration

  • 1.2 for running commands directly on Databricks

For the latest version of all REST APIs, see REST API (latest). You can also jump directly to the REST API home pages for each version: 2.1, 2.0, or 1.2.

Important

To access Databricks REST APIs, you must authenticate.

  • Clusters API 2.0

  • Cluster Policies API 2.0

  • Databricks SQL Queries and Dashboards API 2.0

  • Databricks SQL Query History API 2.0

  • Databricks SQL Warehouses API 2.0

  • Delta Live Tables API 2.0

  • Git Credentials API 2.0

  • Global Init Scripts API 2.0

  • Groups API 2.0

  • Instance Pools API 2.0

  • IP Access List API 2.0

  • Jobs API 2.1, 2.0

  • Libraries API 2.0

  • MLflow API 2.0

  • Permissions API 2.0

  • Repos API 2.0

  • SCIM API 2.0

  • Secrets API 2.0

  • Token API 2.0

  • Token Management API 2.0

  • Workspace API 2.0

  • API 1.2

Authentication

For information about authenticating to the REST API, see Authentication using Databricks personal access tokens. For API examples, see API examples.

Rate limits

To ensure high quality of service under heavy load, Databricks enforces rate limits for all REST API calls. Limits are set per endpoint and per workspace to ensure fair usage and high availability.

Requests that exceed the rate limit return a 429 response status code.

For information on rate limits for API requests, see API rate limits.

Parse output

It can be useful to parse out parts of the JSON output. Databricks recommends the utility jq for parsing JSON. You can install jq on Linux through jq Releases, on macOS using Homebrew with brew install jq, or on Windows using Chocolatey with choco install jq. For more information on jq, see the jq Manual.

This example lists the names and IDs of available clusters in the specified workspace. This example uses a .netrc file.

curl --netrc -X GET https://1234567890123456.7.gcp.databricks.com/api/2.0/clusters/list \
| jq '[ .clusters[] | { id: .cluster_id, name: .cluster_name } ]'
[
  {
    "id": "1234-567890-batch123",
    "name": "My Cluster 1"
  },
  {
    "id": "2345-678901-rigs234",
    "name": "My Cluster 2"
  }
]

Compatibility

Responses for the same API version will not remove any field from the JSON output. However, the API might add new fields to the JSON output without incrementing the API version. Your programmatic workflows must be aware of these additions and ignore unknown fields.

Some STRING fields (which contain error and descriptive messaging intended to be consumed by the UI) are unstructured, and you should not depend on the format of these fields in programmatic workflows.

Use curl to invoke the Databricks REST API

curl is a popular tool for transferring data to and from servers. This section provides specific information about using curl to invoke the Databricks REST API.

Invoke a GET using a query string

While most API calls require that you specify a JSON body, for GET calls you can specify a query string by appending it after ? and surrounding the URL in quotes. If you use curl, you can specify --get (or -G) and --data (or -d) along with the query string; you do not need to surround the URL or the query string in quotes.

In the following example, replace 1234567890123456.7.gcp.databricks.com with the workspace URL of your Databricks deployment.

This example prints information about the specified cluster. This example uses a .netrc file.

Using ?:

curl --netrc 'https://1234567890123456.7.gcp.databricks.com/api/2.0/clusters/get?cluster_id=1234-567890-patch123'

Using --get and --data:

curl --netrc --get \
https://1234567890123456.7.gcp.databricks.com/api/2.0/clusters/get \
--data cluster_id=1234-567890-batch123
{
  "cluster_id": "1234-567890-batch123",
  "driver": {
    "node_aws_attributes": {
      "is_spot": false
    },
    "private_ip": "127.0.0.1"
  },
  "cluster_name": "My Cluster",
  ...
}

Use Python to invoke the Databricks REST API

requests is a popular library for making HTTP requests in Python. This example uses the requests library to list information about the specified Databricks cluster. This example uses a .netrc file.

import requests
import json

instance_id = '1234567890123456.7.gcp.databricks.com'

api_version = '/api/2.0'
api_command = '/clusters/get'
url = f"https://{instance_id}{api_version}{api_command}"

params = {
  'cluster_id': '1234-567890-batch123'
}

response = requests.get(
  url = url,
  params = params
)

print(json.dumps(json.loads(response.text), indent = 2))
{
  "cluster_id": "1234-567890-batch123",
  "driver": {
    ...
  },
  "spark_context_id": 1234567890123456789,
  ...
}

Use Postman to invoke the Databricks REST API

  1. In the Postman app, create a new HTTP request (File > New > HTTP Request).

  2. In the HTTP verb drop-down list, select the verb that matches the REST API operation you want to call. For example, to list information about a Databricks cluster, select GET.

  3. For Enter request URL, begin by entering https://<databricks-instance-name>, where <databricks-instance-name> is your Databricks workspace instance name, for example 1234567890123456.7.gcp.databricks.com.

  4. Finish the request URL with the path that matches the REST API operation you want to call. For example, to list information about a cluster, use /api/2.0/clusters/get.

  5. On the Authorization tab, in the Type list, select Bearer Token.

  6. For Token, enter your Databricks personal access token.

    Tip

    Instead of entering your workspace instance name, for example 1234567890123456.7.gcp.databricks.com and your Databricks personal access token for every call, you can define variables and use variables in Postman instead.

  7. If the REST API operation that you want to call requires a request body, do the following:

    1. On the Headers tab, add the Key and Value pair of Content-Type and an acceptable content type for the REST API operation. For example, to list information about a cluster, use the content type of application/json.

    2. On the Body tab, select an acceptable body type for the REST API operation. For example, to list information about a cluster, select the body type of raw and then JSON.

    3. Enter the request body. For example, to list information about the specified cluster, enter the following:

      {
        "cluster_id": "1234-567890-batch123"
      }
      
  8. If the REST API operation that you want to call requires any additional headers, enter them as additional Key and Value pairs on the Headers tab. For example, to list information about a cluster, no additional headers are needed.

  9. If the REST API operation that you want to call requires any query parameters, enter them as Key and Value pairs on the Params tab. For example, to list information about a cluster, instead of using a request body, you can use a query parameter with a key of cluster_id and a value of the specified cluster’s ID, such as 1234-567890-batch123.

  10. Click Send. Any response details will appear on the response section’s Body tab.

Use HTTPie to invoke the Databricks REST API

Use the HTTPie desktop app or HTTPie web app to invoke the Databricks REST API

  1. Open the HTTPie desktop app, or go to the HTTPie web app.

  2. In the HTTP verb drop-down list, select the verb that matches the REST API operation you want to call. For example, to list information about a Databricks cluster, select GET.

  3. In the httpie.io/hello box, begin by entering https://<databricks-instance-name>, where <databricks-instance-name> is your Databricks workspace instance name, for example 1234567890123456.7.gcp.databricks.com.

  4. Finish the request URL with the path that matches the REST API operation you want to call. For example, to list information about a cluster, use /api/2.0/clusters/get.

  5. On the Auth tab, click Bearer Token.

  6. For token, enter your Databricks personal access token.

    Tip

    Instead of entering your workspace instance name, for example 1234567890123456.7.gcp.databricks.com and your Databricks personal access token for every call, you can define environment variables (such as DATABRICKS_HOST and DATABRICKS_TOKEN) and then use those environment variables (such as {{DATABRICKS_HOST}} and {{DATABRICKS_TOKEN}}) in HTTPie instead. See Environments on the HTTPie blog.

  7. If the REST API operation that you want to call requires a request body, do the following:

    1. On the Headers tab, add the name and value pair of Content-Type and an acceptable content type for the REST API operation. For example, to list information about a cluster, use the content type of application/json.

    2. On the Body tab, select an acceptable body type for the REST API operation. For example, to list information about a cluster, select the body type of Text and then JSON.

    3. Enter the request body. For example, to list information about the specified cluster, enter the following:

      {
        "cluster_id": "1234-567890-batch123"
      }
      
  8. If the REST API operation that you want to call requires any additional headers, enter them as additional name and value pairs on the Headers tab. For example, to list information about a cluster, no additional headers are needed.

  9. If the REST API operation that you want to call requires any query parameters, enter them as name and Value pairs on the Params tab. For example, to list information about a cluster, instead of using a request body, you can use a query parameter with a name of cluster_id and a value of the specified cluster’s ID, such as 1234-567890-batch123.

  10. Click Send. Any response details will appear on the Response tab.

Use the HTTPie command-line interface to invoke the Databricks REST API

This example uses the HTTPie command-line interface to list information about the specified Databricks cluster. This example uses a .netrc file.

https GET 1234567890123456.7.gcp.databricks.com/api/2.0/clusters/get cluster_id=1234-567890-batch123

# Or...

https 1234567890123456.7.gcp.databricks.com/api/2.0/clusters/get cluster_id==1234-567890-batch123

# Which is equivalent in curl with jq...
# curl --netrc 'https://1234567890123456.7.gcp.databricks.com/api/2.0/clusters/get' -d '{ "cluster_id": "1234-567890-patch123" }' | jq .
# Or...
# curl --netrc 'https://1234567890123456.7.gcp.databricks.com/api/2.0/clusters/get?cluster_id=1234-567890-patch123' | jq .

Tip

You can convert curl to HTTPie syntax with tools such as the CurliPie package on PyPI or the CurliPie web app.

Use PowerShell to invoke the Databricks REST API

This example uses the Invoke-RestMethod cmdlet in PowerShell to list information about the specified Databricks cluster.

$Token = 'dapia1b2345678901c23456defa7bcde8fa9'
$ConvertedToken = $Token | ConvertTo-SecureString -AsPlainText -Force

$InstanceID = '1234567890123456.7.gcp.databricks.com'
$APIVersion = '/api/2.0'
$APICommand = '/clusters/get'
$Uri = "https://$InstanceID$APIVersion$APICommand"

$Body = @{
  'cluster_id' = '1234-567890-batch123'
}

$Response = Invoke-RestMethod `
  -Authentication Bearer `
  -Token $ConvertedToken `
  -Method Get `
  -Uri  $Uri `
  -Body $Body

Write-Output $Response
cluster_id       : 1234-567890-batch123
driver           : ...
spark_context_id : 1234567890123456789
...

Runtime version strings

Many API calls require you to specify a Databricks runtime version string. This section describes the structure of a version string in the Databricks REST API.

<M>.<F>.x[-cpu][-esr][-gpu][-ml][-photon]-scala<scala-version>

where

  • M: Databricks Runtime major release

  • F: Databricks Runtime feature release

  • cpu: CPU version (with -ml only)

  • esr: Extended Support

  • gpu: GPU-enabled

  • ml: Machine learning

  • photon: Photon

  • scala-version: version of Scala used to compile Spark: 2.10, 2.11, or 2.12

For example:

  • 7.6.x-gpu-ml-scala2.12 represents Databricks Runtime 7.6 for Machine Learning, is GPU-enabled, and uses Scala version 2.12 to compile Spark version 3.0.1

The Supported Databricks runtime releases and support schedule and Unsupported releases tables map Databricks Runtime versions to the Spark version contained in the runtime.

You can get a list of available Databricks runtime version strings by calling the Runtime versions API.