Developer tools and guidance

Learn about tools and guidance you can use to work with Databricks resources and data and to develop Databricks applications.


Use this section when you want to…


Authenticate with Databricks from your tools, scripts, and apps. You must authenticate with Databricks before you can work with Databricks resources and data.


Connect to Databricks by using popular integrated development environments (IDEs) such as Visual Studio Code, PyCharm, IntelliJ IDEA, Eclipse, and RStudio, as well as automate Databricks by using IDE plugins.


Automate Databricks from code libraries written for popular languages such as Go.

SQL connectors/drivers

Run SQL commands on Databricks from code written in popular languages such as Python, Go, JavaScript, and TypeScript. Connect tools and clients to Databricks through ODBC and JDBC connections.


Automate Databricks by using command-line interfaces (CLIs).


Use Databricks Utilities from within notebooks to do things such as work with object storage efficiently, chain and parameterize notebooks, and work with sensitive credential information.

REST API Explorer (Beta)

Look up reference information for the Databricks REST APIs.

Because this is a beta, you might need to occasionally refer to the legacy Databricks REST API reference, as the REST API Explorer is missing some request/response examples and explanations. If you find an issue, tell Databricks by using the REST API Explorer’s Send feedback link.

REST API (latest)

Call Databricks automation APIs directly by using popular clients such as curl, Postman, and HTTPie, as well as popular libraries such as requests for Python.

Python API

Call Databricks automation APIs directly with Python by using the Databricks CLI package as a Python library.


Automate the provision and maintenance of Databricks infrastructure and resources by using popular infrastructure-as-code (IaC) products such as Terraform, the Cloud Development Kit for Terraform, and Pulumi.


Implement industry-standard continuous integration and continuous delivery (CI/CD) practices for Databricks by using popular systems such as GitHub Actions, Azure Pipelines, GitLab CI/CD, Jenkins, and Apache Airflow.

SQL tools

Run SQL commands and scripts in Databricks by using Databricks CLIs, as well as popular tools such as DataGrip, DBeaver, and SQL Workbench/J.

Service principals

Use identities called service principals as a security best practice to authenticate automated scripts, tools, apps, and systems with Databricks.


You can also connect many additional popular third-party tools to clusters and SQL warehouses to access data in Databricks. See the Technology partners.