How can I use PyCharm with Databricks?
PyCharm by JetBrains is a dedicated Python integrated development environment (IDE) providing a wide range of essential tools for Python developers, tightly integrated to create a convenient environment for productive Python, web, and data science development. You can use PyCharm on your local development machine to write, run, and debug Python code in remote Databricks workspaces.
The following Databricks tools enable functionality for working with Databricks from PyCharm:
Name |
Description |
---|---|
Configure a connection to a remote Databricks workspace and run files on Databricks clusters from PyCharm. This plugin is developed and provided by JetBrains in partnership with Databricks. |
|
Write, run, and debug local Python code on a remote Databricks workspace from PyCharm. |
|
Programmatically define, deploy, and run Databricks jobs, Delta Live Tables pipelines, and MLOps Stacks using CI/CD best practices and workflows from PyCharm. |
|
Work with Databricks from the command line using the built-in Terminal in PyCharm. |
|
Write, run, and debug Python code that works with Databricks in PyCharm. |
|
Write, run, and debug Python code that works with Databricks SQL warehouses in remote Databricks workspaces. |
|
Provision Databricks infrastructure with Terraform and follow infrastructure-as-code (IaC) best practices using the Terraform and HCL plugin for PyCharm. Write and deploy Python definitions of Databricks infrastructure in PyCharm through third-party offerings such as the Cloud Development Kit for Terraform (CDKTF) and Pulumi. |