Develop Delta Live Tables pipelines

The articles in this section describe steps and recommendations for Delta Live Tables pipeline development and testing in either a Databricks notebook, the Databricks file editor, or locally using an integrated development environment (IDE).

You can use either Python or SQL to write pipeline code. You can configure a pipeline with multiple source code assets (notebooks and files) written in Python and SQL, but each individual notebook or file can only use a single language.

Create a pipeline in the Databricks UI

For UI steps to configure a Delta Live Tables pipeline from code in a notebook, see Configure a Delta Live Tables pipeline.

Notebook experience for Delta Live Tables code development (Public Preview)

When you work on a Python or SQL notebook that is the source code for an existing Delta Live Tables pipeline, you can connect the notebook to the pipeline and access a set of features in notebooks that assist in developing and debugging Delta Live Tables code. See Notebook experience for Delta Live Tables code development.

Tips, recommendations, and features for developing and testing pipelines

For pipeline development and testing tips, recommendations, and features, see Tips, recommendations, and features for developing and testing Delta Live Tables pipelines.

CI/CD for pipelines

Databricks Asset Bundles enable you to programmatically validate, deploy, and run Databricks resources such as Delta Live Tables pipelines. For steps that you can complete from your local development machine to use a bundle that programmatically manages a Delta Live Tables pipeline, see Develop Delta Live Tables pipelines with Databricks Asset Bundles.

Develop pipeline code in your local development environment

In addition to using notebooks or the file editor in your Databricks workspace to implement pipeline code that uses the Delta Live Tables Python interface, you can also develop your code in your local development environment. For example, you can use your favorite integrated development environment (IDE) such as Visual Studio Code or PyCharm. After writing your pipeline code locally, you can manually move it into your Databricks workspace or use Databricks tools to operationalize your pipeline, including deploying and running the pipeline.

See Develop Delta Live Tables pipeline code in your local development environment.