Repos for Git integration


This feature is in Public Preview.

To support best practices for data science and engineering code development, Databricks Repos provides repository-level integration with Git providers. You can develop code in a Databricks notebook and sync it with a remote Git repository. Databricks Repos lets you use Git functionality such as cloning a remote repo, managing branches, pushing and pulling changes, and visually comparing differences upon commit.

Databricks Repos also provides an API that you can integrate with your CI/CD pipeline. For example, you can programmatically update a Databricks Repo so that it always has the most recent code version.

Databricks Repos provides security features such as allow lists to control access to Git repositories and detection of clear text secrets in source code.

For more information about best practices for code development using Databricks Repos, see Best practices for integrating repos with CI/CD workflows.


Databricks supports these Git providers:

  • GitHub
  • Bitbucket
  • GitLab
  • Azure DevOps

The Git server must be accessible from Databricks. Databricks does not support private Git servers, such as Git servers behind a VPN.

Configure your Git integration with Databricks

  1. Click the Account Icon profile icon in your Databricks workspace and select User Settings from the menu.

  2. On the User Settings page, go to the Git Integration tab.

  3. Follow the instructions for integration with GitHub, Bitbucket Cloud, GitLab, or Azure DevOps.

    For Azure DevOps, Git integration does not support Azure Active Directory tokens. You must use an Azure DevOps personal access token.

  4. If your organization has SAML SSO enabled in GitHub, ensure that you have authorized your personal access token for SSO.

Work with repos

After you have created a repo, you can develop notebooks in the repo and sync with your remote Git repository.

Work with notebooks and folders in a Databricks repo

To create a new notebook or folder in a repo, click the down arrow next to the repo name, and select Create > Notebook or Create > Folder from the menu.

Repo create menu

To move an notebook or folder in your workspace into a repo, navigate to the notebook or folder and select Move from the drop-down menu:

Move object

In the dialog, select the repo to which you want to move the object:

Move repo

Import a SQL or Python file as a Databricks notebook

You can import a SQL or Python file as a single-cell Databricks notebook.

  • Add the comment line -- Databricks notebook source at the top of a SQL file.
  • Add the comment line # Databricks notebook source at the top of a Python file.

Sync a repo with Git

To sync with Git, use the Git dialog. The Git dialog lets you pull changes from your remote Git repository and push and commit changes. You can also change the branch you are working on or create a new branch.


Git operations that pull in upstream changes clear the notebook state. For more information, see Incoming changes clear the notebook state.

Open the Git dialog

You can access the Git dialog from a notebook or from the repos browser.

  • From a notebook, click the button at the top left of the notebook that identifies the current Git branch.

    Git dialog button on notebook
  • From the repos browser, you can click the button next to the repo name:

    Git dialog button in repo browser

    You can also click the down arrow next to the repo name, and select Git… from the menu.

    Repos menu 2

Pull changes from the remote Git repository

To pull changes from the remote Git repository, click Pullin the Git dialog. Notebooks are updated automatically to the latest version in your remote repository.

A message appears if there are merge conflicts. Databricks recommends that you resolve the merge conflict using your Git provider interface.

Commit and push changes to the remote Git repository

When you have added new notebooks or made changes to existing notebooks, the Git dialog indicates the files that have changed.

git dialog

Add a required Summary of the changes, and click Commit & Push to push these changes to the remote Git repository.

If you don’t have permission to commit to the default branch, such as main, create a new branch and use your Git provider interface to create a pull request (PR) to merge it into the default branch.


If there are merge conflicts, Databricks recommends that you create a new branch, commit and push your changes to that branch, work in your own branch, and resolve the merge conflict using your Git provider interface.

Create a new branch

You can create a new branch based on an existing branch from the Git dialog:

Git dialog new branch

Manage permissions

When you create a repo, you have Can Manage permission. This lets you perform Git operations or modify the remote repository. You can clone public remote repositories without Git credentials (personal access token and username). To modify a public remote repository, or to clone or modify a private remote repository, you must have a Git provider username and personal access token with read and write permissions for the remote repository.

Control access to repos with allow lists

An admin can limit which remote repos users can commit and push to.

  1. Go to the Admin Console.
  2. Click the Workspace Settings tab.
  3. In the Advanced section, click the Enable Repos Git URL Allow List toggle.
  4. Click Confirm.
  5. In the field next to Repos Git URL Allow List: Empty list, enter a comma-separated list of URL prefixes.
  6. Click Save.

Users can only commit and push to Git repositories that start with one of the URL prefixes you specify. The default setting is “Empty list”, which disables access to all repositories. To allow access to all repositories, disable Enable Repos Git URL Allow List.


  • The list you save overwrites the existing set of saved URL prefixes.
  • It may take about 15 minutes for changes to take effect.

Secrets detection

Repos scans code for access key IDs that begin with the prefix AKIA and warns the user before committing.

Repos API

The Repos API update endpoint allows you to update a repo to the latest version of a specific Git branch or to a tag. This enables you to update the repo before you run a job against a notebook in the repo. For details, see Repos API.

Best practices for integrating repos with CI/CD workflows

This section includes best practices for integrating Databricks repos with your CI/CD workflow. The following figure shows an overview of the steps.

Best practices overview

Admin workflow

Repos have user-level folders and non-user top level folders. User-level folders are automatically created when users first clone a remote repository. You can think of repos in user folders as “local checkouts” that are individual for each user and where users make changes to their code.

Set up top-level repo folders

Admins can create non-user top level folders. The most common use case for these top level folders is to create Dev, Staging, and Production folders that contain repos for the appropriate versions or branches for development, staging, and production. For example, if your company uses the Main branch for production, the Production folder would contain repos configured to be at the Main branch.

Typically permissions on these top-level folders are read-only for all non-admin users within the workspace.

Top-level repo folders
Set up Git automation to update repos on merge

To ensure that repos are always at the latest version, you can set up Git automation to call the Repos API. In your Git provider, set up automation that, after every successful merge of a PR into the Main branch, calls the Repos API endpoint on the appropriate repo in the Production folder to bring that repo to the latest version.

For example, on GitHub this can be achieved with GitHub Actions. For more information, see the Repos API.

User workflow

To start a workflow, clone your remote repository into a user folder. A best practice is to create a new feature branch, or select a previously created branch, for your work, instead of directly committing and pushing changes to the main branch. You can make changes, commit, and push changes in that branch. When you are ready to merge your code, create a pull request and follow the review and merge processes in Git.

Production job workflow

You can point jobs directly to notebooks in repos. When a job kicks off a run, it uses the current version of the code in the repo.

If the automation is setup as described in Admin workflow, every successful merge calls the Repos API to update the repo. As a result, jobs that are configured to run code from a repo always use the latest version available when the job run was created.

Limitations and FAQ

Incoming changes clear the notebook state

Git operations that alter the notebook source code result in the loss of the notebook state, including cell results, comments, revision history, and widgets. For example, Git pull can change the source code of a notebook. In this case, Databricks repos must overwrite the existing notebook to import the changes. Git commit and push or creating a new branch do not affect the notebook source code, so the notebook state is preserved in these operations.

Repos can contain only Databricks notebooks and folders

  • Libraries and MLflow experiments are not supported. You can use notebook experiments in repos.
  • Non-notebook files such as .txt, .csv, .md, or .yaml files are not supported.
  • The remote Git repository may contain other files, but they will not appear in Databricks.

What happens if a job starts running on a notebook while a Git operation is in progress?

At any point while a Git operation is in progress, some notebooks in the Repo may have been updated while others have not. This can cause unpredictable behavior.

For example, suppose notebook A calls notebook Z using a %run command. If a job running during a Git operation starts the most recent version of notebook A, but notebook Z has not yet been updated, the %run command in notebook A might start the older version of notebook Z. During the Git operation, the notebook states are not predictable and the job might fail or run notebook A and notebook Z from different commits.

Why do I see .py files in my repo but can’t sync my own .py files?

  • Databricks exports the notebook source for notebooks as .py for easier readability and diffing in your Git provider. However, those files have additional metadata to identify them as Databricks notebook source files. Arbitrary .py files are not available or referencable.
  • In Databricks Runtime 7.1 and above and Databricks Runtime 7.1 ML and above, %pip install support allows you to access private repositories to load Python libraries into notebooks.

How can I run non-Databricks notebook files in a repo? For example, a .py file?

You can use any of the following:

  • Bundle and deploy as a library on the cluster.
  • Pip install the Git repository directly. This requires a credential in secrets manager.
  • Use %run with inline code in a notebook.

Can I create top-level folders that are not user folders?

Yes, admins can create top-level folders to a single depth. Repos does not support additional folder levels.

How and where are the Github tokens stored in Databricks? Who would have access from Databricks?

  • The authentication tokens are stored in the Databricks control plane, and a Databricks employee can only gain access through a temporary credential that is audited.
  • Databricks logs the creation and deletion of these tokens, but not their usage. Databricks has logging that tracks Git operations that could be used to audit the usage of the tokens by the Databricks application.
  • Github enterprise audits token usage. Other Git services may also have Git server auditing.

Does Repos support SSH?

No, only HTTPS.

Can I pull the latest version of a repository from Git before running a job without relying on an external orchestration tool?

No. Typically you can integrate this as a pre-commit on the Git server so that every push to a branch (main/prod) updates the Production repo.

Are there limits on the size of a repo or the number of files?

Databricks does not enforce a limit on the size of a repo. Working branches are limited to 200MB. Individual files are limited to 10MB.

Databricks recommends that the total number of notebooks and files in a repo not exceed 1000.

You may receive an error message if these limits are exceeded. You may also receive a timeout error on the initial clone of the repo, but the operation might complete in the background.

Does Repos support branch merging?

No. Databricks recommends that you create a pull request and merge through your Git provider.

Are the contents of Databricks repos encrypted?

The contents of repos are encrypted by Databricks using a default key.

Can I delete a branch from a Databricks repo?

No. To delete a branch, you must work in your Git provider.

Where is Databricks repo content stored?

The contents of a repo are temporarily cloned onto disk in the control plane. Databricks notebook files are stored in the control plane database just like notebooks in the main workspace. Non-notebook files may be stored on disk for up to 30 days.

How can I disable Repos in my workspace?

Follow these steps to disable Repos for Git in your workspace.

  1. Go to the Admin Console.
  2. Click the Workspace Settings tab.
  3. In the Advanced section, click the Repos toggle.
  4. Click Confirm.
  5. Refresh your browser.


Error message: Invalid credentials

Try the following:

  • Confirm that the settings in the Git integration tab (User Settings > Git Integration) are correct.

    • You must enter both your Git provider username and token. Legacy Git integrations did not require a username, so you may need to add a username to work with repos.
  • Confirm that you have selected the correct Git provider in the Add Repo dialog.

  • Ensure your personal access token or app password has the correct repo access.

  • If SSO is enabled on your Git provider, authorize your tokens for SSO.

  • Test your token with command line Git. Both of these options should work:

    git clone https://<username>:<personal-access-token><org>/<repo-name>.git
    git clone -c http.sslVerify=false -c http.extraHeader='Authorization: Bearer <personal-access-token>'

Error message: Secure connection could not be established because of SSL problems

<link>: Secure connection to <link> could not be established because of SSL problems

This error occurs if your Git server is not accessible from Databricks. Private Git servers are not supported.