Configure task parameters

Task parameters allow you to parameterize tasks using values that can be static, dynamic, or set by upstream tasks.

For information on using dynamic values, see What is a dynamic value reference?.

For information on passing context between tasks, see Use task values to pass information between tasks.

The assets configured by tasks use different syntax to refer to values passed as parameters. See Configure and edit Databricks tasks.

Note

Some tasks support parameterization but do not have parameter fields. See the following:

Configure key-value parameters

Configure parameters for the following tasks as key-value pairs:

  • Notebook

  • Python wheel (only when configured with keyword arguments)

  • SQL query, legacy dashboard, or file

  • Run Job

Job parameters are automatically pushed down to tasks that support key-value parameters. A warning is shown in the UI if you attempt to add a task parameter with the same key as a job parameter. See Job parameter pushdown.

Configure JSON array parameters

Configure parameters for the following tasks as a JSON-formatted array of strings:

  • Python script

  • Python wheel (only when configured with positional arguments)

  • JAR

  • Spark Submit

  • For each

The For each task iterates over this array to run conditionalized logic on the configured task.

All other task types pass the contends of the JSON-formatted array as arguments as if the configured code assets were being run from the command line.

Job parameters are not pushed down to task that use JSON arrays. You can reference job parameters using the dynamic value reference {{job.parameters.<name>}}.

Note

Job parameter values can include any valid JSON construct. This means that you can use dynamic value references to job parameters to conditionalize tasks.