DIVIDE_BY_ZERO error class

Division by zero. To return NULL instead, use try_divide. If necessary set <ansiConfig> to false (except for ANSI interval type) to bypass this error.

Parameters

  • ansiConfig: The name of the configuration to change the behavior.

Explanation

Databricks raises this error whenever it attempts to divide an INTERVAL, or numeric by 0. The context information provided with this error isolates the object and the expression within which the error occurred. Functions and operators such as mod, which can cause this error include those that are performing division as part of more complex formulas.

Mitigation

The mitigation of the error depends on the cause:

  • Is the expression causing the error correct?

    If the expression is incorrect fix it so the 0 value cannot occur, and retry the query.

  • Is the data correct?

    If the input data should be able to result in the 0 values being passed, you may need to either fix the data at the source or clean it prior to passing the data to the function as an argument.

    Data cleaning can mean to exclude the offending rows, to convert the 0 values into NULL using nullif(expr, 0), or to convert the data to another acceptable value using if(expr = 0, alt, expr).

If the expression and the data are correct, and you want to tolerate the division by zero, you can use try_divide. As an alternative, change the argument to nullif(expr, 0). This will cause the expression to return NULL instead of an error. If you prefer you can ue nvl(try_divide(expr1, expr2), alt) to turn the resulting NULL into an alternative values such as neutral elements for addition 0 or multiplication 1.

As a solution of last resort, when the expression or dataflow cannot be changed, you can disable this ANSI behavior by setting the provided ansiconfig to false. Note that this setting is consequential beyond the immediate error condition.

Examples

-- A DIVIDE_BY_ZERO in a embedded in view. The context information isolates the faiing function.
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT 1/val FROM VALUES(1), (0) AS T(val);
> SELECT c1 FROM v;
  [DIVIDE_BY_ZERO] Division by zero. To return NULL instead, use `try_divide`. If necessary set "spark.sql.ansi.enabled" to false (except for ANSI interval type) to bypass this error.
  == SQL of VIEW v(line 1, position 7) ==
  SELECT 1/val FROM VALUES(1), (0) AS T(val)
         ^^^^^

-- Tolerating division by zero by turning the result to NULL using try_divide.
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT try_divide(1, val) FROM VALUES(1), (0) AS T(val);
> SELECT c1 FROM v;
  1
  NULL

-- Tolerating division by zero by turning the result to NULL using nullif
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT 1 / nullif(val, 0) FROM VALUES(1), (0) AS T(val);
> SELECT c1 FROM v;
  1
  NULL

-- Filtering out offensive rows
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT 1/val FROM VALUES(1), (0) AS T(val) WHERE val != 0;
> SELECT c1 FROM v;
  1

-- Turning division by zero into division by a small number.
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT 1 / if(val = 0, 1e-10, val) FROM VALUES(1), (0) AS T(val);
> SELECT c1 FROM v;
  1
  10000000000

-- Turning division by zero into a neutral element for addition.
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT nvl(try_divide(1, val), 0) FROM VALUES(1), (0) AS T(val);
> SELECT c1 FROM v;
  1
  0

-- Disabling ANSI mode in Databricks SQL for the view definition only.
> SET ANSI_MODE = false;
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT 1/val FROM VALUES(1), (0) AS T(val);
> SET ANSI_MODE = true;

> SELECT c1 FROM v;
  1
  NULL

-- Disabling ANSI mode in Databricks Runtime for the view definition only.
> SET spark.sql.ansi.enabled = false;
> CREATE OR REPLACE TEMPORARY VIEW v(c1) AS SELECT 1/val FROM VALUES(1), (0) AS T(val);
> SET spark.sql.ansi.enabled = true;

> SELECT c1 FROM v;
  1
  NULL