Driver capability settings for the Databricks JDBC Driver

This article describes how to configure special and advanced driver capability settings for the Databricks JDBC Driver.

This article supplements the information in the following Databricks JDBC Driver articles:

To configure a Databricks connection for the Databricks JDBC Driver, you must combine your compute resource settings, authentication settings, and any of the following driver capability settings, into a JDBC connection URL or programmatic collection of JDBC connection properties. Whether you use a connection URL or a collection of connection properties will depend on the requirements of your target app, tool, client, SDK, or API. For examples of JDBC connection URLs and programmatic collections of JDBC connection properties, see Authentication settings for the Databricks JDBC Driver.

The Databricks JDBC Driver provides the following special and advanced driver capability settings.

ANSI SQL-92 query support in JDBC

Legacy Spark JDBC drivers accept SQL queries in ANSI SQL-92 dialect and translate the queries to the Databricks SQL dialect before sending them to the server. However, if your application generates Databricks SQL directly or your application uses any non-ANSI SQL-92 standard SQL syntax specific to Databricks, Databricks recommends that you set UseNativeQuery=1 as a connection configuration. With that setting, the driver passes the SQL queries verbatim to Databricks.

Default catalog and schema

To specify the default catalog and schema, add ConnCatalog=<catalog-name>;ConnSchema=<schema-name> to the JDBC connection URL.

Extract large query results in JDBC

To achieve the best performance when you extract large query results, use the latest version of the JDBC driver, which includes the following optimizations.

Arrow serialization in JDBC

JDBC driver version 2.6.16 and above supports an optimized query results serialization format that uses Apache Arrow.

Enable logging

To enable logging in the JDBC driver, set the LogLevel property from 1 to log only severe events through 6 to log all driver activity. Set the LogPath property to the full path to the folder where you want to save log files.

For more information, see the Configuring Logging section in the Databricks JDBC Driver Guide.