SQL language reference
This is a SQL command reference for Databricks SQL and Databricks Runtime.
For information about using SQL with Delta Live Tables, see Delta Live Tables SQL language reference.
General reference
This general reference describes data types, functions, identifiers, literals, and semantics:
- "Applies to" label
- How to read a syntax diagram
- How to add comments to SQL statements
- Configuration parameters
- Data types and literals
- Functions
- SQL data type rules
- Datetime patterns
- H3 geospatial functions
- Lambda functions
- Window functions
- Identifiers
- Names
- IDENTIFIER clause
- NULL semantics
- Expressions
- Parameter markers
- Variables
- Name resolution
- JSON path expressions
- Collation
- Partitions
- ANSI compliance in Databricks Runtime
- Apache Hive compatibility
- Principals
- Privileges and securable objects in Unity Catalog
- Privileges and securable objects in the Hive metastore
- Refresh Unity Catalog metadata
- External locations
- External tables
- Credentials
- Volumes
- Delta Sharing
- Federated queries (Lakehouse Federation)
- Information schema
- Reserved words
DDL statements
You use data definition statements to create or modify the structure of database objects in a database:
- ALTER CATALOG
- ALTER CONNECTION
- ALTER CREDENTIAL
- ALTER DATABASE
- ALTER LOCATION
- ALTER PROVIDER
- ALTER RECIPIENT
- ALTER TABLE
- ALTER SCHEMA
- ALTER SHARE
- ALTER VIEW
- ALTER VOLUME
- COMMENT ON
- CREATE BLOOMFILTER INDEX
- CREATE CATALOG
- CREATE CONNECTION
- CREATE DATABASE
- CREATE FUNCTION (SQL)
- CREATE FUNCTION (External)
- CREATE LOCATION
- CREATE RECIPIENT
- CREATE SCHEMA
- CREATE SERVER
- CREATE SHARE
- CREATE STREAMING TABLE
- CREATE TABLE
- CREATE VIEW
- CREATE VOLUME
- DECLARE VARIABLE
- DROP BLOOMFILTER INDEX
- DROP CATALOG
- DROP CONNECTION
- DROP DATABASE
- DROP CREDENTIAL
- DROP FUNCTION
- DROP LOCATION
- DROP PROVIDER
- DROP RECIPIENT
- DROP SCHEMA
- DROP SHARE
- DROP TABLE
- DROP VARIABLE
- DROP VIEW
- DROP VOLUME
- MSCK REPAIR TABLE
- REFRESH FOREIGN (CATALOG, SCHEMA, or TABLE)
- REFRESH (MATERIALIZED VIEW or STREAMING TABLE)
- SYNC
- TRUNCATE TABLE
- UNDROP TABLE
DML statements
You use data manipulation statements to add, change, or delete data from a Delta Lake table:
Data retrieval statements
You use a query to retrieve rows from one or more tables according to the specified clauses. The full syntax
and brief description of supported clauses are explained in the Query article.
The related SQL statements SELECT
and VALUES
are also included in this section.
Databricks SQL also provides the ability to generate the logical and physical plan for a query using the EXPLAIN
statement.
Delta Lake statements
You use Delta Lake SQL statements to manage tables stored in Delta Lake format:
For details on using Delta Lake statements, see What is Delta Lake?.
Auxiliary statements
You use auxiliary statements to collect statistics, manage caching, explore metadata, set configurations, and manage resources:
Apache Spark Cache statements
Applies to: Databricks Runtime
Show statements
- LIST
- SHOW ALL IN SHARE
- SHOW CATALOGS
- SHOW COLUMNS
- SHOW CONNECTIONS
- SHOW CREATE TABLE
- SHOW CREDENTIALS
- SHOW DATABASES
- SHOW FUNCTIONS
- SHOW GROUPS
- SHOW LOCATIONS
- SHOW PARTITIONS
- SHOW PROVIDERS
- SHOW RECIPIENTS
- SHOW SCHEMAS
- SHOW SHARES
- SHOW SHARES IN PROVIDER
- SHOW TABLE
- SHOW TABLES
- SHOW TABLES DROPPED
- SHOW TBLPROPERTIES
- SHOW USERS
- SHOW VIEWS
- SHOW VOLUMES
Security statements
You use security SQL statements to manage access to data:
For details about using these statements, see Hive metastore privileges and securable objects (legacy).