Delta Lake is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs.
See the Delta Lake website for API references for Scala, Java, and Python.
To learn how to use the Delta Lake APIs on Databricks, see:
See also the Delta Lake API documentation in the Databricks documentation.