Serve models with Databricks
In this section, you learn how to use Mosaic AI Model Serving to serve AI and ML models through REST endpoints, as well as how to use MLflow for batch and streaming inference.
Batch inference
For batch and streaming inference, Databricks recommends that you use MLflow to deploy MLflow models. For more information, see Deploy models for batch inference and prediction.