This article guides you through articles that help you learn how to build AI and LLM solutions natively on Databricks. Topics include key steps of the end-to-end AI lifecycle, from data preparation and model building to deployment, monitoring and MLOps.
Learn how to load and process your data for AI workloads, including data preparation for fine-tuning LLMs. How to prepare your data for model training
With feature engineering available in Unity Catalog, learn how to create feature tables, track the lineage of features and discover features that others have already built.
Learn how to use AutoML for efficient training and tuning of your ML models, and MLflow for experiment tracking.
Learn how to use model serving for real-time workloads or deploy MLflow models for offline inference.
Learn how to use Databricks Asset Bundles for efficient packaging and deployment of all data and AI assets.
See how you can use Databricks to combine DataOps, ModelOps and DevOps for end-to-end ML and LLM operations for your AI application.
If the outlined steps above don’t cater to your needs, a wealth of information is available in the Machine Learning documentation.