Feature Store workflow overview

The typical machine learning workflow using Feature Store follows this path:

  1. Write code to convert raw data into features and create a Spark DataFrame containing the desired features.
  2. Write the DataFrame as a feature table in Feature Store.
  3. Create a training set based on features from feature tables.
  4. Train a model.
  5. Log the model as an MLflow model.
  6. Perform batch inference on new data. The model automatically retrieves the features it needs from Feature Store.

Example notebook

The Feature Store taxi example notebook illustrates the process of creating features, updating them, and using them for model training and batch inference.

Feature Store taxi example notebook

Open notebook in new tab