Troubleshooting and limitations
Troubleshooting
Error message: Database recommender_system does not exist in the Hive metastore.
A feature table is stored as a Delta table. The database is specified by the table name prefix, so a feature table recommender_system.customer_features will be stored in the recommender_system database.
To create the database, run:
%sql CREATE DATABASE IF NOT EXISTS recommender_system;
Error message: ModuleNotFoundError: No module named 'databricks.feature_engineering'
or ModuleNotFoundError: No module named 'databricks.feature_store'
This error occurs when databricks-feature-engineering is not installed on the Databricks Runtime you are using.
databricks-feature-engineering is available on PyPI, and can be installed with:
%pip install databricks-feature-engineering
Error message: ModuleNotFoundError: No module named 'databricks.feature_store'
This error occurs when databricks-feature-store is not installed on the Databricks Runtime you are using.
Note
For Databricks Runtime 14.3 and above, install databricks-feature-engineering instead via %pip install databricks-feature-engineering
databricks-feature-store is available on PyPI, and can be installed with:
%pip install databricks-feature-store
Error message: Invalid input. Data is not compatible with model signature. Cannot convert non-finite values...'
This error can occur when using a Feature Store-packaged model in Mosaic AI Model Serving. When providing custom feature values in an input to the endpoint, you must provide a value for the feature for each row in the input, or for no rows. You cannot provide custom values for a feature for only some rows.
Limitations
A model can use at most 50 tables and 100 functions for training.
Databricks Runtime ML clusters are not supported when using Delta Live Tables as feature tables. Instead, use a shared cluster and manually install the client using
pip install databricks-feature-engineering
. You must also install any other required ML libraries.%pip install databricks-feature-engineering
Materialized views and streaming tables are managed by Delta Live Tables pipelines.
fe.write_table()
does not update them. Instead, use the Delta Live Table pipeline to update the tables.
Feature Store APIs support batch scoring of models packaged with Feature Store. Online inference is not supported.
Databricks legacy Workspace Feature Store does not support deleting individual features from a feature table.
No online stores are supported on Databricks on Google Cloud as of this release.