When models are scored on Databricks Model Serving they will automatically lookup the necessary features (and computed on-demand -- private preview) as inputs for the models. However, if the customers do model inference outside Databricks (in 3rd party model serving or in Kubernetes .. etc), they require access to these named features from Unity Catalog as REST endpoints.
With this new functionality, they can create a "feature serving" or "function serving" endpoint (powered by python UDFs in Unity Catalog) like a model serving endpoint and call it using a REST client from outside Databricks.
Top customer use cases
ML customers that perform feature engineering and model training on Databricks and export the model for serving on 3rd party providers or self-managed services, k8s … etc
Customers that use curated features in their business logic in web-services or for automation and rule engines.
Use features and function endpoints to provide personalized structured data from the Lakehouse as context in LLM applications.