You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment, Model serving (via KServe) and Feature Serving (via Feast—the Feature Store) are separate components without any guidance on how to best serve production machine learning features and models.
It would be beneficial to the community to outline (or support) how these two components could best fit together so that users could launch production machine learning applications with a better understanding about best practices. At the moment, this is outlined by KServe but the approach has some limitations [1]. It would be beneficial to offer some form of integration that outlines how users could launch these two services in a way that maximizes the strengths of each component.
[1] The limitations will be discussed more thoroughly in the RFC but their approach is to treat the online feature store as a Transformer.
Describe the solution you'd like:
I am a maintainer for Feast and I will be drafting a proposal in this document.
There are pros and cons to different approaches and I would like to solicit feedback from the community to understand what would result in the best tradeoffs.
Anything else you would like to add:
Below are some useful references from industry leaders on how some other platforms support a Feature Store integration with model inference. Notice that they generally request features before executing the inference step.
/kind feature
Why you need this feature:
At the moment, Model serving (via KServe) and Feature Serving (via Feast—the Feature Store) are separate components without any guidance on how to best serve production machine learning features and models.
It would be beneficial to the community to outline (or support) how these two components could best fit together so that users could launch production machine learning applications with a better understanding about best practices. At the moment, this is outlined by KServe but the approach has some limitations [1]. It would be beneficial to offer some form of integration that outlines how users could launch these two services in a way that maximizes the strengths of each component.
[1] The limitations will be discussed more thoroughly in the RFC but their approach is to treat the online feature store as a Transformer.
Describe the solution you'd like:
I am a maintainer for Feast and I will be drafting a proposal in this document.
There are pros and cons to different approaches and I would like to solicit feedback from the community to understand what would result in the best tradeoffs.
Anything else you would like to add:
Below are some useful references from industry leaders on how some other platforms support a Feature Store integration with model inference. Notice that they generally request features before executing the inference step.
Sagemaker's high level feature store and inference diagram
Databrick's high level feature store and inference diagram
The text was updated successfully, but these errors were encountered: