[ENH] Add fine tuning methods using PEFT for HFTransformersForecaster
#6435
Labels
enhancement
Adding new functionality
module:forecasting
forecasting module: forecasting, incl probabilistic and hierarchical forecasting
To extend PEFT fine tuning methods in the existing huggingface interface, we need to wrap the model initialization with peft configuration.
Hypothetically, this can be done using this piece of code
Although the same technique can be used to wrap the huggingface model implemented in HFTransformersForecaster, yet fine-tuning this wrapped model results in different errors that are either because of the nature of these Time Series transformers that are different from typical Large Language Models on which these peft techniques are originally used, or because the right parameters and configurations are not set.
If we apply this PEFT wrapper right before the model is passed for training in HFTransformersForecaster, like this
The text was updated successfully, but these errors were encountered: