Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Add fine tuning methods using PEFT for HFTransformersForecaster #6435

Open
geetu040 opened this issue May 16, 2024 · 3 comments · May be fixed by #6457
Open

[ENH] Add fine tuning methods using PEFT for HFTransformersForecaster #6435

geetu040 opened this issue May 16, 2024 · 3 comments · May be fixed by #6457
Assignees
Labels
enhancement Adding new functionality module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting

Comments

@geetu040
Copy link
Contributor

To extend PEFT fine tuning methods in the existing huggingface interface, we need to wrap the model initialization with peft configuration.
Hypothetically, this can be done using this piece of code

from peft import LoraConfig, TaskType, get_peft_model
peft_config = LoraConfig(task_type=TaskType.SEQ_2_SEQ_LM, inference_mode=False, r=8, lora_alpha=32, lora_dropout=0.1)
self.model = get_peft_model(self.model, peft_config)

Although the same technique can be used to wrap the huggingface model implemented in HFTransformersForecaster, yet fine-tuning this wrapped model results in different errors that are either because of the nature of these Time Series transformers that are different from typical Large Language Models on which these peft techniques are originally used, or because the right parameters and configurations are not set.
If we apply this PEFT wrapper right before the model is passed for training in HFTransformersForecaster, like this

from peft import LoraConfig, TaskType, get_peft_model
peft_config = LoraConfig(task_type=TaskType.SEQ_2_SEQ_LM, target_modules=["v_proj"], inference_mode=False, r=8, lora_alpha=32, lora_dropout=0.1)
self.model = get_peft_model(self.model, peft_config)
# TypeError: AutoformerForPrediction.forward() got an unexpected keyword argument 'input_ids'
@geetu040 geetu040 added the enhancement Adding new functionality label May 16, 2024
@geetu040
Copy link
Contributor Author

FYI @benHeid , this is something maybe you can help with?

@fkiraly fkiraly added the module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting label May 16, 2024
@benHeid
Copy link
Contributor

benHeid commented May 17, 2024

@geetu040 I think the main issue is that you set the task_type. If you not set it than it should work.

@geetu040
Copy link
Contributor Author

yes @benHeid that seems to solve the problem, Thanks

@geetu040 geetu040 linked a pull request May 20, 2024 that will close this issue
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Adding new functionality module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting
Projects
Status: In Progress
Development

Successfully merging a pull request may close this issue.

3 participants