Query!! Kindly let me know the alternative.
For Fine-Tuning (Using PEFT) code :
from peft import prepare_model_for_kbit_training
model.gradient_checkpointing_enable()
model = prepare_model_for_kbit_training(model)
Error :
ValueError Traceback (most recent call last)
in <cell line: 3>()
1 from peft import prepare_model_for_kbit_training
2
----> 3 model.gradient_checkpointing_enable()
4 model = prepare_model_for_kbit_training(model)
/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py in gradient_checkpointing_enable(self)
1629 """
1630 if not self.supports_gradient_checkpointing:
-> 1631 raise ValueError(f"{self.class.name} does not support gradient checkpointing.")
1632 self.apply(partial(self._set_gradient_checkpointing, value=True))
1633
ValueError: PhiForCausalLM does not support gradient checkpointing.
Kindly let me know how to solve this issue.
Thank You
Email ID - [email protected]
I also meet this issue. And i do not know how to solve it.
Hello @ilovestudy and @tusharpaul .
We will proceed with the integration of Phi directly in transformers and it will fix the gradient checkpointing issue.