Skip to content

Commit

Permalink
Don't set all of the transformer to be finetuned if peft is on
Browse files Browse the repository at this point in the history
  • Loading branch information
AngledLuffa committed Nov 15, 2023
1 parent acfefd2 commit 414c179
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions stanza/models/coref/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -479,8 +479,9 @@ def _build_optimizers(self):
self.optimizers: Dict[str, torch.optim.Optimizer] = {}
self.schedulers: Dict[str, torch.optim.lr_scheduler.LambdaLR] = {}

for param in self.bert.parameters():
param.requires_grad = self.config.bert_finetune
if not getattr(self.config, 'lora', False):
for param in self.bert.parameters():
param.requires_grad = self.config.bert_finetune

if self.config.bert_finetune:
logger.debug("Making bert optimizer with LR of %f", self.config.bert_learning_rate)
Expand Down

0 comments on commit 414c179

Please sign in to comment.