Skip to content

Commit

Permalink
remove gradient_checkpointing
Browse files Browse the repository at this point in the history
  • Loading branch information
strickvl committed Jul 24, 2024
1 parent 9324371 commit b85a838
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llm-lora-finetuning/steps/finetune.py
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ def finetune(
output_dir=output_dir,
warmup_steps=warmup_steps,
per_device_train_batch_size=per_device_train_batch_size,
gradient_checkpointing=True,
gradient_checkpointing=False,
gradient_checkpointing_kwargs={'use_reentrant':False} if use_accelerate else {},
gradient_accumulation_steps=gradient_accumulation_steps,
max_steps=max_steps,
Expand Down

0 comments on commit b85a838

Please sign in to comment.