Print only on Rank 0 (#187)

* Print only on Rank 0

When training on multiple GPU, the settings are printed once per gpu.
This only prints from rank 0

See https://github.com/tloen/alpaca-lora/issues/182#issuecomment-1485550636
for a sample output.

Could apply to a few other prints further down as well.

* Typo

* Added failsafe

So this works whether or not LOCAL_RANK is defined.
This commit is contained in:
Angainor Development 2023-03-30 01:25:17 +02:00 committed by GitHub
parent a48d947298
commit fcbc45e4c0
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -53,6 +53,7 @@ def train(
wandb_log_model: str = "", # options: false | true
resume_from_checkpoint: str = None, # either training checkpoint or final adapter
):
if int(os.environ.get("LOCAL_RANK", 0)) == 0:
print(
f"Training Alpaca-LoRA model with params:\n"
f"base_model: {base_model}\n"