Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot disable progress bar with TrainingConfig.enable_progress_bar when using CAREamist API #389

Open
melisande-c opened this issue Feb 4, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@melisande-c
Copy link
Member

Describe the bug

Setting enable_progress_bar=False in TrainingConfig results in MisconfigurationException from lightning because ProgressBarCallback is added to the callbacks in CAREamist._define_callbacks regardless.

To Reproduce

Code snippet allowing reproducing the behaviour:

config = create_n2v_configuration(
    experiment_name="ht_lif24",
    data_type="array",
    axes="SYXC",
    n_channels=2,
    patch_size=(64, 64),
    batch_size=64,
    num_epochs=3,
)
config.training_config.enable_progress_bar = False
careamist = CAREamist(source=config)
careamist.train(train_source=train_data,val_minimum_split=5)

Error message:

---------------------------------------------------------------------------
MisconfigurationException                 Traceback (most recent call last)
Cell In[9], line 1
----> 1 careamist = CAREamist(source=config)
      2 careamist.train(train_source=train_data,val_minimum_split=5)

File /localscratch/miniforge3/envs/microSplit/lib/python3.9/site-packages/careamics/careamist.py:201, in CAREamist.__init__(self, source, work_dir, callbacks)
    198     experiment_logger = [csv_logger]
    200 # instantiate trainer
--> 201 self.trainer = Trainer(
    202     max_epochs=self.cfg.training_config.num_epochs,
    203     precision=self.cfg.training_config.precision,
    204     max_steps=self.cfg.training_config.max_steps,
    205     check_val_every_n_epoch=self.cfg.training_config.check_val_every_n_epoch,
    206     enable_progress_bar=self.cfg.training_config.enable_progress_bar,
    207     accumulate_grad_batches=self.cfg.training_config.accumulate_grad_batches,
    208     gradient_clip_val=self.cfg.training_config.gradient_clip_val,
    209     gradient_clip_algorithm=self.cfg.training_config.gradient_clip_algorithm,
    210     callbacks=self.callbacks,
    211     default_root_dir=self.work_dir,
    212     logger=experiment_logger,
    213 )
    215 # place holder for the datamodules
    216 self.train_datamodule: Optional[TrainDataModule] = None

...

File /localscratch/miniforge3/envs/microSplit/lib/python3.9/site-packages/pytorch_lightning/trainer/connectors/callback_connector.py:134, in _CallbackConnector._configure_progress_bar(self, enable_progress_bar)
    131     # otherwise the user specified a progress bar callback but also
    132    # elected to disable the progress bar with the trainer flag
    133     progress_bar_callback = progress_bars[0]
--> 134     raise MisconfigurationException(
    135         "Trainer was configured with `enable_progress_bar=False`"
    136         f" but found `{progress_bar_callback.__class__.__name__}` in callbacks list."
    137     )
    139 if enable_progress_bar:
    140     progress_bar_callback = TQDMProgressBar()

MisconfigurationException: Trainer was configured with `enable_progress_bar=False` but found `ProgressBarCallback` in callbacks list.

Suggested solution

In CAREamist._define_callbacks ProgressBarCallback should only be added to the callbacks list if config.training_config.enable_progress_bar is True. See where code should change below.

# checkpoint callback saves checkpoints during training
self.callbacks.extend(
[
HyperParametersCallback(self.cfg),
ModelCheckpoint(
dirpath=self.work_dir / Path("checkpoints"),
filename=self.cfg.experiment_name,
**self.cfg.training_config.checkpoint_callback.model_dump(),
),
ProgressBarCallback(),
]
)

@melisande-c melisande-c added the bug Something isn't working label Feb 4, 2025
@melisande-c melisande-c changed the title [BUG] Cannot disable progress bar with TrainingConfig.enable_progress_bar when using CAREamist API Feb 4, 2025
@melisande-c
Copy link
Member Author

melisande-c commented Feb 4, 2025

One may also argue whether the ability to enable the progress bar should be part of the training config at all since it doesn't actually affect the model training, and it instead could just be an argument of the CAREamist API.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant