-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LVAE losses tests #180
LVAE losses tests #180
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should stick with the original plan of only using the new algorithms (HDN, denoiSplit) with the LightningAPI
and not make them available through the CAREamist AP
.
We are very close to a 0.1.0
release, which means that we might get an influx of users and incorporating code that does not work or might break can create a lot of issues.
If the modules are in the code base, they can be instantiated independently with the Ligthning API
, that will be more than enough for Ashesh's paper. Once we are confident the quality of the code (integration, testing, documentation) is high enough, then we can make it available through the CAREamist
.
So in the meantime, I would suggest keeping the configuration clean (removing LVAE, NoiseModels, Likelihood, and the new entries in the supported enums) and adding placeholder NotImplementedErrors
in CAREamist
.
For now, NoiseModel
and Likelihood
could be passed directly as parameters along the algorithm configuration. We can also discuss alternatives (e.g. allowing additional parameters in the Pydantic model via ConfigDict
, but that would require a lot of asserts
in the LVAE related code).
src/careamics/careamist.py
Outdated
self.cfg.algorithm_config.model.architecture | ||
== SupportedArchitecture.LVAE | ||
): # TODO fuckin mypy thinks model "already defined on line 158" | ||
self.model: VAEModule = VAEModule( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe what you can do about this is to define a type at the top of the module:
Module = Union[FCNModule, VAEModule]
And use that type on line 158.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But mypy isn't complaining about type ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One needs to try, I haven't. 😄
self.model: VAEModule = VAEModule( | ||
algorithm_config=self.cfg.algorithm_config, | ||
) | ||
else: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The configuration allows custom models that are neither Unet nor LVAE (Custom
I believe). This breaks that possibility.
We could potentially consider that they should be FCNModule
by default. Not sure what it entails.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's discuss
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that the configuration has only custom
and UNet
models, these statements should go.
Once the LVAE
is in the configuration, we will need to reassess how to deal with custom
models.
src/careamics/models/lvae/lvae.py
Outdated
|
||
import ml_collections | ||
import ml_collections # TODO: refactor this out |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see it used except in the declaration of config: ml_collections
and then this config
is not used (as far as I can tell). So kick it out as we discussed to allow the tests from passing (hopefully)!
I will agree and say let's make the minimum number of modifications possible to the rest of the code until the LVAE pipeline is more complete, maybe not even adding not-implemented-errors to |
It seems there is still a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So I still think there are some changes here that are out of scope for the PR (also the current PR title is not descriptive of the changes). I see though that it's pretty hard to make the pydantic model when the LVAE
class does need some refactoring to extract components that don't necessarily need to be contained within the class. Maybe the splitting of the configuration you mentioned will help.
self.model = CAREamicsModule( | ||
algorithm_config=self.cfg.algorithm_config, | ||
) | ||
if ( | ||
self.cfg.algorithm_config.model.architecture | ||
== SupportedArchitecture.UNET | ||
): | ||
self.model = FCNModule( | ||
algorithm_config=self.cfg.algorithm_config, | ||
) | ||
else: | ||
raise NotImplementedError("Architecture not supported.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Repetition of inner if-else statement can be removed through separation of config initialisation and model initialisation. i.e.
if isinstance(source, Path) and source.is_file():
if source.suffix == ".yaml" or source.suffix == ".yml":
self.cfg = load_configuration(source)
self.model = None
else:
self.model, self.cfg = load_pretrained(source)
elif isinstance(source, Configuration):
self.cfg = source
self.model = None
else:
raise ValueError("appropriate message")
# instantiate model
if self.model is None:
if (
self.cfg.algorithm_config.model.architecture
== SupportedArchitecture.UNET
):
self.model = FCNModule(
algorithm_config=self.cfg.algorithm_config,
)
elif (
self.cfg.algorithm_config.model.architecture
== SupportedArchitecture.LVAE
):
raise NotImplementedError(
"LVAE is currently only supported through lightning api"
)
else:
raise NotImplementedError("Architecture not supported.")
^ just an example can probably be improved further. Extracting to smaller testable sub functions might be good too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Those things are gone in the config refac pr
# Mandatory fields | ||
algorithm: Literal["n2v", "care", "n2n", "custom"] # defined in SupportedAlgorithm | ||
"""Name of the algorithm, as defined in SupportedAlgorithm.""" | ||
|
||
# defined in SupportedAlgorithm | ||
algorithm: Literal["n2v", "care", "n2n"] | ||
loss: Literal["n2v", "mae", "mse"] | ||
"""Loss function to use, as defined in SupportedLoss.""" | ||
|
||
model: Union[UNetModel, VAEModel, CustomModel] = Field(discriminator="architecture") | ||
"""Model architecture to use, defined in SupportedArchitecture.""" | ||
model: Union[UNetModel, CustomModel] = Field(discriminator="architecture") | ||
|
||
# Optional fields | ||
optimizer: OptimizerModel = OptimizerModel() | ||
"""Optimizer to use, defined in SupportedOptimizer.""" | ||
|
||
lr_scheduler: LrSchedulerModel = LrSchedulerModel() | ||
"""Learning rate scheduler to use, defined in SupportedScheduler.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some docstrings have been deleted, probably by mistake.
CUSTOM = ( | ||
"Custom" # TODO all the others tags are small letters, except the architect | ||
"custom" # TODO all the others tags are small letters, except the architect |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can remove TODO
self.likelihood_gm = GaussianLikelihood( | ||
self.decoder_n_filters, | ||
self.target_ch, | ||
predict_logvar=self.predict_logvar, | ||
logvar_lowerbound=self.logvar_lowerbound, | ||
conv2d_bias=self.topdown_conv2d_bias, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When is self.likelihood_gm
used? as far as I can see it doesn't seem to be used by any of theLVAE
class' methods. If it is only used by external functions maybe it can be removed from the LVAE
class and passed to those functions explicitly.
assert act in activations | ||
|
||
|
||
def test_activation_wrong_values(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe rename test_activation_unknown_value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know what that function even doing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just testing an error is raised if "nonlinearity" is not a supported activation, no?
I pushed a fix for the doctests as promised. |
Comment https://github.com/CAREamics/careamics/pull/180/files#r1675427899 still needs to be fixed regarding the |
self.model: VAEModule = VAEModule( | ||
algorithm_config=self.cfg.algorithm_config, | ||
) | ||
else: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that the configuration has only custom
and UNet
models, these statements should go.
Once the LVAE
is in the configuration, we will need to reassess how to deal with custom
models.
target = aux[0] | ||
self.loss_parameters.current_epoch = self.current_epoch | ||
self.loss_parameters.inputs = x | ||
self.loss_parameters.mask = ~((target == 0).reshape(len(target), -1).all(dim=1)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add comments about what is happening in these steps.
|
||
loss: Literal["n2v", "mae", "mse"] | ||
"""Loss function to use, as defined in SupportedLoss.""" | ||
"""Name of the algorithm, as defined in SupportedAlgorithm.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Algorithm docs instead of loss docs.
raise ValueError("VAE are currently not implemented.") | ||
if (self.algorithm == "custom") != (self.model.architecture == "custom"): | ||
raise ValueError( | ||
"Algorithm and model architecture must be both `custom` or not." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
grammar could be clearer. not what?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess it was meant to be "None"!
Description
Changes Made
Added:
Known problems
Please ensure your PR meets the following requirements: