Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

allowing parameters to be initialised from a list #14

Merged
merged 3 commits into from
Oct 7, 2024

Conversation

kathryn-baker
Copy link
Contributor

This resolves the following error when initialising parameters from a JSON file which by default uses lists rather than tensors:

File [~/notebooks/calibration-training/training/calibration_modules/decoupled_linear.py:154], in DecoupledLinearInput.__init__(self, model, x_size, x_mask, **kwargs) ...
--> [145]         value = float(value) * torch.ones(size)
    [146]     value_size = value.shape
    [147]     if value.dim() == 1 and isinstance(size, int):

TypeError: float() argument must be a string or a real number, not 'list'

Copy link
Owner

@t-bz t-bz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should definitely support this. Could you rebase to fix the CI tests and add a test similar to this?

@kathryn-baker
Copy link
Contributor Author

I've added the test you suggested and pulled in your changes from the main branch but I seem to get a failing test for something I didn't modify to do with test_save_and_load(), is it possible it's a platform issue?

Copy link
Owner

@t-bz t-bz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like the test passed on the same platform before. I'll look into it if it keeps reoccurring.

@t-bz t-bz merged commit 67a646b into t-bz:main Oct 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants