Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Have a trajectory learner example and reform the documentation build flow #167

Merged
merged 89 commits into from
Jul 26, 2024

Conversation

chaithyagr
Copy link
Member

This is updated version of #142 , with rebase at #156 . Ideally we need #156 to go in after which we can merge this. Opened a new PR cause I dont know why I ended up with 2 branches and started working on this one :P

@chaithyagr chaithyagr changed the base branch from smaps_update to master July 25, 2024 14:21
Copy link
Member

@paquiteau paquiteau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Comment on lines +103 to 111
assert_almost_allclose(
adj_data.cpu().detach(),
adj_data_ndft.cpu().detach(),
atol=1e-1,
rtol=1e-1,
mismatch=20,
)

# Check if nufft and ndft w.r.t trajectory are close in the backprop
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to add this check. Turns out NDFT was not right (see above) for SENSE.

Comment on lines +118 to 124
assert_almost_allclose(
gradient_ndft_ktraj.cpu().numpy(),
gradient_nufft_ktraj.cpu().numpy(),
atol=1e-2,
rtol=1e-2,
mismatch=20,
)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We were tooo easy on the testing and hence we missed this bug

Comment on lines +847 to +848
if self.uses_sense:
self.smaps = self.smaps.conj()
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Turns out, if we have SENSE, while we call fourier_op.op with toggled plans, we still need to maintain smaps.conj(). Fixed that here for all the usecases

@@ -40,8 +40,9 @@ def assert_almost_allclose(a, b, rtol, atol, mismatch, equal_nan=False):
try:
npt.assert_allclose(a, b, rtol=rtol, atol=atol, equal_nan=equal_nan)
except AssertionError as e:
e.message += "\nMismatched elements: "
e.message += f"{np.sum(~val)} > {mismatch}(={mismatch_perc*100:.2f}%)"
message = getattr(e, "message", "")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PEP562, we dont always has message attribute :P

@chaithyagr
Copy link
Member Author

I will add @paquiteau to go through some "new" changes :P @alineyyy if you already started your review, please watchout for a lot of new changes... Can you please tell if the code is understandable and if required we can add more comments for clarity

@chaithyagr chaithyagr requested a review from paquiteau July 26, 2024 09:18
Copy link
Member

@paquiteau paquiteau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ! Thanks for catching all these bugs !

grad_traj = torch.transpose(
torch.sum(grad_traj, dim=1),
0,
1,
Copy link
Contributor

@alineyyy alineyyy Jul 26, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for making this step easier!!

def samples(self, value):
self._samples_torch = value
self.nufft_op.samples = value.detach().cpu().numpy()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love this setter! It really avoids redundantly moving the device of the samples in the test!

Copy link
Contributor

@alineyyy alineyyy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thank you for making these changes to make it simpler for users to use! (including me XD) The codes and the comments are also clear enough for me to understand!

@chaithyagr chaithyagr merged commit 2d37d0e into mind-inria:master Jul 26, 2024
11 checks passed
@chaithyagr chaithyagr deleted the learn branch July 26, 2024 12:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

A simple example to showcase autodiff
4 participants