Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorienting rodent NIFTIs #915

Open
araikes opened this issue Feb 4, 2025 · 7 comments
Open

Reorienting rodent NIFTIs #915

araikes opened this issue Feb 4, 2025 · 7 comments

Comments

@araikes
Copy link

araikes commented Feb 4, 2025

Hi @neurolabusc,

I work a lot with rodent imaging and saw #761, which is insanely cool and I'm attempting to validate, but in a limited number of tests using MRtrix's dwigradcheck, it appears that the bvecs require no flipping or reordering for the NIFTI that is generated.

I have a question though. When converted my images come out in the following orientation:

Image

I wholly understand that this is faithful to what is written in the DICOM, and I read #666 where I can reorient using MRIcroGL, which works great (but though I'm using a Unix system it didn't report the rotation matrix, as suggested in that other issue). I have a ground truth knowledge (at least in some my scans) about where L/R are, and they are appear to be faithfully written, so it's just about rotating and flipping while preserving that. However, I'm struggling to figure that out in a pythonic way. I've found I can match the overall behavior of MRIcroGL's output, including data dimensions and orientation, reporting by nibabel's aff2axcodes confirming that I go from ASL to RAS using MRtrix by setting the strides to 1,2,3,4 (compels the image to RAS), reordering my axes to 0,2,1 (swaps the A/P and S/I labels) and then flipping the new axis 2 (flips the image so that the top of the brain is labeled "S").

Functionally, I'm just trying to figure out if this somehow in the ballpark of what MRIcroGL is actually doing so I know if I've landed on a sane approach because I can store the bvecs with the image in MRtrix format and then this keeps my bvecs aligned with the image through the rotations/flips. Happy to provide any other information I can. If it's useful at all, the MRIcroGL selections for reorientation are "Green", "Yellow", "Blue".

Thanks in advance,

See below for MRtrix mrinfo of my original image and the oriented one from MRIcroGL:

MRtrix mrinfo (original file: left; reoriented file: right):

Image

FSL's fslhd (original file: left; reoriented file: right):

Image

@neurolabusc
Copy link
Collaborator

If you launch MRIcroGL from the command line, it will tell you the order of the dimensions (1 2 3) and whether they must be flipped that allows you to swizzle things:

  Reorient Dimensions -3 1 2

The only challenge is getting the origin translation correct (the fourth column (Tx Ty Tz) of the 4x4 spatial transformation matrix). It is easiest to think of this as the spatial position of the first voxel with respect to the origin. I am not a Python guru, but I think you can get help from the nibabel folks, or adopt the code from a Python conform function.

@araikes
Copy link
Author

araikes commented Feb 4, 2025

Ah... Didn't think about launching from the command line.

Thanks for your help. I'll touch base with others so that I can figure out how to generalize this across my data.

@araikes
Copy link
Author

araikes commented Feb 5, 2025

Python failed me, but that's ok because I can get MRtrix to mimic the MRIcroGL outputs (down to s-form and origin rotations). Especially with MRtrix this is awesome because it means that the bvecs (assuming them to be valid and dwigradcheck seems to suggest that at the very least the orientations are sane without flips/axis reordering, see below for my n=1 test).

That said, I have one final follow up question. The JSONs don't have the phase encoding direction. I'm assuming this information isn't available in the DICOM but I know from the Bruker method file that my slice orientation was sagittal and the read orientation was Head-Foot. Is there any way to infer this information relative to the DICOM? Can provide a DICOM if useful but I'm assuming the information isn't there or you would have found it.

RAS reoriented image with rotated bvecs to match

Image

@neurolabusc
Copy link
Collaborator

  1. This looks nice. Can you share a sample dataset with my institutional email?

  2. Siemens, Canon and UIH use private tags for phase encoding polarity - Philips does not store this in the DICOMs. I am unsure what Bruker does. Feel free to contact Bruker to see if they have specific tags. Alternatively, if you have identical DICOMs where the only difference is the polarity, use dcmdump or gdcmdump to extract the tags (e.g. dcmdump ap.dcm > ap.txt; dcmdump pa.dcm > pa.txt; diff ap.txt pa.txt) to see if differences are preserved in the DICOM. I do not have access to Bruker equipment, so am unable to evaluate this.

@araikes
Copy link
Author

araikes commented Feb 6, 2025

I'll try to share one either today or tomorrow. We were running an ex-vivo overnight scan last night and I appended some b0 read (LR, RostralCaudal, VentralDorsal)/phase (dictated by the other settings)/slice (Axial, Coronal, Sagittal) permutations. Additionally, Bruker has added a direct option for a "reverse phase" in one of their most recent updates to Paravision 360 so I ran that as well for all of the permutations. If the phase encoding direction information is encoded in the DICOM, it should be findable using the breadth of permutations..

@araikes
Copy link
Author

araikes commented Feb 6, 2025

And now I don't know quite what to think... The acquisition we were using last night does not have the bvecs inherently aligned with the image. It's a different acquisition scheme from the preceding one and the image itself still requires a 90 degree counterclockwise rotation, but the vectors don't appear to be oriented correctly as written......

Unclear and uncertain... but to be investigated.

@neurolabusc
Copy link
Collaborator

I would suggest you follow the guidelines in the dcm2niix manual, in particular the validate vectors document. Specifically, acquire multiple series: one where the volume is orthogonal with the scanner bore and one where angulations are applied. The fsl/bids BVEC vectors are defined with respect to image space, while some manufacturers use the world space coordinates. Note that Bruker has historically encoded vectors in a manner that does not allow us to discriminate the polarity. This limits the ability of tools such as FSL's eddy to remove geometric distortions. That was several years ago, so perhaps modern Bruker scanners provide more details. This may be a great opportunity to work with the Bruker engineers affiliated with your site.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants