Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compute operation Normalising #2126

Merged
merged 3 commits into from
Jun 5, 2024
Merged

compute operation Normalising #2126

merged 3 commits into from
Jun 5, 2024

Conversation

ashmeigh
Copy link
Collaborator

@ashmeigh ashmeigh commented Mar 12, 2024

Issue

This issue was part of #2051

Description

This pull request addresses the ongoing effort to standardize the execution of operations by transitioning from the partial function style to a structured compute_function approach. In this update, several operations have been refactored to adhere to the new standard. The modifications ensure a clearer separation between the setup and execution phases of each operation, improving the maintainability and readability of the codebase.

Changes

Refactored multiple operations to implement the new compute_function approach.
Introduced static methods compute_function in each affected operation module to encapsulate the core logic for executing the operation on a per-slice basis.
Updated the calling code to use the compute_function method instead of the previous partial function style

@coveralls
Copy link

coveralls commented May 20, 2024

Coverage Status

coverage: 72.685% (-0.05%) from 72.735%
when pulling 204d1f1 on compute_norm
into 619bdd4 on main.

@samtygier-stfc
Copy link
Collaborator

Here are the notes from our catch up.

Neutron flux is variable, so each image has different exposure. The aim of the algorithm is to make the mean air region in each slice equal to each other, by multiplying each frame by a normalisation factor. So for both cases the final step is to have a normalisation array the length of the number of slices, and multiple each slice by the corresponding value.

For the Stack Average case, we take the average of the air region for each slice to get the normalisation array. Then this itself needs normalising to its own mean, before applying to the full image stack.
A non-parallel version would look like:

air_means = images.data[:, air_top:air_bottom, air_left:air_right].mean(axis=1,2)
#[50,48,52,49, ...]
normed_air_means = air_means / air_means.mean()
#[1,0.96,1.04, ...]

for i in range(images.data.shape[0]):
    images.data[i] /= normed_air_means[i]

# so if you looked at the means again, they would all be the same value
new_means = images.data[:, air_top:air_bottom, air_left:air_right].mean(axis=1,2)
# [50,50,50,50, ...]   

For the Flat Field mode, we normalise to the flat field. e.g.

air_means = images.data[:, air_top:air_bottom, air_left:air_right].mean(axis=1,2)
#[50,48,52,49, ...]
flat_means = flat_field.data[:, air_top:air_bottom, air_left:air_right].mean(axis=(1, 2))
#[40,38,42,39, ...]
normed_air_means = air_means / flat_means.mean() # 40

for i in range(images.data.shape[0]):
    images.data[i] /= air_means[i]

# again if you looked at the means here, they'd all be equal to the mean of the region in the flat images
air_means = images.data[:, air_top:air_bottom, air_left:air_right].mean(axis=1,2)
# [40,40,40,40, ...]   

The compute function for calculating means will need to take a list of arrays, the images and a 1D array to put the means into. It will look something like:
arrays[1][i] = arrays[0][i, air_top:air_bottom, air_left:air_right].mean()

 Added Validation Step.
 Fixed issue with normalization factor
Copy link
Collaborator

@samtygier-stfc samtygier-stfc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Passing the tests now.

A few small tweaks. Also once everything is working the debug print calls can be removed.

Comment on lines 90 to 94
air_means = np.array([
RoiNormalisationFilter._calc_mean(images.data[i], region_of_interest.left, region_of_interest.top,
region_of_interest.right, region_of_interest.bottom)
for i in range(images.data.shape[0])
])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can do this a single numpy call:
air_means = images.data[:, region_of_interest.top:region_of_interest.bottom, region_of_interest.left:region_of_interest.right].mean(axis=(1, 2))

Then can remove _calc.mean()


print("Global Params:", global_params)

return global_params
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could remove the dict, and just pass back normed_air_means

Updated the key name in the compute_function to match filter_func.
Copy link
Collaborator

@samtygier-stfc samtygier-stfc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

operations_tests.py passes and code looks good. Thanks

@samtygier-stfc samtygier-stfc added this pull request to the merge queue Jun 5, 2024
Merged via the queue into main with commit 937bb36 Jun 5, 2024
8 checks passed
@samtygier-stfc samtygier-stfc deleted the compute_norm branch June 5, 2024 16:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants