Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update pre commit #607

Merged
merged 27 commits into from
Feb 11, 2025
Merged
Show file tree
Hide file tree
Changes from 21 commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
e9c7f82
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Dec 16, 2024
2d99b9e
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 16, 2024
f799ef5
Add testing and packaging files
ktactac-ornl Dec 17, 2024
a33e8c7
pre-commit - use ruff instead of black
glass-ships Dec 18, 2024
24c1257
Add notebooks to excludes in pyproject.toml
glass-ships Dec 18, 2024
2a0027f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 18, 2024
715faa5
fix codecov step
ktactac-ornl Dec 18, 2024
b719e67
New test prep stage
ktactac-ornl Dec 18, 2024
8ee3110
change data submodule url
ktactac-ornl Dec 18, 2024
6727225
specify cov files
ktactac-ornl Dec 18, 2024
889145f
add some noqa
glass-ships Dec 19, 2024
4e137d0
Merge remote-tracking branch 'origin/ewm8190_setup_github_ci' into up…
glass-ships Dec 19, 2024
ab800eb
address lint errors
glass-ships Dec 19, 2024
33b2c1b
forgot .items() in test spice xml parser
glass-ships Dec 19, 2024
b35a152
try pip install in test preparation step
glass-ships Dec 19, 2024
2de5055
merge changes from next
glass-ships Jan 29, 2025
25319d4
Merge branch 'next' into update-pre-commit
glass-ships Feb 6, 2025
657f074
restore from next and exclude notebooks & test/examples
glass-ships Feb 6, 2025
f8a43e9
fix geometry.py
glass-ships Feb 6, 2025
2b965e4
Update .pre-commit-config.yaml
glass-ships Feb 10, 2025
f7e5830
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Feb 10, 2025
186817a
minor formatting in test_transmission
glass-ships Feb 10, 2025
1bf11f2
minor formatting in test_simulated_events
glass-ships Feb 10, 2025
6bf09bd
Merge branch 'next' into update-pre-commit
glass-ships Feb 11, 2025
6430e56
update urls to point to github
glass-ships Feb 11, 2025
5cdc372
one more update
glass-ships Feb 11, 2025
5197bb1
add type hinting to iq.py
glass-ships Feb 11, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 11 additions & 8 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: check-json
Expand All @@ -19,16 +19,19 @@ repos:
- id: check-merge-conflict
- id: end-of-file-fixer
- id: sort-simple-yaml
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.0.275
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.9.4
hooks:
- id: ruff
args: [--no-cache, --fix, --exit-non-zero-on-fix]
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
exclude: |
notebooks/
tests/examples/
- id: ruff-format
exclude: |
notebooks/
tests/examples/
- repo: https://github.com/kynan/nbstripout
rev: 0.6.1
rev: 0.8.1
hooks:
- id: nbstripout
1 change: 1 addition & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -215,6 +215,7 @@ class ExecDirective(Directive):
Credit goes to:
https://stackoverflow.com/questions/27875455/displaying-dictionary-data-in-sphinx-documentation/29789910#29789910
"""

has_content = True

def run(self):
Expand Down
6 changes: 3 additions & 3 deletions docs/drtsans/example_1d.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@
# files
config["mask"] = "/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017B_mp/beamstop60_mask_4m.nxs"
config["flux"] = "/SNS/EQSANS/shared/instrument_configuration/bl6_flux_at_sample"
config[
"sensitivity_file_path"
] = "/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017A_mp/Sensitivity_patched_thinPMMA_4m_79165_event.nxs"
config["sensitivity_file_path"] = (
"/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017A_mp/Sensitivity_patched_thinPMMA_4m_79165_event.nxs"
)
config["dark_current"] = "/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017B_mp/EQSANS_86275.nxs.h5"

# numeric values
Expand Down
3 changes: 3 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,9 @@ line-length = 119
[tool.ruff]
cache-dir = "/tmp/ruff_cache"
line-length = 119
extend-exclude = ["notebooks", "tests/examples"]

[tool.ruff.lint]
# https://beta.ruff.rs/docs/rules/
# suggestions: BLE blind exceptions, I sorts imports
# Full pylint PL = PLC, PLE, PLR (~500 issues), PLW. Enable most
Expand Down
3 changes: 2 additions & 1 deletion scripts/common_utils.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
""" Common utility functions for all SANS """
"""Common utility functions for all SANS"""

import os
import numpy as np
import matplotlib.pyplot as plt
Expand Down
1 change: 1 addition & 0 deletions scripts/examples/plot_wavelength.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Example script to plot the before and after k-correction data for a given slice and frame.
"""

import glob
import os
from pathlib import Path
Expand Down
1 change: 1 addition & 0 deletions scripts/generate_report
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
"""
Script to generate a report from an hdf5 log file
"""

import sys
import os
import numpy as np
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def show_calibration_stage1(raw_flood_ws_name, database_file):
# Plot flood workspace raw and calibrated
print("#####\n\nCompare applying the calibration to flood (stage 1)")

calibrated_flood_ws_name = f'demo_calibrated1_flood_{raw_flood_ws_name.split("flood_")[1]}'
calibrated_flood_ws_name = f"demo_calibrated1_flood_{raw_flood_ws_name.split('flood_')[1]}"
apply_calibrations(
raw_flood_ws_name,
output_workspace=calibrated_flood_ws_name,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,13 +1,14 @@
"""
SANS sensitivities preparation script
SANS sensitivities preparation script

# goal
1. implement a universal mask_beam_center(flood_ws, beam_center_mask=None, beam_center_ws=None)
for 3 types of mask
2. add option for wing/main detector for BIOSANS:w
# goal
1. implement a universal mask_beam_center(flood_ws, beam_center_mask=None, beam_center_ws=None)
for 3 types of mask
2. add option for wing/main detector for BIOSANS:w


"""

import os
import warnings
from drtsans.mono.gpsans.prepare_sensitivities_correction import SpiceRun
Expand Down
3 changes: 2 additions & 1 deletion scripts/prepare_sensitivities_biosans.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Sensitivities preparation script for Bio-SANS (CG3)
Sensitivities preparation script for Bio-SANS (CG3)
"""

from drtsans.mono.biosans.prepare_sensitivities_correction import PrepareSensitivityCorrection

import os
Expand Down
3 changes: 2 additions & 1 deletion scripts/prepare_sensitivities_gpsans.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Sensitivities preparation script for GP-SANS (CG2)
Sensitivities preparation script for GP-SANS (CG2)
"""

from drtsans.prepare_sensivities_correction import PrepareSensitivityCorrection


Expand Down
1 change: 1 addition & 0 deletions scripts/test_help/biosans_synthetic_sensitivity_dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
the Midrange Detector. See Fixture ``biosans_synthetic_sensitivity_dataset`` for detailed use

"""

# local imports
from drtsans.load import __monitor_counts
from drtsans.mono.biosans.simulated_intensities import clone_component_intensities, insert_midrange_detector
Expand Down
3 changes: 2 additions & 1 deletion src/drtsans/absolute_units.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
r""" Links to Mantid algorithms
r"""Links to Mantid algorithms
DeleteWorkspace <https://docs.mantidproject.org/nightly/algorithms/DeleteWorkspace-v1.html>
Divide <https://docs.mantidproject.org/nightly/algorithms/Divide-v1.html>
Multiply <https://docs.mantidproject.org/nightly/algorithms/Multiply-v1.html>
"""

from mantid.simpleapi import DeleteWorkspace, Divide, Multiply, mtd
from mantid.dataobjects import WorkspaceSingleValue

Expand Down
9 changes: 4 additions & 5 deletions src/drtsans/auto_wedge.py
Original file line number Diff line number Diff line change
Expand Up @@ -244,9 +244,9 @@ def _export_to_h5(iq2d, rings, azimuthal_delta, peak_fit_dict, output_dir):
function_data_set[0] = peak_fit_dict[index]["fit_function"]

# add peak fitting result
for param_name in func_param_dict:
for param_name, param_value in func_param_dict.items():
# form data set
data_set = np.array(func_param_dict[param_name])
data_set = np.array(param_value)
fit_group.create_dataset(param_name, data=data_set)

# close
Expand Down Expand Up @@ -501,8 +501,7 @@ def _estimatePeakParameters(intensity, azimuthal, azimuthal_start, window_half_w
break
# output
print(
f"[WEDGE FIT] azimuthal: {azimuthal_new}, {azimuthal_last} with "
f"left and right as {left_index}, {right_index}"
f"[WEDGE FIT] azimuthal: {azimuthal_new}, {azimuthal_last} with left and right as {left_index}, {right_index}"
)

# now use the first two moments of the data within the window to give an improved center position (first moment)
Expand Down Expand Up @@ -837,7 +836,7 @@ def _fitQAndAzimuthal(
fit_result_dict[index]["error"] = error_reason
continue
else:
fitted_peaks_message += f"spectrum {index-1}: Fitted peaks: {newlyFittedPeaks}\n"
fitted_peaks_message += f"spectrum {index - 1}: Fitted peaks: {newlyFittedPeaks}\n"
for i in range(len(peakResults)):
peakResults[i].append(newlyFittedPeaks[i])
q_centers_used.append(q_center)
Expand Down
4 changes: 2 additions & 2 deletions src/drtsans/dataobjects.py
Original file line number Diff line number Diff line change
Expand Up @@ -619,10 +619,10 @@ def __new__(cls, intensity, error, qx, qy, delta_qx=None, delta_qy=None, wavelen

# Sanity check
assert qx.shape == intensity.shape, (
f"qx and intensity must have same shapes. " f"It is not now: {qx.shape} vs {intensity.shape}"
f"qx and intensity must have same shapes. It is not now: {qx.shape} vs {intensity.shape}"
)
assert qy.shape == intensity.shape, (
f"qy and intensity must have same shapes. " f"It is not now: {qy.shape} vs {intensity.shape}"
f"qy and intensity must have same shapes. It is not now: {qy.shape} vs {intensity.shape}"
)

# pass everything to namedtuple
Expand Down
4 changes: 2 additions & 2 deletions src/drtsans/detector.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def _detector_first_ws_index(self, first_det_id):
self.first_index = ws_index
break
else:
raise ValueError("Iterared WS and did not find first det id = " "{}".format(first_det_id))
raise ValueError("Iterared WS and did not find first det id = {}".format(first_det_id))

def masked_ws_indices(self):
"""
Expand Down Expand Up @@ -256,7 +256,7 @@ def monitor_indices(self):
return np.array([])

def __str__(self):
return "Component: {} with {} pixels (dim x={}, dim y={})." " First index = {}.".format(
return "Component: {} with {} pixels (dim x={}, dim y={}). First index = {}.".format(
self._component_name,
self.dims,
self.dim_x,
Expand Down
6 changes: 2 additions & 4 deletions src/drtsans/determine_bins.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ def determine_1d_linear_bins(x_min, x_max, bins):
# Check input x min and x max
if x_min is None or x_max is None or x_min >= x_max:
raise RuntimeError(
"x min {} and x max {} must not be None and x min shall be less than x max" "".format(x_min, x_max)
"x min {} and x max {} must not be None and x min shall be less than x max".format(x_min, x_max)
)
# force the number of bins to be an integer and error check it
bins = int(bins)
Expand Down Expand Up @@ -104,9 +104,7 @@ def determine_1d_log_bins(x_min, x_max, decade_on_center, n_bins_per_decade=None

# case that is not supported
if decade_on_center:
assert n_bins_per_decade is not None, (
"For option decade_on_center, number of bins per decade " "is required"
)
assert n_bins_per_decade is not None, "For option decade_on_center, number of bins per decade is required"
x_ref = x_min

# calculate bin step size
Expand Down
12 changes: 5 additions & 7 deletions src/drtsans/files/hdf5_rw.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def match(self, other_node):
# compare class type
if not isinstance(other_node, type(self)):
raise TypeError(
"Try to match instance of class {} (other) to {} (self)" "".format(type(other_node), type(self))
"Try to match instance of class {} (other) to {} (self)".format(type(other_node), type(self))
)

# compare name
Expand All @@ -89,16 +89,14 @@ def match(self, other_node):
# compare attributes
if set(self._attributes.keys()) != set(other_node.attributes.keys()):
print(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}"
"".format(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}".format(
self.name,
set(self._attributes.keys()) - set(other_node.attributes.keys()),
set(other_node.attributes.keys()) - set(self._attributes.keys()),
)
)
raise KeyError(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}"
"".format(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}".format(
self.name,
set(self._attributes.keys()) - set(other_node.attributes.keys()),
set(other_node.attributes.keys()) - set(self._attributes.keys()),
Expand All @@ -109,7 +107,7 @@ def match(self, other_node):
error_msg = ""
for attr_name in self._attributes.keys():
if self._attributes[attr_name] != other_node.attributes[attr_name]:
error_msg += "Mismatch attribute {} value: self = {}, other = {}" "".format(
error_msg += "Mismatch attribute {} value: self = {}, other = {}".format(
attr_name,
self._attributes[attr_name],
other_node.attributes[attr_name],
Expand Down Expand Up @@ -188,7 +186,7 @@ def write_attributes(self, curr_entry):
except TypeError as type_error:
print(f"[ERROR] {self._name}-node attribute {attr_name} is of type {type(attr_name)}")
raise TypeError(
f"[ERROR] {self._name}-node attribute {attr_name} is of type " f"{type(attr_name)}: {type_error}"
f"[ERROR] {self._name}-node attribute {attr_name} is of type {type(attr_name)}: {type_error}"
)


Expand Down
4 changes: 2 additions & 2 deletions src/drtsans/files/log_h5_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ def compare_reduced_iq(test_log_file, gold_log_file, title: str, prefix: str):

# Output error message
if test_exception:
base_name = f'{prefix}{os.path.basename(test_log_file).split(".")[0]}'
base_name = f"{prefix}{os.path.basename(test_log_file).split('.')[0]}"
report_difference(
(test_q_vec, test_intensity_vec),
(gold_q_vec, gold_intensity_vec),
Expand Down Expand Up @@ -211,7 +211,7 @@ def verify_cg2_reduction_results(sample_names, output_dir, gold_path, title, pre
try:
compare_reduced_iq(output_log_file, gold_log_file, title_i, prefix)
except AssertionError as unmatched_error:
unmatched_errors = "Testing output {} is different from gold result {}:\n{}" "".format(
unmatched_errors = "Testing output {} is different from gold result {}:\n{}".format(
output_log_file, gold_log_file, unmatched_error
)
# END-FOR
Expand Down
1 change: 1 addition & 0 deletions src/drtsans/frame_mode.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,6 @@ class FrameMode(Enum):
r"""
Selects if instrument operating in frame-skipping mode
"""

not_skip = 0
skip = 1
9 changes: 5 additions & 4 deletions src/drtsans/geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ def bank_workspace_index_range(input_workspace, component=""):
for i in range(input_workspace.getNumberHistograms()):
ids = input_workspace.getSpectrum(i).getDetectorIDs()
if len(ids) > 1:
raise RuntimeError("do not know how to work with more than one " "detector per spectrum ({})".format(ids))
raise RuntimeError("do not know how to work with more than one detector per spectrum ({})".format(ids))
if ids[0] == detector_id_first:
first = i
break
Expand Down Expand Up @@ -786,7 +786,7 @@ def translate_sample_by_z(workspace, z):
)
workspace = mtd[ws_name]
logger.debug(
"Instrument sample position is moved to {}" "".format(workspace.getInstrument().getSample().getPos())
"Instrument sample position is moved to {}".format(workspace.getInstrument().getSample().getPos())
)

# update the appropriate log
Expand Down Expand Up @@ -836,8 +836,9 @@ def translate_detector_by_z(input_workspace, z=None, relative=True):
update_log = True
if (not relative) or (z != 0.0):
logger.debug(
"Moving detector along Z = {} is relative = {} to component {}"
"".format(z, relative, main_detector_name(input_workspace))
"Moving detector along Z = {} is relative = {} to component {}".format(
z, relative, main_detector_name(input_workspace)
)
)

MoveInstrumentComponent(
Expand Down
5 changes: 3 additions & 2 deletions src/drtsans/instruments.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ def _empty_download(filepath):

idf = os.path.join(str(output_directory), idf_xml)
url = f"https://raw.githubusercontent.com/mantidproject/mantid/main/instrument/{idf_xml}"
result = subprocess.run(f"curl -o {idf} {url}", shell=True, capture_output=True, text=True)
result = subprocess.run(f"curl -o {idf} {url}", shell=True, capture_output=True, text=True, check=False)
if result.returncode == 0 and not _empty_download(idf):
return idf
else:
Expand Down Expand Up @@ -328,7 +328,8 @@ def copy_to_newest_instrument(
target.getAxis(0).setUnit(origin_unit)
target.setYUnit(origin.YUnit())
MergeRuns(
InputWorkspaces=[target_workspace, input_workspace], OutputWorkspace=target_workspace # order is necessary
InputWorkspaces=[target_workspace, input_workspace],
OutputWorkspace=target_workspace, # order is necessary
)
# Move components to the positions they have in input_workspace by reading their positions
# in the logs. This is implicitly done when invoking algorithm LoadInstrument.
Expand Down
10 changes: 6 additions & 4 deletions src/drtsans/iq.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,14 +117,16 @@ def valid_wedge(min_angle, max_angle) -> List[Tuple[float, float]]:
if diff < 180.0:
return [(min_angle, max_angle)]
raise ValueError(
"wedge angle is greater than 180 degrees: {:.1f} - {:.1f} = {:.1f} < 180"
"".format(max_angle, min_angle, diff)
"wedge angle is greater than 180 degrees: {:.1f} - {:.1f} = {:.1f} < 180".format(
max_angle, min_angle, diff
)
)
diff = min_angle - max_angle
if diff <= 180:
raise ValueError(
"wedge angle is greater than 180 degrees: {:.1f} - {:.1f} = {:.1f} <= 180"
"".format(min_angle, max_angle, diff)
"wedge angle is greater than 180 degrees: {:.1f} - {:.1f} = {:.1f} <= 180".format(
min_angle, max_angle, diff
)
)
return [(min_angle, 270.1), (-90.1, max_angle)]

Expand Down
4 changes: 1 addition & 3 deletions src/drtsans/mask_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -258,9 +258,7 @@ def circular_mask_from_beam_center(input_workspace, radius, unit="mm"):
<radius val="{}" />
</infinite-cylinder>
<algebra val="shape" />
""".format(
r
)
""".format(r)
det_ids = FindDetectorsInShape(Workspace=input_workspace, ShapeXML=cylinder)
return det_ids

Expand Down
Loading
Loading