Skip to content

Commit

Permalink
Update pre commit (#607)
Browse files Browse the repository at this point in the history
* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/pre-commit/pre-commit-hooks: v4.4.0 → v5.0.0](pre-commit/pre-commit-hooks@v4.4.0...v5.0.0)
- https://github.com/charliermarsh/ruff-pre-commithttps://github.com/astral-sh/ruff-pre-commit
- [github.com/astral-sh/ruff-pre-commit: v0.0.275 → v0.8.3](astral-sh/ruff-pre-commit@v0.0.275...v0.8.3)
- [github.com/psf/black: 23.3.0 → 24.10.0](psf/black@23.3.0...24.10.0)
- [github.com/kynan/nbstripout: 0.6.1 → 0.8.1](kynan/nbstripout@0.6.1...0.8.1)

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Add testing and packaging files

* pre-commit - use ruff instead of black

* Add notebooks to excludes in pyproject.toml

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix codecov step

* New test prep stage

* change data submodule url

* specify cov files

* add some noqa

* address lint errors

* forgot .items() in test spice xml parser

* try pip install in test preparation step

* restore from next and exclude notebooks & test/examples

* fix geometry.py

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Kevin A. Tactac <[email protected]>
Co-authored-by: glass-ships <[email protected]>
  • Loading branch information
4 people authored Feb 11, 2025
1 parent a591d44 commit d88ea02
Show file tree
Hide file tree
Showing 134 changed files with 500 additions and 510 deletions.
4 changes: 2 additions & 2 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
## Description of work:

Check all that apply:
- [ ] added [release notes](https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/docs/release_notes.rst?ref_type=heads) (if not, provide an explanation in the work description)
- [ ] added [release notes](https://github.com/neutrons/drtsans/blob/next/docs/release_notes.rst) (if not, provide an explanation in the work description)
- [ ] updated documentation
- [ ] Source added/refactored
- [ ] Added unit tests
Expand All @@ -16,7 +16,7 @@ Check all that apply:
<!-- Instructions for testing here. -->

## Check list for the reviewer
- [ ] [release notes](https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/docs/release_notes.rst?ref_type=heads) updated, or an explanation is provided as to why release notes are unnecessary
- [ ] [release notes](https://github.com/neutrons/drtsans/blob/next/docs/release_notes.rst) updated, or an explanation is provided as to why release notes are unnecessary
- [ ] best software practices
+ [ ] clearly named variables (better to be verbose in variable names)
+ [ ] code comments explaining the intent of code blocks
Expand Down
19 changes: 11 additions & 8 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: check-json
Expand All @@ -19,16 +19,19 @@ repos:
- id: check-merge-conflict
- id: end-of-file-fixer
- id: sort-simple-yaml
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.0.275
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.9.4
hooks:
- id: ruff
args: [--no-cache, --fix, --exit-non-zero-on-fix]
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
exclude: |
notebooks/
tests/examples/
- id: ruff-format
exclude: |
notebooks/
tests/examples/
- repo: https://github.com/kynan/nbstripout
rev: 0.6.1
rev: 0.8.1
hooks:
- id: nbstripout
6 changes: 3 additions & 3 deletions OnboardingChecklist.rst
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ or
Development procedure
#####################

How to develop codes in drtSANS shall follow the instruction in `CONTRIBUTION <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/CONTRIBUTING.rst>`_.
How to develop codes in drtSANS shall follow the instruction in `CONTRIBUTION <https://github.com/neutrons/drtsans/blob/next/CONTRIBUTING.rst>`_.

..
1. A developer is assigned with a task during neutron status meeting and changes the task's status to **In Progress**.
Expand All @@ -113,8 +113,8 @@ Test Driven Development (TDD)

Examples:

* `drtsans/resolution.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/drtsans/resolution.py>`_ and `tests/unit/new/drtsans/test_resolution.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/tests/unit/new/drtsans/test_resolution.py>`_.
* `drtsans/tof/eqsans/incoherence_correction_1d.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/drtsans/tof/eqsans/incoherence_correction_1d.py>`_ and `tests/unit/new/drtsans/tof/eqsans/test_incoherence_correction_1d.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/tests/unit/new/drtsans/tof/eqsans/test_incoherence_correction_1d.py>`_.
* `drtsans/resolution.py <https://github.com/neutrons/drtsans/blob/next/src/drtsans/resolution.py>`_ and `tests/unit/drtsans/test_resolution.py <https://github.com/neutrons/drtsans/blob/next/tests/unit/drtsans/test_resolution.py>`_.
* `drtsans/tof/eqsans/incoherence_correction_1d.py <https://github.com/neutrons/drtsans/blob/next/src/drtsans/tof/eqsans/incoherence_correction_1d.py>`_ and `tests/unit/drtsans/tof/eqsans/test_incoherence_correction_1d.py <https://github.com/neutrons/drtsans/blob/next/tests/unit/drtsans/tof/eqsans/test_incoherence_correction_1d.py>`_.

* Integration test

Expand Down
6 changes: 3 additions & 3 deletions README_developer.rst
Original file line number Diff line number Diff line change
Expand Up @@ -200,13 +200,13 @@ Test Driven Development (TDD)
* Unit test

All methods and modules shall have unit tests implemented.
Unit tests are located in `repo/tests/unit/new <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/tree/next/tests/unit/new>`_.
Unit tests are located in `repo/tests/unit/new <https://github.com/neutrons/drtsans/blob/next/tests/unit/>`_.
A unit test shall be created in the corresponding directory to the method or module that it tests against.

Examples:

* `drtsans/resolution.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/drtsans/resolution.py>`_ and `tests/unit/new/drtsans/test_resolution.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/tests/unit/new/drtsans/test_resolution.py>`_.
* `drtsans/tof/eqsans/incoherence_correction_1d.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/drtsans/tof/eqsans/incoherence_correction_1d.py>`_ and `tests/unit/new/drtsans/tof/eqsans/test_incoherence_correction_1d.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/tests/unit/new/drtsans/tof/eqsans/test_incoherence_correction_1d.py>`_.
* `drtsans/resolution.py <https://github.com/neutrons/drtsans/blob/next/src/drtsans/resolution.py>`_ and `tests/unit/drtsans/test_resolution.py <https://github.com/neutrons/drtsans/blob/next/tests/unit/drtsans/test_resolution.py>`_.
* `drtsans/tof/eqsans/incoherence_correction_1d.py <https://github.com/neutrons/drtsans/blob/next/src/drtsans/tof/eqsans/incoherence_correction_1d.py>`_ and `tests/unit/drtsans/tof/eqsans/test_incoherence_correction_1d.py <https://github.com/neutrons/drtsans/blob/next/tests/unit/drtsans/tof/eqsans/test_incoherence_correction_1d.py>`_.

* Integration test

Expand Down
1 change: 1 addition & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -215,6 +215,7 @@ class ExecDirective(Directive):
Credit goes to:
https://stackoverflow.com/questions/27875455/displaying-dictionary-data-in-sphinx-documentation/29789910#29789910
"""

has_content = True

def run(self):
Expand Down
6 changes: 3 additions & 3 deletions docs/drtsans/example_1d.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@
# files
config["mask"] = "/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017B_mp/beamstop60_mask_4m.nxs"
config["flux"] = "/SNS/EQSANS/shared/instrument_configuration/bl6_flux_at_sample"
config[
"sensitivity_file_path"
] = "/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017A_mp/Sensitivity_patched_thinPMMA_4m_79165_event.nxs"
config["sensitivity_file_path"] = (
"/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017A_mp/Sensitivity_patched_thinPMMA_4m_79165_event.nxs"
)
config["dark_current"] = "/SNS/EQSANS/shared/NeXusFiles/EQSANS/2017B_mp/EQSANS_86275.nxs.h5"

# numeric values
Expand Down
6 changes: 3 additions & 3 deletions docs/drtsans/reduction_scripts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@ Reduction Scripts
The following python scripts can be used as the entry points for reduction of SANS data
for each instrument

- `scripts/biosans_reduction.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/scripts/biosans_reduction.py>`_
- `scripts/eqsans_reduction.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/scripts/eqsans_reduction.py>`_
- `scripts/gpsans_reduction.py <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/scripts/gpsans_reduction.py>`_
- `scripts/biosans_reduction.py <https://github.com/neutrons/drtsans/blob/next/scripts/biosans_reduction.py>`_
- `scripts/eqsans_reduction.py <https://github.com/neutrons/drtsans/blob/next/scripts/eqsans_reduction.py>`_
- `scripts/gpsans_reduction.py <https://github.com/neutrons/drtsans/blob/next/scripts/gpsans_reduction.py>`_

These scripts receive as argument the path to a `*.json` file containing all necessary reduction parameters. In the
active `sans` conda environment, and assuming we are at the root of the drtsans repository:
Expand Down
2 changes: 1 addition & 1 deletion docs/user/reduction_output.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Customized reports could be generated form reduction hdf log files. The script `
generate_report /path/to/hdf/log/file.hdf [/path/to/my/generate_report.yaml]
The last parameter here is a path to a users' YAML file with report parameters. If not provided, the [default parameters](https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/-/blob/next/scripts/generate_report.yaml) will be used to create the report.
The last parameter here is a path to a users' YAML file with report parameters. If not provided, the [default parameters](https://github.com/neutrons/drtsans/blob/next/scripts/generate_report.yaml) will be used to create the report.

The yaml file contains the keys to extract from the hdf log file and short aliases to be used in the report

Expand Down
2 changes: 1 addition & 1 deletion notebooks/barscan/barscan.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@
"source": [
"<h3>Plotting the calibration</h3>\n",
"\n",
"We can create a **2D intensity plot** for the pixel positions and heights using the `as_intensities` method of the calibration object (the calibration is a table of type [Table](https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/blob/next/drtsans%2Fpixel_calibration.py#L90))\n",
"We can create a **2D intensity plot** for the pixel positions and heights using the `as_intensities` method of the calibration object (the calibration is a table of type [Table](https://github.com/neutrons/drtsans/blob/next/src/drtsans/pixel_calibration.py#L285))\n",
"\n",
"**Functions and Algorithms used:**\n",
"- [plot_detector](http://docs.drt-sans.ornl.gov/drtsans/plots.html#drtsans.plots.plot_detector)"
Expand Down
3 changes: 3 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,9 @@ line-length = 119
[tool.ruff]
cache-dir = "/tmp/ruff_cache"
line-length = 119
extend-exclude = ["notebooks", "tests/examples"]

[tool.ruff.lint]
# https://beta.ruff.rs/docs/rules/
# suggestions: BLE blind exceptions, I sorts imports
# Full pylint PL = PLC, PLE, PLR (~500 issues), PLW. Enable most
Expand Down
3 changes: 2 additions & 1 deletion scripts/common_utils.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
""" Common utility functions for all SANS """
"""Common utility functions for all SANS"""

import os
import numpy as np
import matplotlib.pyplot as plt
Expand Down
1 change: 1 addition & 0 deletions scripts/examples/plot_wavelength.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Example script to plot the before and after k-correction data for a given slice and frame.
"""

import glob
import os
from pathlib import Path
Expand Down
1 change: 1 addition & 0 deletions scripts/generate_report
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
"""
Script to generate a report from an hdf5 log file
"""

import sys
import os
import numpy as np
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def show_calibration_stage1(raw_flood_ws_name, database_file):
# Plot flood workspace raw and calibrated
print("#####\n\nCompare applying the calibration to flood (stage 1)")

calibrated_flood_ws_name = f'demo_calibrated1_flood_{raw_flood_ws_name.split("flood_")[1]}'
calibrated_flood_ws_name = f"demo_calibrated1_flood_{raw_flood_ws_name.split('flood_')[1]}"
apply_calibrations(
raw_flood_ws_name,
output_workspace=calibrated_flood_ws_name,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,13 +1,14 @@
"""
SANS sensitivities preparation script
SANS sensitivities preparation script
# goal
1. implement a universal mask_beam_center(flood_ws, beam_center_mask=None, beam_center_ws=None)
for 3 types of mask
2. add option for wing/main detector for BIOSANS:w
# goal
1. implement a universal mask_beam_center(flood_ws, beam_center_mask=None, beam_center_ws=None)
for 3 types of mask
2. add option for wing/main detector for BIOSANS:w
"""

import os
import warnings
from drtsans.mono.gpsans.prepare_sensitivities_correction import SpiceRun
Expand Down
3 changes: 2 additions & 1 deletion scripts/prepare_sensitivities_biosans.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Sensitivities preparation script for Bio-SANS (CG3)
Sensitivities preparation script for Bio-SANS (CG3)
"""

from drtsans.mono.biosans.prepare_sensitivities_correction import PrepareSensitivityCorrection

import os
Expand Down
3 changes: 2 additions & 1 deletion scripts/prepare_sensitivities_gpsans.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Sensitivities preparation script for GP-SANS (CG2)
Sensitivities preparation script for GP-SANS (CG2)
"""

from drtsans.prepare_sensivities_correction import PrepareSensitivityCorrection


Expand Down
1 change: 1 addition & 0 deletions scripts/test_help/biosans_synthetic_sensitivity_dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
the Midrange Detector. See Fixture ``biosans_synthetic_sensitivity_dataset`` for detailed use
"""

# local imports
from drtsans.load import __monitor_counts
from drtsans.mono.biosans.simulated_intensities import clone_component_intensities, insert_midrange_detector
Expand Down
3 changes: 2 additions & 1 deletion src/drtsans/absolute_units.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
r""" Links to Mantid algorithms
r"""Links to Mantid algorithms
DeleteWorkspace <https://docs.mantidproject.org/nightly/algorithms/DeleteWorkspace-v1.html>
Divide <https://docs.mantidproject.org/nightly/algorithms/Divide-v1.html>
Multiply <https://docs.mantidproject.org/nightly/algorithms/Multiply-v1.html>
"""

from mantid.simpleapi import DeleteWorkspace, Divide, Multiply, mtd
from mantid.dataobjects import WorkspaceSingleValue

Expand Down
9 changes: 4 additions & 5 deletions src/drtsans/auto_wedge.py
Original file line number Diff line number Diff line change
Expand Up @@ -244,9 +244,9 @@ def _export_to_h5(iq2d, rings, azimuthal_delta, peak_fit_dict, output_dir):
function_data_set[0] = peak_fit_dict[index]["fit_function"]

# add peak fitting result
for param_name in func_param_dict:
for param_name, param_value in func_param_dict.items():
# form data set
data_set = np.array(func_param_dict[param_name])
data_set = np.array(param_value)
fit_group.create_dataset(param_name, data=data_set)

# close
Expand Down Expand Up @@ -501,8 +501,7 @@ def _estimatePeakParameters(intensity, azimuthal, azimuthal_start, window_half_w
break
# output
print(
f"[WEDGE FIT] azimuthal: {azimuthal_new}, {azimuthal_last} with "
f"left and right as {left_index}, {right_index}"
f"[WEDGE FIT] azimuthal: {azimuthal_new}, {azimuthal_last} with left and right as {left_index}, {right_index}"
)

# now use the first two moments of the data within the window to give an improved center position (first moment)
Expand Down Expand Up @@ -837,7 +836,7 @@ def _fitQAndAzimuthal(
fit_result_dict[index]["error"] = error_reason
continue
else:
fitted_peaks_message += f"spectrum {index-1}: Fitted peaks: {newlyFittedPeaks}\n"
fitted_peaks_message += f"spectrum {index - 1}: Fitted peaks: {newlyFittedPeaks}\n"
for i in range(len(peakResults)):
peakResults[i].append(newlyFittedPeaks[i])
q_centers_used.append(q_center)
Expand Down
4 changes: 2 additions & 2 deletions src/drtsans/dark_current.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@

r"""
Hyperlinks to drtsans functions
namedtuplefy <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/blob/next/drtsans/settings.py>
SampleLogs <https://code.ornl.gov/sns-hfir-scse/sans/sans-backend/blob/next/drtsans/samplelogs.py>
namedtuplefy <https://github.com/neutrons/drtsans/blob/next/src/drtsans/settings.py>
SampleLogs <https://github.com/neutrons/drtsans/blob/next/src/drtsans/samplelogs.py>
""" # noqa: E501
from drtsans.settings import namedtuplefy
from drtsans.samplelogs import SampleLogs
Expand Down
4 changes: 2 additions & 2 deletions src/drtsans/dataobjects.py
Original file line number Diff line number Diff line change
Expand Up @@ -619,10 +619,10 @@ def __new__(cls, intensity, error, qx, qy, delta_qx=None, delta_qy=None, wavelen

# Sanity check
assert qx.shape == intensity.shape, (
f"qx and intensity must have same shapes. " f"It is not now: {qx.shape} vs {intensity.shape}"
f"qx and intensity must have same shapes. It is not now: {qx.shape} vs {intensity.shape}"
)
assert qy.shape == intensity.shape, (
f"qy and intensity must have same shapes. " f"It is not now: {qy.shape} vs {intensity.shape}"
f"qy and intensity must have same shapes. It is not now: {qy.shape} vs {intensity.shape}"
)

# pass everything to namedtuple
Expand Down
4 changes: 2 additions & 2 deletions src/drtsans/detector.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def _detector_first_ws_index(self, first_det_id):
self.first_index = ws_index
break
else:
raise ValueError("Iterared WS and did not find first det id = " "{}".format(first_det_id))
raise ValueError("Iterared WS and did not find first det id = {}".format(first_det_id))

def masked_ws_indices(self):
"""
Expand Down Expand Up @@ -256,7 +256,7 @@ def monitor_indices(self):
return np.array([])

def __str__(self):
return "Component: {} with {} pixels (dim x={}, dim y={})." " First index = {}.".format(
return "Component: {} with {} pixels (dim x={}, dim y={}). First index = {}.".format(
self._component_name,
self.dims,
self.dim_x,
Expand Down
6 changes: 2 additions & 4 deletions src/drtsans/determine_bins.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ def determine_1d_linear_bins(x_min, x_max, bins):
# Check input x min and x max
if x_min is None or x_max is None or x_min >= x_max:
raise RuntimeError(
"x min {} and x max {} must not be None and x min shall be less than x max" "".format(x_min, x_max)
"x min {} and x max {} must not be None and x min shall be less than x max".format(x_min, x_max)
)
# force the number of bins to be an integer and error check it
bins = int(bins)
Expand Down Expand Up @@ -104,9 +104,7 @@ def determine_1d_log_bins(x_min, x_max, decade_on_center, n_bins_per_decade=None

# case that is not supported
if decade_on_center:
assert n_bins_per_decade is not None, (
"For option decade_on_center, number of bins per decade " "is required"
)
assert n_bins_per_decade is not None, "For option decade_on_center, number of bins per decade is required"
x_ref = x_min

# calculate bin step size
Expand Down
12 changes: 5 additions & 7 deletions src/drtsans/files/hdf5_rw.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def match(self, other_node):
# compare class type
if not isinstance(other_node, type(self)):
raise TypeError(
"Try to match instance of class {} (other) to {} (self)" "".format(type(other_node), type(self))
"Try to match instance of class {} (other) to {} (self)".format(type(other_node), type(self))
)

# compare name
Expand All @@ -89,16 +89,14 @@ def match(self, other_node):
# compare attributes
if set(self._attributes.keys()) != set(other_node.attributes.keys()):
print(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}"
"".format(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}".format(
self.name,
set(self._attributes.keys()) - set(other_node.attributes.keys()),
set(other_node.attributes.keys()) - set(self._attributes.keys()),
)
)
raise KeyError(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}"
"".format(
"Data node {} Attributes are not same:\nself - other = {}]\nother - self = {}".format(
self.name,
set(self._attributes.keys()) - set(other_node.attributes.keys()),
set(other_node.attributes.keys()) - set(self._attributes.keys()),
Expand All @@ -109,7 +107,7 @@ def match(self, other_node):
error_msg = ""
for attr_name in self._attributes.keys():
if self._attributes[attr_name] != other_node.attributes[attr_name]:
error_msg += "Mismatch attribute {} value: self = {}, other = {}" "".format(
error_msg += "Mismatch attribute {} value: self = {}, other = {}".format(
attr_name,
self._attributes[attr_name],
other_node.attributes[attr_name],
Expand Down Expand Up @@ -188,7 +186,7 @@ def write_attributes(self, curr_entry):
except TypeError as type_error:
print(f"[ERROR] {self._name}-node attribute {attr_name} is of type {type(attr_name)}")
raise TypeError(
f"[ERROR] {self._name}-node attribute {attr_name} is of type " f"{type(attr_name)}: {type_error}"
f"[ERROR] {self._name}-node attribute {attr_name} is of type {type(attr_name)}: {type_error}"
)


Expand Down
Loading

0 comments on commit d88ea02

Please sign in to comment.