Data import/export functionality #78
-
It would be great to have robust and standardized tooling for data import/export. For example, it would be useful to be able to cache to disk Pickle files aren't very robust (or secure). Perhaps Hdf5? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 2 replies
-
I have been using json for this: https://github.com/invrs-io/totypes/blob/main/src/totypes/json_utils.py |
Beta Was this translation helpful? Give feedback.
-
Oh interesting... how big do these get? json isn't compressed or serialized in anyway. |
Beta Was this translation helpful? Give feedback.
-
They are about 1.4x the size of the object in memory. (I actually test for it: https://github.com/invrs-io/totypes/blob/main/tests/test_json_utils.py#L93) Not a huge overhead, and the benefit is the file can be almost visually inspected. |
Beta Was this translation helpful? Give feedback.
-
@mfschubert suppose I want to cache a cascaded scattering matrix to disk (e.g. the result of cascading several discretized slabs of a microlens). Naively, I might just dump the So maybe I get creative, and I just dump the scattering parameters themselves... after all, I can manipulate those with amplitudes rather easily. But things get tricky if I want to dynamically cascade that scattering matrix with a new scattering matrix I compute on the fly. Because all the relevant In theory, we should be able to just Redheffer Star my cached S-matrix with my new S-matrix, and ignore my Line 281 in 8c7491a Is there more nuanced involved (e.g. matching boundary conditions)? i.e. here's the logic I'm thinking:
|
Beta Was this translation helpful? Give feedback.
I have been using json for this:
https://github.com/invrs-io/totypes/blob/main/src/totypes/json_utils.py