You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
by @kabilar , and ongoing work on "vendorization" of DANDI archive codebase to support multiple instances of DANDI.
ATM both dandi-cli and dandi-archive re-use dandischema Python library for manipulating and validating metadata records. As a result, any upgrade of dandischema in server requires upgrade of dandischema in dandi-cli installation which also might require upgrade of dandi-cli itself.
With multiple archive instances in play (already DANDI and LINC and EMBER is coming), synchronizing all of them to align with most recent dandischema would be getting close to infeasible. So I think we should distill a solution which would allow dandi-cli to manipulate different/needed versions of dandischema.
Possible roadmap to achieve that
migrate pydantic model to linkml (effort ongoing by @candleindark ) + potentially additional "instance specific" restrictions (TODO: link issues in dandi-schema)
in dandi-cli implement loading of models from linkml models for the version desired by the server:
as long as that linkml model is described in compatible version of schema (based on minimal dandischema_schema_version supported by dandi-cli) , we should be good
potentially extend the model with instance specific models where needed for validation (most likely the base model would be sufficient to describe)
not yet sure what we should do for currently used pydantic models in Python interfaces, might need to populate classes for different versions. So issue similarish to what we discussed with @rly and other nwb developers about nwb versions.
Also attn @jwodder@satra@bendichter @dandi/dandiarchive to potentially also think about enhancement to above plan or just to suggest alternative approaches.
The text was updated successfully, but these errors were encountered:
the cli has to be compatible with not just schemas but also api processes (e.g. upload flows or parameters) across instances. hence there is a more critical notion of compatibility of cli and server that includes both api version and schema version.
Inspired by
by @kabilar , and ongoing work on "vendorization" of DANDI archive codebase to support multiple instances of DANDI.
ATM both
dandi-cli
anddandi-archive
re-usedandischema
Python library for manipulating and validating metadata records. As a result, any upgrade ofdandischema
in server requires upgrade ofdandischema
indandi-cli
installation which also might require upgrade ofdandi-cli
itself.With multiple archive instances in play (already DANDI and LINC and EMBER is coming), synchronizing all of them to align with most recent
dandischema
would be getting close to infeasible. So I think we should distill a solution which would allowdandi-cli
to manipulate different/needed versions of dandischema.Possible roadmap to achieve that
dandischema_schema_version
-- as long as it is "compatible" in terms of its formulation withdandi-cli
. In a similar situation in BIDS we haveBIDS_VERSION
and itsSCHEMA_VERSION
, see https://github.com/bids-standard/bids-specification/tree/master/src/schema#version-of-the-schemadandi-cli
implement loading of models fromlinkml
models for the version desired by the server:linkml
model is described in compatible version of schema (based on minimaldandischema_schema_version
supported by dandi-cli) , we should be goodAlso attn @jwodder @satra @bendichter @dandi/dandiarchive to potentially also think about enhancement to above plan or just to suggest alternative approaches.
The text was updated successfully, but these errors were encountered: