You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Handling variable-size observations, such as those used with permutation-invariant embedding networks, RNNs, or Transformers, currently requires padding inputs (e.g., with NaNs) to a fixed size. While this approach is nice for batching during training, at test time, it's preferable to also support working with tensors of varying lengths directly.
Unfortunately, the current input_shape checks prevent this, even when the underlying methods could handle variable-length inputs without issue.
As a workaround, it's necessary to manually override the inferred shapes to bypass these checks:
Is your feature request related to a problem? Please describe.
Handling variable-size observations, such as those used with permutation-invariant embedding networks, RNNs, or Transformers, currently requires padding inputs (e.g., with NaNs) to a fixed size. While this approach is nice for batching during training, at test time, it's preferable to also support working with tensors of varying lengths directly.
Unfortunately, the current
input_shape
checks prevent this, even when the underlying methods could handle variable-length inputs without issue.As a workaround, it's necessary to manually override the inferred shapes to bypass these checks:
Describe the solution you'd like
Shape checks should only be enforced where a static shape is truly necessary. Specifically:
The text was updated successfully, but these errors were encountered: