You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
) which are actually unnecessary since these functions are supported for complex numbers in the last couple of releases of PyTorch and would be much faster too (since we call into blas operation for matmul for example).
The text was updated successfully, but these errors were encountered:
Hi,
Before version 1.7, complex tensors were not implemented at all, since then, various updates have introduced complex tensors and now, almost all operations are implemented and, very recently and very importantly, autograd works for complex tensors.
So basically, complexPyTorch is now obsolete, you can use pyTorch out of the box with complex tensors like you use real tensors and everything should work. I keep this repository for backward compatibility with old code or if one needs to use an old version of pyTorch for some reason.
Could you elaborate a bit more on how to use pyTorch out of the box with complex tensors? For instance in this library, Keras CVNN (similar to yours but with Keras Tenserflow), the activation functions are well depicted and explained. However, with pytorch, I can't seem to properly access the code to check which implementation of the layers are available (#47052). For instance, with ReLu and TanH, there is different implementations available, how to I differentiate between them?
Hi I noticed that you have custom matmul (
complexPyTorch/complexPyTorch/complexFunctions.py
Lines 11 to 19 in a4e752c
complexPyTorch/complexPyTorch/complexFunctions.py
Lines 52 to 56 in a4e752c
The text was updated successfully, but these errors were encountered: