Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use native Pytorch operations for complex numbers #19

Open
anjali411 opened this issue Apr 3, 2022 · 2 comments
Open

Use native Pytorch operations for complex numbers #19

anjali411 opened this issue Apr 3, 2022 · 2 comments

Comments

@anjali411
Copy link

anjali411 commented Apr 3, 2022

Hi I noticed that you have custom matmul (

def complex_matmul(A, B):
'''
Performs the matrix product between two complex matricess
'''
outp_real = torch.matmul(A.real, B.real) - torch.matmul(A.imag, B.imag)
outp_imag = torch.matmul(A.real, B.imag) + torch.matmul(A.imag, B.real)
return outp_real.type(torch.complex64) + 1j * outp_imag.type(torch.complex64)
) and tanh, neg functions defined (
def complex_tanh(input):
return tanh(input.real).type(torch.complex64)+1j*tanh(input.imag).type(torch.complex64)
def complex_opposite(input):
return -(input.real).type(torch.complex64)+1j*(-(input.imag).type(torch.complex64))
) which are actually unnecessary since these functions are supported for complex numbers in the last couple of releases of PyTorch and would be much faster too (since we call into blas operation for matmul for example).

@wavefrontshaping
Copy link
Owner

wavefrontshaping commented Apr 5, 2022

Hi,
Before version 1.7, complex tensors were not implemented at all, since then, various updates have introduced complex tensors and now, almost all operations are implemented and, very recently and very importantly, autograd works for complex tensors.

So basically, complexPyTorch is now obsolete, you can use pyTorch out of the box with complex tensors like you use real tensors and everything should work. I keep this repository for backward compatibility with old code or if one needs to use an old version of pyTorch for some reason.

@SantaTitular
Copy link

SantaTitular commented Jul 26, 2022

Hi @wavefrontshaping @anjali411 ,

Could you elaborate a bit more on how to use pyTorch out of the box with complex tensors? For instance in this library, Keras CVNN (similar to yours but with Keras Tenserflow), the activation functions are well depicted and explained. However, with pytorch, I can't seem to properly access the code to check which implementation of the layers are available (#47052). For instance, with ReLu and TanH, there is different implementations available, how to I differentiate between them?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants