Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doubt about the cosine_angular_loss #47

Open
ireneMsm2020 opened this issue May 26, 2023 · 2 comments
Open

Doubt about the cosine_angular_loss #47

ireneMsm2020 opened this issue May 26, 2023 · 2 comments

Comments

@ireneMsm2020
Copy link

Hello,
I have doubts about cosine_angular_loss. After permute operation, the channel dim is in dim 3, but in normalize operation, you set dim=1. So is there a mistake? Can you please explain it ?


def masked_cosine_angular_loss(preds, target, mask_valid):
preds = (2 * preds - 1).clamp(-1, 1)
target = (2 * target - 1).clamp(-1, 1)
mask_valid = mask_valid[:,0,:,:].bool().squeeze(1)
preds = preds.permute(0,2,3,1)[mask_valid, :]
target = target.permute(0,2,3,1)[mask_valid, :]
preds_norm = torch.nn.functional.normalize(preds, p=2, dim=1)
target_norm = torch.nn.functional.normalize(target, p=2, dim=1)
loss = torch.mean(-torch.sum(preds_norm * target_norm, dim = 1))
return loss

@alexsax
Copy link
Collaborator

alexsax commented May 26, 2023

Hi your concern looks right to me, but the models are quite good. So I think the released code might just be the problem (and the evaluation code we borrowed right from the OASIS paper). @Ainaz99 what do you think?

@limacv
Copy link

limacv commented Sep 5, 2023

isn't the mask operation flatten the two tensor into [N, 3] shape? In that case dim=1 should be correct.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants