Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Method and Results #2

Open
xnejed07 opened this issue Sep 6, 2018 · 3 comments
Open

Method and Results #2

xnejed07 opened this issue Sep 6, 2018 · 3 comments

Comments

@xnejed07
Copy link

xnejed07 commented Sep 6, 2018

Hi, I have read your paper with method description. I found this paper really interesting and have several theoretical questions. First of all, since the noise matrix is unconstrained (in our case it has usually negative values), how do you extract normalized (0,1) values that are described in figures. Do you apply softmax on each row? Secondly, how is your model behaving when applied to the correct labels without noise?

@ijindal
Copy link
Owner

ijindal commented Sep 17, 2018

Hi, Thanks for reading this work.

Yes, We apply Softmax on each row to normalize the learned noise matrix.

Correct Labels or 0% noise, This model learns a very pessimistic noise model (an aggressive dropout). Therefore, we find this model not doing good on correctly labeled datasets. We have developed a new model to tackle all kinds and types of label noise and is under review.

Hope this helps.

@CompareSan
Copy link

In the paper you don't say that you apply the softmax to every row. You say quite the opposite, quote:
"the matrix W is unconstrained during optimization. Because the softmax layer implicitly normalizes the resulting conditional probabilities, there is no need to normalize W or force its entries to be nonnegative. This simplifies the optimization process by eliminating the normalization step described above."

@CompareSan
Copy link

You don't even say anything about the initialization of the weight matrix W. If you initialize it randomly, good luck with the convergence.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants