Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

a decline on the final baseline accuracy with the alternate channel_weights #3

Open
GuokaiLiu opened this issue Feb 11, 2018 · 1 comment

Comments

@GuokaiLiu
Copy link

Hello Udibr,
Here is another question I want to consult for your kindly help.
I replaced the following code as suggested in mnist-simple.ipynb:

Before with baseline accuracy=0.98:

channel_weights = baseline_confusion.copy()
channel_weights /= channel_weights.sum(axis=1, keepdims=True)
# perm_bias_weights[prediction,noisy_label] = log(P(noisy_label|prediction))
channel_weights = np.log(channel_weights + 1e-8)

After with baseline accuracy=0.78:

# If you dont have a pre-trained baseline model then use this
channel_weights = (
    np.array([[np.log(1. - NOISE_LEVEL)
                        if i == j else
                        np.log(0.46 / (nb_classes - 1.))
                        for j in range(nb_classes)] for i in
              range(nb_classes)])
    + 0.01 * np.random.random((nb_classes, nb_classes)))

#1
Here is a significant decline on the final baseline accuracy and I can't find the reason.
Any suggestion for it? Here is my jupyter notebook result
#2
And I don't understand the reason why there exist 0.01 * np.random.random((nb_classes, nb_classes)) in the above expression?

Thanks for your help:)

@Billy1900
Copy link

Billy1900 commented Apr 18, 2021

  1. it seems like you have solved your problem?
  2. maybe a way to initialize?
  3. And I have updated a PyTorch version, welcome to correct my issues: https://github.com/Billy1900/Noise-Adaption-Layer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants