You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I try to use my own data set ( with the same shape as mnist) in code. After some iterations, it is found that the training loss becomes NAN. After carefully checking the code, I found that the following code may trigger NAN in loss:
In TensorFlow-Examples/examples/2_BasicModels/logistic_regression.py:
If pred contains 0 (output of softmax ), the result of tf.log(pred) is inf because log(0) is illegal . And this may cause the result of loss to become NAN.
It could be fixed by making the following changes:
Hello~
Thank you very much for sharing the code!
I try to use my own data set ( with the same shape as mnist) in code. After some iterations, it is found that the training loss becomes NAN. After carefully checking the code, I found that the following code may trigger NAN in loss:
In
TensorFlow-Examples/examples/2_BasicModels/logistic_regression.py:
If pred contains 0 (output of softmax ), the result of
tf.log(pred)
isinf
becauselog(0)
is illegal . And this may cause the result of loss to become NAN.It could be fixed by making the following changes:
or
Hope to hear from you ~
Thanks in advance! : )
The text was updated successfully, but these errors were encountered: