-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Disable XLA with TensorFlow determinisim #20315
Disable XLA with TensorFlow determinisim #20315
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #20315 +/- ##
==========================================
- Coverage 78.83% 74.45% -4.38%
==========================================
Files 512 512
Lines 48995 49062 +67
Branches 9021 9035 +14
==========================================
- Hits 38624 36529 -2095
- Misses 8509 10728 +2219
+ Partials 1862 1805 -57
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Thanks for the PR! Yes, this seems like a reasonable change. Please take a look at the test failure: FAILED keras/src/trainers/trainer_test.py::TrainerDistributeTest::test_end_to_end_tf_distribute - RuntimeError: Random ops require a seed to be set when determinism is enabled. Please set a seed before running the op, e.g. by calling tf.random.set_seed(1). |
…al state of tensorflow across the tests
Hi @fchollet, I have no idea why jax backend test failed since I specially decorated the test for tensorflow backend: keras/keras/src/trainers/trainer_test.py Lines 1721 to 1741 in f24c863
If I remove |
keras/src/trainers/trainer_test.py
Outdated
@@ -1718,6 +1718,28 @@ def call(self, x, training=None): | |||
for v in model._compile_loss.variables: | |||
self.assertAllClose(v, 0.0) | |||
|
|||
pytest.mark.skipif( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You forgot the @
for the decorator
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
:-) it happens...
Currently if someone runs https://keras.io/examples/keras_recipes/reproducibility_recipes/, gets an error:
(PR for fixing recipe keras-team/keras-io#1941)
Considering Keras 2, which had
jit_compile = None
(not enabled by default) it's better not to use XLA when TF determinism is enabled?(I can check the failed tests if the team is willing to merge this PR)