Coverage for nondeterministic_seeded, respect it in constant prop (#83650)
- nondeterministic_seeded was not applied to enough functions. I added
some heuristics to codegen for identifying functions that are likely
to be random and added a bunch of these tags to functions. Not sure
I got all of them.
- Don't constant propagate through nondeterministic functions in FX
tracing.
It would be better to do some testing for the tag but this would be quite an effort.
Signed-off-by: Edward Z. Yang <[email protected]>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/83650
Approved by: https://github.com/bdhirsh, https://github.com/eellison
diff --git a/test/test_proxy_tensor.py b/test/test_proxy_tensor.py
index c3bda7e..0dc46da 100644
--- a/test/test_proxy_tensor.py
+++ b/test/test_proxy_tensor.py
@@ -454,6 +454,17 @@
lambda: make_fx(f, tracing_mode=self.tracing_mode)()
)
+ def test_constant_random(self):
+ def f():
+ val = torch.tensor([2.0])
+ val.normal_()
+ return val.item()
+
+ self.assertRaisesRegex(
+ RuntimeError, "data-dependent",
+ lambda: make_fx(f, tracing_mode=self.tracing_mode)()
+ )
+
def test_decomposition_interpreter(self):
def fn(x):
return torch.nn.functional.silu(x)