Conversation
df10415 to
8ffd190
Compare
4ec1d74 to
054bb36
Compare
| (t0, d0), (t1, d1) = self, other | ||
| if t0 != t1: | ||
| return False | ||
| return (d0 == d1) or (np.isnan(d0) and np.isnan(d1)) |
There was a problem hiding this comment.
are there any other corner cases with np.inf? I don't think so just checking.
| cache hit overwrites memo[c2]=c1, c2 is evicted from variables | ||
| while the cache-miss node still references it. | ||
| """ | ||
| from pytensor.graph.fg import FrozenFunctionGraph |
There was a problem hiding this comment.
we should consider adding the ruff rule that all imports need to be on top of the file, im embarrassed i checked this in.
|
|
||
| def test_constant_output_equality(self): | ||
| """FFGs with distinct but equal constant outputs should be equal.""" | ||
| c1 = ScalarConstant(float64, 3.14) |
There was a problem hiding this comment.
There are other circle constants you know: https://www.youtube.com/watch?v=dUGFb_HgG1c
| except KeyError: | ||
| if not isinstance(o, AtomicVariable): | ||
| raise ValueError( | ||
| f"Output variable {o} could not be mapped to a frozen graph variable. " |
There was a problem hiding this comment.
Is it really possible for a user to reach here? I ask because the error isn't really actionable, so it's another pytensor-special cryptic error if you do end up here somewhere.
There was a problem hiding this comment.
I had this doubt myself before. I don't think so, it would trigger in the first loop. Wanna take it out?
One edge case is a NominalVariable output not specified as input. That should fail, which means this branch should only really accept Constants (not general AtomicVariables)
That may be findable only here but would need to check
There was a problem hiding this comment.
We can make it an assert if we just want it for safety but dont think it should ever be hit (keeping the message)
Related to #2033