@xyhhx @dean @foone There's a single deterministic computation producing the data to read back independent of what hardware you're running on.
In principle you should be able to make GPU rendering deterministic/bit-exact, but there are so many layers for stuff to go wrong at and in practice it's always fingerprinting.