@0xabad1dea but, have you seen timestamps (expressed as seconds since the Epoch) declared as a 32-bit float?
Because I did. In production code. In a BIG company.
The result was a ~2-minute precision of said timestamps, which triggered interesting bugs elsewhere.