@Di4na @clacke @makdaam @hendric That doesn’t seem at all the case to me. The Therac-25 report had quite a few big lessons.
• Data races can exist anywhere shared mutable state exists. This was poorly understood at the time. Language people have taken this to heart with copy-on-write data structures, static analysis for control flow, and more recently with proof-based data access validation as seen in Swift 6. This kind of issue is why those capabilities exist, and why you shouldn’t just turn them off to silence warnings.
• Software interlocks are strictly worse than hardware interlocks. They have more opportunities to fail in non-obvious ways.
• Safety-critical software has become a much more formalized discipline, finally matching the rigor of real engineering. For example, techniques were developed to prove a given program is free of bugs by proving it exactly matches the behaviors defined by its formal specification (no undefined behaviors, and no missing behaviors).
• Reported issues should be treated as real until you can prove what happened. Part of the reason the Therac-25 hurt so many people is the company brushed off the early issue reports.
A lot of the company-culture problems the incidents exposed are still major issues today. The company thought their software was perfect, and they didn’t include it in their analysis of potential failure modes. They didn’t have any independent review of their code. They shipped straight to production (the hardware and software were never tested together outside customer installations). They didn’t document error codes and didn’t differentiate between minor errors and safety-critical errors.