"... In a discussion of the earthquake and tsunami that produced the 2011 Fukushima nuclear disaster in Japan, Taleb writes: “Not seeing a tsunami or an economic event coming is excusable; building something fragile to them is not.” And in the case of the Fukushima disaster, authorities seem to be responding appropriately: not by developing better predictive models, but by building smaller and less vulnerable reactors."
source: Nassim Nicholas Taleb on Accepting Uncertainty, Embracing Volatility
"4. Do not let someone making an “incentive” bonus manage a nuclear plant – or your financial risks. Odds are he would cut every corner on safety to show “profits” while claiming to be “conservative”. Bonuses do not accommodate the hidden risks of blow-ups. It is the asymmetry of the bonus system that got us here. No incentives without disincentives: capitalism is about rewards and punishments, not just rewards."
The Tokyo District Public Prosecutors Office, including tsunami and earthquake experts had been investigating for years and concluded that it was not predictable. There may be different opinions, but so far no clear evidence shows that it was predictable.
I don't think this post was limited on side projects. Many people reading this are working, or will work in environments where safety and correctness are important, if not critical.
Very few software projects have potential consequences on the scale of a nuclear reactor disaster. If your project does, then "release early" and many other popular software practices will probably not apply.
Relatively few, but they're everywhere -- Auto-pilot on airplanes. ABS brakes. Self-driving cars.
"Releasing early", in the sense of getting these things into the real world as early as possible so they can be iterated on with real-world data is actually pretty crucial if you think about it.
The key here is figuring out how to release gently. The greater the risk, the more iteratively and more gently you should be releasing. If you were building landing gear controllers for passenger planes and your plan for release was to spend a really long time planning really carefully, testing like crazy in non-real-world conditions and finally doing a world-wide, all-commercial flight roll-out at once, I'm likely to pass on air travel for a while (and keep a lookout overhead).
When people talk about releasing early, they're not talking about big bang releases at all. There are alphas, betas, feature toggles, a/b tests, and dark paths for a reason.
I think that's actually kind of debatable. There are plenty of things that happen online that can, for example, expose kids to pedophiles, give channels for illegal and/or immoral activities, etc.
We are currently wrestling with questions that can be personal catastrophes due to the existence of personal tech, such as teens sexting each other and sending each other sexy photos and being charged with promoting child pornography or crap like that. If the phones and apps did not exist, they would probably be showing each other their bits in person instead and (in many cases) this would be perfectly legal. Get a phone involved, suddenly one or both of them can potentially go to jail.
But you just keep believing that software development is harmless, good clean fun. Yeah.
Often dangerous failures that happen are unknown unknowns. Fukushima is just one example for this.