The weather was dreadful today, "killer hot" and no hype about it. So I stayed in and binge-watched HBO's Chernobyl.
It's well-told and gives a much better look into the terrifying near-miss; the outcome could have been so much worse. And it's a story of the dangers of politically-driven engineering, a story of overconfidence and of what happens when you forget that failure is an option and Murphy never sleeps. --Especially not when hazards are not communicated.
This is the story of many engineering disasters: when engineering bumps up against politics, politics often triumphs; the dam must hold, the lower temperature limit for Shuttle launches is just a guideline, the power grid is sufficiently robust.... Political numbers are often nudged, adjusted, massaged; what's a few billion dollars or rubles here or there? Engineering numbers don't work that way; even the "wiggle room" is part of the calculation.
...You can't fudge it; you can't cajole it. If you built mistakes into the technology, hiding them doesn't make them go away.
Chernobyl is an engrossing docudrama. At the very end, there's a short segment explaining the simplifications used to tell the story, the large science staff distilled to just two people, one real, one composite. There's no similar segment directly calling out the dangers of misinformation, of concealing information from the people in a position to understand and apply it.
All around you, every day, there are plenty of potential small-scale engineering disasters. And plenty of politicians who don't know how that stuff works are making decisions about it. Below them, plenty of non-technical managers who can't tell genuine reasons for concern from under- or over-reaction. And an Internet full of bad information.
None of us can know everything. Each of us knows a few things -- some of which is true. Be certain of what you know; be wary of both complacency and panic.
2 months ago