https://buy-zithromax.online buy kamagra usa https://antibiotics.top buy stromectol online https://deutschland-doxycycline.com https://ivermectin-apotheke.com kaufen cialis https://2-pharmaceuticals.com buy antibiotics online Online Pharmacy vermectin apotheke buy stromectol europe buy zithromax online https://kaufen-cialis.com levitra usa https://stromectol-apotheke.com buy doxycycline online https://buy-ivermectin.online https://stromectol-europe.com stromectol apotheke https://buyamoxil24x7.online deutschland doxycycline https://buy-stromectol.online https://doxycycline365.online https://levitra-usa.com buy ivermectin online buy amoxil online https://buykamagrausa.net

There Is No 100% Safe System

That seems to be the conclusion in the (fascinating) article below, and it makes frightening sense. The reasons being:

1. We can identify and ameliorate a lot of single-point failures, but in doing so we make systems more complex and more prone to multi-point failures.

2. Pilots are human. Humans make mistakes, including assumptions about what failures are and aren't possible, to the point of, under stress, ignoring warnings about failures that they know are not possible.

3. The world's a complicated place.

Where this gets interesting is when you start talking about very dangerous technologies, where you can minimize the likelihood of a catastrophe, but where the cost of a catastrophe would be huge. Nuclear plants are an obvious example. Lots we can do to make them extremely safe. But … #ddtb

Embedded Link

Disaster book club: What you need to read to understand the crash of Air France 447 – Boing Boing

65 view(s)  

Leave a Reply

Your email address will not be published. Required fields are marked *