Knowledge (XXG)

System accident

Source đź“ť

524:, a regularly scheduled ValuJet Airlines flight from Miami International to Hartsfield–Jackson Atlanta, crashed about 10 minutes after taking off as a result of a fire in the cargo compartment caused by improperly stored and labeled hazardous cargo. All 110 people on board died. The airline had a poor safety record before the crash. The accident brought widespread attention to the airline's management problems, including inadequate training of employees in proper handling of hazardous materials. The maintenance manual for the MD-80 aircraft documented the necessary procedures and was "correct" in a sense. However, it was so huge that it was neither helpful nor informative. 380: 288: 25: 610:... even in such highly constrained and high-risk environments as nuclear power plants, modification of instructions is repeatedly found and the violation of rules appears to be quite rational, given the actual workload and timing constraints under which the operators must do their job. In these situations, a basic conflict exists between error as seen as a deviation from the 585:
engineer Earl Wiener who takes the humorous statement attributed to the Duchess of Windsor that one can never be too rich or too thin, and adds "or too careful about what you put into a digital flight-guidance system." Wiener says that the effect of automation is typically to reduce the workload when it is light, but to increase it when it's heavy.
329: 117: 66: 598:
budget cuts, because managers know that they system will continue to operate without fixing the backup system. Steps in procedures may be changed and adapted in practice, from the formal safety rules, often in ways that seem appropriate and rational, and may be essential in meeting time constraints and work demands. In a 2004
554:...), big mistakes can be made–especially if rating agencies tell you they are triple-A, to wit, safe enough for grandma. When the crash comes, losses may therefore be much larger than investors dreamed imaginable. Markets may dry up as no one knows what these securities are really worth. Panic may set in. Thus complexity 479:(g): In reviewing these procedures before the flight, officials of NASA, ER, and Beech did not recognize the possibility of damage due to overheating. Many of these officials were not aware of the extended heater operation. In any event, adequate thermostatic switches might have been expected to protect the tank. 538:
In a 2014 monograph, economist Alan Blinder stated that complicated financial instruments made it hard for potential investors to judge whether the price was reasonable. In a section entitled "Lesson # 6: Excessive complexity is not just anti-competitive, it's dangerous", he further stated, "But the
576:
who writes about "automation surprises," often related to system modes the pilot does not fully understand or that the system switches to on its own. In fact, one of the more common questions asked in cockpits today is, "What's it doing now?" In response to this, Langewiesche points to the fivefold
597:
Human factors in the implementation of safety procedures play a role in overall effectiveness of safety systems. Maintenance problems are common with redundant systems. Maintenance crews can fail to restore a redundant system to active status. They may be overworked, or maintenance deferred due to
1320:
is essential reading today for industrial managers, organizational sociologists, historians of technology, and interested lay people alike, because it shows that a major strategy engineers have used in this century to keep hazardous technologies under control—multiple layers of 'fail-safe' backup
571:
Despite a significant increase in airplane safety since 1980s, there is concern that automated flight systems have become so complex that they both add to the risks that arise from overcomplication and are incomprehensible to the crews who must work with them. As an example, professionals in the
254:
disaster. Langewiesche writes about, "an entire pretend reality that includes unworkable chains of command, unlearnable training programs, unreadable manuals, and the fiction of regulations, checks, and controls." The more formality and effort to get it exactly right, at times can actually make
584:
over the mid-Atlantic. He points out that, since the 1980s when the transition to automated cockpit systems began, safety has improved fivefold. Langwiesche writes, "In the privacy of the cockpit and beyond public view, pilots have been relegated to mundane roles as system managers." He quotes
504:
It resembled other accidents in nuclear plants and in other high risk, complex and highly interdependent operator-machine systems; none of the accidents were caused by management or operator ineptness or by poor government regulation, though these characteristics existed and should have been
650:
Charles Perrow's thinking is more difficult for pilots like me to accept. Perrow came unintentionally to his theory about normal accidents after studying the failings of large organizations. His point is not that some technologies are riskier than others, which is obvious, but that
199:. This complexity can either be of technology or of human organizations and is frequently both. A system accident can be easy to see in hindsight, but extremely difficult in foresight because there are simply too many action pathways to seriously consider all of them. 222:
In 2012 Charles Perrow wrote, "A normal accident is where everyone tries very hard to play safe, but unexpected interaction of two or more failures (because of interactive complexity), causes a cascade of failures (because of tight coupling)." Perrow uses the term
255:
failure more likely. For example, employees are more likely to delay reporting any changes, problems, and unexpected conditions, wherever organizational procedures involved in adjusting to changing conditions are complex, difficult, or laborious.
588:
Boeing Engineer Delmar Fadden said that once capacities are added to flight management systems, they become impossibly expensive to remove because of certification requirements. But if unused, may in a sense lurk in the depths unseen.
655:. Those failures will occasionally combine in unforeseeable ways, and if they induce further failures in an operating environment of tightly interrelated processes, the failures will spin out of control, defeating all interventions. 572:
aviation industry note that such systems sometimes switch or engage on their own; crew in the cockpit are not necessarily privy to the rationale for their auto-engagement, causing perplexity. Langewiesche cites industrial engineer
1047:... Since the 1980s, when the shift began, the safety record has improved fivefold, to the current one fatal accident for every five million departures. No one can rationally advocate a return to the glamour of the past. 1180:
article states: "... A system accident is one that requires many things to go wrong in a cascade. Change any element of the cascade and the accident may well not occur, but every element shares the blame
505:
expected. I maintained that the accident was normal, because in complex systems there are bound to be multiple faults that cannot be avoided by planning and that operators cannot immediately comprehend.
1084:... In fact, a common way for workers to apply pressure to management without actually going out on strike is to 'work to rule,' which can lead to a breakdown in productivity and even chaos 606:
However, instructions and written procedures are almost never followed exactly as operators strive to become more efficient and productive and to deal with time pressures
1265:, Todd La Prote, Karlene Roberts, and Gene Rochlin, Oxford University Press, 1997. This book provides counter-examples of complex systems which have good safety records. 1289: 214:
in 1998: "the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur."
1228: 1190: 539:
greater hazard may come from opacity. When investors don't understand the risks that inhere in the securities they buy (examples: the mezzanine tranche of a
653:
the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur
1128: 203:
first developed these ideas in the mid-1980s. Safety systems themselves are sometimes the added complexity which leads to this type of accident.
1220: 1361: 1104: 992: 138: 127: 1280: 1163: 948: 872: 739: 705: 441: 423: 315: 174: 156: 52: 679:
article: "... Understanding why might keep us from making the system even more complex, and therefore perhaps more dangerous, too."
390: 841: 301: 38: 533: 227:
to emphasize that, given the current level of technology, such accidents are highly likely over a number of years or decades.
1386: 822: 728:
Normal accidents : living with high-risk technologies : with a new afterword and a postscript on the Y2K problem
259: 398: 339: 76: 1381: 864: 731: 491: 602:
article, reporting on research partially supported by National Science Foundation and NASA, Nancy Leveson writes:
394: 266:, for example, discusses in multiple publications their robust reliability, especially regarding nuclear weapons. 470:... It was found that the accident was not the result of a chance malfunction in a statistical sense, but rather 1173: 1034: 577:
increase in aviation safety and writes, "No one can rationally advocate a return to the glamour of the past."
628: 250:
in the way that small errors of judgment, flaws in technology, and insignificant damages combine to form an
134:
Style approaches a news or magazine article. Examples and views are not knitted together for coherent whole
472:
resulted from an unusual combination of mistakes, coupled with a somewhat deficient and unforgiving design
980: 1144: 965: 581: 1152:
The Inmates Are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity
210:
used Perrow's concept in his analysis of the factors at play in a 1996 aviation disaster. He wrote in
1245: 307: 247: 207: 44: 544: 521: 515: 236: 1009: 888:
Christianson, Marlys K; Sutcliffe, Kathleen M; Miller, Melissa A; Iwashyna, Theodore J (2011).
1357: 1207: 1188:
Helmreich, Robert L. (1994). "Anatomy of a system accident: The crash of Avianca Flight 052".
1159: 1155: 1100: 1010:"What Did We Learn from the Financial Crisis, the Great Recession, and the Pathetic Recovery?" 988: 944: 921: 868: 735: 701: 232: 1060: 1237: 1199: 1075: 911: 901: 745: 711: 240: 1141:
Accidents at both Chernobyl and Three Mile Island were set off by failed safety systems.
465:
The Apollo 13 Review Board stated in the introduction to chapter five of their report:
1276: 916: 889: 723: 693: 200: 196: 1293: 1079: 964:
Chair Edgar M. Cortright. "Chapter 5. Findings, Determinations, and Recommendations".
1375: 573: 548: 1135:. The Beckham Company. Article from a health care consulting company. Archived from 580:
In an article entitled "The Human Factor", Langewiesche discusses the 2009 crash of
1136: 228: 1321:
devices—often adds a dangerous level of unpredictability to the system as a whole
1203: 540: 263: 1331: 1241: 749: 715: 460: 251: 1211: 925: 1315:(Ph.D. dissertation). Massachusetts Institute of Technology. p. 15: 660:
William Langewiesche (March 1998), "The Lessons of Valujet 592", p. 23 ,
271: 405: 1268:
Pidgeon, Nick (September 22, 2011). "In retrospect: Normal accidents".
1313:
Catastrophe and Control: How Technological Disasters Enhance Democracy
614:
and error as seen as a deviation from the rational and normally used
1292:; Institute for the Study of Society and Environment, archived from 1095:
Rasmussen, Jens; Pejtersen, Annelise Mark; Goodstein, L. P. (1994).
906: 941:
The Limits of Safety: Organizations, Accidents, and Nuclear Weapons
1043:... pilots have been relegated to mundane roles as system managers 16:
Unanticipated interaction of multiple failures in a complex system
842:"Getting to Catastrophe: Concentrations, Complexity and Coupling" 262:. In his assessment of the vulnerabilities of complex systems, 1336: 675:
See especially the last three paragraphs of this 30-plus-page
373: 322: 281: 270:(1993) provided an extensive review of close calls during the 195:) is an "unanticipated interaction of multiple failures" in a 110: 59: 18: 983:. In David L. Sills; C. P. Wolf; Vivien B. Shelanski (eds.). 987:. Boulder, Colorado, U.S: Westview Press. pp. 173–184. 1035:"The Human Factor - Should Airplanes Be Flying Themselves?" 1263:
Beyond Engineering: A New Way of Thinking About Technology
1017:
Griswold Center for Economic Policy Studies Working Papers
698:
Normal accidents : living with high-risk technologies
985:
Accident at Three Mile Island : The human dimensions
354: 91: 981:"16. The President's Commission and the Normal Accident" 350: 87: 346: 274:
that could have resulted in a nuclear war by accident.
83: 966:
REPORT OF APOLLO 13 REVIEW BOARD ("Cortright Report")
1061:"A New Accident Model for Engineering Safer Systems" 1028: 1026: 793: 791: 1332:"Test shows oxygen canisters sparking intense fire" 816: 814: 812: 810: 808: 806: 496:Perrow considered the Three Mile Island accident 1290:University Corporation for Atmospheric Research 648: 604: 502: 467: 1229:Journal of Contingencies and Crisis Management 647:In the same article, Langewiesche continued: 349:. Consider transferring direct quotations to 86:. Consider transferring direct quotations to 8: 1191:International Journal of Aviation Psychology 1033:Langewiesche, William (September 17, 2014). 461:Apollo 13 § Investigation and response 316:Learn how and when to remove these messages 53:Learn how and when to remove these messages 1221:"Was Three Mile Island A Normal Accident?" 890:"Becoming a high reliability organization" 915: 905: 442:Learn how and when to remove this message 424:Learn how and when to remove this message 175:Learn how and when to remove this message 157:Learn how and when to remove this message 404:Relevant discussion may be found on the 763: 640: 1172:Gross, Michael Joseph (May 29, 2015). 821:Langewiesche, William (1 March 1998). 797: 782: 770: 528:Financial crises and investment losses 1282:Organizationally Induced Catastrophes 340:too many or overly lengthy quotations 77:too many or overly lengthy quotations 7: 1174:"Life and Death at Cirque du Soleil" 1127:Beckham, J. Daniel (January 1999). 1008:Blinder, Alan S. (November 2014). 258:A contrasting idea is that of the 126:tone or style may not reflect the 14: 840:Perrow, Charles (December 2012). 297:This section has multiple issues. 34:This article has multiple issues. 1019:. Princeton University. No. 243. 378: 327: 286: 139:guide to writing better articles 115: 64: 23: 1352:Wallace, Brendan (2009-03-05). 305:or discuss these issues on the 246:These accidents often resemble 42:or discuss these issues on the 943:. Princeton University Press. 1: 1219:Hopkins, Andrew (June 2001). 1097:Cognitive systems engineering 1080:10.1016/S0925-7535(03)00047-X 1059:Leveson, Nancy (April 2004). 593:Theory and practice interplay 260:high reliability organization 859:Reason, James (1990-10-26). 823:"The Lessons of ValuJet 592" 231:extended this approach with 218:Characteristics and overview 1311:Roush, Wade Edmund (1994). 1204:10.1207/s15327108ijap0403_4 1150:Cooper, Alan (2004-03-05). 1405: 1129:"The Crash of ValuJet 592" 865:Cambridge University Press 732:Princeton University Press 534:2007–2008 financial crisis 531: 513: 492:Three Mile Island accident 489: 486:Three Mile Island accident 458: 239:, now widely accepted in 979:Perrow, Charles (1982). 939:Sagan, Scott D. (1993). 387:This article or section 347:summarize the quotations 278:System accident examples 84:summarize the quotations 1242:10.1111/1468-5973.00155 1145:Direct article download 629:Unintended consequences 130:used on Knowledge (XXG) 1356:. Florida: CRC Press. 1154:. Indianapolis: Sams; 666: 620: 558:is a source of risk." 507: 483: 137:See Knowledge (XXG)'s 132:. The reason given is: 582:Air France Flight 447 562:Continuing challenges 514:Further information: 490:Further information: 459:Further information: 391:synthesis of material 248:Rube Goldberg devices 1387:Engineering failures 567:Air transport safety 268:The Limits of Safety 208:William Langewiesche 1099:. New York: Wiley. 846:The MontrĂ©al Review 616:effective procedure 612:normative procedure 1382:Safety engineering 1354:Beyond Human Error 1299:on 5 December 2018 1251:on August 29, 2007 522:Valujet Flight 592 516:ValuJet Flight 592 510:ValuJet Flight 592 401:to the main topic. 395:verifiably mention 389:possibly contains 237:Swiss cheese model 1363:978-0-8493-2718-6 1156:Pearson Education 1139:on 4 March 2016. 1106:978-0-471-01198-9 994:978-0-86531-165-7 520:On May 11, 1996, 452: 451: 444: 434: 433: 426: 372: 371: 320: 243:and healthcare. 233:human reliability 206:Pilot and author 185: 184: 177: 167: 166: 159: 128:encyclopedic tone 109: 108: 57: 1394: 1367: 1348: 1346: 1345: 1327: 1324: 1318:Normal Accidents 1307: 1306: 1304: 1298: 1287: 1259: 1257: 1256: 1250: 1244:. Archived from 1225: 1215: 1183: 1169: 1143: 1113: 1110: 1090: 1087: 1065: 1056: 1050: 1049: 1046: 1030: 1021: 1020: 1014: 1005: 999: 998: 976: 970: 969: 961: 955: 954: 936: 930: 929: 919: 909: 885: 879: 878: 856: 850: 849: 837: 831: 830: 818: 801: 795: 786: 780: 774: 768: 753: 719: 680: 673: 667: 664: 645: 609: 553: 447: 440: 429: 422: 418: 415: 409: 382: 381: 374: 367: 364: 358: 331: 330: 323: 312: 290: 289: 282: 180: 173: 162: 155: 151: 148: 142: 141:for suggestions. 119: 118: 111: 104: 101: 95: 68: 67: 60: 49: 27: 26: 19: 1404: 1403: 1397: 1396: 1395: 1393: 1392: 1391: 1372: 1371: 1370: 1364: 1351: 1343: 1341: 1330: 1322: 1310: 1302: 1300: 1296: 1285: 1279:(29 May 2000), 1277:Perrow, Charles 1275: 1254: 1252: 1248: 1223: 1218: 1187: 1181: 1166: 1149: 1126: 1122: 1120:Further reading 1117: 1116: 1107: 1094: 1085: 1063: 1058: 1057: 1053: 1044: 1032: 1031: 1024: 1012: 1007: 1006: 1002: 995: 978: 977: 973: 963: 962: 958: 951: 938: 937: 933: 907:10.1186/cc10360 887: 886: 882: 875: 858: 857: 853: 839: 838: 834: 820: 819: 804: 796: 789: 781: 777: 769: 765: 760: 742: 722: 708: 692: 689: 684: 683: 674: 670: 665: 659: 646: 642: 637: 625: 607: 595: 569: 564: 551: 536: 530: 518: 512: 494: 488: 482: 475: 463: 457: 448: 437: 436: 435: 430: 419: 413: 410: 403: 393:which does not 383: 379: 368: 362: 359: 353:or excerpts to 344: 332: 328: 291: 287: 280: 241:aviation safety 225:normal accident 220: 193:normal accident 189:system accident 181: 170: 169: 168: 163: 152: 146: 143: 136: 124:This article's 120: 116: 105: 99: 96: 90:or excerpts to 81: 69: 65: 28: 24: 17: 12: 11: 5: 1402: 1401: 1398: 1390: 1389: 1384: 1374: 1373: 1369: 1368: 1362: 1349: 1328: 1308: 1273: 1266: 1260: 1216: 1198:(3): 265–284. 1185: 1170: 1164: 1147: 1123: 1121: 1118: 1115: 1114: 1112: 1111: 1105: 1074:(4): 237–270. 1068:Safety Science 1051: 1022: 1000: 993: 971: 956: 949: 931: 880: 873: 851: 832: 802: 787: 775: 762: 761: 759: 756: 755: 754: 740: 724:Charles Perrow 720: 706: 694:Charles Perrow 688: 685: 682: 681: 668: 657: 639: 638: 636: 633: 632: 631: 624: 621: 600:Safety Science 594: 591: 568: 565: 563: 560: 529: 526: 511: 508: 487: 484: 481: 480: 468: 456: 453: 450: 449: 432: 431: 386: 384: 377: 370: 369: 335: 333: 326: 321: 295: 294: 292: 285: 279: 276: 219: 216: 201:Charles Perrow 197:complex system 183: 182: 165: 164: 123: 121: 114: 107: 106: 72: 70: 63: 58: 32: 31: 29: 22: 15: 13: 10: 9: 6: 4: 3: 2: 1400: 1399: 1388: 1385: 1383: 1380: 1379: 1377: 1365: 1359: 1355: 1350: 1339: 1338: 1333: 1329: 1326: 1319: 1314: 1309: 1295: 1291: 1284: 1283: 1278: 1274: 1271: 1267: 1264: 1261: 1247: 1243: 1239: 1235: 1231: 1230: 1222: 1217: 1213: 1209: 1205: 1201: 1197: 1193: 1192: 1186: 1179: 1175: 1171: 1167: 1165:0-672-31649-8 1161: 1157: 1153: 1148: 1146: 1142: 1138: 1134: 1130: 1125: 1124: 1119: 1108: 1102: 1098: 1092: 1091: 1089: 1081: 1077: 1073: 1069: 1062: 1055: 1052: 1048: 1040: 1036: 1029: 1027: 1023: 1018: 1011: 1004: 1001: 996: 990: 986: 982: 975: 972: 967: 960: 957: 952: 950:0-691-02101-5 946: 942: 935: 932: 927: 923: 918: 913: 908: 903: 899: 895: 894:Critical Care 891: 884: 881: 876: 874:0-521-31419-4 870: 866: 862: 855: 852: 847: 843: 836: 833: 828: 824: 817: 815: 813: 811: 809: 807: 803: 799: 794: 792: 788: 784: 779: 776: 773:, p. 70. 772: 767: 764: 757: 751: 747: 743: 741:0-691-00412-9 737: 733: 729: 725: 721: 717: 713: 709: 707:0-465-05143-X 703: 699: 695: 691: 690: 686: 678: 672: 669: 663: 656: 654: 644: 641: 634: 630: 627: 626: 622: 619: 617: 613: 603: 601: 592: 590: 586: 583: 578: 575: 574:Nadine Sarter 566: 561: 559: 557: 550: 549:synthetic CDO 546: 542: 535: 527: 525: 523: 517: 509: 506: 501: 499: 493: 485: 478: 477: 476: 473: 466: 462: 454: 446: 443: 428: 425: 417: 407: 402: 400: 396: 392: 385: 376: 375: 366: 356: 352: 348: 342: 341: 336:This article 334: 325: 324: 319: 317: 310: 309: 304: 303: 298: 293: 284: 283: 277: 275: 273: 269: 265: 261: 256: 253: 249: 244: 242: 238: 234: 230: 226: 217: 215: 213: 209: 204: 202: 198: 194: 190: 179: 176: 161: 158: 150: 140: 135: 131: 129: 122: 113: 112: 103: 93: 89: 85: 79: 78: 73:This article 71: 62: 61: 56: 54: 47: 46: 41: 40: 35: 30: 21: 20: 1353: 1342:. Retrieved 1340:. 1996-11-19 1335: 1317: 1316: 1312: 1301:, retrieved 1294:the original 1281: 1269: 1262: 1253:. Retrieved 1246:the original 1236:(2): 65–72. 1233: 1227: 1195: 1189: 1177: 1151: 1140: 1137:the original 1132: 1096: 1083: 1071: 1067: 1054: 1042: 1038: 1016: 1003: 984: 974: 959: 940: 934: 897: 893: 883: 860: 854: 845: 835: 827:The Atlantic 826: 778: 766: 727: 697: 676: 671: 662:The Atlantic 661: 652: 649: 643: 615: 611: 605: 599: 596: 587: 579: 570: 555: 537: 519: 503: 497: 495: 471: 469: 464: 438: 420: 411: 388: 360: 345:Please help 337: 313: 306: 300: 299:Please help 296: 267: 257: 245: 229:James Reason 224: 221: 212:The Atlantic 211: 205: 192: 188: 186: 171: 153: 144: 133: 125: 97: 82:Please help 74: 50: 43: 37: 36:Please help 33: 1303:February 6, 1178:Vanity Fair 1039:Vanity Fair 861:Human Error 798:Perrow 1999 783:Perrow 1984 771:Perrow 1999 541:CDO-Squared 264:Scott Sagan 1376:Categories 1344:2008-03-06 1255:2008-03-06 900:(6): 314. 758:References 750:Q114963670 716:Q114963622 532:See also: 355:Wikisource 302:improve it 92:Wikisource 39:improve it 968:(Report). 455:Apollo 13 414:June 2011 406:talk page 363:June 2011 351:Wikiquote 338:contains 308:talk page 147:July 2023 100:July 2023 88:Wikiquote 75:contains 45:talk page 1212:11539174 1093:Citing: 926:22188677 746:Wikidata 726:(1999). 712:Wikidata 696:(1984). 677:Atlantic 658:—  623:See also 272:Cold War 252:emergent 235:and the 1176:, This 1133:Quality 917:3388695 687:Sources 1360:  1323:  1270:Nature 1210:  1182:  1162:  1103:  1086:  1045:  991:  947:  924:  914:  871:  748:  738:  714:  704:  608:  556:per se 552:  498:normal 399:relate 1297:(PDF) 1286:(PDF) 1249:(PDF) 1224:(PDF) 1064:(PDF) 1013:(PDF) 635:Notes 547:on a 1358:ISBN 1305:2009 1208:PMID 1184:..." 1160:ISBN 1101:ISBN 989:ISBN 945:ISBN 922:PMID 869:ISBN 736:ISBN 702:ISBN 543:; a 191:(or 1337:CNN 1325:... 1238:doi 1200:doi 1088:... 1076:doi 912:PMC 902:doi 545:CDS 474:... 397:or 1378:: 1334:. 1288:, 1232:. 1226:. 1206:. 1194:. 1158:. 1131:. 1082:. 1072:42 1070:. 1066:. 1041:. 1037:. 1025:^ 1015:. 920:. 910:. 898:15 896:. 892:. 867:. 863:. 844:. 825:. 805:^ 790:^ 744:. 734:. 730:. 710:. 700:. 500:: 311:. 187:A 48:. 1366:. 1347:. 1272:. 1258:. 1240:: 1234:9 1214:. 1202:: 1196:4 1168:. 1109:. 1078:: 997:. 953:. 928:. 904:: 877:. 848:. 829:. 800:. 785:. 752:. 718:. 618:. 445:) 439:( 427:) 421:( 416:) 412:( 408:. 365:) 361:( 357:. 343:. 318:) 314:( 178:) 172:( 160:) 154:( 149:) 145:( 102:) 98:( 94:. 80:. 55:) 51:(

Index

improve it
talk page
Learn how and when to remove these messages
too many or overly lengthy quotations
summarize the quotations
Wikiquote
Wikisource
encyclopedic tone
guide to writing better articles
Learn how and when to remove this message
Learn how and when to remove this message
complex system
Charles Perrow
William Langewiesche
James Reason
human reliability
Swiss cheese model
aviation safety
Rube Goldberg devices
emergent
high reliability organization
Scott Sagan
Cold War
improve it
talk page
Learn how and when to remove these messages
too many or overly lengthy quotations
summarize the quotations
Wikiquote
Wikisource

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑