Knowledge (XXG)

Talk:Preparedness paradox

Source đź“ť

898:
consequences. It might be worth more to look at examples of things that have not happened rather than incidents that did occur. For example, there have been plenty of nuclear reactor accidents, but the number of functional reactors that were built in a routine manner and decommissioned just as quietly without anything of particular interest happening along the way vastly outweighs the number that have gone catastrophically wrong. We would need a reliable opinionated source that this could create the illusion that the risks of a nuclear meltdown are not as severe as they truly are for the very reason that those risks have so seldom been made a reality (if, that is, I am understanding the topic of this article correctly). If we're going to cite specific incidents like the above, we would need to say, "XYZ of ABC Publication argues that DEF incident qualifies as an example of the paradox in some way," because as I said that's really all they're ever going to be — examples in the opinion of so-and-so, impossible to say as defining to the nature of the topic in question.
633:
result of failures in regulation and nuclear plant design and that both were lagging behind international best practices and standards. Had these been heeded and applied, the risks to the Fukushima Daiichi Nuclear Power Station would likely have been recognized and effective steps to prevent a major accident could have been taken. In this sense, we believe the Fukushima accident—like its predecessors—was preventable.
72: 436:, like I said. The only connection to "preparedness paradox" in the cited article is the use of the words "preparedness" and "paradox". I honestly don't know how to be any clearer on that. The Chernobyl example is also completely irrelevant, as discussed below. Not wanting to pay for additional safety measures because of a perceived lack of risk is simply underestimating the risk. There's no paradox. 141: 120: 89: 21: 213: 355:
the risk is overstated by scientists or public officials. Or, along the same vein, if the damage was significant, but was still mitigated thanks to levees, but the levees were dismissed as unnecessary because they didn't prevent all major damage, and would be less likely to support preparedness measures in the future due to the perceived lack of
517:
depends on where that perception came from: was it from a power station in a similar location which survived (through preparation) a similar disaster, creating the illusion that its preparation wasn't necessary? Or did they measure the risk in isolation and decide it was small enough to not invest in
647:
I'm not sure I see how this is an example of the preparedness paradox, as this article defines it. There might be something to be said for Three Mile Island's "relatively small release of radiation" leading later (pre-Chernobyl!) regulators to mistakenly believe that nuclear meltdowns weren't that a
931:
not being as bad as advertised is likely to have been an instance of this paradox. But like its article says, not everyone agrees that all the preparation for it was really what made it not a big deal. And I think that's going to be a pretty big theme across examples: because the key feature of the
843:
Even though the Onagawa Nuclear Power Plant was closer to the epicenter of the earthquake, it withstood the cataclysm because of the preparations made by the facility's owners. On the other hand, the Fukushima Daiichi Nuclear Power Plant suffered heavy damage because of a lack of preparation due to
812:
Feedback taken is that Chernobyl is not an example; the levee example is poorly stated; the third may be the case for Onagawa given more information (which is not present in the article right now, its preparation is praised as successful), but is not the case for Fukushima. Some of this had already
632:
It is clear that the two major nuclear accidents before Fukushima—Chernobyl in 1986 and Three Mile Island in 1979 (which involved extensive damage to nuclear fuel but a relatively small release of radiation)—were preventable ... There is a growing body of evidence that suggests the accident was the
462:
I agree, this does seem to be describing (and to be a good example of!) risk compensation rather than the preparedness paradox, at least the way that this article defines that paradox. The "Background" section looks similarly off-topic, drawn from sources that happen to use the phrase "preparedness
354:
the levees were effective in reducing damage), citizens or officials were then to declare the expense was unnecessary, as the predicted disaster did not occur, and would be less likely to support measures in the future due to a perceived lack of danger – we shouldn't take warnings seriously because
939:
an example of the paradox, it's just poorly stated in the article as written. A levee (preparation) makes flooding appear to not be a concern (nothing happening), when actually flooding is being stopped by the levee (the paradox). This then causes people to not properly prepare for floods, causing
897:
I have to agree with Loki's third point and want to bring up another — that because this idea is inherently non-defining to the potential examples we could include, there will always be an element of personal opinion as to whether a particular event qualifies as an example of this paradox or its
1026:
See what I said in the RFC above. In the levee paradox, preparation makes flooding appear to not be a concern, when actually flooding is being stopped by the preparation. That's classic preparedness paradox. This reaction then itself makes future disasters worse, but that's a consequence of the
605:
I'm unable to find any sources talking about Chernobyl as an example of the preparedness paradox, and don't personally see how it applies: if the operators of Fukushima recalled the Chernobyl disaster but failed to prepare due to budgetary constraints, that doesn't sound like an example of the
394:
I've also removed the photo of Chernobyl (which is not mentioned anywhere in the article text, and whose caption doesn't have anything to say about the preparedness paradox) and added one of the Fukushima Daiichi plant (which is mentioned in the text).
568:
Chernobyl would only be an example of the paradox if it had been a relatively minor accident, prevented by adequate preparation, and which caused people to question the need for the preparation. But Chernobyl was the worst nuclear accident in history.
494:
article. Ludicer's first comment in this thread is convincing, and your argument for inclusion ("The underestimation of the risk may be because they thought they were sufficiently prepared.") doesn't seem to match the given definition of the paradox.
970:
Read the lead. Then read the paragraph I deleted. The latter does not follow logically from the former. If someone wants to expand the lead to explain that the paradox has multiple meanings, by all means – go ahead. But as it stands, the levee stuff
591:
If you want to swap the photo at the top with another example. Go for it. If my language is imprecise, go ahead and fix it. The point is, given the disaster at Chernobyl, people should have been prepared. They didn't prepare because of the cost.
708:
Creating a new article and asking that other editors who have concerns to "go ahead and fix it", so long as they do not fully remove any of the content, is not a constructive way to write articles. As the sole editor arguing for inclusion, the
754:
No, it's not appropriate to inform other editors that they will only be able to change, but not remove, the paragraphs and images you have added. I don't have a better lead image in mind, but that's not a reason to keep the current one.
563:"Knowledge gained from the tragedy at Chernobyl is helping other people and communities around the world to protect themselves and to recover from trauma, including during the Fukushima Daiichi nuclear emergency of 2011." 833:...damages attributed to the Chernobyl nuclear power plant disaster could have been easily reduced if the reactor and sarcophagus had been re-ensured with additional concrete and steel bars prior to the accident. 450:
The underestimation of the risk may be because they thought they were sufficiently prepared. It also helps set up Fukushima - where they should have prepared - but didn't because they thought they had prepared.
715:
damages attributed to the Chernobyl nuclear power plant disaster could have been easily reduced if the reactor and sarcophagus had been re-ensured with additional concrete and steel bars prior to the accident
838:
Levees are structures which run parallel to rivers and are meant to offer protection from flooding. Paradoxically, their construction leads to a reduced awareness of and preparation for floods or breaches.
350:. The hypothetical scenario that would fall under the "preparedness paradox" would be if, after the levees were constructed at great expense, and the feared scenario never materialized ( 490:
I agree with WP Ludicer that the example is clear and works, but it just isn't an example of the preparedness paradox: to "make this work" it should be moved from this article to the
191: 263: 64: 1061: 181: 561:
New section for this: Evrik has added the text that Chernobyl "should have been a warning" to operators at Fukushima, but that isn't what Ban Ki-moon is quoted as saying (
610:
by talking about how Chernobyl could have been (but wasn't) prepared for. So my fix on current sourcing and understanding would be to remove the mentions of Chernobyl. --
1056: 157: 932:
paradox is something not happening, it's hard to be sure if it didn't happen because of the preparation or if it just was never going to happen in the first place.
881:
I think it will be hard coming up with clear examples for this article, because the fundamental property of the topic of the article is something not happening.
913: 280:
terrorism, and a 2017 use re aerospace and defence. Is there a source for the definition in the introduction? Has the term been used before 2014? Best wishes,
771:
Since the sarcophagus was built after the accident it doesn't make sense to have reinforced it before the accident, and thus before it was built and needed.
148: 125: 1051: 565:) and neither Evrik's interpretation nor Ban's statement seems to be an example of the preparedness paradox. Did they mean to add a different source? 416: 736:
Fixing it does not mean removal of the content. I think that it appropriate. The levee section gives content. Want a new lead image? Okay, what? --
60: 607: 415:
As for the levee paradox, if the example is not clear, please edit it to make it clearer. The citation for the levee section is actually:
477:
I think that the example fits, but do encourage you both to edit the page to make this work, or show how these aren't really examples. --
671:
Maybe I'll look into that some time, but this section is about the use of the Chernobyl lead image and Chernobyl/Fukushima example. --
795: 276:"Preparedness paradox" seems to have been used occasionally, but with different meanings. I've added a 2014 use in the context of 100: 776: 961:
Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
368: 388: 750:(I posted the above to Evrik's talk page, they decided to delete it unarchived from there and post it here instead.) 627: 417:
https://knowledge.aidr.org.au/resources/ajem-jul-2018-flood-levee-influences-on-community-preparedness-a-paradox/
878:
3. Maybe. The example of the Onagawa Plant could be with more information, but the example of Fukushima is not.
772: 106: 88: 238: 463:
paradox", but don't seem to be actually talking about the subject of the article, as it is set out here. --
854: 818: 760: 726: 676: 653: 615: 574: 534: 500: 468: 400: 864:
1. No. Any example of a disaster being worse because it was not prepared for does not go in this article.
1032: 945: 886: 359:– we shouldn't take warnings seriously because the measures proposed will provide no protection anyway. 156:
on Knowledge (XXG). If you would like to participate, please visit the project page, where you can join
36: 306:
Thanks. A Google search finds nothing pre-2014; I hadn't checked the article carefully... Best wishes,
1016: 441: 364: 52: 828:
Are the following three situations examples of the preparedness paradox, as the article defines it?
246: 153: 509:
Checking the text while wondering whether Fukushima would be a good lead image, I'm not sure that
20: 850: 814: 756: 722: 672: 649: 611: 586: 570: 530: 526: 496: 491: 464: 433: 396: 347: 245:
Please help fix the broken anchors. You can remove this template after fixing the problems. |
710: 428:
It doesn't need to be edited to be made clear; it's clear. It needs to be excised because it
346:
to do with the preparedness paradox as described in the article. It instead is an example of
1028: 1011:
The idea of a "levee paradox" is certainly a related phenomenon, but it's not the same one.
941: 882: 325: 311: 285: 225: 713:
is on you to see that a consensus is reached on whether verifiably true statements such as
1012: 437: 360: 515:
suffered heavy damage because of a lack of preparation due to the perception of less risk
991:
a potential disaster such as a pandemic, natural disaster or other catastrophe so that
1045: 940:
worse disasters, but those are a consequence of the paradox, not the paradox itself.
719:
construction leads to a reduced awareness of and preparation for floods or breaches
868: 523:
where they should have prepared - but didn't because they thought they had prepared
935:
On a partially-related note: I think the "levee paradox" mentioned in the article
320:... and the interview of Juliette Kayyem defines the term as in the introduction. 41: 321: 307: 281: 1036: 1020: 949: 918: 890: 858: 822: 780: 764: 740: 730: 680: 666: 657: 642: 619: 596: 578: 538: 504: 481: 472: 455: 445: 423: 404: 378: 329: 315: 299: 289: 737: 663: 639: 593: 478: 452: 420: 391:
does not mention levees or rivers in any way, I have removed it as an example.
375: 296: 140: 119: 237:] The anchor (#Pandemic preparedness) is no longer available because it was 987:
The preparedness paradox is the proposition that if a society or individual
31: 928: 233:
This article links to one or more target anchors that no longer exist.
374:
Maybe the language could be refined. I encourage you to upgrade it. --
295:
If you read into the article, you'll see that it was used in 1949. --
705:
saying why the Chernobyl and levee content belongs in that article.
648:
big deal, but this isn't being said in the source or the article. --
50:... that if a disaster is avoided through planning and vigilance, 624:
I just swapped out the Ban-ki Moon citation above with this one:
277: 338:
Levee paradox – doesn't follow the definition given in the lead
207: 82: 15: 875:
prepared for. It is not about a disaster being worse at all.
70: 966:
Removing levee content again because it is not an example
871:. It is also not about a disaster being worse because it 432:
as defined in this article. It would, however, fit with
975:. Period. If you are really determined to add it back, 513:
an example of the preparedness paradox either. That it
409:
Somehow, the text about Chernobyl got stripped out: -->
337: 65:
Template:Did you know nominations/Preparedness paradox
63:. The nomination discussion and review may be seen at 794:
The following discussion is an archived record of a
152:, a collaborative effort to improve the coverage of 927:a few relatively clear examples. For instance, the 804:
No further edits should be made to this discussion.
606:preparedness paradox, and isn't a good reason to 525:comment about Fukushima above sounds again like 166:Knowledge (XXG):WikiProject Disaster management 813:been removed from the article during the RfC. 630:. Carnegie Endowment for International Peace. 979:explain how preparedness making the disaster 807:A summary of the conclusions reached follows. 430:is not an example of the preparedness paradox 264:Did you know nominations/Preparedness paradox 8: 721:are examples of the preparedness paradox. -- 1062:Low-importance Disaster management articles 626:Acton, James M.; Hibbs, Mark (2012-03-06). 86: 114: 61:Knowledge (XXG):Recent additions/2022/May 44:). The text of the entry was as follows: 1057:Start-Class Disaster management articles 529:rather than the preparedness paradox. -- 169:Template:WikiProject Disaster management 116: 71: 842: 837: 832: 718: 714: 701:Please respond to the open threads at 522: 514: 1052:Knowledge (XXG) Did you know articles 59:A record of the entry may be seen at 7: 146:This article is within the scope of 105:It is of interest to the following 56:that the preparation was necessary? 342:The "levee paradox" seems to have 14: 1027:paradox, not the paradox itself. 999:will be perceived as having been 957:The discussion above is closed. 211: 139: 118: 87: 19: 628:"Why Fukushima Was Preventable" 186:This article has been rated as 149:WikiProject Disaster management 53:people will paradoxically doubt 983:accords with this definition: 823:15:42, 14 September 2022 (UTC) 389:source cited for the paragraph 30:appeared on Knowledge (XXG)'s 1: 160:and see a list of open tasks. 989:acts effectively to mitigate 844:the perception of less risk. 697:Preparedness paradox reverts 662:Then go ahead and add it. -- 172:Disaster management articles 1021:20:29, 4 January 2023 (UTC) 867:2. No. This article is not 781:14:04, 15 August 2022 (UTC) 1078: 518:the necessary preparation? 192:project's importance scale 950:17:39, 27 July 2022 (UTC) 919:16:49, 27 July 2022 (UTC) 891:14:50, 27 July 2022 (UTC) 859:09:43, 27 July 2022 (UTC) 765:18:18, 19 July 2022 (UTC) 741:18:04, 19 July 2022 (UTC) 731:17:47, 19 July 2022 (UTC) 703:Talk:Preparedness paradox 681:20:14, 23 June 2022 (UTC) 667:20:03, 23 June 2022 (UTC) 658:11:14, 17 June 2022 (UTC) 643:19:37, 13 June 2022 (UTC) 620:17:15, 13 June 2022 (UTC) 597:16:42, 13 June 2022 (UTC) 579:16:31, 13 June 2022 (UTC) 539:18:24, 19 July 2022 (UTC) 505:20:42, 23 June 2022 (UTC) 482:20:02, 23 June 2022 (UTC) 473:16:16, 23 June 2022 (UTC) 456:20:02, 23 June 2022 (UTC) 446:14:02, 23 June 2022 (UTC) 424:16:14, 13 June 2022 (UTC) 405:13:40, 13 June 2022 (UTC) 185: 134: 113: 1037:23:35, 16 May 2023 (UTC) 959:Please do not modify it. 801:Please do not modify it. 379:16:16, 26 May 2022 (UTC) 369:03:45, 26 May 2022 (UTC) 330:18:15, 3 May 2022 (UTC) 316:17:55, 3 May 2022 (UTC) 300:17:37, 3 May 2022 (UTC) 290:17:32, 3 May 2022 (UTC) 256:Did you know nomination 95:This article is rated 76: 40:column on 3 May 2022 ( 99:on Knowledge (XXG)'s 74: 973:does not belong here 773:Julian Silden Langlo 28:Preparedness paradox 993:it causes less harm 796:request for comment 410:i added it back in. 272:Different meanings? 163:Disaster management 154:Disaster management 126:Disaster management 608:draw the reader in 344:absolutely nothing 101:content assessment 77: 1001:much less serious 917: 914:(Follow my trail) 751: 527:risk compensation 492:risk compensation 434:risk compensation 348:risk compensation 253: 252: 239:deleted by a user 228:in most browsers. 206: 205: 202: 201: 198: 197: 81: 80: 1069: 1007:actually caused. 911: 909: 903: 803: 749: 635: 590: 268: 262: 247:Reporting errors 215: 214: 208: 174: 173: 170: 167: 164: 143: 136: 135: 130: 122: 115: 98: 92: 91: 83: 73: 23: 16: 1077: 1076: 1072: 1071: 1070: 1068: 1067: 1066: 1042: 1041: 1003:because of the 968: 963: 962: 916: 907: 899: 825: 799: 789: 787:RfC on examples 699: 625: 584: 559: 387:Given that the 340: 274: 266: 260: 258: 249: 231: 230: 229: 212: 171: 168: 165: 162: 161: 128: 96: 75:Knowledge (XXG) 12: 11: 5: 1075: 1073: 1065: 1064: 1059: 1054: 1044: 1043: 1040: 1039: 1009: 1008: 1005:limited damage 997:avoided danger 967: 964: 956: 955: 954: 953: 952: 933: 912: 894: 893: 879: 876: 865: 847: 846: 840: 835: 826: 811: 810: 809: 790: 788: 785: 784: 783: 769: 768: 767: 752: 744: 743: 698: 695: 694: 693: 692: 691: 690: 689: 688: 687: 686: 685: 684: 683: 636: 600: 599: 558: 555: 554: 553: 552: 551: 550: 549: 548: 547: 546: 545: 544: 543: 542: 541: 519: 507: 485: 484: 459: 458: 412: 411: 392: 382: 381: 339: 336: 335: 334: 333: 332: 303: 302: 273: 270: 257: 254: 251: 250: 244: 243: 242: 226:case-sensitive 220: 219: 218: 216: 204: 203: 200: 199: 196: 195: 188:Low-importance 184: 178: 177: 175: 158:the discussion 144: 132: 131: 129:Low‑importance 123: 111: 110: 104: 93: 79: 78: 68: 58: 57: 24: 13: 10: 9: 6: 4: 3: 2: 1074: 1063: 1060: 1058: 1055: 1053: 1050: 1049: 1047: 1038: 1034: 1030: 1025: 1024: 1023: 1022: 1018: 1014: 1006: 1002: 998: 994: 990: 986: 985: 984: 982: 978: 974: 965: 960: 951: 947: 943: 938: 934: 930: 926: 922: 921: 920: 915: 910: 902: 896: 895: 892: 888: 884: 880: 877: 874: 870: 866: 863: 862: 861: 860: 856: 852: 845: 841: 839: 836: 834: 831: 830: 829: 824: 820: 816: 808: 805: 802: 797: 792: 791: 786: 782: 778: 774: 770: 766: 762: 758: 753: 748: 747: 746: 745: 742: 739: 735: 734: 733: 732: 728: 724: 720: 716: 712: 706: 704: 696: 682: 678: 674: 670: 669: 668: 665: 661: 660: 659: 655: 651: 646: 645: 644: 641: 637: 634: 629: 623: 622: 621: 617: 613: 609: 604: 603: 602: 601: 598: 595: 588: 583: 582: 581: 580: 576: 572: 566: 564: 556: 540: 536: 532: 528: 524: 520: 516: 512: 508: 506: 502: 498: 493: 489: 488: 487: 486: 483: 480: 476: 475: 474: 470: 466: 461: 460: 457: 454: 449: 448: 447: 443: 439: 435: 431: 427: 426: 425: 422: 418: 414: 413: 408: 407: 406: 402: 398: 393: 390: 386: 385: 384: 383: 380: 377: 373: 372: 371: 370: 366: 362: 358: 353: 349: 345: 331: 327: 323: 319: 318: 317: 313: 309: 305: 304: 301: 298: 294: 293: 292: 291: 287: 283: 279: 271: 269: 265: 255: 248: 240: 236: 235: 234: 227: 223: 217: 210: 209: 193: 189: 183: 180: 179: 176: 159: 155: 151: 150: 145: 142: 138: 137: 133: 127: 124: 121: 117: 112: 108: 102: 94: 90: 85: 84: 69: 66: 62: 55: 54: 49: 46: 45: 43: 39: 38: 33: 29: 25: 22: 18: 17: 1010: 1004: 1000: 996: 992: 988: 980: 976: 972: 969: 958: 936: 924: 905: 900: 872: 869:moral hazard 851:Lord Belbury 848: 827: 815:Lord Belbury 806: 800: 793: 757:Lord Belbury 723:Lord Belbury 707: 702: 700: 673:Lord Belbury 650:Lord Belbury 631: 612:Lord Belbury 587:Lord Belbury 571:Lord Belbury 567: 562: 560: 531:Lord Belbury 510: 497:Lord Belbury 465:Lord Belbury 429: 397:Lord Belbury 356: 351: 343: 341: 275: 259: 232: 224:Anchors are 221: 187: 147: 107:WikiProjects 51: 48:Did you know 47: 37:Did you know 35: 27: 26:A fact from 97:Start-class 42:check views 1046:Categories 1013:WP Ludicer 438:WP Ludicer 361:WP Ludicer 908:Horrorist 557:Chernobyl 32:Main Page 521:Evrik's 929:Y2K bug 711:WP:ONUS 357:benefit 352:because 241:before. 190:on the 34:in the 995:, the 977:please 923:There 904:, the 511:that's 322:Pol098 308:Pol098 282:Pol098 103:scale. 981:worse 738:evrik 664:evrik 640:evrik 594:evrik 479:evrik 453:evrik 421:evrik 376:evrik 297:evrik 1033:talk 1029:Loki 1017:talk 946:talk 942:Loki 906:Mad 901:Zeke 887:talk 883:Loki 855:talk 819:talk 777:talk 761:talk 727:talk 677:talk 654:talk 616:talk 575:talk 535:talk 501:talk 469:talk 442:talk 401:talk 365:talk 326:talk 312:talk 286:talk 278:CBRN 222:Tip: 925:are 873:was 849:-- 717:or 182:Low 1048:: 1035:) 1019:) 948:) 937:is 889:) 857:) 821:) 798:. 779:) 763:) 755:-- 729:) 679:) 656:) 638:-- 618:) 592:-- 577:) 569:-- 537:) 503:) 495:-- 471:) 451:-- 444:) 419:-- 403:) 395:-- 367:) 328:) 314:) 288:) 267:}} 261:{{ 1031:( 1015:( 944:( 885:( 853:( 817:( 775:( 759:( 725:( 675:( 652:( 614:( 589:: 585:@ 573:( 533:( 499:( 467:( 440:( 399:( 363:( 324:( 310:( 284:( 194:. 109:: 67:.

Index


Main Page
Did you know
check views
people will paradoxically doubt
Knowledge (XXG):Recent additions/2022/May
Template:Did you know nominations/Preparedness paradox

content assessment
WikiProjects
WikiProject icon
Disaster management
WikiProject icon
WikiProject Disaster management
Disaster management
the discussion
Low
project's importance scale
case-sensitive
deleted by a user
Reporting errors
Did you know nominations/Preparedness paradox
CBRN
Pol098
talk
17:32, 3 May 2022 (UTC)
evrik
17:37, 3 May 2022 (UTC)
Pol098
talk

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑