Knowledge

Why the Future Doesn't Need Us

Source 📝

234:'s article about Bill Joy's interview, he quotes him on how some concerns with new developing technologies are actually more dangerous than he expressed in the article, because Goldsmith claims that the developers of these machines are giving them too much power. Goldsmith states his belief that scientists don't think of a lot of things that can go wrong when they start making inventions, because that will lead to less funding. 774: 786: 237:
In Sophie Tysom's review about Bill Joy's article she says Joy shouldn't be one minded when it comes to newer technology, and should also see that there could be a "compromise" made between him and those new technologies. She also agrees that he has a point for being worried about what will happen in
207:
nature and ease of being hidden. Similarly, he feels that Joy's "Hippocratic oath" proposal of voluntary abstention by scientists from harmful research would not be effective either, because scientists might be pressured by governments, tempted by profits, uncertain which technologies would lead to
61:
clearly show the need to take personal responsibility, the danger that things will move too fast, and the way in which a process can take on a life of its own. We can, as they did, create insurmountable problems in almost no time flat. We must do more thinking up front if we are not to be similarly
226:
shares Kurzweil's viewpoint on matters of the impractical and ineffective nature of "technological relinquishment," but adds a larger moral and philosophical component to the argument, arguing that the perfection and evolution of humanity is not "losing our humanity" and that voluntarily-sought
554: 238:
the long run, but doesn't think that these technologies will try to control us in the future. Joy responded to this, stating that he liked that people were starting to respond to his article because it gave them an input on the subject.
130:
that will be built and that these people could also decide to take life into their own hands and control how humans continue to populate and reproduce. He started doing more research into robotics and people that specialize in
143:
book ''Robot: Mere Machine to Transcendent Mind'' where he believed there will be a shift in the future where robots will take over normal human activities, but with time humans will become okay with living that way.
246:
After the publication of the article, Bill Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.
203:(AGI) would fail because "prohibitions, at least under current technology and current geopolitics, are certain to be ineffective". Verification of AGI-limitation agreements would be difficult due to AGI's 586: 254:
in 2008, Lucas Graves's article reported that the genetics, nanotechnology, and robotics technologies have not reached the level that would make Bill Joy's scenario come true.
182: 816: 98:
were limited to large governments, due to the complexity and cost of such devices, as well as the difficulty in acquiring the required materials. He uses the novel
111:
Joy also voices concerns about increasing computer power. His worry is that computers will eventually become more intelligent than we are, leading to such
916: 82:
Joy argues that developing technologies pose a much greater danger to humanity than any technology before has ever done. In particular, he focuses on
809: 209: 906: 911: 496: 896: 208:
harm down the road, or opposed to Joy's premise in the first place. Rather than relinquishment of AGI, McGinnis argues for a kind of
802: 213: 352: 200: 306: 178:
believe that modern technologies are bad for both freedom and the problem of cancer, and that the two issues are connected.
162:
questioned the regulation of potentially dangerous technology, asking "Should we tell the millions of people afflicted with
297:
Khushf, George (2004). "The Ethics of Nanotechnology: Vision and Values for a New Generation of Science and Engineering",
170:
treatments because there is a risk that these same technologies may someday be used for malevolent purposes?" However,
139:, a specialist in robotics, believes that in the future there will be a merge between humans and robots. Joy mentioned 185:
Science and Technology Policy Yearbook 2001 article "A Response to Bill Joy and the Doom-and-Gloom Technofuturists",
58: 126:
Joy expresses concerns that eventually the rich will be the only ones that have the power to control the future
717: 278: 835:
Messerly, John G. "I'm glad the future doesn't need us: a critique of Joy's pessimistic futurism." ACM SIGCAS
901: 154: 748: 643: 607: 273: 759: 733: 204: 175: 83: 42: 699: 470: 50: 844: 877:
The Center for the Study of Technology and Society: Special Focus on Bill Joy's Hi-Tech Warning
871: 527: 840: 790: 738: 662: 327: 302: 258: 116: 728: 466: 301:, National Academy of Engineering, pp. 31–32. Washington, DC: The National Academies Press. 186: 100: 37:
magazine. In the article, he argues that "Our most powerful 21st-century technologies—
33: 28: 620: 397: 778: 743: 167: 87: 46: 866: 890: 445: 265: 231: 196: 190: 136: 120: 105: 856: 74:, others share his concerns about the consequences of rapidly expanding technology. 773: 753: 199:
argues that Joy's proposal for "relinquishment" of technologies that might lead to
159: 140: 135:, and outside of his own thoughts, he tried getting others' opinions on the topic. 95: 67: 647: 393: 171: 71: 227:
increased capacity in any domain does not even represent "a loss" of any kind.
785: 687: 423: 261: 166:
and other devastating conditions that we are canceling the development of all
331: 269: 219: 112: 223: 132: 91: 38: 24: 694: 319: 94:. He argues that 20th-century technologies of destruction such as the 474: 163: 881: 377: 127: 663:"15th Anniversary: Why the Future Still Needs Us a While Longer" 876: 475:"A Response to Bill Joy and the Doom-and-Gloom Technofuturists" 634:
Joy, Bill (15 September 2000). "The dark side of technology".
264:
cited the article during a discussion on the implications of
62:
surprised and shocked by the consequences of our inventions.
587:"Technological Utopias or Dystopias: Is There a Third Way?" 353:"Critique of Bill Joy's "Why the future doesn't need us"" 193:
on his prediction by failing to consider social factors.
689:
Joe Rogan Experience #1555 - Alex Jones & Tim Dillon
189:
and Paul Duguid criticized Joy for having technological
299:
Emerging Technologies and Ethical Issues in Engineering
484:. American Association for the Advancement of Science. 66:
While some critics have characterized Joy's stance as
882:
Bill Joy – Nanotech, and Genetics, and Robots, Oh My!
867:
Rants & Raves: "Why the Future Doesn't Need Us"
585: 553: 448:. Primitivism.com (excerpt); Yale University Press 108:creates a virus capable of wiping out humanity. 55: 104:as a potential nightmare scenario, in which a 810: 8: 817: 803: 708: 49:—are threatening to make humans an 555:"Discomfort and Joy: Bill Joy Interview" 379:Robot: Mere Machine to Transcendent Mind 716: 290: 616: 605: 528:"Embrace, Don't Relinquish the Future" 482:Science and Technology Policy Yearbook 210:differential technological development 446:"Health and the Rise of Civilization" 216:is advanced faster than other kinds. 7: 276:on the October 27, 2020 episode of 501:Northwestern University Law Review 14: 917:Media related to existential risk 839:, Volume 33,Issue 2, (June 2003) 495:McGinnis, John O. (Summer 2010). 250:In the 15th Anniversary issue of 857:"Why the future doesn't need us" 784: 772: 702:from the original on 2021-12-11. 320:"Why the Future Doesn't Need Us" 214:friendly artificial intelligence 712:Part of a series of articles on 661:Graves, Lucas (24 March 2008). 552:Goldsmith, Zac (October 2000). 201:artificial general intelligence 872:Bill Joy Hopes Reason Prevails 584:Tysom, Sophie (January 2001). 21:Why the Future Doesn't Need Us 1: 351:Messerly, John (2016-02-17). 31:) in the April 2000 issue of 907:Nanotechnology publications 444:Cohen, Mark Nathan (1991). 404:(10). Federated Anarchy Inc 23:" is an article written by 933: 912:Regulation of technologies 897:Wired (magazine) articles 636:Vital Speeches of the Day 526:More, Max (May 7, 2000). 27:(then Chief Scientist at 318:Joy, Bill (2000-04-01). 279:The Joe Rogan Experience 16:2000 article by Bill Joy 155:The Singularity Is Near 57:The experiences of the 749:Productive nanosystems 615:Cite journal requires 376:Moravec, Hans (1998). 64: 837:Computers and Society 791:Technology portal 760:Engines of Creation 734:Molecular assembler 396:(31 October 2002). 259:conspiracy theorist 176:Chellis Glendinning 84:genetic engineering 43:genetic engineering 779:Science portal 357:Reason and Meaning 51:endangered species 827: 826: 739:Molecular machine 497:"Accelerating AI" 467:Brown, John Seely 426:. Primitivism.com 123:(the Unabomber). 59:atomic scientists 924: 819: 812: 805: 789: 788: 777: 776: 729:Mechanosynthesis 709: 704: 703: 684: 678: 677: 675: 674: 658: 652: 651: 631: 625: 624: 618: 613: 611: 603: 601: 599: 589: 581: 575: 574: 572: 570: 557: 549: 543: 542: 540: 538: 523: 517: 516: 514: 512: 492: 486: 485: 479: 463: 457: 456: 454: 453: 441: 435: 434: 432: 431: 420: 414: 413: 411: 409: 390: 384: 383: 373: 367: 366: 364: 363: 348: 342: 341: 339: 338: 315: 309: 295: 268:with podcasters 187:John Seely Brown 101:The White Plague 29:Sun Microsystems 932: 931: 927: 926: 925: 923: 922: 921: 887: 886: 853: 832: 830:Further reading 823: 783: 771: 719: 707: 686: 685: 681: 672: 670: 660: 659: 655: 642:(23): 706–709. 633: 632: 628: 614: 604: 597: 595: 583: 582: 578: 568: 566: 551: 550: 546: 536: 534: 525: 524: 520: 510: 508: 494: 493: 489: 477: 465: 464: 460: 451: 449: 443: 442: 438: 429: 427: 422: 421: 417: 407: 405: 398:"What Ails Us?" 392: 391: 387: 375: 374: 370: 361: 359: 350: 349: 345: 336: 334: 317: 316: 312: 296: 292: 288: 244: 150: 117:robot rebellion 80: 17: 12: 11: 5: 930: 928: 920: 919: 914: 909: 904: 902:2000 documents 899: 889: 888: 885: 884: 879: 874: 869: 864: 852: 851:External links 849: 848: 847: 831: 828: 825: 824: 822: 821: 814: 807: 799: 796: 795: 794: 793: 781: 766: 765: 764: 763: 756: 751: 746: 744:Brownian motor 741: 736: 731: 723: 722: 720:nanotechnology 714: 713: 706: 705: 679: 653: 626: 617:|journal= 576: 544: 518: 507:(3): 1253–1270 487: 458: 436: 424:"Age of Grief" 415: 385: 368: 343: 310: 289: 287: 284: 243: 240: 149: 146: 141:Hans Moravec's 88:nanotechnology 79: 76: 53:." Joy warns: 15: 13: 10: 9: 6: 4: 3: 2: 929: 918: 915: 913: 910: 908: 905: 903: 900: 898: 895: 894: 892: 883: 880: 878: 875: 873: 870: 868: 865: 862: 858: 855: 854: 850: 846: 842: 838: 834: 833: 829: 820: 815: 813: 808: 806: 801: 800: 798: 797: 792: 787: 782: 780: 775: 770: 769: 768: 767: 762: 761: 757: 755: 752: 750: 747: 745: 742: 740: 737: 735: 732: 730: 727: 726: 725: 724: 721: 715: 711: 710: 701: 697: 696: 691: 690: 683: 680: 668: 664: 657: 654: 649: 645: 641: 637: 630: 627: 622: 609: 593: 588: 580: 577: 565: 561: 556: 548: 545: 533: 529: 522: 519: 506: 502: 498: 491: 488: 483: 476: 472: 468: 462: 459: 447: 440: 437: 425: 419: 416: 403: 402:Green Anarchy 399: 395: 389: 386: 381: 380: 372: 369: 358: 354: 347: 344: 333: 329: 325: 321: 314: 311: 308: 304: 300: 294: 291: 285: 283: 281: 280: 275: 271: 267: 266:transhumanism 263: 260: 255: 253: 248: 241: 239: 235: 233: 232:Zac Goldsmith 228: 225: 221: 217: 215: 211: 206: 202: 198: 197:John McGinnis 194: 192: 191:tunnel vision 188: 184: 179: 177: 173: 169: 168:bioengineered 165: 161: 157: 156: 147: 145: 142: 138: 137:Rodney Brooks 134: 129: 124: 122: 121:Ted Kaczynski 118: 115:scenarios as 114: 109: 107: 106:mad scientist 103: 102: 97: 93: 89: 85: 77: 75: 73: 69: 63: 60: 54: 52: 48: 44: 40: 36: 35: 30: 26: 22: 863:, April 2000 860: 836: 758: 754:Nanorobotics 693: 688: 682: 671:. Retrieved 666: 656: 639: 635: 629: 608:cite journal 596:. Retrieved 591: 579: 567:. Retrieved 563: 559: 547: 535:. Retrieved 531: 521: 509:. Retrieved 504: 500: 490: 481: 471:Duguid, Paul 461: 450:. Retrieved 439: 428:. Retrieved 418: 406:. Retrieved 401: 394:Zerzan, John 388: 378: 371: 360:. Retrieved 356: 346: 335:. Retrieved 323: 313: 298: 293: 277: 256: 251: 249: 245: 236: 229: 218: 195: 180: 160:Ray Kurzweil 153: 151: 125: 119:. He quotes 110: 99: 96:nuclear bomb 81: 68:obscurantism 65: 56: 32: 20: 18: 669:. Wired.com 598:20 November 569:20 November 172:John Zerzan 72:neo-Luddism 891:Categories 673:2009-10-22 594:(1): 15–16 452:2009-07-08 430:2009-07-08 382:. Oup Usa. 362:2019-11-14 337:2019-11-14 307:030909271X 286:References 274:Tim Dillon 262:Alex Jones 845:0095-2737 718:Molecular 648:221453952 560:Ecologist 332:1059-1028 270:Joe Rogan 242:Aftermath 222:futurist 220:Extropian 212:in which 148:Criticism 113:dystopian 700:Archived 644:ProQuest 473:(2001). 224:Max More 205:dual-use 133:robotics 92:robotics 47:nanotech 39:robotics 25:Bill Joy 695:YouTube 537:22 July 532:Extropy 511:16 July 408:5 March 181:In the 78:Summary 843:  646:  330:  305:  257:Noted 164:cancer 128:robots 45:, and 861:Wired 667:Wired 478:(PDF) 324:Wired 252:Wired 34:Wired 841:ISSN 621:help 600:2019 571:2019 539:2018 513:2014 410:2012 328:ISSN 303:ISBN 272:and 183:AAAS 174:and 90:and 505:104 230:In 152:In 70:or 893:: 859:, 698:. 692:. 665:. 640:66 638:. 612:: 610:}} 606:{{ 592:20 590:. 564:30 562:. 558:. 530:. 503:. 499:. 480:. 469:; 400:. 355:. 326:. 322:. 282:. 158:, 86:, 41:, 818:e 811:t 804:v 676:. 650:. 623:) 619:( 602:. 573:. 541:. 515:. 455:. 433:. 412:. 365:. 340:. 19:"

Index

Bill Joy
Sun Microsystems
Wired
robotics
genetic engineering
nanotech
endangered species
atomic scientists
obscurantism
neo-Luddism
genetic engineering
nanotechnology
robotics
nuclear bomb
The White Plague
mad scientist
dystopian
robot rebellion
Ted Kaczynski
robots
robotics
Rodney Brooks
Hans Moravec's
The Singularity Is Near
Ray Kurzweil
cancer
bioengineered
John Zerzan
Chellis Glendinning
AAAS

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.