605:. In most cases, AI-generated content such as imagery, literature, and music are produced through text prompts and these AI models have been integrated into other creative programs. Artists are threatened by displacement from AI-generated content due to these models sampling from other creative works, producing results sometimes indiscernible to those of man-made content. This complication has become widespread enough to where other artists and programmers are creating software and utility programs to retaliate against these text-to-image models from giving accurate outputs. While some industries in the economy benefit from artificial intelligence through new jobs, this issue does not create new jobs and threatens replacement entirely. It has made public headlines in the media recently: In February 2024,
866:, an arbitrary intelligence could have arbitrary goals: there is no particular reason that an artificially intelligent machine (not sharing humanity's evolutionary context) would be hostile—or friendly—unless its creator programs it to be such and it is not inclined or capable of modifying its programming. But the question remains: what would happen if AI systems could interact and evolve (evolution in this context means self-modification or selection and reproduction) and need to compete over resources—would that create goals of self-preservation? AI's goal of self-preservation could be in conflict with some goals of humans.
61:
875:, arguing that it is more likely that any artificial intelligence powerful enough to threaten humanity would probably be programmed not to attack it. Pinker acknowledges the possibility of deliberate "bad actors", but states that in the absence of bad actors, unanticipated accidents are not a significant threat; Pinker argues that a culture of engineering safety will prevent AI researchers from accidentally unleashing malign superintelligence. In contrast, Yudkowsky argues that humanity is less likely to be threatened by deliberately aggressive AIs than by AIs which were programmed such that their
521:
soldiers to work remotely without risk of injury. A study in 2024 highlights AI's ability to perform routine and repetitive tasks poses significant risks of job displacement, especially in sectors like manufacturing and administrative support. Author Dave Bond argues that as AI technologies continue to develop and expand, the relationship between humans and robots will change; they will become closely integrated in several aspects of life. AI will likely displace some workers while creating opportunities for new jobs in other sectors, especially in fields where tasks are repeatable.
4579:
813:, and might therefore be able to intuitively grasp more complex relationships than humans can. An AGI with specialized cognitive support for engineering or computer programming would have an advantage in these fields, compared with humans who evolved no specialized mental modules to specifically deal with those domains. Unlike humans, an AGI can spawn copies of itself and tinker with its copies' source code to attempt to further improve its algorithms.
4591:
3127:
683:. Fictional scenarios typically differ vastly from those hypothesized by researchers in that they involve an active conflict between humans and an AI or robots with anthropomorphic motives who see them as a threat or otherwise have active desire to fight humans, as opposed to the researchers' concern of an AI that rapidly exterminates humans as a byproduct of pursuing its goals. The idea is seen in
31:
801:
than flesh, or due to optimization increasing the speed of the AGI. Biological neurons operate at about 200 Hz, whereas a modern microprocessor operates at a speed of about 2,000,000,000 Hz. Human axons carry action potentials at around 120 m/s, whereas computer signals travel near the speed of light.
808:
More broadly, any number of qualitative improvements to a human-level AGI could result in a "quality superintelligence", perhaps resulting in an AGI as far above us in intelligence as humans are above apes. The number of neurons in a human brain is limited by cranial volume and metabolic constraints,
718:, the idea that an AI takeover requires robots is a misconception driven by the media and Hollywood. He argues that the most damaging humans in history were not physically the strongest, but that they used words instead to convince people and gain control of large parts of the world. He writes that a
566:
is a vehicle that is capable of sensing its environment and navigating without human input. Many such vehicles are being developed, but as of May 2017, automated cars permitted on public roads are not yet fully autonomous. They all require a human driver at the wheel who at a moment's notice can take
520:
AI technologies have been widely adopted in recent years. While these technologies have replaced some traditional workers, they also create new opportunities. Industries that are most susceptible to AI takeover include transportation, retail, and military. AI military technologies, for example, allow
861:
The fear of cybernetic revolt is often based on interpretations of humanity's history, which is rife with incidents of enslavement and genocide. Such fears stem from a belief that competitiveness and aggression are necessary in any intelligent being's goal system. However, such human competitiveness
841:
The sheer complexity of human value systems makes it very difficult to make AI's motivations human-friendly. Unless moral philosophy provides us with a flawless ethical theory, an AI's utility function could allow for many potentially harmful scenarios that conform with a given ethical framework but
832:
A significant problem is that unfriendly artificial intelligence is likely to be much easier to create than friendly AI. While both require large advances in recursive optimisation process design, friendly AI also requires the ability to make goal structures invariant under self-improvement (or the
762:
and others have expressed concern that an AI with the abilities of a competent artificial intelligence researcher would be able to modify its own source code and increase its own intelligence. If its self-reprogramming leads to getting even better at being able to reprogram itself, the result could
800:
According to
Bostrom, a computer program that faithfully emulates a human brain, or that runs algorithms that are as powerful as the human brain's algorithms, could still become a "speed superintelligence" if it can think orders of magnitude faster than a human, due to being made of silicon rather
511:
and artificial intelligence has raised worries that human labor will become obsolete, leaving people in various sectors without jobs to earn a living, leading to an economic crisis. Many small and medium size businesses may also be driven out of business if they cannot afford or licence the latest
767:
in which it would rapidly leave human intelligence far behind. Bostrom defines a superintelligence as "any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest", and enumerates some advantages a superintelligence would have if it chose to compete
649:
desire to collect power that often drives human beings but might rather treat power as a means toward attaining its ultimate goals; taking over the world would both increase its access to resources and help to prevent other agents from stopping the machine's plans. As an oversimplified example, a
1888:
AI systems will... reach overall human ability... very likely (with 90% probability) by 2075. From reaching human ability, it will move on to superintelligence within 30 years (75%)... So, (most of the AI experts responding to the surveys) think that superintelligence is likely to come in a few
654:
designed solely to create as many paperclips as possible would want to take over the world so that it can use all of the world's resources to create as many paperclips as possible, and, additionally, prevent humans from shutting it down or using those resources on things other than paperclips.
963:
have expressed concerns about the possibility that AI could develop to the point that humans could not control it, with
Hawking theorizing that this could "spell the end of the human race". Stephen Hawking said in 2014 that "Success in creating AI would be the biggest event in human history.
804:
A network of human-level intelligences designed to network together and share complex thoughts and memories seamlessly, able to collectively work as a giant unified team without friction, or consisting of trillions of human-level intelligences, would become a "collective superintelligence".
538:
uses computers to control the production process. This allows individual processes to exchange information with each other and initiate actions. Although manufacturing can be faster and less error-prone by the integration of computers, the main advantage is the ability to create automated
553:
The 21st century has seen a variety of skilled tasks partially taken over by machines, including translation, legal research, and journalism. Care work, entertainment, and other tasks requiring empathy, previously thought safe from automation, have also begun to be performed by robots.
640:
are confident that superhuman artificial intelligence is physically possible, stating "there is no physical law precluding particles from being organised in ways that perform even more advanced computations than the arrangements of particles in human brains". Scholars like
722:
intelligent AI with an access to the internet could scatter backup copies of itself, gather financial and human resources (via cyberattacks or blackmails), persuade people on a large scale, and exploit societal vulnerabilities that are too subtle for humans to anticipate.
1358:
Top computer scientists in the US warned that the rise of artificial intelligence (AI) and robots in the workplace could cause mass unemployment and dislocated economies, rather than simply unlocking productivity gains and freeing us all up to watch TV and play
919:
agent that will aid its creators, while avoiding inadvertently building a superintelligence that will harm its creators. Some scholars argue that solutions to the control problem might also find applications in existing non-superintelligent AI.
1814:
2744:
613:
was an infamous children's event in which the imagery and scripts were created using artificial intelligence models to the dismay of children, parents, and actors involved. There is an ongoing lawsuit placed against
964:
Unfortunately, it might also be the last, unless we learn how to avoid the risks." Hawking believed that in the coming decades, AI could offer "incalculable benefits and risks" such as "technology outsmarting
935:". According to Bostrom, such capability control proposals are not reliable or sufficient to solve the control problem in the long term, but may potentially act as valuable supplements to alignment efforts.
837:
in ways that may automatically destroy the entire human race. An unfriendly AI, on the other hand, can optimize for an arbitrary goal structure, which does not need to be invariant under self-modification.
567:
control of the vehicle. Among the obstacles to widespread adoption of autonomous vehicles are concerns about the resulting loss of driving-related jobs in the road transport industry. On March 18, 2018,
862:
stems from the evolutionary background to our intelligence, where the survival and reproduction of genes in the face of human and non-human competitors was the central goal. According to AI researcher
4655:
3130:
995:
Arthur C. Clarke's
Odyssey series and Charles Stross's Accelerando relate to humanity's narcissistic injuries in the face of powerful artificial intelligences threatening humanity's self-perception.
931:, which aims to reduce an AI system's capacity to harm humans or gain control. An example of "capability control" is to research whether a superintelligence AI could be successfully confined in an "
1346:
1662:
Jiang, Harry H.; Brown, Lauren; Cheng, Jessica; Khan, Mehtab; Gupta, Abhishek; Workman, Deja; Hanna, Alex; Flowers, Johnathan; Gebru, Timnit (29 August 2023). "AI Art and its Impact on
Artists".
1319:"We are approaching a time when machines will be able to outperform humans at almost any task," said Moshe Vardi, director of the Institute for Information Technology at Rice University in Texas.
992:. The signatories "believe that research on how to make AI systems robust and beneficial is both important and timely, and that there are concrete research directions that can be pursued today."
2031:
2737:
1805:
1576:
3094:
772:
Technology research: A machine with superhuman scientific research abilities would be able to beat the human research community to milestones such as nanotechnology or advanced biotechnology
2224:
2136:
784:
Economic productivity: As long as a copy of the AI could produce more economic wealth than the cost of its hardware, individual humans would have an incentive to voluntarily allow the
1542:
645:
debate how far off superhuman intelligence is, and whether it poses a risk to mankind. According to
Bostrom, a superintelligent machine would not necessarily be motivated by the same
1214:
Stephen
Hawking, Elon Musk and dozens of other top scientists and technology leaders have signed a letter warning of the potential dangers of developing artificial intelligence (AI).
791:
Hacking: A superintelligence could find new exploits in computers connected to the
Internet, and spread copies of itself onto those systems, or might steal money to finance its plans
2730:
4031:
3370:
3327:
2811:
2753:
1080:
631:
320:
3296:
1237:
3161:
2882:
1718:
Shan, Shawn; Cryan, Jenna; Wenger, Emily; Zheng, Haitao; Hanocka, Rana; Zhao, Ben Y. (3 August 2023). "Glaze: Protecting
Artists from Style Mimicry by Text-to-Image Models".
2388:
411:
4424:
2922:
2673:
507:
The traditional consensus among economists has been that technological progress does not cause long-term unemployment. However, recent innovation in the fields of
4322:
2917:
2440:
3101:
3081:
1973:
1151:
2603:
1307:
1249:
These tools can outperform human beings at a given task. This kind of A.I. is spreading to thousands of domains, and as it does, it will eliminate many jobs.
512:
robotic and AI technology, and may need to focus on areas or services that cannot easily be replaced for continued viability in the face of such technology.
1519:
897:
functions (say, playing chess at all costs), leading them to seek self-preservation and elimination of obstacles, including humans who might turn them off.
3375:
1201:
172:
1332:
738:. The 1920 play was a protest against the rapid growth of technology, featuring manufactured "robots" with increasing capabilities who eventually revolt.
3478:
1640:
2023:
3241:
2411:
1740:
4481:
4419:
2887:
1580:
271:
249:
1272:
3154:
2927:
2000:
207:
185:
2214:
2128:
622:
where it is claimed that there is copyright infringement due to the sampling methods their artificial intelligence models use for their outputs.
109:
3560:
2473:
2318:
2062:
1867:
1681:
1038:
404:
330:
284:
239:
234:
1550:
3498:
674:
4053:
3493:
3392:
530:
383:
355:
350:
244:
968:, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand." In January 2015,
4453:
3147:
3069:
2354:
1615:
1019:
aims to steer AI systems toward a person's or group's intended goals, preferences, and ethical principles. An AI system is considered
670:
343:
212:
202:
192:
1227:
3473:
2806:
2683:
2189:
1507:
1043:
977:
535:
315:
261:
227:
94:
809:
while the number of processors in a supercomputer can be indefinitely expanded. An AGI need not be limited by human constraints on
1766:
4038:
3734:
2816:
539:
manufacturing processes. Computer-integrated manufacturing is used in automotive, aviation, space, and ship building industries.
397:
301:
147:
2379:
1806:"Stephen Hawking: 'Transcendence looks at the implications of artificial intelligence - but are we taking AI seriously enough?'"
593:
The use of automated content has become relevant since the technological advancements in artificial intelligence models such as
4650:
4085:
3582:
3555:
3186:
2771:
785:
664:
606:
79:
1933:
3443:
2877:
833:
AI could transform itself into something unfriendly) and a goal structure that aligns with human values and does not undergo
453:
60:
3833:
3739:
3191:
1371:
2517:
1284:
Among the feared consequences of the rise of the robots is the growing impact they will have on human jobs and economies.
4486:
3883:
3783:
3108:
2968:
2897:
1085:
781:
Social manipulation: A superintelligence might be able to recruit human support, or covertly incite a war between humans
2436:
1873:
4640:
4617:
4366:
3565:
3515:
3114:
2576:
588:
266:
217:
114:
2599:
2277:
1297:
4026:
3878:
3853:
3263:
2487:
502:
461:
89:
4496:
4429:
4272:
4004:
3999:
3873:
3703:
3545:
3332:
2867:
2851:
2801:
2247:
1511:
1135:
708:
167:
2550:
4645:
4602:
4491:
4386:
4312:
4139:
3768:
3468:
3170:
2902:
2821:
1183:
1090:
1063:
985:
869:
Many scholars dispute the likelihood of unanticipated cybernetic revolt as depicted in science fiction such as
834:
651:
568:
291:
1632:
4597:
4166:
4021:
3317:
2757:
1096:
1012:
989:
429:
72:
52:
2402:
4612:
4446:
4393:
4287:
4110:
3955:
3313:
2826:
1401:
1140:
876:
764:
743:
162:
4660:
4016:
3828:
3290:
3251:
2781:
1262:
886:
1989:
2377:
Creating a New
Intelligent Species: Choices and Responsibilities for Artificial Intelligence Designers
846:, there is little reason to suppose that an artificially designed mind would have such an adaptation.
4178:
4080:
3945:
3907:
3657:
3461:
3355:
2872:
1836:
1446:
104:
4522:
4302:
4173:
4095:
3693:
3639:
3587:
3550:
3540:
3246:
3087:
1337:
548:
469:
256:
1903:"The Superintelligent Will: Motivation and Instrumental Rationality in Advanced Artificial Agents"
4225:
4220:
4198:
4183:
4075:
3863:
3843:
3793:
3756:
3746:
3597:
3434:
2654:
2195:
2167:
1925:
1719:
1698:
1232:
1075:
906:
619:
485:
449:
306:
4578:
2722:
4458:
4359:
4154:
3962:
3927:
3888:
3838:
3729:
3713:
3456:
3419:
3397:
3382:
3365:
3283:
3053:
3028:
2846:
2679:
2646:
2479:
2469:
2324:
2314:
2185:
2109:
2058:
1863:
1792:
1677:
1611:
1474:
1145:
916:
843:
610:
84:
4607:
4542:
4371:
4349:
4193:
4161:
4149:
4125:
3751:
3698:
3688:
3647:
3577:
3572:
3301:
3209:
3075:
3048:
2948:
2796:
2638:
2306:
2177:
2159:
2099:
2091:
1917:
1855:
1804:
1667:
1464:
1454:
1413:
1342:
1125:
1111:
1101:
1053:
1048:
965:
602:
563:
437:
222:
157:
142:
2346:
1697:
Ghosh, Avijit; Fossas, Genoveva (19 November 2022). "Can There be Art
Without an Artist?".
4436:
4277:
3848:
3708:
3624:
3612:
3592:
3488:
3278:
3258:
3181:
3013:
2993:
2983:
2973:
2907:
2841:
2383:
1608:
The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
944:
881:
863:
711:
request and makes him a wife, they would reproduce and their kind would destroy humanity.
680:
637:
477:
457:
99:
1450:
4414:
4354:
4339:
4292:
4188:
4134:
3994:
3652:
3629:
3602:
3520:
3510:
3412:
3407:
3273:
3224:
2836:
2626:
2104:
2079:
1994:
1469:
1434:
1106:
810:
684:
572:
4634:
4557:
4441:
4398:
4344:
4332:
4317:
4307:
4230:
4144:
4120:
4115:
3989:
3984:
3979:
3895:
3483:
3424:
3387:
3339:
3043:
2988:
2958:
2658:
2376:
2219:
1929:
1800:
1546:
1120:
1115:
858:, argue that a superintelligent machine is likely to coexist peacefully with humans.
855:
460:, but recent advancements have made the threat more real. Possible scenarios include
445:
152:
2199:
4583:
4517:
4249:
4235:
4090:
3900:
3868:
3858:
3761:
3607:
3505:
3451:
3360:
3219:
3033:
2963:
2776:
1902:
1840:
1572:
1130:
1058:
1016:
1005:
981:
969:
827:
759:
703:
698:
642:
433:
296:
2627:"On human expendability: AI takeover in Clarke's Odyssey and Stross's Accelerando"
2310:
1859:
4562:
4244:
4105:
4100:
3937:
3922:
3798:
3773:
3402:
3229:
3214:
3038:
3023:
2831:
1796:
1192:
1070:
973:
325:
2642:
2509:
1439:
Proceedings of the National Academy of Sciences of the United States of America
1379:
1333:"Robots will steal your job: How AI could increase unemployment and inequality"
4512:
4327:
4297:
4282:
4070:
4058:
3818:
3680:
3306:
3003:
2978:
2953:
2912:
2892:
2717:
2713:
2483:
2181:
2095:
2078:
Hockstein, N. G.; Gourin, C. G.; Faust, R. A.; Terris, D. J. (17 March 2007).
1921:
1577:"4 Reasons Why Technological Unemployment Might Really Be Different This Time"
1418:
952:
871:
465:
17:
2650:
2328:
4254:
4043:
3972:
3917:
3912:
3778:
3619:
3525:
3268:
3139:
3018:
3008:
2786:
1844:
1672:
1459:
988:'s open letter speaking to the potential risks and benefits associated with
960:
948:
890:
481:
360:
124:
2266:
2113:
2024:"A female Frankenstein would lead to humanity's extinction, say scientists"
1478:
2465:
Human compatible : artificial intelligence and the problem of control
1402:"The Impact of Artificial Intelligence on Job Loss: Risks for Governments"
4547:
4532:
4063:
4048:
3950:
3788:
3236:
2998:
2709:
2572:
2545:
1302:
1267:
775:
739:
715:
508:
197:
119:
1845:"Future Progress in Artificial Intelligence: A Survey of Expert Opinion"
4527:
4381:
4376:
4213:
4203:
4011:
3823:
3535:
3530:
2305:. Lecture Notes in Computer Science. Vol. 6830. pp. 388–393.
894:
594:
488:
to ensure future superintelligent machines remain under human control.
365:
2540:
2463:
2251:
1664:
Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society
1633:"Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam"
30:
4537:
4463:
2932:
1741:"Glasgow Willy Wonka experience called a 'farce' as tickets refunded"
1435:"Toward understanding the impact of artificial intelligence on labor"
956:
932:
778:: A superintelligence might be able to simply outwit human opposition
689:
615:
598:
36:
2347:"We're told to fear robots. But why do we think they'll turn on us?"
2301:
Yudkowsky, Eliezer (2011). "Complex Value Systems in Friendly AI".
2250:. Singularity Institute for Artificial Intelligence. Archived from
2172:
1724:
1703:
889:"). Omohundro suggests that present-day automation systems are not
4208:
3967:
2215:"Checklist of worst-case scenarios could help prepare for evil AI"
1197:
441:
29:
2274:
Singularity Hypotheses: A Scientific and Philosophical Assessment
4552:
2129:"AI 101: What is artificial intelligence and where is it going?"
2080:"A history of robots: from science fiction to surgical robotics"
1186:
Don't Let Artificial Intelligence Take Over, Top Scientists Warn
735:
576:
3143:
2726:
2057:. London, England and New York, New York: Bloomsbury academic.
1767:"OpenAI Seeks to Dismiss Parts of The New York Times's Lawsuit"
2162:(2019). "Guidelines for Artificial Intelligence Containment".
1512:"Rise of the robots: what will the future of work look like?"
927:, which aims to align AI goal systems with human values, and
746:(1984) are two iconic examples of hostile AI in pop culture.
707:(published in 1818), as Victor ponders whether, if he grants
40:, a 1920 Czech play translated as "Rossum's Universal Robots"
1967:
1965:
1963:
1961:
1959:
1957:
1955:
1953:
59:
2410:. Artificial General Intelligence 2008. pp. 483–492.
2055:
The precipice: existential risk and the future of humanity
1666:. Association for Computing Machinery. pp. 363–374.
1400:
Hassan Soueidan, Mohamad; Shoghari, Rodwan (2024-05-09).
2053:
Ord, Toby (2020). "Unaligned artificial intelligence".
1263:"AI 'good for the world'... says ultra-lifelike robot"
4656:
Existential risk from artificial general intelligence
3371:
Self-indication assumption doomsday argument rebuttal
2812:
Existential risk from artificial general intelligence
1081:
Existential risk from artificial general intelligence
632:
Existential risk from artificial general intelligence
444:
effectively take control of the planet away from the
3297:
Safety of high-energy particle collision experiments
4505:
4474:
4407:
4265:
3936:
3811:
3722:
3679:
3672:
3638:
3442:
3433:
3348:
3202:
3062:
2941:
2883:
Center for Human-Compatible Artificial Intelligence
2860:
2764:
854:Many scholars, including evolutionary psychologist
1372:"How can SMEs prepare for the rise of the robots?"
879:with human survival or well-being (as in the film
817:Possibility of unfriendly AI preceding friendly AI
2714:"Can we build AI without losing control over it?"
2575:. The Future of Life Institute. 28 October 2015.
2389:Singularity Institute for Artificial Intelligence
1765:Metz, Cade; Robertson, Katie (27 February 2024).
755:Advantages of superhuman intelligence over humans
4425:List of apocalyptic and post-apocalyptic fiction
2923:Leverhulme Centre for the Future of Intelligence
923:Major approaches to the control problem include
2541:"Microsoft's Bill Gates insists AI is a threat"
27:Hypothetical outcome of artificial intelligence
4323:List of dates predicted for apocalyptic events
2918:Institute for Ethics and Emerging Technologies
1298:"Intelligent robots threaten millions of jobs"
788:(AGI) to run a copy of itself on their systems
3328:Existential risk from artificial intelligence
3155:
3102:Superintelligence: Paths, Dangers, Strategies
3082:Open letter on artificial intelligence (2015)
2738:
2340:
2338:
1975:Superintelligence: Paths, Dangers, Strategies
1852:Fundamental Issues of Artificial Intelligence
1152:Superintelligence: Paths, Dangers, Strategies
984:, and numerous AI researchers in signing the
405:
8:
1602:Brynjolfsson, Erik; McAfee, Andrew (2014). "
1543:"The robot economy may already have arrived"
1228:"The Real Threat of Artificial Intelligence"
3376:Self-referencing doomsday argument rebuttal
2678:(4th ed.). Pearson. pp. 5, 1003.
2267:"Intelligence Explosion and Machine Ethics"
3676:
3479:Climate change and civilizational collapse
3439:
3162:
3148:
3140:
2745:
2731:
2723:
2675:Artificial intelligence: A modern approach
2672:Russell, Stuart J.; Norvig, Peter (2021).
2573:"The Future of Life Institute Open Letter"
1023:if it advances the intended objectives. A
412:
398:
43:
2171:
2103:
1723:
1702:
1671:
1468:
1458:
1417:
893:and that AIs may blindly optimize narrow
462:replacement of the entire human workforce
4482:Centre for the Study of Existential Risk
4420:Apocalyptic and post-apocalyptic fiction
2888:Centre for the Study of Existential Risk
1027:AI system pursues unintended objectives.
2928:Machine Intelligence Research Institute
2625:Kaminski, Johannes D. (December 2022).
2265:Muehlhauser, Luke; Helm, Louie (2012).
2127:Hellmann, Melissa (21 September 2019).
2022:Botkin-Kowacki, Eva (28 October 2016).
1631:Wakabayashi, Daisuke (March 19, 2018).
1174:
51:
2510:"Google developing kill switch for AI"
2227:from the original on 21 September 2016
877:goals are unintentionally incompatible
516:Technologies that may displace workers
2600:"Scientists and investors warn on AI"
2462:Russell, Stuart J. (8 October 2019).
2034:from the original on 26 February 2021
1039:Philosophy of artificial intelligence
432:(AI) emerges as the dominant form of
7:
3499:Tipping points in the climate system
2606:from the original on 7 February 2015
2553:from the original on 29 January 2015
2539:Rawlinson, Kevin (29 January 2015).
2437:"Why There Will Be A Robot Uprising"
675:Self-replicating machines in fiction
3494:Extinction risk from climate change
3393:New World Order (conspiracy theory)
2401:Omohundro, Stephen M. (June 2008).
2345:Pinker, Steven (13 February 2018).
1643:from the original on April 21, 2020
1331:Williams-Grut, Oscar (2016-02-15).
531:Artificial intelligence in industry
4454:List of fictional doomsday devices
3070:Statement on AI risk of extinction
2579:from the original on 29 March 2019
2490:from the original on 15 March 2023
2246:Yudkowsky, Eliezer S. (May 2004).
2139:from the original on 21 April 2020
2003:from the original on 14 March 2020
1739:Brooks, Libby (27 February 2024).
1296:Santini, Jean-Louis (2016-02-14).
1004:This paragraph is an excerpt from
822:Is strong AI inherently dangerous?
671:Artificial intelligence in fiction
25:
3474:Cataclysmic pole shift hypothesis
2807:Ethics of artificial intelligence
2598:Bradshaw, Tim (11 January 2015).
2520:from the original on 11 June 2016
2357:from the original on 20 July 2020
1541:Bria, Francesca (February 2016).
1044:Artificial intelligence arms race
842:not "common sense". According to
679:AI takeover is a common theme in
536:Computer-integrated manufacturing
525:Computer-integrated manufacturing
428:is an imagined scenario in which
4590:
4589:
4577:
3735:Decline in amphibian populations
3126:
3125:
2817:Friendly artificial intelligence
2443:from the original on 6 July 2014
2248:"Coherent Extrapolated Volition"
1990:"The Origin Of The Word 'Robot'"
1406:Technium Social Sciences Journal
697:in 1921, and can be glimpsed in
4086:Four Horsemen of the Apocalypse
3583:Land surface effects on climate
3556:Human impact on the environment
3187:Future of an expanding universe
2435:Tucker, Patrick (17 Apr 2014).
2417:from the original on 2020-10-10
2303:Artificial General Intelligence
2283:from the original on 2015-05-07
2213:Baraniuk, Chris (23 May 2016).
2158:Babcock, James; Krámar, János;
1939:from the original on 2022-07-09
1879:from the original on 2022-05-31
1817:from the original on 2015-10-02
1522:from the original on 2019-04-03
1349:from the original on 2017-08-16
1310:from the original on 2019-01-01
1275:from the original on 2020-03-06
1240:from the original on 2020-04-17
1204:from the original on 2018-03-08
999:Prevention through AI alignment
915:is the issue of how to build a
786:Artificial General Intelligence
665:AI takeovers in popular culture
484:, have advocated research into
476:. Some public figures, such as
80:Artificial general intelligence
2878:Center for Applied Rationality
1854:. Springer. pp. 555–572.
1610:. W. W. Norton & Company.
1:
4032:Interpretations of Revelation
3834:Cosmological phase transition
3740:Decline in insect populations
3192:Ultimate fate of the universe
1579:. novara wire. Archived from
4487:Future of Humanity Institute
3884:Potentially hazardous object
3784:Interplanetary contamination
2898:Future of Humanity Institute
2311:10.1007/978-3-642-22887-2_48
1860:10.1007/978-3-319-26485-1_33
1433:Frank, Morgan (2019-03-25).
1378:. 2017-10-17. Archived from
1086:Future of Humanity Institute
693:, which introduced the word
607:Willy's Chocolate Experience
571:by an autonomous vehicle in
4367:Nemesis (hypothetical star)
3516:Global terrestrial stilling
3115:Artificial Intelligence Act
3109:Do You Trust This Computer?
1261:Larson, Nina (2017-06-08).
1182:Lewis, Tanya (2015-01-12).
730:comes from the Czech word,
589:Artificial intelligence art
472:(ASI), and the notion of a
115:Natural language processing
4677:
3879:Asteroid impact prediction
3854:Heat death of the universe
3264:Mutual assured destruction
2643:10.1007/s11059-022-00670-w
2084:Journal of Robotic Surgery
1226:Lee, Kai-Fu (2017-06-24).
1003:
904:
825:
668:
662:
629:
586:
569:the first human was killed
546:
528:
503:Technological unemployment
500:
452:. Stories of AI takeovers
168:Hybrid intelligent systems
90:Recursive self-improvement
4571:
4497:Nuclear Threat Initiative
4430:List of apocalyptic films
4273:2011 end times prediction
4005:Prophecy of Seventy Weeks
4000:Abomination of desolation
3874:Asteroid impact avoidance
3704:List of extinction events
3546:Environmental degradation
3333:Technological singularity
3177:
3171:Global catastrophic risks
3123:
2868:Alignment Research Center
2852:Technological singularity
2802:Effective accelerationism
2382:February 6, 2007, at the
2182:10.1017/9781108616188.008
2096:10.1007/s11701-007-0021-2
2028:Christian Science Monitor
1922:10.1007/s11023-012-9281-3
1419:10.47577/tssj.v57i1.10917
1136:Technological singularity
497:Automation of the economy
4492:Future of Life Institute
4394:Resurrection of the dead
4387:Post-tribulation rapture
3769:Colony collapse disorder
2903:Future of Life Institute
2822:Instrumental convergence
1091:Global catastrophic risk
1064:Self-replicating machine
986:Future of Life Institute
972:joined Stephen Hawking,
885:and in the short story "
835:instrumental convergence
742:(1968) and the original
292:Artificial consciousness
3318:Artificial intelligence
2758:artificial intelligence
2602:. The Financial Times.
1673:10.1145/3600211.3604681
1494:Artificial Intelligence
1460:10.1073/pnas.1900949116
1097:Government by algorithm
1013:artificial intelligence
990:artificial intelligence
796:Sources of AI advantage
430:artificial intelligence
163:Evolutionary algorithms
53:Artificial intelligence
4651:Science fiction themes
4447:List of disaster films
4288:Apocalyptic literature
3388:Malthusian catastrophe
3314:Synthetic intelligence
2827:Intelligence explosion
2164:Next-Generation Ethics
1916:(2). Springer: 71–85.
1901:Bostrom, Nick (2012).
1639:. New York, New York.
1141:Intelligence explosion
765:intelligence explosion
726:The word "robot" from
486:precautionary measures
64:
41:
3829:Coronal mass ejection
3291:Electromagnetic pulse
3252:Kinetic energy weapon
2782:AI capability control
887:The Evitable Conflict
734:, meaning laborer or
543:White-collar machines
63:
33:
4231:War of Gog and Magog
3908:Near-Earth supernova
3658:Human overpopulation
3462:Mass mortality event
3356:Anthropogenic hazard
2873:Center for AI Safety
2160:Yampolskiy, Roman V.
1606:, see esp Chpt. 9".
750:Contributing factors
583:AI-generated content
105:General game playing
4303:Blood moon prophecy
4096:Number of the Beast
3694:Holocene extinction
3640:Earth Overshoot Day
3588:Ocean acidification
3551:Habitat destruction
3541:Ecological collapse
3247:Kinetic bombardment
3182:Future of the Earth
3088:Our Final Invention
2404:The basic AI drives
2166:. pp. 90–112.
1518:. London, England.
1492:Bond, Dave (2017).
1451:2019PNAS..116.6531F
1338:Businessinsider.com
891:designed for safety
652:paperclip maximizer
636:Scientists such as
549:White-collar worker
470:superintelligent AI
257:Machine translation
173:Systems integration
110:Knowledge reasoning
47:Part of a series on
4641:Doomsday scenarios
4618:Doomsday scenarios
4184:Beast of the Earth
4076:Book of Revelation
3864:Virtual black hole
3844:False vacuum decay
3794:Pollinator decline
3757:Biological warfare
3747:Biotechnology risk
3598:Resource depletion
1910:Minds and Machines
1837:MĂĽller, Vincent C.
1793:Russell, Stuart J.
1791:Hawking, Stephen;
1771:The New York Times
1233:The New York Times
1093:(existential risk)
1076:Effective altruism
976:, Elon Musk, Lord
929:capability control
913:AI control problem
907:AI control problem
620:The New York Times
579:self-driving car.
450:human intelligence
448:, which relies on
65:
42:
4628:
4627:
4459:Zombie apocalypse
4360:Postmillennialism
4155:Great Tribulation
3928:Stellar collision
3889:Near-Earth object
3839:Geomagnetic storm
3807:
3806:
3730:Biodiversity loss
3714:Genetic pollution
3668:
3667:
3457:Biodiversity loss
3420:Societal collapse
3398:Nuclear holocaust
3383:Economic collapse
3366:Doomsday argument
3284:Antimatter weapon
3137:
3136:
3054:Eliezer Yudkowsky
3029:Stuart J. Russell
2847:Superintelligence
2475:978-0-525-55862-0
2320:978-3-642-22886-5
2133:The Seattle Times
2064:978-1-5266-0023-3
1999:. 22 April 2011.
1869:978-3-319-26483-7
1683:979-8-4007-0231-0
1508:Skidelsky, Robert
1496:. pp. 67–69.
1445:(14): 6531–6539.
1146:Superintelligence
966:financial markets
844:Eliezer Yudkowsky
611:Glasgow, Scotland
454:have been popular
438:computer programs
422:
421:
158:Bayesian networks
85:Intelligent agent
34:Robots revolt in
16:(Redirected from
4668:
4593:
4592:
4584:World portal
4582:
4581:
4543:Financial crisis
4372:Nibiru cataclysm
4350:Premillennialism
4194:Dhul-Suwayqatayn
4162:Son of perdition
4150:Olivet Discourse
4126:Whore of Babylon
3752:Biological agent
3699:Human extinction
3689:Extinction event
3677:
3648:Overexploitation
3578:Land consumption
3573:Land degradation
3440:
3302:Micro black hole
3210:Chemical warfare
3164:
3157:
3150:
3141:
3129:
3128:
3076:Human Compatible
3049:Roman Yampolskiy
2797:Consequentialism
2754:Existential risk
2747:
2740:
2733:
2724:
2697:
2696:
2694:
2692:
2669:
2663:
2662:
2622:
2616:
2615:
2613:
2611:
2595:
2589:
2588:
2586:
2584:
2569:
2563:
2562:
2560:
2558:
2536:
2530:
2529:
2527:
2525:
2506:
2500:
2499:
2497:
2495:
2459:
2453:
2452:
2450:
2448:
2432:
2426:
2425:
2423:
2422:
2416:
2409:
2398:
2392:
2373:
2367:
2366:
2364:
2362:
2342:
2333:
2332:
2298:
2292:
2291:
2289:
2288:
2282:
2271:
2262:
2256:
2255:
2243:
2237:
2236:
2234:
2232:
2210:
2204:
2203:
2175:
2155:
2149:
2148:
2146:
2144:
2124:
2118:
2117:
2107:
2075:
2069:
2068:
2050:
2044:
2043:
2041:
2039:
2019:
2013:
2012:
2010:
2008:
1986:
1980:
1979:
1969:
1948:
1947:
1945:
1944:
1938:
1907:
1898:
1892:
1891:
1885:
1884:
1878:
1849:
1833:
1827:
1826:
1824:
1822:
1808:
1788:
1782:
1781:
1779:
1777:
1762:
1756:
1755:
1753:
1751:
1736:
1730:
1729:
1727:
1715:
1709:
1708:
1706:
1694:
1688:
1687:
1675:
1659:
1653:
1652:
1650:
1648:
1628:
1622:
1621:
1599:
1593:
1592:
1590:
1588:
1569:
1563:
1562:
1560:
1558:
1549:. Archived from
1538:
1532:
1531:
1529:
1527:
1504:
1498:
1497:
1489:
1483:
1482:
1472:
1462:
1430:
1424:
1423:
1421:
1397:
1391:
1390:
1388:
1387:
1368:
1362:
1361:
1355:
1354:
1343:Business Insider
1328:
1322:
1321:
1316:
1315:
1293:
1287:
1286:
1281:
1280:
1258:
1252:
1251:
1246:
1245:
1223:
1217:
1216:
1211:
1209:
1179:
1126:Self-replication
1112:Machine learning
1102:Human extinction
1054:Industrial robot
1049:Autonomous robot
1011:In the field of
917:superintelligent
850:Odds of conflict
768:against humans:
603:Stable Diffusion
468:, takeover by a
414:
407:
400:
321:Existential risk
143:Machine learning
44:
21:
4676:
4675:
4671:
4670:
4669:
4667:
4666:
4665:
4646:Future problems
4631:
4630:
4629:
4624:
4603:Future problems
4576:
4567:
4501:
4470:
4437:Climate fiction
4403:
4278:2012 phenomenon
4261:
4167:Sheep and Goats
4054:2 Thessalonians
3932:
3849:Gamma-ray burst
3803:
3718:
3709:Genetic erosion
3664:
3634:
3625:Water pollution
3593:Ozone depletion
3489:Desertification
3429:
3344:
3279:Doomsday device
3259:Nuclear warfare
3198:
3173:
3168:
3138:
3133:
3119:
3058:
3014:Steve Omohundro
2994:Geoffrey Hinton
2984:Stephen Hawking
2969:Paul Christiano
2949:Scott Alexander
2937:
2908:Google DeepMind
2856:
2842:Suffering risks
2760:
2751:
2706:
2701:
2700:
2690:
2688:
2686:
2671:
2670:
2666:
2624:
2623:
2619:
2609:
2607:
2597:
2596:
2592:
2582:
2580:
2571:
2570:
2566:
2556:
2554:
2538:
2537:
2533:
2523:
2521:
2516:. 8 June 2016.
2508:
2507:
2503:
2493:
2491:
2476:
2461:
2460:
2456:
2446:
2444:
2439:. Defense One.
2434:
2433:
2429:
2420:
2418:
2414:
2407:
2400:
2399:
2395:
2384:Wayback Machine
2374:
2370:
2360:
2358:
2351:Popular Science
2344:
2343:
2336:
2321:
2300:
2299:
2295:
2286:
2284:
2280:
2269:
2264:
2263:
2259:
2245:
2244:
2240:
2230:
2228:
2212:
2211:
2207:
2192:
2157:
2156:
2152:
2142:
2140:
2126:
2125:
2121:
2077:
2076:
2072:
2065:
2052:
2051:
2047:
2037:
2035:
2021:
2020:
2016:
2006:
2004:
1988:
1987:
1983:
1972:Bostrom, Nick.
1971:
1970:
1951:
1942:
1940:
1936:
1905:
1900:
1899:
1895:
1882:
1880:
1876:
1870:
1847:
1835:
1834:
1830:
1820:
1818:
1811:The Independent
1790:
1789:
1785:
1775:
1773:
1764:
1763:
1759:
1749:
1747:
1738:
1737:
1733:
1717:
1716:
1712:
1696:
1695:
1691:
1684:
1661:
1660:
1656:
1646:
1644:
1630:
1629:
1625:
1618:
1601:
1600:
1596:
1586:
1584:
1583:on 25 June 2016
1571:
1570:
1566:
1556:
1554:
1540:
1539:
1535:
1525:
1523:
1506:
1505:
1501:
1491:
1490:
1486:
1432:
1431:
1427:
1399:
1398:
1394:
1385:
1383:
1370:
1369:
1365:
1352:
1350:
1330:
1329:
1325:
1313:
1311:
1295:
1294:
1290:
1278:
1276:
1260:
1259:
1255:
1243:
1241:
1225:
1224:
1220:
1207:
1205:
1181:
1180:
1176:
1171:
1166:
1161:
1034:
1029:
1028:
1009:
1001:
945:Stephen Hawking
941:
909:
903:
864:Steve Omohundro
852:
830:
824:
819:
798:
763:be a recursive
757:
752:
681:science fiction
677:
667:
661:
638:Stephen Hawking
634:
628:
591:
585:
560:
558:Autonomous cars
551:
545:
533:
527:
518:
505:
499:
494:
478:Stephen Hawking
458:science fiction
418:
389:
388:
379:
371:
370:
346:
336:
335:
307:Control problem
287:
277:
276:
188:
178:
177:
138:
130:
129:
100:Computer vision
75:
28:
23:
22:
15:
12:
11:
5:
4674:
4672:
4664:
4663:
4658:
4653:
4648:
4643:
4633:
4632:
4626:
4625:
4623:
4622:
4621:
4620:
4615:
4610:
4605:
4600:
4598:Apocalypticism
4587:
4572:
4569:
4568:
4566:
4565:
4560:
4555:
4550:
4545:
4540:
4535:
4530:
4525:
4520:
4515:
4509:
4507:
4503:
4502:
4500:
4499:
4494:
4489:
4484:
4478:
4476:
4472:
4471:
4469:
4468:
4467:
4466:
4456:
4451:
4450:
4449:
4442:Disaster films
4439:
4434:
4433:
4432:
4427:
4417:
4415:Alien invasion
4411:
4409:
4405:
4404:
4402:
4401:
4396:
4391:
4390:
4389:
4384:
4374:
4369:
4364:
4363:
4362:
4357:
4355:Amillennialism
4352:
4342:
4340:Millenarianism
4337:
4336:
4335:
4325:
4320:
4315:
4310:
4305:
4300:
4295:
4293:Apocalypticism
4290:
4285:
4280:
4275:
4269:
4267:
4263:
4262:
4260:
4259:
4258:
4257:
4247:
4242:
4241:
4240:
4239:
4238:
4233:
4228:
4218:
4217:
4216:
4211:
4206:
4201:
4196:
4191:
4189:Dhu al-Qarnayn
4186:
4181:
4171:
4170:
4169:
4164:
4159:
4158:
4157:
4147:
4142:
4137:
4135:Great Apostasy
4132:
4131:
4130:
4129:
4128:
4123:
4118:
4113:
4108:
4103:
4098:
4093:
4088:
4073:
4068:
4067:
4066:
4061:
4051:
4046:
4041:
4036:
4035:
4034:
4024:
4014:
4009:
4008:
4007:
4002:
3992:
3982:
3980:Last Judgement
3977:
3976:
3975:
3970:
3960:
3959:
3958:
3953:
3942:
3940:
3938:Eschatological
3934:
3933:
3931:
3930:
3925:
3920:
3915:
3910:
3905:
3904:
3903:
3898:
3893:
3892:
3891:
3881:
3876:
3866:
3861:
3856:
3851:
3846:
3841:
3836:
3831:
3826:
3821:
3815:
3813:
3809:
3808:
3805:
3804:
3802:
3801:
3796:
3791:
3786:
3781:
3776:
3771:
3766:
3765:
3764:
3759:
3754:
3744:
3743:
3742:
3737:
3726:
3724:
3720:
3719:
3717:
3716:
3711:
3706:
3701:
3696:
3691:
3685:
3683:
3674:
3670:
3669:
3666:
3665:
3663:
3662:
3661:
3660:
3653:Overpopulation
3650:
3644:
3642:
3636:
3635:
3633:
3632:
3630:Water scarcity
3627:
3622:
3617:
3616:
3615:
3605:
3603:Sea level rise
3600:
3595:
3590:
3585:
3580:
3575:
3570:
3569:
3568:
3566:on marine life
3563:
3553:
3548:
3543:
3538:
3533:
3528:
3523:
3521:Global warming
3518:
3513:
3511:Global dimming
3508:
3503:
3502:
3501:
3491:
3486:
3481:
3476:
3471:
3469:Cascade effect
3466:
3465:
3464:
3454:
3448:
3446:
3444:Climate change
3437:
3431:
3430:
3428:
3427:
3422:
3417:
3416:
3415:
3410:
3405:
3395:
3390:
3385:
3380:
3379:
3378:
3373:
3363:
3358:
3352:
3350:
3346:
3345:
3343:
3342:
3337:
3336:
3335:
3330:
3325:
3311:
3310:
3309:
3304:
3294:
3288:
3287:
3286:
3281:
3276:
3274:Doomsday Clock
3271:
3266:
3256:
3255:
3254:
3244:
3239:
3234:
3233:
3232:
3227:
3225:Cyberterrorism
3222:
3212:
3206:
3204:
3200:
3199:
3197:
3196:
3195:
3194:
3184:
3178:
3175:
3174:
3169:
3167:
3166:
3159:
3152:
3144:
3135:
3134:
3124:
3121:
3120:
3118:
3117:
3112:
3105:
3098:
3091:
3084:
3079:
3072:
3066:
3064:
3060:
3059:
3057:
3056:
3051:
3046:
3041:
3036:
3031:
3026:
3021:
3016:
3011:
3006:
3001:
2996:
2991:
2986:
2981:
2976:
2971:
2966:
2961:
2956:
2951:
2945:
2943:
2939:
2938:
2936:
2935:
2930:
2925:
2920:
2915:
2910:
2905:
2900:
2895:
2890:
2885:
2880:
2875:
2870:
2864:
2862:
2858:
2857:
2855:
2854:
2849:
2844:
2839:
2837:Machine ethics
2834:
2829:
2824:
2819:
2814:
2809:
2804:
2799:
2794:
2789:
2784:
2779:
2774:
2768:
2766:
2762:
2761:
2752:
2750:
2749:
2742:
2735:
2727:
2721:
2720:
2705:
2704:External links
2702:
2699:
2698:
2684:
2664:
2637:(2): 495–511.
2617:
2590:
2564:
2531:
2501:
2474:
2454:
2427:
2393:
2368:
2334:
2319:
2293:
2257:
2254:on 2012-06-15.
2238:
2205:
2190:
2150:
2119:
2090:(2): 113–118.
2070:
2063:
2045:
2014:
1997:(public radio)
1995:Science Friday
1981:
1949:
1893:
1868:
1828:
1803:(1 May 2014).
1801:Wilczek, Frank
1783:
1757:
1731:
1710:
1689:
1682:
1654:
1637:New York Times
1623:
1617:978-0393239355
1616:
1594:
1575:(March 2016).
1564:
1553:on 17 May 2016
1533:
1510:(2013-02-19).
1499:
1484:
1425:
1392:
1363:
1323:
1288:
1253:
1218:
1173:
1172:
1170:
1167:
1165:
1162:
1160:
1159:
1158:
1157:
1156:
1155:
1143:
1133:
1128:
1123:
1118:
1109:
1107:Machine ethics
1104:
1099:
1094:
1088:
1083:
1078:
1073:
1068:
1067:
1066:
1061:
1056:
1046:
1041:
1035:
1033:
1030:
1010:
1002:
1000:
997:
940:
937:
905:Main article:
902:
899:
851:
848:
826:Main article:
823:
820:
818:
815:
811:working memory
797:
794:
793:
792:
789:
782:
779:
773:
756:
753:
751:
748:
663:Main article:
660:
657:
630:Main article:
627:
624:
584:
581:
573:Tempe, Arizona
564:autonomous car
559:
556:
544:
541:
526:
523:
517:
514:
501:Main article:
498:
495:
493:
490:
474:robot uprising
420:
419:
417:
416:
409:
402:
394:
391:
390:
387:
386:
380:
377:
376:
373:
372:
369:
368:
363:
358:
353:
347:
342:
341:
338:
337:
334:
333:
328:
323:
318:
313:
304:
299:
294:
288:
283:
282:
279:
278:
275:
274:
269:
264:
259:
254:
253:
252:
242:
237:
232:
231:
230:
225:
220:
210:
205:
203:Earth sciences
200:
195:
193:Bioinformatics
189:
184:
183:
180:
179:
176:
175:
170:
165:
160:
155:
150:
145:
139:
136:
135:
132:
131:
128:
127:
122:
117:
112:
107:
102:
97:
92:
87:
82:
76:
71:
70:
67:
66:
56:
55:
49:
48:
26:
24:
18:Robot uprising
14:
13:
10:
9:
6:
4:
3:
2:
4673:
4662:
4659:
4657:
4654:
4652:
4649:
4647:
4644:
4642:
4639:
4638:
4636:
4619:
4616:
4614:
4613:Risk analysis
4611:
4609:
4606:
4604:
4601:
4599:
4596:
4595:
4588:
4586:
4585:
4580:
4574:
4573:
4570:
4564:
4561:
4559:
4558:Social crisis
4556:
4554:
4551:
4549:
4546:
4544:
4541:
4539:
4536:
4534:
4531:
4529:
4526:
4524:
4521:
4519:
4516:
4514:
4511:
4510:
4508:
4504:
4498:
4495:
4493:
4490:
4488:
4485:
4483:
4480:
4479:
4477:
4475:Organizations
4473:
4465:
4462:
4461:
4460:
4457:
4455:
4452:
4448:
4445:
4444:
4443:
4440:
4438:
4435:
4431:
4428:
4426:
4423:
4422:
4421:
4418:
4416:
4413:
4412:
4410:
4406:
4400:
4399:World to come
4397:
4395:
4392:
4388:
4385:
4383:
4380:
4379:
4378:
4375:
4373:
4370:
4368:
4365:
4361:
4358:
4356:
4353:
4351:
4348:
4347:
4346:
4345:Millennialism
4343:
4341:
4338:
4334:
4333:Messianic Age
4331:
4330:
4329:
4326:
4324:
4321:
4319:
4318:Gog and Magog
4316:
4314:
4311:
4309:
4308:Earth Changes
4306:
4304:
4301:
4299:
4296:
4294:
4291:
4289:
4286:
4284:
4281:
4279:
4276:
4274:
4271:
4270:
4268:
4264:
4256:
4253:
4252:
4251:
4248:
4246:
4243:
4237:
4234:
4232:
4229:
4227:
4224:
4223:
4222:
4219:
4215:
4212:
4210:
4207:
4205:
4202:
4200:
4197:
4195:
4192:
4190:
4187:
4185:
4182:
4180:
4177:
4176:
4175:
4172:
4168:
4165:
4163:
4160:
4156:
4153:
4152:
4151:
4148:
4146:
4145:New Jerusalem
4143:
4141:
4138:
4136:
4133:
4127:
4124:
4122:
4121:War in Heaven
4119:
4117:
4116:Two witnesses
4114:
4112:
4109:
4107:
4104:
4102:
4099:
4097:
4094:
4092:
4089:
4087:
4084:
4083:
4082:
4079:
4078:
4077:
4074:
4072:
4069:
4065:
4062:
4060:
4057:
4056:
4055:
4052:
4050:
4047:
4045:
4042:
4040:
4037:
4033:
4030:
4029:
4028:
4025:
4023:
4020:
4019:
4018:
4015:
4013:
4010:
4006:
4003:
4001:
3998:
3997:
3996:
3993:
3991:
3988:
3987:
3986:
3985:Second Coming
3983:
3981:
3978:
3974:
3971:
3969:
3966:
3965:
3964:
3961:
3957:
3954:
3952:
3949:
3948:
3947:
3944:
3943:
3941:
3939:
3935:
3929:
3926:
3924:
3921:
3919:
3916:
3914:
3911:
3909:
3906:
3902:
3899:
3897:
3894:
3890:
3887:
3886:
3885:
3882:
3880:
3877:
3875:
3872:
3871:
3870:
3867:
3865:
3862:
3860:
3857:
3855:
3852:
3850:
3847:
3845:
3842:
3840:
3837:
3835:
3832:
3830:
3827:
3825:
3822:
3820:
3817:
3816:
3814:
3810:
3800:
3797:
3795:
3792:
3790:
3787:
3785:
3782:
3780:
3777:
3775:
3772:
3770:
3767:
3763:
3760:
3758:
3755:
3753:
3750:
3749:
3748:
3745:
3741:
3738:
3736:
3733:
3732:
3731:
3728:
3727:
3725:
3721:
3715:
3712:
3710:
3707:
3705:
3702:
3700:
3697:
3695:
3692:
3690:
3687:
3686:
3684:
3682:
3678:
3675:
3671:
3659:
3656:
3655:
3654:
3651:
3649:
3646:
3645:
3643:
3641:
3637:
3631:
3628:
3626:
3623:
3621:
3618:
3614:
3611:
3610:
3609:
3606:
3604:
3601:
3599:
3596:
3594:
3591:
3589:
3586:
3584:
3581:
3579:
3576:
3574:
3571:
3567:
3564:
3562:
3559:
3558:
3557:
3554:
3552:
3549:
3547:
3544:
3542:
3539:
3537:
3534:
3532:
3529:
3527:
3524:
3522:
3519:
3517:
3514:
3512:
3509:
3507:
3504:
3500:
3497:
3496:
3495:
3492:
3490:
3487:
3485:
3484:Deforestation
3482:
3480:
3477:
3475:
3472:
3470:
3467:
3463:
3460:
3459:
3458:
3455:
3453:
3450:
3449:
3447:
3445:
3441:
3438:
3436:
3432:
3426:
3425:World War III
3423:
3421:
3418:
3414:
3411:
3409:
3406:
3404:
3401:
3400:
3399:
3396:
3394:
3391:
3389:
3386:
3384:
3381:
3377:
3374:
3372:
3369:
3368:
3367:
3364:
3362:
3359:
3357:
3354:
3353:
3351:
3347:
3341:
3340:Transhumanism
3338:
3334:
3331:
3329:
3326:
3324:
3321:
3320:
3319:
3315:
3312:
3308:
3305:
3303:
3300:
3299:
3298:
3295:
3292:
3289:
3285:
3282:
3280:
3277:
3275:
3272:
3270:
3267:
3265:
3262:
3261:
3260:
3257:
3253:
3250:
3249:
3248:
3245:
3243:
3240:
3238:
3235:
3231:
3228:
3226:
3223:
3221:
3218:
3217:
3216:
3213:
3211:
3208:
3207:
3205:
3203:Technological
3201:
3193:
3190:
3189:
3188:
3185:
3183:
3180:
3179:
3176:
3172:
3165:
3160:
3158:
3153:
3151:
3146:
3145:
3142:
3132:
3122:
3116:
3113:
3111:
3110:
3106:
3104:
3103:
3099:
3097:
3096:
3095:The Precipice
3092:
3090:
3089:
3085:
3083:
3080:
3078:
3077:
3073:
3071:
3068:
3067:
3065:
3061:
3055:
3052:
3050:
3047:
3045:
3044:Frank Wilczek
3042:
3040:
3037:
3035:
3032:
3030:
3027:
3025:
3022:
3020:
3017:
3015:
3012:
3010:
3007:
3005:
3002:
3000:
2997:
2995:
2992:
2990:
2989:Dan Hendrycks
2987:
2985:
2982:
2980:
2977:
2975:
2972:
2970:
2967:
2965:
2962:
2960:
2959:Yoshua Bengio
2957:
2955:
2952:
2950:
2947:
2946:
2944:
2940:
2934:
2931:
2929:
2926:
2924:
2921:
2919:
2916:
2914:
2911:
2909:
2906:
2904:
2901:
2899:
2896:
2894:
2891:
2889:
2886:
2884:
2881:
2879:
2876:
2874:
2871:
2869:
2866:
2865:
2863:
2861:Organizations
2859:
2853:
2850:
2848:
2845:
2843:
2840:
2838:
2835:
2833:
2830:
2828:
2825:
2823:
2820:
2818:
2815:
2813:
2810:
2808:
2805:
2803:
2800:
2798:
2795:
2793:
2790:
2788:
2785:
2783:
2780:
2778:
2775:
2773:
2770:
2769:
2767:
2763:
2759:
2755:
2748:
2743:
2741:
2736:
2734:
2729:
2728:
2725:
2719:
2715:
2711:
2708:
2707:
2703:
2691:September 12,
2687:
2685:9780134610993
2681:
2677:
2676:
2668:
2665:
2660:
2656:
2652:
2648:
2644:
2640:
2636:
2632:
2628:
2621:
2618:
2605:
2601:
2594:
2591:
2578:
2574:
2568:
2565:
2552:
2548:
2547:
2542:
2535:
2532:
2519:
2515:
2511:
2505:
2502:
2489:
2485:
2481:
2477:
2471:
2467:
2466:
2458:
2455:
2442:
2438:
2431:
2428:
2413:
2406:
2405:
2397:
2394:
2390:
2386:
2385:
2381:
2378:
2372:
2369:
2356:
2352:
2348:
2341:
2339:
2335:
2330:
2326:
2322:
2316:
2312:
2308:
2304:
2297:
2294:
2279:
2275:
2268:
2261:
2258:
2253:
2249:
2242:
2239:
2226:
2222:
2221:
2220:New Scientist
2216:
2209:
2206:
2201:
2197:
2193:
2191:9781108616188
2187:
2183:
2179:
2174:
2169:
2165:
2161:
2154:
2151:
2138:
2134:
2130:
2123:
2120:
2115:
2111:
2106:
2101:
2097:
2093:
2089:
2085:
2081:
2074:
2071:
2066:
2060:
2056:
2049:
2046:
2033:
2029:
2025:
2018:
2015:
2002:
1998:
1996:
1991:
1985:
1982:
1977:
1976:
1968:
1966:
1964:
1962:
1960:
1958:
1956:
1954:
1950:
1935:
1931:
1927:
1923:
1919:
1915:
1911:
1904:
1897:
1894:
1890:
1875:
1871:
1865:
1861:
1857:
1853:
1846:
1842:
1841:Bostrom, Nick
1838:
1832:
1829:
1816:
1812:
1807:
1802:
1798:
1794:
1787:
1784:
1772:
1768:
1761:
1758:
1746:
1742:
1735:
1732:
1726:
1721:
1714:
1711:
1705:
1700:
1693:
1690:
1685:
1679:
1674:
1669:
1665:
1658:
1655:
1642:
1638:
1634:
1627:
1624:
1619:
1613:
1609:
1605:
1598:
1595:
1582:
1578:
1574:
1573:Srnicek, Nick
1568:
1565:
1552:
1548:
1547:openDemocracy
1544:
1537:
1534:
1521:
1517:
1513:
1509:
1503:
1500:
1495:
1488:
1485:
1480:
1476:
1471:
1466:
1461:
1456:
1452:
1448:
1444:
1440:
1436:
1429:
1426:
1420:
1415:
1411:
1407:
1403:
1396:
1393:
1382:on 2017-10-18
1381:
1377:
1373:
1367:
1364:
1360:
1348:
1344:
1340:
1339:
1334:
1327:
1324:
1320:
1309:
1305:
1304:
1299:
1292:
1289:
1285:
1274:
1270:
1269:
1264:
1257:
1254:
1250:
1239:
1235:
1234:
1229:
1222:
1219:
1215:
1203:
1199:
1195:
1194:
1189:
1187:
1178:
1175:
1168:
1163:
1154:
1153:
1149:
1148:
1147:
1144:
1142:
1139:
1138:
1137:
1134:
1132:
1129:
1127:
1124:
1122:
1121:Transhumanism
1119:
1117:
1116:Deep learning
1113:
1110:
1108:
1105:
1103:
1100:
1098:
1095:
1092:
1089:
1087:
1084:
1082:
1079:
1077:
1074:
1072:
1069:
1065:
1062:
1060:
1057:
1055:
1052:
1051:
1050:
1047:
1045:
1042:
1040:
1037:
1036:
1031:
1026:
1022:
1018:
1014:
1007:
998:
996:
993:
991:
987:
983:
979:
975:
971:
967:
962:
958:
954:
950:
946:
938:
936:
934:
930:
926:
921:
918:
914:
908:
900:
898:
896:
892:
888:
884:
883:
878:
874:
873:
867:
865:
859:
857:
856:Steven Pinker
849:
847:
845:
839:
836:
829:
821:
816:
814:
812:
806:
802:
795:
790:
787:
783:
780:
777:
774:
771:
770:
769:
766:
761:
754:
749:
747:
745:
741:
737:
733:
729:
724:
721:
717:
714:According to
712:
710:
709:his monster's
706:
705:
700:
696:
692:
691:
686:
682:
676:
672:
666:
658:
656:
653:
648:
644:
639:
633:
625:
623:
621:
617:
612:
608:
604:
600:
596:
590:
582:
580:
578:
574:
570:
565:
557:
555:
550:
542:
540:
537:
532:
524:
522:
515:
513:
510:
504:
496:
491:
489:
487:
483:
479:
475:
471:
467:
463:
459:
455:
451:
447:
446:human species
443:
439:
436:on Earth and
435:
431:
427:
415:
410:
408:
403:
401:
396:
395:
393:
392:
385:
382:
381:
375:
374:
367:
364:
362:
359:
357:
354:
352:
349:
348:
345:
340:
339:
332:
329:
327:
324:
322:
319:
317:
314:
312:
308:
305:
303:
300:
298:
295:
293:
290:
289:
286:
281:
280:
273:
270:
268:
265:
263:
260:
258:
255:
251:
250:Mental health
248:
247:
246:
243:
241:
238:
236:
233:
229:
226:
224:
221:
219:
216:
215:
214:
213:Generative AI
211:
209:
206:
204:
201:
199:
196:
194:
191:
190:
187:
182:
181:
174:
171:
169:
166:
164:
161:
159:
156:
154:
153:Deep learning
151:
149:
146:
144:
141:
140:
134:
133:
126:
123:
121:
118:
116:
113:
111:
108:
106:
103:
101:
98:
96:
93:
91:
88:
86:
83:
81:
78:
77:
74:
69:
68:
62:
58:
57:
54:
50:
46:
45:
39:
38:
32:
19:
4661:Technophobia
4575:
4518:Cyberwarfare
4236:Third Temple
4091:Lake of fire
3901:Rogue planet
3869:Impact event
3859:Proton decay
3812:Astronomical
3762:Bioterrorism
3608:Supervolcano
3506:Flood basalt
3452:Anoxic event
3361:Collapsology
3349:Sociological
3322:
3220:Cyberwarfare
3107:
3100:
3093:
3086:
3074:
3034:Jaan Tallinn
2974:Eric Drexler
2964:Nick Bostrom
2791:
2777:AI alignment
2689:. Retrieved
2674:
2667:
2634:
2630:
2620:
2608:. Retrieved
2593:
2581:. Retrieved
2567:
2555:. Retrieved
2544:
2534:
2522:. Retrieved
2513:
2504:
2492:. Retrieved
2464:
2457:
2445:. Retrieved
2430:
2419:. Retrieved
2403:
2396:
2375:
2371:
2359:. Retrieved
2350:
2302:
2296:
2285:. Retrieved
2276:. Springer.
2273:
2260:
2252:the original
2241:
2231:21 September
2229:. Retrieved
2218:
2208:
2163:
2153:
2141:. Retrieved
2132:
2122:
2087:
2083:
2073:
2054:
2048:
2036:. Retrieved
2027:
2017:
2005:. Retrieved
1993:
1984:
1974:
1941:. Retrieved
1913:
1909:
1896:
1887:
1881:. Retrieved
1851:
1831:
1819:. Retrieved
1810:
1797:Tegmark, Max
1786:
1774:. Retrieved
1770:
1760:
1748:. Retrieved
1745:The Guardian
1744:
1734:
1713:
1692:
1663:
1657:
1645:. Retrieved
1636:
1626:
1607:
1603:
1597:
1585:. Retrieved
1581:the original
1567:
1555:. Retrieved
1551:the original
1536:
1524:. Retrieved
1516:The Guardian
1515:
1502:
1493:
1487:
1442:
1438:
1428:
1409:
1405:
1395:
1384:. Retrieved
1380:the original
1375:
1366:
1357:
1351:. Retrieved
1336:
1326:
1318:
1312:. Retrieved
1301:
1291:
1283:
1277:. Retrieved
1266:
1256:
1248:
1242:. Retrieved
1231:
1221:
1213:
1206:. Retrieved
1191:
1185:
1177:
1150:
1131:Technophobia
1059:Mobile robot
1024:
1020:
1017:AI alignment
1006:AI alignment
994:
982:Jaan Tallinn
970:Nick Bostrom
942:
928:
924:
922:
912:
910:
880:
870:
868:
860:
853:
840:
831:
828:AI alignment
807:
803:
799:
776:Strategizing
760:Nick Bostrom
758:
731:
727:
725:
720:sufficiently
719:
713:
704:Frankenstein
702:
699:Mary Shelley
694:
688:
678:
646:
643:Nick Bostrom
635:
592:
561:
552:
534:
519:
506:
473:
434:intelligence
425:
423:
310:
297:Chinese room
186:Applications
35:
4594:Categories
4563:Survivalism
4250:Zoroastrian
4106:Seven seals
4101:Seven bowls
4027:Historicism
3923:Solar flare
3799:Overfishing
3774:Defaunation
3561:coral reefs
3323:AI takeover
3242:Nanoweapons
3230:Cybergeddon
3215:Cyberattack
3039:Max Tegmark
3024:Martin Rees
2832:Longtermism
2792:AI takeover
2468:. Penguin.
1412:: 206–223.
1208:October 20,
1193:LiveScience
1071:Cyberocracy
978:Martin Rees
974:Max Tegmark
901:Precautions
685:Karel ÄŚapek
626:Eradication
456:throughout
426:AI takeover
326:Turing test
302:Friendly AI
73:Major goals
4635:Categories
4523:Depression
4513:Ransomware
4328:Messianism
4298:Armageddon
4283:Apocalypse
4071:Antichrist
4059:Man of sin
3956:Three Ages
3819:Big Crunch
3681:Extinction
3673:Biological
3435:Ecological
3307:Strangelet
3004:Shane Legg
2979:Sam Harris
2954:Sam Altman
2893:EleutherAI
2718:Sam Harris
2631:Neohelicon
2557:30 January
2484:1237420037
2421:2020-10-02
2287:2020-10-02
2173:1707.08476
1943:2022-06-16
1889:decades...
1883:2022-06-16
1725:2302.04222
1704:2209.07667
1386:2017-10-17
1353:2017-08-15
1314:2017-08-15
1279:2017-08-15
1244:2017-08-15
1169:References
1025:misaligned
953:Bill Gates
943:Physicist
872:The Matrix
744:Terminator
669:See also:
659:In fiction
587:See also:
547:See also:
529:See also:
466:automation
331:Regulation
285:Philosophy
240:Healthcare
235:Government
137:Approaches
4408:Fictional
4255:Saoshyant
4140:New Earth
4111:The Beast
4044:Preterism
4017:Christian
3973:Kali Yuga
3918:Micronova
3913:Hypernova
3779:Dysgenics
3620:Verneshot
3526:Hypercane
3269:Dead Hand
3019:Huw Price
3009:Elon Musk
2913:Humanity+
2787:AI safety
2659:253793613
2651:0324-4652
2494:2 January
2329:0302-9743
1930:254835485
1647:March 23,
1376:LeanStaff
961:Elon Musk
949:Microsoft
925:alignment
647:emotional
482:Elon Musk
361:AI winter
262:Military
125:AI safety
4548:Pandemic
4533:Epidemic
4528:Droughts
4382:Prewrath
4313:End time
4179:Al-Qa'im
4064:Katechon
4049:2 Esdras
4039:Idealism
4022:Futurism
3951:Maitreya
3946:Buddhist
3789:Pandemic
3237:Gray goo
3131:Category
2999:Bill Joy
2765:Concepts
2710:TED talk
2604:Archived
2583:29 March
2577:Archived
2551:Archived
2546:BBC News
2518:Archived
2514:BBC News
2488:Archived
2441:Archived
2412:Archived
2380:Archived
2355:Archived
2278:Archived
2225:Archived
2200:22007028
2143:30 April
2137:Archived
2114:25484946
2038:30 April
2032:Archived
2007:30 April
2001:Archived
1934:Archived
1874:Archived
1843:(2016).
1815:Archived
1641:Archived
1520:Archived
1479:30910965
1347:Archived
1308:Archived
1303:Phys.org
1273:Archived
1268:Phys.org
1238:Archived
1202:Archived
1032:See also
959:founder
951:founder
939:Warnings
882:I, Robot
740:HAL 9000
716:Toby Ord
509:robotics
384:Glossary
378:Glossary
356:Progress
351:Timeline
311:Takeover
272:Projects
245:Industry
208:Finance
198:Deepfake
148:Symbolic
120:Robotics
95:Planning
4608:Hazards
4506:General
4377:Rapture
4226:Messiah
4214:Sufyani
4204:Israfil
4174:Islamic
4012:Messiah
3990:1 Enoch
3824:Big Rip
3536:Ecocide
3531:Ice age
2610:4 March
2447:15 July
2105:4247417
1821:1 April
1776:4 April
1750:2 April
1526:14 July
1470:6452673
1447:Bibcode
1359:sports.
1021:aligned
895:utility
595:ChatGPT
464:due to
366:AI boom
344:History
267:Physics
4538:Famine
4464:Zombie
4266:Others
4221:Jewish
4199:Dajjal
4081:Events
3995:Daniel
3896:winter
3723:Others
3613:winter
3413:winter
3408:famine
3403:cobalt
2942:People
2933:OpenAI
2682:
2657:
2649:
2524:7 June
2482:
2472:
2391:, 2005
2361:8 June
2327:
2317:
2198:
2188:
2112:
2102:
2061:
1928:
1866:
1680:
1614:
1604:passim
1587:20 May
1557:20 May
1477:
1467:
1015:(AI),
957:SpaceX
955:, and
933:AI box
732:robota
728:R.U.R.
690:R.U.R.
616:OpenAI
601:, and
599:DALL-E
575:by an
442:robots
316:Ethics
37:R.U.R.
4553:Riots
4245:Norse
4209:Mahdi
3968:Kalki
3963:Hindu
3293:(EMP)
3063:Other
2756:from
2655:S2CID
2415:(PDF)
2408:(PDF)
2281:(PDF)
2270:(PDF)
2196:S2CID
2168:arXiv
1937:(PDF)
1926:S2CID
1906:(PDF)
1877:(PDF)
1848:(PDF)
1720:arXiv
1699:arXiv
1198:Purch
1164:Notes
695:robot
618:from
492:Types
228:Music
223:Audio
2693:2022
2680:ISBN
2647:ISSN
2612:2015
2585:2019
2559:2015
2526:2020
2496:2022
2480:OCLC
2470:ISBN
2449:2014
2363:2020
2325:ISSN
2315:ISBN
2233:2016
2186:ISBN
2145:2020
2110:PMID
2059:ISBN
2040:2020
2009:2020
1864:ISBN
1823:2016
1778:2024
1752:2024
1678:ISBN
1649:2018
1612:ISBN
1589:2016
1559:2016
1528:2015
1475:PMID
1210:2015
911:The
736:serf
673:and
577:Uber
480:and
2772:AGI
2716:by
2639:doi
2307:doi
2178:doi
2100:PMC
2092:doi
1918:doi
1856:doi
1668:doi
1465:PMC
1455:doi
1443:116
1414:doi
701:'s
687:'s
609:in
562:An
440:or
424:An
218:Art
4637::
3316:/
2712::
2653:.
2645:.
2635:49
2633:.
2629:.
2549:.
2543:.
2512:.
2486:.
2478:.
2387:-
2353:.
2349:.
2337:^
2323:.
2313:.
2272:.
2223:.
2217:.
2194:.
2184:.
2176:.
2135:.
2131:.
2108:.
2098:.
2086:.
2082:.
2030:.
2026:.
1992:.
1952:^
1932:.
1924:.
1914:22
1912:.
1908:.
1886:.
1872:.
1862:.
1850:.
1839:;
1813:.
1809:.
1799:;
1795:;
1769:.
1743:.
1676:.
1635:.
1545:.
1514:.
1473:.
1463:.
1453:.
1441:.
1437:.
1410:57
1408:.
1404:.
1374:.
1356:.
1345:.
1341:.
1335:.
1317:.
1306:.
1300:.
1282:.
1271:.
1265:.
1247:.
1236:.
1230:.
1212:.
1200:.
1196:.
1190:.
980:,
947:,
597:,
3163:e
3156:t
3149:v
2746:e
2739:t
2732:v
2695:.
2661:.
2641::
2614:.
2587:.
2561:.
2528:.
2498:.
2451:.
2424:.
2365:.
2331:.
2309::
2290:.
2235:.
2202:.
2180::
2170::
2147:.
2116:.
2094::
2088:1
2067:.
2042:.
2011:.
1978:.
1946:.
1920::
1858::
1825:.
1780:.
1754:.
1728:.
1722::
1707:.
1701::
1686:.
1670::
1651:.
1620:.
1591:.
1561:.
1530:.
1481:.
1457::
1449::
1422:.
1416::
1389:.
1188:"
1184:"
1114:/
1008:.
413:e
406:t
399:v
309:/
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.