827:, an LLM may decide that the most likely word to follow "Pluto is the" is "smallest", but then have no high-probability completions (the paper suggests "dwarf planet in our solar system" and "celestial body in the solar system that has ever been classified as a planet.", both of which are incorrect). It's like the LLM 'knows' that none of its suggestions are accurate, yet it's painted itself into a corner. So "not grounded in any training data" doesn't seem accurate here, either. (Possibly this specific type of occurrence would be better handled with a
510:
492:
248:
74:
53:
171:
1078:
150:
22:
772:"Note that while a human hallucination is a percept by a human that cannot sensibly be associated with the portion of the external world that the human is currently directly observing with sense organs, an AI hallucination is instead a confident response by an AI that cannot be grounded in any of its training data."
985:
I feel that delusion is the proper term. I also feel that this page carries enough weight that discussing the difference between "an experience involving the apparent perception of something not present" and "a false belief or judgment about external reality, held despite incontrovertible evidence to
928:
more accurately represents what is described on this page. People who have a _stake_ in anthropomorphizing AI for their own benefit because anthropomorphizing it makes it more engaging, and therefore economically valuable, use words like hallucination _strategically_. It's not objective. An objective
901:
I don’t have an alternate source to suggest offhand, but I agree that this definition is bad. I strongly agree with you that renaming the page is inappropriate as hallucination is overwhelmingly the term used. (Confabulation has its own issues, though it may be an improvement, but that’s not up to us
751:
the phrasing "a confident response" is terse, reflects the sources, and seems accurate to me, even under (say) "mimicry" models. I can say "Harry Potter was confident that his life would be peaceful" even though neither Daniel
Radcliffe nor J.K. Rowling nor any concrete entity made such an assessment
804:
Not only is the intro too wordy, it is also incorrect. Where exactly is the proof/source that such confident responses "are not grounded in any of its training data"Â ? If an AI chatbot was trained on fruits, then surely contaminated or corrupted data could end up provide responses that don't involve
624:
While this does at first seem to be an error on ChatGPT's part, consider that Pearl did not specify "Second largest by land area", of which
Nicaragua is. It is entirely possible that ChatGPT interpreted his question as referring to population, in which case Guatemala is the correct answer. There is
859:
The definition of the term is blatantly incorrect and lacks sources. Where exactly is the proof/source that such confident responses "are not grounded in any of its training data"Â ? If an AI chatbot was trained on fruits, then surely contaminated or corrupted data could end up providing responses
731:
The introduction defines a hallucination as "a confident response". In this context, is 'confidence' being used as statistical concept (e.g. confidence interval) or does it just mean that the generated text reads as if the writer were confident? If this 'confidence' is based purely on the text, I
269:
860:
that don't involve fruits. But that doesn't make it "not grounded in any training data". It seems like that the authors of this article are trying to dogwhistle that hallucination in AI chatbots might involve emergent behavior. This couldn't be any further from the truth. --
937:
is an internal experience not grounded in reality. AIs don't have an internal experience and certainly not one that isn't grounded in reality. Confabulation is defined by visible behaviors based on fabrication, which is exactly what is happening here.
778:"While a human hallucination is when a person sees or feels something that doesn't match up with what's actually happening around them, an AI hallucination is instead a confident response by an AI that cannot be grounded in any of its training data."
1090:
1056:
It seems to me that the "Terminologies" section is foundational and definitional, yet it currently is buried toward the end of the article. I propose moving it earlier, such as before or after where the "Analysis" section currently is.
952:
Renaming is a non-starter at this time, all the sources acknowledge that the current mainstream terminology is "hallucination". You need to lobby the scientists and the mainstream media, not
Knowledge (XXG), if you want to change that.
1028:
I don't know the answer myself, nor am I sure where to find it or source it, but I think it would be very interesting if the article could tell who coined the usage of "hallucination" for referring to AI model output, and when.
621:
Mike Pearl of
Mashable tested ChatGPT with multiple questions. In one example, he asked the model for "the largest country in Central America that isn't Mexico". ChatGPT responded with Guatemala, when the answer is instead
1047:
by
Professor Falken in the missle command center. The WOPR computer was depicted as having the ability to learn which is a basic concept of artificial intelligence. Colonial Computer 02:10, 8 August 2023 (UTC)
293:
433:
1208:
526:
599:
I just did it, seems uncontroversial and basically the same topic. And because 'artificial intellligence' is broader than NLP, it makes sense to merge into a broader article.
350:
288:
221:
1198:
211:
1098:
517:
497:
640:
It's an error either way ("largest" almost always means "largest by area" rather than "most populous"), but I removed it since there are plenty of other examples.
1203:
883:
The definition was sourced to "Survey of
Hallucination in Natural Language Generation" but similar definitions appear in other contexts. Is there an alternative
187:
625:
not enough information provided here to determine whether or not this was genuine AI hallucination or just GPT misinterpreting the vague data it was asked for.
132:
1193:
1183:
395:
122:
234:
178:
155:
369:
98:
741:
732:
think the hallucination should be described as "seemingly confident", because there is no underlying assessment of confidence by the machine.
341:
987:
580:
article has some valuable new content which is properly sourced and not in this article, so there is value to add the current content of the
458:
322:
1178:
912:
841:
626:
654:
Totally disagree. Google "world's largest democracy", and then compare the top result with the one that is the largest by land area.
81:
58:
1188:
968:
793:
775:
That sentence clearly had a lot of work put into it so I didn't touch it, but imho, it should be much simpler. Something like:
414:
379:
260:
33:
711:
I'm pesonally reluctant to add either until we get a strong reporting source explicitly calling one of them a hallucination.
677:
573:
303:
1008:). Their definition is : a plausible but false or misleading response generated by an artificial intelligence algorithm.
424:
186:
related articles on
Knowledge (XXG). If you would like to participate, please visit the project page, where you can join
389:
986:
the contrary, occurring especially in mental conditions" will at least allow folks to question the current thinking.
451:
991:
360:
39:
958:
892:
865:
810:
757:
716:
645:
522:
781:
908:
837:
630:
94:
1164:
1144:
1140:
1129:
1125:
1114:
1066:
1062:
1038:
1034:
1017:
1013:
995:
980:
962:
947:
943:
930:
917:
896:
876:
869:
861:
846:
814:
806:
797:
761:
720:
697:
663:
649:
634:
608:
593:
525:
on
Knowledge (XXG). If you would like to participate, please visit the project page, where you can join
97:
on
Knowledge (XXG). If you would like to participate, please visit the project page, where you can join
1005:
1155:
Why is the
Glenfinnan bridge a particularly "notable" example of this phenomenon, as the lead claims?
1136:
1121:
1009:
752:
of confidence. That said, if we can find a strong source for alternate views, we should include them.
1004:
A definition of hallucination for IA was added in September 2023 to the Merriam-Webster dictionnary (
737:
589:
785:
279:
21:
789:
676:
A German geocoding company was flooded by dissatisfied customers trying to use code ChatGPT wrote:
581:
577:
569:
565:
1110:
1081:
This article was the subject of a Wiki Education Foundation-supported course assignment, between
954:
888:
753:
712:
659:
641:
568:
article should be merged with this article. I agree with this proposal. The reason I created the
708:
976:
903:
880:
832:
604:
405:
1160:
1058:
1030:
939:
685:
331:
183:
509:
491:
748:
733:
693:
678:
https://the-decoder.com/company-wins-customers-via-chatgpt-for-a-product-it-does-not-carry
585:
884:
704:
572:
article is because I did not find this one: it is not referenced anywhere, even in the
247:
270:
Requested articles/Applied arts and sciences/Computer science, computing, and Internet
1172:
1106:
1043:
The earliest reference I'm aware of to a computer halliciating is in the 1983 movie:
934:
925:
655:
972:
924:
I agree with OP and would like to suggest renaming the page to Confabulation (AI).
600:
1156:
1094:
1077:
828:
684:
complained people keep trying to access a URL they don't have on their website:
73:
52:
1120:
Can you write me an 80 essay for my dream is to be a gymnastics coach and why
689:
681:
707:
searching for (eleuther hallucination) nor (opencage hallucination), so per
312:
90:
170:
149:
1044:
86:
805:
fruits. But that doesn't make it "not grounded in any training data". --
1006:
https://www.merriam-webster.com/wordplay/new-words-in-the-dictionary
824:
823:
I agree that it's an incorrect definition. Using the example from
855:
The definition of "Hallucination" (AI) needs to be reconsidered
688:
Likely, there should be more examples, which ones are notable?
388:
Find pictures for the biographies of computer scientists (see
15:
1024:
Who coined usage of “hallucination” with respect to AI models
686:
https://twitter.com/AiEleuther/status/1633971388317941763
615:
Some Examples of Hallucination are better than others.
1072:
Wiki Education assignment: Intro to Technical Writing
560:
Merging with Hallucination (artificial intellligence)
726:
521:, a collaborative effort to improve the coverage of
182:, a collaborative effort to improve the coverage of
85:, a collaborative effort to improve the coverage of
902:to decide but the broader scientific community.) -
831:, but that still only gives you local planning.) -
535:
Knowledge (XXG):WikiProject Artificial Intelligence
294:Computer science articles needing expert attention
825:The Internal State of an LLM Knows When its Lying
434:WikiProject Computer science/Unreferenced BLPs
8:
1209:WikiProject Artificial Intelligence articles
538:Template:WikiProject Artificial Intelligence
196:Knowledge (XXG):WikiProject Computer science
351:Computer science articles without infoboxes
289:Computer science articles needing attention
779:
486:
255:Here are some tasks awaiting attention:
229:
144:
47:
1199:Low-importance Computer science articles
929:description would be confabulation. As
727:'confidence' - is this a rigorous term?
488:
146:
49:
19:
1204:WikiProject Computer science articles
1052:Move "Terminologies" section earlier?
199:Template:WikiProject Computer science
107:Knowledge (XXG):WikiProject Computing
7:
515:This article is within the scope of
176:This article is within the scope of
79:This article is within the scope of
887:that you (or others) would prefer?
518:WikiProject Artificial Intelligence
38:It is of interest to the following
1086:
1082:
370:Timeline of computing 2020–present
14:
1194:C-Class Computer science articles
1184:Low-importance Computing articles
933:is getting at, the definition of
396:Computing articles needing images
1089:. Further details are available
1076:
541:Artificial Intelligence articles
508:
490:
246:
169:
148:
72:
51:
20:
969:Confabulation (neural networks)
672:Hallucinating non-existent APIs
576:page. However I think that the
564:It has been suggested that the
216:This article has been rated as
127:This article has been rated as
619:The article currently states:
574:Hallucination (disambiguation)
110:Template:WikiProject Computing
1:
1105:— Assignment last updated by
996:19:28, 3 September 2023 (UTC)
529:and see a list of open tasks.
450:Tag all relevant articles in
190:and see a list of open tasks.
101:and see a list of open tasks.
1145:19:36, 4 December 2023 (UTC)
1130:19:36, 4 December 2023 (UTC)
1115:19:28, 8 December 2023 (UTC)
1018:08:59, 28 October 2023 (UTC)
609:18:41, 30 January 2023 (UTC)
594:17:47, 16 January 2023 (UTC)
459:WikiProject Computer science
235:WikiProject Computer science
179:WikiProject Computer science
390:List of computer scientists
1225:
1179:C-Class Computing articles
1165:02:17, 23 March 2024 (UTC)
870:14:40, 26 April 2023 (UTC)
815:14:37, 26 April 2023 (UTC)
798:21:01, 10 April 2023 (UTC)
721:18:20, 19 March 2023 (UTC)
698:10:05, 16 March 2023 (UTC)
664:10:59, 29 March 2023 (UTC)
650:18:12, 19 March 2023 (UTC)
635:05:23, 15 March 2023 (UTC)
222:project's importance scale
133:project's importance scale
1067:16:52, 13 July 2023 (UTC)
1039:16:25, 13 July 2023 (UTC)
981:04:45, 12 June 2023 (UTC)
963:19:23, 11 June 2023 (UTC)
948:12:19, 11 June 2023 (UTC)
918:01:08, 12 June 2023 (UTC)
897:00:41, 11 June 2023 (UTC)
762:02:37, 11 June 2023 (UTC)
742:12:51, 8 April 2023 (UTC)
503:
452:Category:Computer science
228:
215:
202:Computer science articles
164:
126:
67:
46:
703:I didn't find any great
454:and sub-categories with
967:there is a page for it
847:18:21, 5 May 2023 (UTC)
532:Artificial Intelligence
523:Artificial intelligence
498:Artificial Intelligence
1189:All Computing articles
768:Intro is way too wordy
415:Computer science stubs
95:information technology
28:This article is rated
1093:. Student editor(s):
82:WikiProject Computing
32:on Knowledge (XXG)'s
233:Things you can help
582:Hallucination (NLP)
578:Hallucination (NLP)
570:Hallucination (NLP)
566:Hallucination (NLP)
1091:on the course page
113:Computing articles
34:content assessment
916:
881:User:CRGreathouse
845:
800:
784:comment added by
557:
556:
553:
552:
549:
548:
485:
484:
481:
480:
477:
476:
473:
472:
143:
142:
139:
138:
1216:
1117:
1099:article contribs
1088:
1084:
1080:
906:
835:
623:
543:
542:
539:
536:
533:
512:
505:
504:
494:
487:
463:
457:
332:Computer science
261:Article requests
250:
243:
242:
230:
204:
203:
200:
197:
194:
193:Computer science
184:Computer science
173:
166:
165:
160:
156:Computer science
152:
145:
115:
114:
111:
108:
105:
76:
69:
68:
63:
55:
48:
31:
25:
24:
16:
1224:
1223:
1219:
1218:
1217:
1215:
1214:
1213:
1169:
1168:
1153:
1104:
1087:1 November 2023
1074:
1054:
1026:
988:216.213.180.191
857:
770:
729:
674:
620:
617:
562:
540:
537:
534:
531:
530:
469:
466:
461:
455:
443:Project-related
438:
419:
400:
374:
355:
336:
317:
298:
274:
201:
198:
195:
192:
191:
158:
112:
109:
106:
103:
102:
61:
29:
12:
11:
5:
1222:
1220:
1212:
1211:
1206:
1201:
1196:
1191:
1186:
1181:
1171:
1170:
1152:
1149:
1148:
1147:
1083:3 October 2023
1073:
1070:
1053:
1050:
1025:
1022:
1021:
1020:
1002:
1001:
1000:
999:
998:
965:
922:
921:
920:
856:
853:
852:
851:
850:
849:
818:
817:
769:
766:
765:
764:
728:
725:
724:
723:
673:
670:
669:
668:
667:
666:
616:
613:
612:
611:
561:
558:
555:
554:
551:
550:
547:
546:
544:
527:the discussion
513:
501:
500:
495:
483:
482:
479:
478:
475:
474:
471:
470:
468:
467:
465:
464:
447:
439:
437:
436:
430:
420:
418:
417:
411:
401:
399:
398:
393:
385:
375:
373:
372:
366:
356:
354:
353:
347:
337:
335:
334:
328:
318:
316:
315:
309:
299:
297:
296:
291:
285:
275:
273:
272:
266:
254:
252:
251:
239:
238:
226:
225:
218:Low-importance
214:
208:
207:
205:
188:the discussion
174:
162:
161:
159:Low‑importance
153:
141:
140:
137:
136:
129:Low-importance
125:
119:
118:
116:
99:the discussion
77:
65:
64:
62:Low‑importance
56:
44:
43:
37:
26:
13:
10:
9:
6:
4:
3:
2:
1221:
1210:
1207:
1205:
1202:
1200:
1197:
1195:
1192:
1190:
1187:
1185:
1182:
1180:
1177:
1176:
1174:
1167:
1166:
1162:
1158:
1150:
1146:
1142:
1138:
1134:
1133:
1132:
1131:
1127:
1123:
1118:
1116:
1112:
1108:
1102:
1100:
1096:
1092:
1079:
1071:
1069:
1068:
1064:
1060:
1051:
1049:
1046:
1041:
1040:
1036:
1032:
1023:
1019:
1015:
1011:
1007:
1003:
997:
993:
989:
984:
983:
982:
978:
974:
970:
966:
964:
960:
956:
955:Rolf H Nelson
951:
950:
949:
945:
941:
936:
935:hallucination
932:
931:AloisIrlmaier
927:
926:Confabulation
923:
919:
914:
910:
905:
900:
899:
898:
894:
890:
889:Rolf H Nelson
886:
882:
878:
877:AloisIrlmaier
874:
873:
872:
871:
867:
863:
862:AloisIrlmaier
854:
848:
843:
839:
834:
830:
826:
822:
821:
820:
819:
816:
812:
808:
807:AloisIrlmaier
803:
802:
801:
799:
795:
791:
787:
783:
776:
773:
767:
763:
759:
755:
754:Rolf H Nelson
750:
746:
745:
744:
743:
739:
735:
722:
718:
714:
713:Rolf H Nelson
710:
706:
702:
701:
700:
699:
695:
691:
687:
683:
679:
671:
665:
661:
657:
653:
652:
651:
647:
643:
642:Rolf H Nelson
639:
638:
637:
636:
632:
628:
614:
610:
606:
602:
598:
597:
596:
595:
591:
587:
583:
579:
575:
571:
567:
559:
545:
528:
524:
520:
519:
514:
511:
507:
506:
502:
499:
496:
493:
489:
460:
453:
449:
448:
446:
444:
440:
435:
432:
431:
429:
427:
426:
421:
416:
413:
412:
410:
408:
407:
402:
397:
394:
391:
387:
386:
384:
382:
381:
376:
371:
368:
367:
365:
363:
362:
357:
352:
349:
348:
346:
344:
343:
338:
333:
330:
329:
327:
325:
324:
319:
314:
311:
310:
308:
306:
305:
300:
295:
292:
290:
287:
286:
284:
282:
281:
276:
271:
268:
267:
265:
263:
262:
257:
256:
253:
249:
245:
244:
241:
240:
236:
232:
231:
227:
223:
219:
213:
210:
209:
206:
189:
185:
181:
180:
175:
172:
168:
167:
163:
157:
154:
151:
147:
134:
130:
124:
121:
120:
117:
100:
96:
92:
88:
84:
83:
78:
75:
71:
70:
66:
60:
57:
54:
50:
45:
41:
35:
27:
23:
18:
17:
1154:
1119:
1103:
1075:
1055:
1042:
1027:
904:CRGreathouse
858:
833:CRGreathouse
780:— Preceding
777:
774:
771:
730:
675:
627:192.77.12.11
618:
563:
516:
442:
441:
425:Unreferenced
423:
422:
404:
403:
378:
377:
359:
358:
340:
339:
321:
320:
302:
301:
278:
277:
259:
258:
217:
177:
128:
80:
40:WikiProjects
1137:Guizhuzheng
1122:Guizhuzheng
1059:Showeropera
1031:Showeropera
1010:Bob20230408
940:TwigsCogito
829:beam search
1173:Categories
1151:Glenfinnan
749:AdamChrisR
734:AdamChrisR
682:EleutherAI
622:Nicaragua.
586:Hervegirod
786:Cainxinth
584:article.
313:Computing
104:Computing
91:computing
87:computers
59:Computing
1107:Jazaam02
1045:Wargames
794:contribs
782:unsigned
709:WP:SYNTH
656:Mathglot
361:Maintain
304:Copyedit
973:Artem.G
601:Artem.G
342:Infobox
280:Cleanup
220:on the
131:on the
30:C-class
1157:Furius
1095:Wdan14
323:Expand
93:, and
36:scale.
885:WP:RS
705:WP:RS
690:Ain92
406:Stubs
380:Photo
237:with:
1161:talk
1141:talk
1135:Idk
1126:talk
1111:talk
1085:and
1063:talk
1035:talk
1014:talk
992:talk
977:talk
959:talk
944:talk
893:talk
866:talk
811:talk
790:talk
758:talk
738:talk
717:talk
694:talk
680:And
660:talk
646:talk
631:talk
605:talk
590:talk
1101:).
212:Low
123:Low
1175::
1163:)
1143:)
1128:)
1113:)
1065:)
1037:)
1016:)
994:)
979:)
971:.
961:)
946:)
911:|
895:)
868:)
840:|
813:)
796:)
792:•
760:)
740:)
719:)
696:)
662:)
648:)
633:)
607:)
592:)
462:}}
456:{{
89:,
1159:(
1139:(
1124:(
1109:(
1097:(
1061:(
1033:(
1012:(
990:(
975:(
957:(
942:(
915:)
913:c
909:t
907:(
891:(
879:@
875:@
864:(
844:)
842:c
838:t
836:(
809:(
788:(
756:(
747:@
736:(
715:(
692:(
658:(
644:(
629:(
603:(
588:(
445::
428::
409::
392:)
383::
364::
345::
326::
307::
283::
264::
224:.
135:.
42::
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.