Knowledge

Inverted Dirichlet distribution

Source đź“ť

1272: 375: 41: 653: 370:{\displaystyle p\left(x_{1},\ldots ,x_{k}\right)={\frac {\Gamma \left(\nu _{1}+\cdots +\nu _{k+1}\right)}{\prod _{j=1}^{k+1}\Gamma \left(\nu _{j}\right)}}x_{1}^{\nu _{1}-1}\cdots x_{k}^{\nu _{k}-1}\times \left(1+\sum _{i=1}^{k}x_{i}\right)^{-\sum _{j=1}^{k+1}\nu _{j}},\qquad x_{i}>0.} 398: 648:{\displaystyle E\left={\frac {\Gamma \left(\nu _{k+1}-\sum _{j=1}^{k}q_{j}\right)}{\Gamma \left(\nu _{k+1}\right)}}\prod _{j=1}^{k}{\frac {\Gamma \left(\nu _{j}+q_{j}\right)}{\Gamma \left(\nu _{j}\right)}}} 717: 885: 782: 952: 957:
T. Bdiri et al. have developed several models that use the inverted Dirichlet distribution to represent and model non-Gaussian data. They have introduced finite and infinite
1198:
Bdiri, Taoufik; Bouguila, Nizar; Ziou, Djemel (2014). "Object clustering and recognition using multi-finite mixtures for semantic classes and hierarchy modeling".
1332: 813: 1225:
Bdiri, Taoufik; Bouguila, Nizar; Ziou, Djemel (2013). "Visual Scenes Categorization Using a Flexible Hierarchical Mixture Model Supporting Users Ontology".
1342: 1313: 1337: 1242: 1140: 1107: 1347: 788: 385: 969:
to model infinite mixtures. T. Bdiri et al. have also used the inverted Dirichlet distribution to propose an approach to generate
661: 818: 1306: 722: 1063:
Bdiri, Taoufik; Nizar, Bouguila (2012). "Positive vectors clustering using inverted Dirichlet finite mixture models".
890: 1352: 1299: 1090:
Bdiri, Taoufik; Bouguila, Nizar (2011). "Learning Inverted Dirichlet Mixtures for Positive Data Clustering".
978: 25: 795:
is used instead of the categories' probabilities- if the negative multinomial parameter vector is given by
970: 381: 29: 1271: 389: 1123:
Bdiri, Taoufik; Bouguila, Nizar (2011). "An Infinite Mixture of Inverted Dirichlet Distributions".
1157: 962: 1248: 1180: 1045: 998: 974: 1238: 1136: 1103: 966: 1230: 1207: 1172: 1128: 1095: 1072: 1037: 1010: 1283: 958: 798: 1326: 1184: 1049: 1252: 1014: 1099: 1132: 1211: 1076: 1279: 1176: 1041: 792: 17: 1227:
2013 IEEE 25th International Conference on Tools with Artificial Intelligence
1158:"Bayesian learning of inverted Dirichlet mixtures for SVM kernels generation" 1234: 1094:. Lecture Notes in Computer Science. Vol. 6743. pp. 265–272. 1127:. Lecture Notes in Computer Science. Vol. 7063. pp. 71–78. 1001:(1965). "The inverted Dirichlet distribution with applications". 1028:
Ghorbel, M. (2010). "On the inverted Dirichlet distribution".
1092:
Rough Sets, Fuzzy Sets, Data Mining and Granular Computing
712:{\displaystyle q_{j}>-\nu _{j},1\leqslant j\leqslant k} 880:{\displaystyle x_{i}={\frac {p_{i}}{p_{0}}},i=1\ldots k} 815:, by changing parameters of the negative multinomial to 787:
The inverted Dirichlet distribution is conjugate to the
1287: 32:. It was first described by Tiao and Cuttman in 1965. 893: 821: 801: 725: 664: 401: 44: 946: 879: 807: 776: 711: 647: 369: 1030:Communications in Statistics - Theory and Methods 777:{\displaystyle \nu _{k+1}>q_{1}+\ldots +q_{k}} 35:The distribution has a density function given by 1003:Journal of the American Statistical Association 961:of inverted Dirichlet distributions using the 1307: 965:technique to estimate the parameters and the 8: 947:{\displaystyle p_{0}=1-\sum _{i=1}^{k}p_{i}} 1314: 1300: 384:and arises naturally when considering the 938: 928: 917: 898: 892: 851: 841: 835: 826: 820: 800: 768: 749: 730: 724: 685: 669: 663: 632: 608: 595: 580: 574: 563: 540: 516: 506: 495: 476: 461: 445: 440: 435: 425: 414: 400: 355: 339: 323: 312: 304: 293: 283: 272: 239: 234: 229: 208: 203: 198: 181: 158: 147: 124: 105: 90: 76: 57: 43: 1156:Bdiri, Taoufik; Nizar, Bouguila (2013). 24:is a multivariate generalization of the 990: 1333:Multivariate continuous distributions 380:The distribution has applications in 7: 1268: 1266: 977:and another approach to establish 621: 583: 529: 464: 388:. It can be characterized by its 170: 93: 14: 1165:Neural Computing and Applications 789:negative multinomial distribution 386:multivariate Student distribution 1343:Exponential family distributions 1270: 1200:Expert Systems with Applications 1065:Expert Systems with Applications 350: 22:inverted Dirichlet distribution 1015:10.1080/01621459.1965.10480828 1: 1338:Conjugate prior distributions 1125:Neural Information Processing 1286:. You can help Knowledge by 1100:10.1007/978-3-642-21881-1_42 1133:10.1007/978-3-642-24958-7_9 1369: 1265: 1212:10.1016/j.eswa.2013.08.005 1077:10.1016/j.eswa.2011.08.063 1177:10.1007/s00521-012-1094-z 1042:10.1080/03610920802627062 791:if a generalized form of 1348:Continuous distributions 28:, and is related to the 979:hierarchical clustering 26:beta prime distribution 1282:-related article is a 971:Support Vector Machine 948: 933: 881: 809: 778: 713: 649: 579: 511: 430: 382:statistical regression 371: 334: 288: 169: 30:Dirichlet distribution 1235:10.1109/ICTAI.2013.48 949: 913: 882: 810: 779: 714: 650: 559: 491: 410: 372: 308: 268: 143: 1229:. pp. 262–267. 891: 819: 799: 723: 662: 399: 42: 973:kernels basing on 452: 252: 221: 975:Bayesian inference 944: 877: 805: 774: 709: 645: 431: 367: 225: 194: 1295: 1294: 1244:978-1-4799-2972-6 1142:978-3-642-24957-0 1109:978-3-642-21880-4 967:Dirichlet process 857: 808:{\displaystyle p} 643: 557: 192: 1360: 1353:Statistics stubs 1316: 1309: 1302: 1274: 1267: 1257: 1256: 1222: 1216: 1215: 1206:(4): 1218–1235. 1195: 1189: 1188: 1171:(5): 1443–1458. 1162: 1153: 1147: 1146: 1120: 1114: 1113: 1087: 1081: 1080: 1071:(2): 1869–1882. 1060: 1054: 1053: 1025: 1019: 1018: 1009:(311): 793–805. 995: 953: 951: 950: 945: 943: 942: 932: 927: 903: 902: 886: 884: 883: 878: 858: 856: 855: 846: 845: 836: 831: 830: 814: 812: 811: 806: 783: 781: 780: 775: 773: 772: 754: 753: 741: 740: 718: 716: 715: 710: 690: 689: 674: 673: 654: 652: 651: 646: 644: 642: 641: 637: 636: 619: 618: 614: 613: 612: 600: 599: 581: 578: 573: 558: 556: 555: 551: 550: 527: 526: 522: 521: 520: 510: 505: 487: 486: 462: 457: 453: 451: 450: 449: 439: 429: 424: 376: 374: 373: 368: 360: 359: 346: 345: 344: 343: 333: 322: 303: 299: 298: 297: 287: 282: 251: 244: 243: 233: 220: 213: 212: 202: 193: 191: 190: 186: 185: 168: 157: 141: 140: 136: 135: 134: 110: 109: 91: 86: 82: 81: 80: 62: 61: 1368: 1367: 1363: 1362: 1361: 1359: 1358: 1357: 1323: 1322: 1321: 1320: 1263: 1261: 1260: 1245: 1224: 1223: 1219: 1197: 1196: 1192: 1160: 1155: 1154: 1150: 1143: 1122: 1121: 1117: 1110: 1089: 1088: 1084: 1062: 1061: 1057: 1027: 1026: 1022: 997: 996: 992: 987: 934: 894: 889: 888: 847: 837: 822: 817: 816: 797: 796: 764: 745: 726: 721: 720: 681: 665: 660: 659: 628: 624: 620: 604: 591: 590: 586: 582: 536: 532: 528: 512: 472: 471: 467: 463: 441: 409: 405: 397: 396: 351: 335: 289: 261: 257: 256: 235: 204: 177: 173: 142: 120: 101: 100: 96: 92: 72: 53: 52: 48: 40: 39: 12: 11: 5: 1366: 1364: 1356: 1355: 1350: 1345: 1340: 1335: 1325: 1324: 1319: 1318: 1311: 1304: 1296: 1293: 1292: 1275: 1259: 1258: 1243: 1217: 1190: 1148: 1141: 1115: 1108: 1082: 1055: 1020: 989: 988: 986: 983: 963:Newton–Raphson 959:mixture models 941: 937: 931: 926: 923: 920: 916: 912: 909: 906: 901: 897: 876: 873: 870: 867: 864: 861: 854: 850: 844: 840: 834: 829: 825: 804: 771: 767: 763: 760: 757: 752: 748: 744: 739: 736: 733: 729: 708: 705: 702: 699: 696: 693: 688: 684: 680: 677: 672: 668: 658:provided that 656: 655: 640: 635: 631: 627: 623: 617: 611: 607: 603: 598: 594: 589: 585: 577: 572: 569: 566: 562: 554: 549: 546: 543: 539: 535: 531: 525: 519: 515: 509: 504: 501: 498: 494: 490: 485: 482: 479: 475: 470: 466: 460: 456: 448: 444: 438: 434: 428: 423: 420: 417: 413: 408: 404: 378: 377: 366: 363: 358: 354: 349: 342: 338: 332: 329: 326: 321: 318: 315: 311: 307: 302: 296: 292: 286: 281: 278: 275: 271: 267: 264: 260: 255: 250: 247: 242: 238: 232: 228: 224: 219: 216: 211: 207: 201: 197: 189: 184: 180: 176: 172: 167: 164: 161: 156: 153: 150: 146: 139: 133: 130: 127: 123: 119: 116: 113: 108: 104: 99: 95: 89: 85: 79: 75: 71: 68: 65: 60: 56: 51: 47: 13: 10: 9: 6: 4: 3: 2: 1365: 1354: 1351: 1349: 1346: 1344: 1341: 1339: 1336: 1334: 1331: 1330: 1328: 1317: 1312: 1310: 1305: 1303: 1298: 1297: 1291: 1289: 1285: 1281: 1276: 1273: 1269: 1264: 1254: 1250: 1246: 1240: 1236: 1232: 1228: 1221: 1218: 1213: 1209: 1205: 1201: 1194: 1191: 1186: 1182: 1178: 1174: 1170: 1166: 1159: 1152: 1149: 1144: 1138: 1134: 1130: 1126: 1119: 1116: 1111: 1105: 1101: 1097: 1093: 1086: 1083: 1078: 1074: 1070: 1066: 1059: 1056: 1051: 1047: 1043: 1039: 1035: 1031: 1024: 1021: 1016: 1012: 1008: 1004: 1000: 994: 991: 984: 982: 980: 976: 972: 968: 964: 960: 955: 939: 935: 929: 924: 921: 918: 914: 910: 907: 904: 899: 895: 874: 871: 868: 865: 862: 859: 852: 848: 842: 838: 832: 827: 823: 802: 794: 790: 785: 769: 765: 761: 758: 755: 750: 746: 742: 737: 734: 731: 727: 706: 703: 700: 697: 694: 691: 686: 682: 678: 675: 670: 666: 638: 633: 629: 625: 615: 609: 605: 601: 596: 592: 587: 575: 570: 567: 564: 560: 552: 547: 544: 541: 537: 533: 523: 517: 513: 507: 502: 499: 496: 492: 488: 483: 480: 477: 473: 468: 458: 454: 446: 442: 436: 432: 426: 421: 418: 415: 411: 406: 402: 395: 394: 393: 391: 390:mixed moments 387: 383: 364: 361: 356: 352: 347: 340: 336: 330: 327: 324: 319: 316: 313: 309: 305: 300: 294: 290: 284: 279: 276: 273: 269: 265: 262: 258: 253: 248: 245: 240: 236: 230: 226: 222: 217: 214: 209: 205: 199: 195: 187: 182: 178: 174: 165: 162: 159: 154: 151: 148: 144: 137: 131: 128: 125: 121: 117: 114: 111: 106: 102: 97: 87: 83: 77: 73: 69: 66: 63: 58: 54: 49: 45: 38: 37: 36: 33: 31: 27: 23: 19: 1288:expanding it 1277: 1262: 1226: 1220: 1203: 1199: 1193: 1168: 1164: 1151: 1124: 1118: 1091: 1085: 1068: 1064: 1058: 1033: 1029: 1023: 1006: 1002: 999:Tiao, George 993: 956: 786: 657: 379: 34: 21: 15: 1327:Categories 1280:statistics 985:References 793:odds ratio 18:statistics 1185:254025619 1050:122956752 1036:: 21–37. 915:∑ 911:− 872:… 759:… 728:ν 704:⩽ 698:⩽ 683:ν 679:− 630:ν 622:Γ 593:ν 584:Γ 561:∏ 538:ν 530:Γ 493:∑ 489:− 474:ν 465:Γ 412:∏ 337:ν 310:∑ 306:− 270:∑ 254:× 246:− 237:ν 223:⋯ 215:− 206:ν 179:ν 171:Γ 145:∏ 122:ν 115:⋯ 103:ν 94:Γ 67:… 1253:1236111 20:, the 1251:  1241:  1183:  1139:  1106:  1048:  887:where 1278:This 1249:S2CID 1181:S2CID 1161:(PDF) 1046:S2CID 1284:stub 1239:ISBN 1137:ISBN 1104:ISBN 743:> 719:and 676:> 362:> 1231:doi 1208:doi 1173:doi 1129:doi 1096:doi 1073:doi 1038:doi 1011:doi 16:In 1329:: 1247:. 1237:. 1204:41 1202:. 1179:. 1169:23 1167:. 1163:. 1135:. 1102:. 1069:39 1067:. 1044:. 1034:39 1032:. 1007:60 1005:. 981:. 954:. 784:. 392:: 365:0. 1315:e 1308:t 1301:v 1290:. 1255:. 1233:: 1214:. 1210:: 1187:. 1175:: 1145:. 1131:: 1112:. 1098:: 1079:. 1075:: 1052:. 1040:: 1017:. 1013:: 940:i 936:p 930:k 925:1 922:= 919:i 908:1 905:= 900:0 896:p 875:k 869:1 866:= 863:i 860:, 853:0 849:p 843:i 839:p 833:= 828:i 824:x 803:p 770:k 766:q 762:+ 756:+ 751:1 747:q 738:1 735:+ 732:k 707:k 701:j 695:1 692:, 687:j 671:j 667:q 639:) 634:j 626:( 616:) 610:j 606:q 602:+ 597:j 588:( 576:k 571:1 568:= 565:j 553:) 548:1 545:+ 542:k 534:( 524:) 518:j 514:q 508:k 503:1 500:= 497:j 484:1 481:+ 478:k 469:( 459:= 455:] 447:i 443:q 437:i 433:x 427:k 422:1 419:= 416:i 407:[ 403:E 357:i 353:x 348:, 341:j 331:1 328:+ 325:k 320:1 317:= 314:j 301:) 295:i 291:x 285:k 280:1 277:= 274:i 266:+ 263:1 259:( 249:1 241:k 231:k 227:x 218:1 210:1 200:1 196:x 188:) 183:j 175:( 166:1 163:+ 160:k 155:1 152:= 149:j 138:) 132:1 129:+ 126:k 118:+ 112:+ 107:1 98:( 88:= 84:) 78:k 74:x 70:, 64:, 59:1 55:x 50:( 46:p

Index

statistics
beta prime distribution
Dirichlet distribution
statistical regression
multivariate Student distribution
mixed moments
negative multinomial distribution
odds ratio
mixture models
Newton–Raphson
Dirichlet process
Support Vector Machine
Bayesian inference
hierarchical clustering
Tiao, George
doi
10.1080/01621459.1965.10480828
doi
10.1080/03610920802627062
S2CID
122956752
doi
10.1016/j.eswa.2011.08.063
doi
10.1007/978-3-642-21881-1_42
ISBN
978-3-642-21880-4
doi
10.1007/978-3-642-24958-7_9
ISBN

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑