Knowledge (XXG)

Algorithmic information theory

Source 📝

364:(Burgin 1982). The axiomatic approach encompasses other approaches in the algorithmic information theory. It is possible to treat different measures of algorithmic information as particular cases of axiomatically defined measures of algorithmic information. Instead of proving similar theorems, such as the basic invariance theorem, for each particular measure, it is possible to easily deduce all such results from one corresponding theorem proved in the axiomatic setting. This is a general advantage of the axiomatic approach in mathematics. The axiomatic approach to algorithmic information theory was further developed in the book (Burgin 2005) and applied to software metrics (Burgin and Debnath, 2003; Debnath and Burgin, 2003). 3757: 3743: 383:"description language" in which the "descriptions" are given), the collection of random strings does depend on the choice of fixed universal machine. Nevertheless, the collection of random strings, as a whole, has similar properties regardless of the fixed machine, so one can (and often does) talk about the properties of random strings as a group without having to first specify a universal machine. 3781: 424:—on the space of infinite binary sequences) is random. Also, since it can be shown that the Kolmogorov complexity relative to two different universal machines differs by at most a constant, the collection of random infinite sequences does not depend on the choice of universal machine (in contrast to finite strings). This definition of randomness is usually called 3769: 249:, but any choice gives identical asymptotic results because the Kolmogorov complexity of a string is invariant up to an additive constant depending only on the choice of universal Turing machine. For this reason the set of random infinite sequences is independent of the choice of universal machine.) 209:
From this point of view, a 3000-page encyclopedia actually contains less information than 3000 pages of completely random letters, despite the fact that the encyclopedia is much more useful. This is because to reconstruct the entire sequence of random letters, one must know what every single letter
533:
The major drawback of AC and AP are their incomputability. Time-bounded "Levin" complexity penalizes a slow program by adding the logarithm of its running time to its length. This leads to computable variants of AC and AP, and universal "Levin" search (US) solves all inversion problems in optimal
537:
AC and AP also allow a formal and rigorous definition of randomness of individual strings to not depend on physical or philosophical intuitions about non-determinism or likelihood. Roughly, a string is algorithmic "Martin-Löf" random (AR) if it is incompressible in the sense that its algorithmic
382:
of the string is at least the length of the string. A simple counting argument shows that some strings of any given length are random, and almost all strings are very close to being random. Since Kolmogorov complexity depends on a fixed choice of universal Turing machine (informally, a fixed
210:
is. On the other hand, if every vowel were removed from the encyclopedia, someone with reasonable knowledge of the English language could reconstruct it, just as one could likely reconstruct the sentence "Ths sntnc hs lw nfrmtn cntnt" from the context and consonants present.
99:
does, as in classical information theory; randomness is incompressibility; and, within the realm of randomly generated software, the probability of occurrence of any data structure is of the order of the shortest program that generates it when running on a universal machine.
63:. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in 436:
to distinguish it from other stronger notions of randomness (2-randomness, 3-randomness, etc.). In addition to Martin-Löf randomness concepts, there are also recursive randomness, Schnorr randomness, and Kurtz randomness etc.
90:
Besides the formalization of a universal measure for irreducible information content of computably generated objects, some main achievements of AIT were to show that: in fact algorithmic complexity follows (in the
484:
Algorithmic information theory (AIT) is the information theory of individual objects, using computer science, and concerns itself with the relationship between computation, information, and randomness.
541:
AC, AP, and AR are the core sub-disciplines of AIT, but AIT spawns into many other areas. It serves as the foundation of the Minimum Description Length (MDL) principle, can simplify proofs in
796:
Paper from conference on "Cerebral Systems and Computers", California Institute of Technology, February 8–11, 1960, cited in "A Formal Theory of Inductive Inference, Part 1, 1964, p. 1
1420:
Zvonkin, A.K. and Levin, L. A. (1970). "The Complexity of Finite Objects and the Development of the Concepts of Information and Randomness by Means of the Theory of Algorithms".
333:
in 1960, and in a report, February 1960, "A Preliminary Report on a General Theory of Inductive Inference." Algorithmic information theory was later developed independently by
1463: 474: 2878: 3383: 662: 989: 3533: 3157: 267:
when its input is supplied by flips of a fair coin (sometimes thought of as the probability that a random computer program will eventually halt). Although
1798: 356:
also contributed significantly to the information theory of infinite sequences. An axiomatic approach to algorithmic information theory based on the
2931: 3370: 3807: 194:
Informally, from the point of view of algorithmic information theory, the information content of a string is equivalent to the length of the most-
140: 698: 1351: 758: 1793: 1493: 290: 2397: 1545: 851: 613: 3785: 656: 580: 488:
The information content or complexity of an object can be measured by the length of its shortest description. For instance the string
3180: 3072: 1410: 1270: 1177: 1034: 978: 3358: 3232: 119:. One of the main motivations behind AIT is the very study of the information carried by mathematical objects as in the field of 3416: 3077: 2822: 2193: 1783: 568: 542: 387: 302: 226: 156: 128: 123:, e.g., as shown by the incompleteness results mentioned below. Other main motivations came from surpassing the limitations of 2407: 3817: 3467: 2679: 2486: 2375: 2333: 1572: 151:). In this way, AIT is known to be basically founded upon three main mathematical concepts and the relations between them: 3710: 2669: 723:
or, for the mutual algorithmic information, informing the algorithmic complexity of the input along with the input itself.
2719: 3261: 3210: 3195: 3185: 3054: 2926: 2893: 2674: 2504: 256:, appear to challenge common mathematical and philosophical intuitions. Most notable among these is the construction of 96: 40: 3330: 2631: 3812: 3605: 3406: 2385: 2054: 1518: 278: 3490: 3457: 344:
There are several variants of Kolmogorov complexity or algorithmic information; the most widely used one is based on
3462: 3205: 2964: 2870: 2850: 2758: 2469: 2287: 1770: 1642: 624: 2636: 2402: 2260: 3222: 2990: 2711: 2565: 2494: 2414: 2272: 2253: 1961: 1682: 176: 104: 56: 3335: 1004: 3705: 3472: 3020: 2985: 2949: 2734: 2176: 2085: 2044: 1956: 1647: 1486: 242: 136: 2742: 2726: 3614: 3227: 3167: 3104: 2464: 2326: 2316: 2166: 2080: 559: 527: 322: 160: 3375: 3312: 3652: 3582: 3067: 2954: 1951: 1848: 1755: 1634: 1533: 641: 630: 3773: 2651: 3677: 3619: 3562: 3388: 3281: 3190: 2916: 2800: 2659: 2541: 2533: 2348: 2244: 2222: 2181: 2146: 2113: 2059: 2034: 1989: 1928: 1888: 1690: 1513: 1457: 636: 618: 607: 586: 508: 417: 401: 379: 373: 253: 246: 230: 222: 198:
possible self-contained representation of that string. A self-contained representation is essentially a
172: 152: 44: 3756: 2646: 574: 257: 3600: 3175: 3124: 3100: 3062: 2980: 2959: 2911: 2790: 2768: 2737: 2523: 2474: 2392: 2365: 2321: 2277: 2039: 1815: 1695: 1429: 842: 203: 84: 702: 263:, a real number that expresses the probability that a self-delimiting universal Turing machine will 3747: 3672: 3595: 3276: 3040: 3033: 2995: 2903: 2883: 2855: 2588: 2454: 2449: 2439: 2431: 2249: 2210: 2100: 2090: 1999: 1778: 1734: 1652: 1577: 1479: 601: 546: 199: 184: 112: 3322: 1399:"Algorithmic Information Content, Church-Turing Thesis, physical entropy, and Maxwell's demon, in" 3761: 3572: 3426: 3271: 3147: 3044: 3028: 3005: 2782: 2516: 2499: 2459: 2370: 2265: 2227: 2198: 2158: 2118: 2064: 1981: 1667: 1662: 1445: 1385: 1213: 1131: 1096: 1061: 955: 913: 148: 132: 124: 76: 64: 52: 3667: 3637: 3629: 3449: 3440: 3365: 3296: 3152: 3137: 3112: 3000: 2941: 2807: 2795: 2421: 2338: 2282: 2205: 2049: 1971: 1750: 1624: 1406: 1347: 1266: 1187:
Kolmogorov, A.N. (1965). "Three approaches to the definition of the quantity of information".
1173: 1030: 974: 925:
Burgin, M. (1982). "Generalized Kolmogorov complexity and duality in theory of computations".
754: 647: 429: 361: 353: 334: 17: 1398: 1344:
Algorithmic Probability: Theory and Applications, Information Theory and Statistical Learning
447: 3692: 3647: 3411: 3398: 3291: 3266: 3200: 3132: 3010: 2618: 2511: 2444: 2357: 2304: 2123: 1994: 1788: 1587: 1554: 1437: 1377: 1330: 1307: 1205: 1152: 1121: 1086: 1053: 947: 903: 878: 421: 195: 938:
Burgin, M. (1990). "Generalized Kolmogorov Complexity and other Dual Complexity Measures".
809:", Report V-131, Zator Co., Cambridge, Ma., (November Revision of February 4, 1960 report.) 3609: 3353: 3215: 3142: 2817: 2691: 2664: 2641: 2610: 2237: 2232: 2186: 1916: 1567: 1029:. Texts in Theoretical Computer Science. An EATCS Series (2nd ed.). Springer-Verlag. 522:
A closely related notion is the probability that a universal computer outputs some string
432:, to distinguish it from other similar notions of randomness. It is also sometimes called 338: 264: 120: 68: 1044:
Chaitin, G.J. (1966). "On the Length of Programs for Computing Finite Binary Sequences".
1433: 1075:"On the Simplicity and Speed of Programs for Computing Definite Sets of Natural Numbers" 321:, who published the basic ideas on which the field is based as part of his invention of 234: 3558: 3553: 2016: 1946: 1592: 1362: 326: 318: 180: 108: 72: 60: 1334: 1312: 1295: 882: 187:
of strings, it can be used to study a wide variety of mathematical objects, including
183:). Because most mathematical objects can be described in terms of strings, or as the 115:
of strings, it can be used to study a wide variety of mathematical objects, including
3801: 3715: 3682: 3545: 3506: 3317: 3286: 2750: 2704: 2309: 2011: 1838: 1602: 1597: 1449: 1389: 1227:"Laws of information (nongrowth) and aspects of the foundation of probability theory" 1166: 1065: 959: 820: 416:. It can be shown that almost every sequence (from the point of view of the standard 306: 214: 111:). Because most mathematical objects can be described in terms of strings, or as the 1868: 1441: 1217: 1135: 1100: 917: 545:, has been used to define a universal similarity metric between objects, solves the 3657: 3590: 3567: 3482: 2812: 2108: 2006: 1941: 1883: 1805: 1760: 846: 595: 562: â€“ mathematical method of assigning a prior probability to a given observation 349: 144: 360:(Blum 1967) was introduced by Mark Burgin in a paper presented for publication by 1260: 1024: 968: 748: 530:
is key in addressing the old philosophical problem of induction in a formal way.
3700: 3662: 3345: 3246: 3108: 2921: 2888: 2380: 2297: 2292: 1936: 1893: 1873: 1853: 1843: 1612: 866: 504:
presumably has no simple description other than writing down the string itself.
438: 357: 345: 92: 80: 48: 785: 583: â€“ In computer science, relationship between two families of distributions 2546: 2026: 1726: 1657: 1607: 1582: 1502: 1381: 272: 238: 1209: 2699: 2551: 2171: 1966: 1878: 1863: 1858: 1823: 786:
Obituary: Ray Solomonoff, Founding Father of Algorithmic Information Theory"
1244:"Various Measures of Complexity for Finite Objects (Axiomatic Description)" 1126: 1109: 1091: 1074: 1057: 908: 891: 515:
is defined as the length of the shortest program that computes or outputs
213:
Unlike classical information theory, algorithmic information theory gives
2215: 1833: 1710: 1705: 1700: 1672: 610: â€“ Determining the probability of future events based on past events 1156: 3720: 3421: 951: 519:, where the program is run on some fixed reference universal computer. 392:
An infinite binary sequence is said to be random if, for some constant
330: 325:—a way to overcome serious problems associated with the application of 188: 116: 103:
AIT principally studies measures of irreducible information content of
534:
time (apart from some unrealistically large multiplicative constant).
3642: 2623: 2597: 2577: 1828: 1619: 1281: 1243: 806: 218: 1226: 1321:
Solomonoff, R.J. (1964). "A Formal Theory of Inductive Inference".
1110:"A Theory of Program Size Formally Identical to Information Theory" 892:"A Machine-independent Theory of Complexity of Recursive Functions" 644: â€“ Term used in theoretical computer science and cryptography 1197: 500:"1100100001100001110111101110110011111010010000100101011110010110" 492:"0101010101010101010101010101010101010101010101010101010101010101" 444:(Related definitions can be made for alphabets other than the set 329:
in statistics. He first described his results at a Conference at
289:, providing an absolute limit on knowledge that is reminiscent of 275: 589: â€“ sequence of probability distributions or random variables 305:
and thus its binary digits are evenly distributed (in fact it is
1562: 3531: 3098: 2845: 2144: 1914: 1531: 1475: 1342:
Solomonoff, R.J. (2009). Emmert-Streib, F.; Dehmer, M. (eds.).
1283:
A Preliminary Report on a General Theory of Inductive Inference
807:
A Preliminary Report on a General Theory of Inductive Inference
252:
Some of the results of algorithmic information theory, such as
633: â€“ Formal information theory restatement of Occam's Razor 1471: 1262:
An Introduction to Kolmogorov Complexity and its Applications
1198:"Logical basis for information theory and probability theory" 1289:(Technical report). Cambridge, Mass: Zator Company. ZTB-138. 441:
showed that all of these randomness concepts are different.
659: â€“ Establishes the limits to possible data compression 496:
has the short description "32 repetitions of '01'", while
241:. (The set of random strings depends on the choice of the 577: â€“ Halting probability of a random computer program 127:
for single and fixed objects, formalizing the concept of
95:
case) the same inequalities (except for a constant) that
1143:
Chaitin, G.J. (1977). "Algorithmic information theory".
1026:
Information and Randomness: An Algorithmic Perspective
450: 3384:
Autoregressive conditional heteroskedasticity (ARCH)
652:
Pages displaying wikidata descriptions as a fallback
591:
Pages displaying wikidata descriptions as a fallback
564:
Pages displaying wikidata descriptions as a fallback
3691: 3628: 3581: 3544: 3499: 3481: 3448: 3439: 3397: 3344: 3305: 3254: 3245: 3166: 3123: 3053: 3019: 2973: 2940: 2902: 2869: 2781: 2690: 2609: 2564: 2532: 2485: 2430: 2356: 2347: 2157: 2099: 2073: 2025: 1980: 1927: 1814: 1769: 1743: 1725: 1681: 1633: 1553: 1544: 171:Algorithmic information theory principally studies 43:that concerns itself with the relationship between 27:
Subfield of information theory and computer science
1403:Complexity, Entropy and the Physics of Information 1165: 1114:Journal of the Association for Computing Machinery 1079:Journal of the Association for Computing Machinery 1046:Journal of the Association for Computing Machinery 468: 202:—in some fixed but otherwise irrelevant universal 747:Downey, Rodney G.; Hirschfeldt, Denis R. (2010). 229:that do not depend on physical or philosophical 87:into a cocktail shaker and shaking vigorously." 2932:Multivariate adaptive regression splines (MARS) 990:"Algorithmic information theory: Open problems" 526:when fed with a program chosen at random. This 51:of computably generated objects (as opposed to 317:Algorithmic information theory was founded by 206:—that, when run, outputs the original string. 1487: 281:one can only compute finitely many digits of 8: 1462:: CS1 maint: multiple names: authors list ( 973:. Monographs in computer science. Springer. 463: 451: 378:A binary string is said to be random if the 682: 680: 3541: 3528: 3445: 3251: 3120: 3095: 2866: 2842: 2570: 2353: 2154: 2141: 1924: 1911: 1550: 1541: 1528: 1494: 1480: 1472: 731: 729: 663:Solomonoff's theory of inductive inference 1311: 1125: 1090: 907: 773: 621: â€“ Measure of algorithmic complexity 528:algorithmic "Solomonoff" probability (AP) 449: 297:cannot be determined, many properties of 1296:"A Formal Theory of Inductive Inference" 598: â€“ Philosophical study of knowledge 1145:IBM Journal of Research and Development 852:Chaitin's account of the history of AIT 686: 676: 141:independent and identically distributed 3458:Kaplan–Meier estimator (product limit) 1455: 735: 750:Algorithmic Randomness and Complexity 7: 3768: 3468:Accelerated failure time (AFT) model 1231:Problems of Information Transmission 1189:Problems of Information Transmission 3780: 3063:Analysis of variance (ANOVA, anova) 869:(1967). "On the Size of Machines". 604: â€“ Method of logical reasoning 538:complexity is equal to its length. 3158:Cochran–Mantel–Haenszel statistics 1784:Pearson product-moment correlation 1405:. Addison-Wesley. pp. 73–89. 581:Computational indistinguishability 25: 1265:(2nd ed.). Springer-Verlag. 627: â€“ Model selection principle 404:of the initial segment of length 301:are known; for example, it is an 3779: 3767: 3755: 3742: 3741: 1363:"Algorithmic Information Theory" 828:(PhD). University of Heidelberg. 699:"Algorithmic Information Theory" 254:Chaitin's incompleteness theorem 3417:Least-squares spectral analysis 1442:10.1070/RM1970v025n06ABEH001269 657:Shannon's source coding theorem 569:Algorithmically random sequence 543:computational complexity theory 388:Algorithmically random sequence 303:algorithmically random sequence 291:Gödel's incompleteness theorems 135:without prior knowledge of the 71:, it is "the result of putting 3808:Algorithmic information theory 2398:Mean-unbiased minimum-variance 1172:. Cambridge University Press. 1168:Algorithmic Information Theory 843:Algorithmic Information Theory 33:Algorithmic information theory 18:Algorithmic Information Theory 1: 3711:Geographic information system 2927:Simultaneous equations models 1335:10.1016/S0019-9958(64)90131-7 1313:10.1016/S0019-9958(64)90223-2 883:10.1016/S0019-9958(67)90546-3 665: â€“ A mathematical theory 2894:Coefficient of determination 2505:Uniformly most powerful test 1422:Russian Mathematical Surveys 1259:Li, M.; Vitanyi, P. (2013). 1003:(5): 439–441. Archived from 408:of the sequence is at least 125:classical information theory 41:theoretical computer science 3463:Proportional hazards models 3407:Spectral density estimation 3389:Vector autoregression (VAR) 2823:Maximum posterior estimator 2055:Randomized controlled trial 509:algorithmic complexity (AC) 131:, and finding a meaningful 3834: 3223:Multivariate distributions 1643:Average absolute deviation 970:Super-recursive algorithms 625:Minimum description length 549:problem, and many others. 385: 371: 271:is easily defined, in any 3737: 3540: 3527: 3211:Structural equation model 3119: 3094: 2865: 2841: 2573: 2547:Score/Lagrange multiplier 2153: 2140: 1962:Sample size determination 1923: 1910: 1540: 1527: 1509: 1382:10.1017/S0022481200041153 1370:Journal of Symbolic Logic 1294:Solomonoff, R.J. (1964). 1280:Solomonoff, R.J. (1960). 1196:Kolmogorov, A.N. (1968). 822:Randomness and Complexity 293:. Although the digits of 285:, so it is in some sense 3706:Environmental statistics 3228:Elliptical distributions 3021:Generalized linear model 2950:Simple linear regression 2720:Hodges–Lehmann estimator 2177:Probability distribution 2086:Stochastic approximation 1648:Coefficient of variation 1210:10.1109/TIT.1968.1054210 650: â€“ cognitive theory 346:self-delimiting programs 243:universal Turing machine 227:random infinite sequence 137:probability distribution 3366:Cross-correlation (XCF) 2974:Non-standard predictors 2408:Lehmann–ScheffĂ© theorem 2081:Adaptive clinical trial 1323:Information and Control 1300:Information and Control 1202:IEEE Trans. Inf. Theory 871:Information and Control 571: â€“ Binary sequence 560:Algorithmic probability 469:{\displaystyle \{0,1\}} 323:algorithmic probability 161:algorithmic probability 133:probabilistic inference 3762:Mathematics portal 3583:Engineering statistics 3491:Nelson–Aalen estimator 3068:Analysis of covariance 2955:Ordinary least squares 2879:Pearson product-moment 2283:Statistical functional 2194:Empirical distribution 2027:Controlled experiments 1756:Frequency distribution 1534:Descriptive statistics 1164:Chaitin, G.J. (1987). 1108:Chaitin, G.J. (1975). 1073:Chaitin, G.J. (1969). 642:Pseudorandom generator 631:Minimum message length 470: 157:algorithmic randomness 153:algorithmic complexity 3818:Randomized algorithms 3678:Population statistics 3620:System identification 3354:Autocorrelation (ACF) 3282:Exponential smoothing 3196:Discriminant analysis 3191:Canonical correlation 3055:Partition of variance 2917:Regression validation 2761:(Jonckheere–Terpstra) 2660:Likelihood-ratio test 2349:Frequentist inference 2261:Location–scale family 2182:Sampling distribution 2147:Statistical inference 2114:Cross-sectional study 2101:Observational studies 2060:Randomized experiment 1889:Stem-and-leaf display 1691:Central limit theorem 1397:Zurek, W.H. (2018) . 1361:Van Lambagen (1989). 1225:Levin, L. A. (1974). 1127:10.1145/321892.321894 1092:10.1145/321526.321530 1058:10.1145/321356.321363 1023:Calude, C.S. (2013). 988:Calude, C.S. (1996). 909:10.1145/321386.321395 819:Wang, Yongge (1996). 774:Li & Vitanyi 2013 637:Pseudorandom ensemble 619:Kolmogorov complexity 608:Inductive probability 587:Distribution ensemble 471: 402:Kolmogorov complexity 380:Kolmogorov complexity 374:Kolmogorov complexity 348:and is mainly due to 247:Kolmogorov complexity 139:(e.g., whether it is 3601:Probabilistic design 3186:Principal components 3029:Exponential families 2981:Nonlinear regression 2960:General linear model 2922:Mixed effects models 2912:Errors and residuals 2889:Confounding variable 2791:Bayesian probability 2769:Van der Waerden test 2759:Ordered alternative 2524:Multiple comparisons 2403:Rao–Blackwellization 2366:Estimating equations 2322:Statistical distance 2040:Factorial experiment 1573:Arithmetic-Geometric 1242:Levin, L.A. (1976). 1204:. IT-14 (5): 662–4. 1010:on November 28, 2021 448: 204:programming language 85:computability theory 55:generated), such as 3673:Official statistics 3596:Methods engineering 3277:Seasonal adjustment 3045:Poisson regressions 2965:Bayesian regression 2904:Regression analysis 2884:Partial correlation 2856:Regression analysis 2455:Prediction interval 2450:Likelihood interval 2440:Confidence interval 2432:Interval estimation 2393:Unbiased estimators 2211:Model specification 2091:Up-and-down designs 1779:Partial correlation 1735:Index of dispersion 1653:Interquartile range 1434:1970RuMaS..25...83Z 1157:10.1147/rd.214.0350 967:Burgin, M. (2005). 705:on January 23, 2016 602:Inductive reasoning 507:More formally, the 368:Precise definitions 185:limit of a sequence 113:limit of a sequence 3813:Information theory 3693:Spatial statistics 3573:Medical statistics 3473:First hitting time 3427:Whittle likelihood 3078:Degrees of freedom 3073:Multivariate ANOVA 3006:Heteroscedasticity 2818:Bayesian estimator 2783:Bayesian inference 2632:Kolmogorov–Smirnov 2517:Randomization test 2487:Testing hypotheses 2460:Tolerance interval 2371:Maximum likelihood 2266:Exponential family 2199:Density estimation 2159:Statistical theory 2119:Natural experiment 2065:Scientific control 1982:Survey methodology 1668:Standard deviation 952:10.1007/BF01068189 896:Journal of the ACM 614:Invariance theorem 575:Chaitin's constant 466: 428:randomness, after 258:Chaitin's constant 77:information theory 65:information theory 3795: 3794: 3733: 3732: 3729: 3728: 3668:National accounts 3638:Actuarial science 3630:Social statistics 3523: 3522: 3519: 3518: 3515: 3514: 3450:Survival function 3435: 3434: 3297:Granger causality 3138:Contingency table 3113:Survival analysis 3090: 3089: 3086: 3085: 2942:Linear regression 2837: 2836: 2833: 2832: 2808:Credible interval 2777: 2776: 2560: 2559: 2376:Method of moments 2245:Parametric family 2206:Statistical model 2136: 2135: 2132: 2131: 2050:Random assignment 1972:Statistical power 1906: 1905: 1902: 1901: 1751:Contingency table 1721: 1720: 1588:Generalized/power 1353:978-0-387-84815-0 1248:Soviet Math. Dokl 927:Soviet Math. Dokl 890:Blum, M. (1967). 805:Solomonoff, R., " 760:978-0-387-68441-3 648:Simplicity theory 480:Specific sequence 362:Andrey Kolmogorov 335:Andrey Kolmogorov 221:definitions of a 39:) is a branch of 16:(Redirected from 3825: 3783: 3782: 3771: 3770: 3760: 3759: 3745: 3744: 3648:Crime statistics 3542: 3529: 3446: 3412:Fourier analysis 3399:Frequency domain 3379: 3326: 3292:Structural break 3252: 3201:Cluster analysis 3148:Log-linear model 3121: 3096: 3037: 3011:Homoscedasticity 2867: 2843: 2762: 2754: 2746: 2745:(Kruskal–Wallis) 2730: 2715: 2670:Cross validation 2655: 2637:Anderson–Darling 2584: 2571: 2542:Likelihood-ratio 2534:Parametric tests 2512:Permutation test 2495:1- & 2-tails 2386:Minimum distance 2358:Point estimation 2354: 2305:Optimal decision 2256: 2155: 2142: 2124:Quasi-experiment 2074:Adaptive designs 1925: 1912: 1789:Rank correlation 1551: 1542: 1529: 1496: 1489: 1482: 1473: 1467: 1461: 1453: 1416: 1393: 1376:(4): 1389–1400. 1367: 1357: 1338: 1317: 1315: 1290: 1288: 1276: 1255: 1238: 1221: 1192: 1183: 1171: 1160: 1139: 1129: 1104: 1094: 1069: 1040: 1019: 1017: 1015: 1009: 994: 984: 963: 934: 921: 911: 886: 830: 829: 827: 816: 810: 803: 797: 794: 788: 782: 776: 771: 765: 764: 744: 738: 733: 724: 721: 715: 714: 712: 710: 701:. Archived from 695: 689: 684: 653: 592: 565: 501: 493: 475: 473: 472: 467: 422:Lebesgue measure 420:—"fair coin" or 21: 3833: 3832: 3828: 3827: 3826: 3824: 3823: 3822: 3798: 3797: 3796: 3791: 3754: 3725: 3687: 3624: 3610:quality control 3577: 3559:Clinical trials 3536: 3511: 3495: 3483:Hazard function 3477: 3431: 3393: 3377: 3340: 3336:Breusch–Godfrey 3324: 3301: 3241: 3216:Factor analysis 3162: 3143:Graphical model 3115: 3082: 3049: 3035: 3015: 2969: 2936: 2898: 2861: 2860: 2829: 2773: 2760: 2752: 2744: 2728: 2713: 2692:Rank statistics 2686: 2665:Model selection 2653: 2611:Goodness of fit 2605: 2582: 2556: 2528: 2481: 2426: 2415:Median unbiased 2343: 2254: 2187:Order statistic 2149: 2128: 2095: 2069: 2021: 1976: 1919: 1917:Data collection 1898: 1810: 1765: 1739: 1717: 1677: 1629: 1546:Continuous data 1536: 1523: 1505: 1500: 1470: 1454: 1419: 1413: 1396: 1365: 1360: 1354: 1341: 1320: 1293: 1286: 1279: 1273: 1258: 1241: 1224: 1195: 1186: 1180: 1163: 1142: 1107: 1072: 1043: 1037: 1022: 1013: 1011: 1007: 992: 987: 981: 966: 937: 924: 889: 865: 861: 859:Further reading 839: 834: 833: 825: 818: 817: 813: 804: 800: 795: 791: 783: 779: 772: 768: 761: 746: 745: 741: 734: 727: 722: 718: 708: 706: 697: 696: 692: 685: 678: 673: 668: 651: 590: 563: 555: 499: 491: 482: 446: 445: 390: 376: 370: 341:, around 1966. 339:Gregory Chaitin 315: 245:used to define 181:data structures 169: 121:metamathematics 109:data structures 69:Gregory Chaitin 67:. According to 28: 23: 22: 15: 12: 11: 5: 3831: 3829: 3821: 3820: 3815: 3810: 3800: 3799: 3793: 3792: 3790: 3789: 3777: 3765: 3751: 3738: 3735: 3734: 3731: 3730: 3727: 3726: 3724: 3723: 3718: 3713: 3708: 3703: 3697: 3695: 3689: 3688: 3686: 3685: 3680: 3675: 3670: 3665: 3660: 3655: 3650: 3645: 3640: 3634: 3632: 3626: 3625: 3623: 3622: 3617: 3612: 3603: 3598: 3593: 3587: 3585: 3579: 3578: 3576: 3575: 3570: 3565: 3556: 3554:Bioinformatics 3550: 3548: 3538: 3537: 3532: 3525: 3524: 3521: 3520: 3517: 3516: 3513: 3512: 3510: 3509: 3503: 3501: 3497: 3496: 3494: 3493: 3487: 3485: 3479: 3478: 3476: 3475: 3470: 3465: 3460: 3454: 3452: 3443: 3437: 3436: 3433: 3432: 3430: 3429: 3424: 3419: 3414: 3409: 3403: 3401: 3395: 3394: 3392: 3391: 3386: 3381: 3373: 3368: 3363: 3362: 3361: 3359:partial (PACF) 3350: 3348: 3342: 3341: 3339: 3338: 3333: 3328: 3320: 3315: 3309: 3307: 3306:Specific tests 3303: 3302: 3300: 3299: 3294: 3289: 3284: 3279: 3274: 3269: 3264: 3258: 3256: 3249: 3243: 3242: 3240: 3239: 3238: 3237: 3236: 3235: 3220: 3219: 3218: 3208: 3206:Classification 3203: 3198: 3193: 3188: 3183: 3178: 3172: 3170: 3164: 3163: 3161: 3160: 3155: 3153:McNemar's test 3150: 3145: 3140: 3135: 3129: 3127: 3117: 3116: 3099: 3092: 3091: 3088: 3087: 3084: 3083: 3081: 3080: 3075: 3070: 3065: 3059: 3057: 3051: 3050: 3048: 3047: 3031: 3025: 3023: 3017: 3016: 3014: 3013: 3008: 3003: 2998: 2993: 2991:Semiparametric 2988: 2983: 2977: 2975: 2971: 2970: 2968: 2967: 2962: 2957: 2952: 2946: 2944: 2938: 2937: 2935: 2934: 2929: 2924: 2919: 2914: 2908: 2906: 2900: 2899: 2897: 2896: 2891: 2886: 2881: 2875: 2873: 2863: 2862: 2859: 2858: 2853: 2847: 2846: 2839: 2838: 2835: 2834: 2831: 2830: 2828: 2827: 2826: 2825: 2815: 2810: 2805: 2804: 2803: 2798: 2787: 2785: 2779: 2778: 2775: 2774: 2772: 2771: 2766: 2765: 2764: 2756: 2748: 2732: 2729:(Mann–Whitney) 2724: 2723: 2722: 2709: 2708: 2707: 2696: 2694: 2688: 2687: 2685: 2684: 2683: 2682: 2677: 2672: 2662: 2657: 2654:(Shapiro–Wilk) 2649: 2644: 2639: 2634: 2629: 2621: 2615: 2613: 2607: 2606: 2604: 2603: 2595: 2586: 2574: 2568: 2566:Specific tests 2562: 2561: 2558: 2557: 2555: 2554: 2549: 2544: 2538: 2536: 2530: 2529: 2527: 2526: 2521: 2520: 2519: 2509: 2508: 2507: 2497: 2491: 2489: 2483: 2482: 2480: 2479: 2478: 2477: 2472: 2462: 2457: 2452: 2447: 2442: 2436: 2434: 2428: 2427: 2425: 2424: 2419: 2418: 2417: 2412: 2411: 2410: 2405: 2390: 2389: 2388: 2383: 2378: 2373: 2362: 2360: 2351: 2345: 2344: 2342: 2341: 2336: 2331: 2330: 2329: 2319: 2314: 2313: 2312: 2302: 2301: 2300: 2295: 2290: 2280: 2275: 2270: 2269: 2268: 2263: 2258: 2242: 2241: 2240: 2235: 2230: 2220: 2219: 2218: 2213: 2203: 2202: 2201: 2191: 2190: 2189: 2179: 2174: 2169: 2163: 2161: 2151: 2150: 2145: 2138: 2137: 2134: 2133: 2130: 2129: 2127: 2126: 2121: 2116: 2111: 2105: 2103: 2097: 2096: 2094: 2093: 2088: 2083: 2077: 2075: 2071: 2070: 2068: 2067: 2062: 2057: 2052: 2047: 2042: 2037: 2031: 2029: 2023: 2022: 2020: 2019: 2017:Standard error 2014: 2009: 2004: 2003: 2002: 1997: 1986: 1984: 1978: 1977: 1975: 1974: 1969: 1964: 1959: 1954: 1949: 1947:Optimal design 1944: 1939: 1933: 1931: 1921: 1920: 1915: 1908: 1907: 1904: 1903: 1900: 1899: 1897: 1896: 1891: 1886: 1881: 1876: 1871: 1866: 1861: 1856: 1851: 1846: 1841: 1836: 1831: 1826: 1820: 1818: 1812: 1811: 1809: 1808: 1803: 1802: 1801: 1796: 1786: 1781: 1775: 1773: 1767: 1766: 1764: 1763: 1758: 1753: 1747: 1745: 1744:Summary tables 1741: 1740: 1738: 1737: 1731: 1729: 1723: 1722: 1719: 1718: 1716: 1715: 1714: 1713: 1708: 1703: 1693: 1687: 1685: 1679: 1678: 1676: 1675: 1670: 1665: 1660: 1655: 1650: 1645: 1639: 1637: 1631: 1630: 1628: 1627: 1622: 1617: 1616: 1615: 1610: 1605: 1600: 1595: 1590: 1585: 1580: 1578:Contraharmonic 1575: 1570: 1559: 1557: 1548: 1538: 1537: 1532: 1525: 1524: 1522: 1521: 1516: 1510: 1507: 1506: 1501: 1499: 1498: 1491: 1484: 1476: 1469: 1468: 1417: 1411: 1394: 1358: 1352: 1339: 1329:(2): 224–254. 1318: 1291: 1277: 1271: 1256: 1239: 1222: 1193: 1184: 1178: 1161: 1140: 1120:(3): 329–340. 1105: 1085:(3): 407–412. 1070: 1052:(4): 547–569. 1041: 1035: 1020: 985: 979: 964: 946:(4): 481–490. 935: 922: 902:(2): 322–336. 887: 877:(3): 257–265. 862: 860: 857: 856: 855: 849: 838: 837:External links 835: 832: 831: 811: 798: 789: 777: 766: 759: 739: 725: 716: 690: 675: 674: 672: 669: 667: 666: 660: 654: 645: 639: 634: 628: 622: 616: 611: 605: 599: 593: 584: 578: 572: 566: 556: 554: 551: 547:Maxwell daemon 481: 478: 465: 462: 459: 456: 453: 430:Per Martin-Löf 386:Main article: 372:Main article: 369: 366: 354:Per Martin-Löf 337:, in 1965 and 319:Ray Solomonoff 314: 311: 235:nondeterminism 168: 165: 93:self-delimited 61:data structure 53:stochastically 26: 24: 14: 13: 10: 9: 6: 4: 3: 2: 3830: 3819: 3816: 3814: 3811: 3809: 3806: 3805: 3803: 3788: 3787: 3778: 3776: 3775: 3766: 3764: 3763: 3758: 3752: 3750: 3749: 3740: 3739: 3736: 3722: 3719: 3717: 3716:Geostatistics 3714: 3712: 3709: 3707: 3704: 3702: 3699: 3698: 3696: 3694: 3690: 3684: 3683:Psychometrics 3681: 3679: 3676: 3674: 3671: 3669: 3666: 3664: 3661: 3659: 3656: 3654: 3651: 3649: 3646: 3644: 3641: 3639: 3636: 3635: 3633: 3631: 3627: 3621: 3618: 3616: 3613: 3611: 3607: 3604: 3602: 3599: 3597: 3594: 3592: 3589: 3588: 3586: 3584: 3580: 3574: 3571: 3569: 3566: 3564: 3560: 3557: 3555: 3552: 3551: 3549: 3547: 3546:Biostatistics 3543: 3539: 3535: 3530: 3526: 3508: 3507:Log-rank test 3505: 3504: 3502: 3498: 3492: 3489: 3488: 3486: 3484: 3480: 3474: 3471: 3469: 3466: 3464: 3461: 3459: 3456: 3455: 3453: 3451: 3447: 3444: 3442: 3438: 3428: 3425: 3423: 3420: 3418: 3415: 3413: 3410: 3408: 3405: 3404: 3402: 3400: 3396: 3390: 3387: 3385: 3382: 3380: 3378:(Box–Jenkins) 3374: 3372: 3369: 3367: 3364: 3360: 3357: 3356: 3355: 3352: 3351: 3349: 3347: 3343: 3337: 3334: 3332: 3331:Durbin–Watson 3329: 3327: 3321: 3319: 3316: 3314: 3313:Dickey–Fuller 3311: 3310: 3308: 3304: 3298: 3295: 3293: 3290: 3288: 3287:Cointegration 3285: 3283: 3280: 3278: 3275: 3273: 3270: 3268: 3265: 3263: 3262:Decomposition 3260: 3259: 3257: 3253: 3250: 3248: 3244: 3234: 3231: 3230: 3229: 3226: 3225: 3224: 3221: 3217: 3214: 3213: 3212: 3209: 3207: 3204: 3202: 3199: 3197: 3194: 3192: 3189: 3187: 3184: 3182: 3179: 3177: 3174: 3173: 3171: 3169: 3165: 3159: 3156: 3154: 3151: 3149: 3146: 3144: 3141: 3139: 3136: 3134: 3133:Cohen's kappa 3131: 3130: 3128: 3126: 3122: 3118: 3114: 3110: 3106: 3102: 3097: 3093: 3079: 3076: 3074: 3071: 3069: 3066: 3064: 3061: 3060: 3058: 3056: 3052: 3046: 3042: 3038: 3032: 3030: 3027: 3026: 3024: 3022: 3018: 3012: 3009: 3007: 3004: 3002: 2999: 2997: 2994: 2992: 2989: 2987: 2986:Nonparametric 2984: 2982: 2979: 2978: 2976: 2972: 2966: 2963: 2961: 2958: 2956: 2953: 2951: 2948: 2947: 2945: 2943: 2939: 2933: 2930: 2928: 2925: 2923: 2920: 2918: 2915: 2913: 2910: 2909: 2907: 2905: 2901: 2895: 2892: 2890: 2887: 2885: 2882: 2880: 2877: 2876: 2874: 2872: 2868: 2864: 2857: 2854: 2852: 2849: 2848: 2844: 2840: 2824: 2821: 2820: 2819: 2816: 2814: 2811: 2809: 2806: 2802: 2799: 2797: 2794: 2793: 2792: 2789: 2788: 2786: 2784: 2780: 2770: 2767: 2763: 2757: 2755: 2749: 2747: 2741: 2740: 2739: 2736: 2735:Nonparametric 2733: 2731: 2725: 2721: 2718: 2717: 2716: 2710: 2706: 2705:Sample median 2703: 2702: 2701: 2698: 2697: 2695: 2693: 2689: 2681: 2678: 2676: 2673: 2671: 2668: 2667: 2666: 2663: 2661: 2658: 2656: 2650: 2648: 2645: 2643: 2640: 2638: 2635: 2633: 2630: 2628: 2626: 2622: 2620: 2617: 2616: 2614: 2612: 2608: 2602: 2600: 2596: 2594: 2592: 2587: 2585: 2580: 2576: 2575: 2572: 2569: 2567: 2563: 2553: 2550: 2548: 2545: 2543: 2540: 2539: 2537: 2535: 2531: 2525: 2522: 2518: 2515: 2514: 2513: 2510: 2506: 2503: 2502: 2501: 2498: 2496: 2493: 2492: 2490: 2488: 2484: 2476: 2473: 2471: 2468: 2467: 2466: 2463: 2461: 2458: 2456: 2453: 2451: 2448: 2446: 2443: 2441: 2438: 2437: 2435: 2433: 2429: 2423: 2420: 2416: 2413: 2409: 2406: 2404: 2401: 2400: 2399: 2396: 2395: 2394: 2391: 2387: 2384: 2382: 2379: 2377: 2374: 2372: 2369: 2368: 2367: 2364: 2363: 2361: 2359: 2355: 2352: 2350: 2346: 2340: 2337: 2335: 2332: 2328: 2325: 2324: 2323: 2320: 2318: 2315: 2311: 2310:loss function 2308: 2307: 2306: 2303: 2299: 2296: 2294: 2291: 2289: 2286: 2285: 2284: 2281: 2279: 2276: 2274: 2271: 2267: 2264: 2262: 2259: 2257: 2251: 2248: 2247: 2246: 2243: 2239: 2236: 2234: 2231: 2229: 2226: 2225: 2224: 2221: 2217: 2214: 2212: 2209: 2208: 2207: 2204: 2200: 2197: 2196: 2195: 2192: 2188: 2185: 2184: 2183: 2180: 2178: 2175: 2173: 2170: 2168: 2165: 2164: 2162: 2160: 2156: 2152: 2148: 2143: 2139: 2125: 2122: 2120: 2117: 2115: 2112: 2110: 2107: 2106: 2104: 2102: 2098: 2092: 2089: 2087: 2084: 2082: 2079: 2078: 2076: 2072: 2066: 2063: 2061: 2058: 2056: 2053: 2051: 2048: 2046: 2043: 2041: 2038: 2036: 2033: 2032: 2030: 2028: 2024: 2018: 2015: 2013: 2012:Questionnaire 2010: 2008: 2005: 2001: 1998: 1996: 1993: 1992: 1991: 1988: 1987: 1985: 1983: 1979: 1973: 1970: 1968: 1965: 1963: 1960: 1958: 1955: 1953: 1950: 1948: 1945: 1943: 1940: 1938: 1935: 1934: 1932: 1930: 1926: 1922: 1918: 1913: 1909: 1895: 1892: 1890: 1887: 1885: 1882: 1880: 1877: 1875: 1872: 1870: 1867: 1865: 1862: 1860: 1857: 1855: 1852: 1850: 1847: 1845: 1842: 1840: 1839:Control chart 1837: 1835: 1832: 1830: 1827: 1825: 1822: 1821: 1819: 1817: 1813: 1807: 1804: 1800: 1797: 1795: 1792: 1791: 1790: 1787: 1785: 1782: 1780: 1777: 1776: 1774: 1772: 1768: 1762: 1759: 1757: 1754: 1752: 1749: 1748: 1746: 1742: 1736: 1733: 1732: 1730: 1728: 1724: 1712: 1709: 1707: 1704: 1702: 1699: 1698: 1697: 1694: 1692: 1689: 1688: 1686: 1684: 1680: 1674: 1671: 1669: 1666: 1664: 1661: 1659: 1656: 1654: 1651: 1649: 1646: 1644: 1641: 1640: 1638: 1636: 1632: 1626: 1623: 1621: 1618: 1614: 1611: 1609: 1606: 1604: 1601: 1599: 1596: 1594: 1591: 1589: 1586: 1584: 1581: 1579: 1576: 1574: 1571: 1569: 1566: 1565: 1564: 1561: 1560: 1558: 1556: 1552: 1549: 1547: 1543: 1539: 1535: 1530: 1526: 1520: 1517: 1515: 1512: 1511: 1508: 1504: 1497: 1492: 1490: 1485: 1483: 1478: 1477: 1474: 1465: 1459: 1451: 1447: 1443: 1439: 1435: 1431: 1428:(6): 83–124. 1427: 1423: 1418: 1414: 1412:9780429982514 1408: 1404: 1400: 1395: 1391: 1387: 1383: 1379: 1375: 1371: 1364: 1359: 1355: 1349: 1345: 1340: 1336: 1332: 1328: 1324: 1319: 1314: 1309: 1305: 1301: 1297: 1292: 1285: 1284: 1278: 1274: 1272:9781475726060 1268: 1264: 1263: 1257: 1253: 1249: 1245: 1240: 1237:(3): 206–210. 1236: 1232: 1228: 1223: 1219: 1215: 1211: 1207: 1203: 1199: 1194: 1190: 1185: 1181: 1179:9780521343060 1175: 1170: 1169: 1162: 1158: 1154: 1150: 1146: 1141: 1137: 1133: 1128: 1123: 1119: 1115: 1111: 1106: 1102: 1098: 1093: 1088: 1084: 1080: 1076: 1071: 1067: 1063: 1059: 1055: 1051: 1047: 1042: 1038: 1036:9783662049785 1032: 1028: 1027: 1021: 1006: 1002: 998: 991: 986: 982: 980:9780387955698 976: 972: 971: 965: 961: 957: 953: 949: 945: 941: 936: 932: 928: 923: 919: 915: 910: 905: 901: 897: 893: 888: 884: 880: 876: 872: 868: 864: 863: 858: 853: 850: 848: 844: 841: 840: 836: 824: 823: 815: 812: 808: 802: 799: 793: 790: 787: 784:Vitanyi, P. " 781: 778: 775: 770: 767: 762: 756: 752: 751: 743: 740: 737: 732: 730: 726: 720: 717: 704: 700: 694: 691: 688: 683: 681: 677: 670: 664: 661: 658: 655: 649: 646: 643: 640: 638: 635: 632: 629: 626: 623: 620: 617: 615: 612: 609: 606: 603: 600: 597: 594: 588: 585: 582: 579: 576: 573: 570: 567: 561: 558: 557: 552: 550: 548: 544: 539: 535: 531: 529: 525: 520: 518: 514: 510: 505: 502: 497: 494: 489: 486: 479: 477: 460: 457: 454: 442: 440: 435: 431: 427: 423: 419: 415: 412: âˆ’  411: 407: 403: 399: 395: 389: 384: 381: 375: 367: 365: 363: 359: 355: 351: 347: 342: 340: 336: 332: 328: 324: 320: 312: 310: 308: 304: 300: 296: 292: 288: 284: 280: 277: 276:axiomatizable 274: 270: 266: 262: 259: 255: 250: 248: 244: 240: 236: 232: 228: 224: 223:random string 220: 216: 211: 207: 205: 201: 197: 192: 190: 186: 182: 178: 174: 166: 164: 162: 158: 154: 150: 146: 142: 138: 134: 130: 126: 122: 118: 114: 110: 106: 101: 98: 94: 88: 86: 82: 78: 74: 70: 66: 62: 59:or any other 58: 54: 50: 46: 42: 38: 34: 30: 19: 3784: 3772: 3753: 3746: 3658:Econometrics 3608: / 3591:Chemometrics 3568:Epidemiology 3561: / 3534:Applications 3376:ARIMA model 3323:Q-statistic 3272:Stationarity 3168:Multivariate 3111: / 3107: / 3105:Multivariate 3103: / 3043: / 3039: / 2813:Bayes factor 2712:Signed rank 2624: 2598: 2590: 2578: 2273:Completeness 2109:Cohort study 2007:Opinion poll 1942:Missing data 1929:Study design 1884:Scatter plot 1806:Scatter plot 1799:Spearman's ρ 1761:Grouped data 1458:cite journal 1425: 1421: 1402: 1373: 1369: 1346:. Springer. 1343: 1326: 1322: 1303: 1299: 1282: 1261: 1251: 1247: 1234: 1230: 1201: 1188: 1167: 1151:(4): 350–9. 1148: 1144: 1117: 1113: 1082: 1078: 1049: 1045: 1025: 1012:. Retrieved 1005:the original 1000: 996: 969: 943: 939: 930: 926: 899: 895: 874: 870: 847:Scholarpedia 821: 814: 801: 792: 780: 769: 753:. Springer. 749: 742: 719: 707:. Retrieved 703:the original 693: 687:Chaitin 1975 596:Epistemology 540: 536: 532: 523: 521: 516: 512: 511:of a string 506: 503: 498: 495: 490: 487: 483: 443: 434:1-randomness 433: 425: 413: 409: 405: 397: 393: 391: 377: 350:Leonid Levin 343: 327:Bayes' rules 316: 298: 294: 286: 282: 268: 260: 251: 212: 208: 193: 175:measures on 170: 102: 89: 36: 32: 31: 29: 3786:WikiProject 3701:Cartography 3663:Jurimetrics 3615:Reliability 3346:Time domain 3325:(Ljung–Box) 3247:Time-series 3125:Categorical 3109:Time-series 3101:Categorical 3036:(Bernoulli) 2871:Correlation 2851:Correlation 2647:Jarque–Bera 2619:Chi-squared 2381:M-estimator 2334:Asymptotics 2278:Sufficiency 2045:Interaction 1957:Replication 1937:Effect size 1894:Violin plot 1874:Radar chart 1854:Forest plot 1844:Correlogram 1794:Kendall's τ 1306:(1): 1–22. 940:Cybernetics 933:(3): 19–23. 736:Calude 2013 439:Yongge Wang 358:Blum axioms 49:information 45:computation 3802:Categories 3653:Demography 3371:ARMA model 3176:Regression 2753:(Friedman) 2714:(Wilcoxon) 2652:Normality 2642:Lilliefors 2589:Student's 2465:Resampling 2339:Robustness 2327:divergence 2317:Efficiency 2255:(monotone) 2250:Likelihood 2167:Population 2000:Stratified 1952:Population 1771:Dependence 1727:Count data 1658:Percentile 1635:Dispersion 1568:Arithmetic 1503:Statistics 1254:: 522–526. 1191:(1): 3–11. 671:References 426:Martin-Löf 396:, for all 287:unknowable 273:consistent 239:likelihood 231:intuitions 196:compressed 179:(or other 173:complexity 149:stationary 147:, or even 129:randomness 107:(or other 3034:Logistic 2801:posterior 2727:Rank sum 2475:Jackknife 2470:Bootstrap 2288:Bootstrap 2223:Parameter 2172:Statistic 1967:Statistic 1879:Run chart 1864:Pie chart 1859:Histogram 1849:Fan chart 1824:Bar chart 1706:L-moments 1593:Geometric 1450:250850390 1390:250348327 1066:207698337 960:121736453 145:Markovian 3748:Category 3441:Survival 3318:Johansen 3041:Binomial 2996:Isotonic 2583:(normal) 2228:location 2035:Blocking 1990:Sampling 1869:Q–Q plot 1834:Box plot 1816:Graphics 1711:Skewness 1701:Kurtosis 1673:Variance 1603:Heronian 1598:Harmonic 1218:11402549 1136:14133389 1101:12584692 1014:June 30, 918:15710280 867:Blum, M. 553:See also 352:(1974). 219:rigorous 189:integers 167:Overview 117:integers 3774:Commons 3721:Kriging 3606:Process 3563:studies 3422:Wavelet 3255:General 2422:Plug-in 2216:L space 1995:Cluster 1696:Moments 1514:Outline 1430:Bibcode 418:measure 331:Caltech 313:History 200:program 177:strings 105:strings 97:entropy 73:Shannon 57:strings 3643:Census 3233:Normal 3181:Manova 3001:Robust 2751:2-way 2743:1-way 2581:-test 2252:  1829:Biplot 1620:Median 1613:Lehmer 1555:Center 1448:  1409:  1388:  1350:  1269:  1216:  1176:  1134:  1099:  1064:  1033:  997:J. UCS 977:  958:  916:  757:  709:May 3, 400:, the 307:normal 279:theory 233:about 225:and a 215:formal 159:, and 81:Turing 3267:Trend 2796:prior 2738:anova 2627:-test 2601:-test 2593:-test 2500:Power 2445:Pivot 2238:shape 2233:scale 1683:Shape 1663:Range 1608:Heinz 1583:Cubic 1519:Index 1446:S2CID 1386:S2CID 1366:(PDF) 1287:(PDF) 1214:S2CID 1132:S2CID 1097:S2CID 1062:S2CID 1008:(PDF) 993:(PDF) 956:S2CID 914:S2CID 826:(PDF) 3500:Test 2700:Sign 2552:Wald 1625:Mode 1563:Mean 1464:link 1407:ISBN 1348:ISBN 1267:ISBN 1174:ISBN 1031:ISBN 1016:2019 975:ISBN 755:ISBN 711:2010 265:halt 79:and 47:and 2680:BIC 2675:AIC 1438:doi 1426:256 1378:doi 1331:doi 1308:doi 1206:doi 1153:doi 1122:doi 1087:doi 1054:doi 948:doi 904:doi 879:doi 845:at 476:.) 309:). 237:or 83:'s 75:'s 37:AIT 3804:: 1460:}} 1456:{{ 1444:. 1436:. 1424:. 1401:. 1384:. 1374:54 1372:. 1368:. 1325:. 1302:. 1298:. 1252:17 1250:. 1246:. 1235:10 1233:. 1229:. 1212:. 1200:. 1149:21 1147:. 1130:. 1118:22 1116:. 1112:. 1095:. 1083:16 1081:. 1077:. 1060:. 1050:13 1048:. 999:. 995:. 954:. 944:26 942:. 931:25 929:. 912:. 900:14 898:. 894:. 875:11 873:. 728:^ 679:^ 217:, 191:. 163:. 155:, 143:, 2625:G 2599:F 2591:t 2579:Z 2298:V 2293:U 1495:e 1488:t 1481:v 1466:) 1452:. 1440:: 1432:: 1415:. 1392:. 1380:: 1356:. 1337:. 1333:: 1327:7 1316:. 1310:: 1304:7 1275:. 1220:. 1208:: 1182:. 1159:. 1155:: 1138:. 1124:: 1103:. 1089:: 1068:. 1056:: 1039:. 1018:. 1001:2 983:. 962:. 950:: 920:. 906:: 885:. 881:: 854:. 763:. 713:. 524:x 517:x 513:x 464:} 461:1 458:, 455:0 452:{ 414:c 410:n 406:n 398:n 394:c 299:Ω 295:Ω 283:Ω 269:Ω 261:Ω 35:( 20:)

Index

Algorithmic Information Theory
theoretical computer science
computation
information
stochastically
strings
data structure
information theory
Gregory Chaitin
Shannon
information theory
Turing
computability theory
self-delimited
entropy
strings
data structures
limit of a sequence
integers
metamathematics
classical information theory
randomness
probabilistic inference
probability distribution
independent and identically distributed
Markovian
stationary
algorithmic complexity
algorithmic randomness
algorithmic probability

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑