Knowledge

Talk:Differential entropy

Source 📝

428:
to a base measure. For the most common, discrete, entropy the base measure is the counting measure. For "continuous" probability distributions the entropy is computed with respect to the Lebesgue measure on the real line. In principe, however, you can choose any base measure. If you change the base measure, the entropy changes as well. (This seems counter-intuitive, since most people think that entropy is some inherent quantity.) In this respect the article is sloppy and does not mention with respect to which base measure the integral is computed and at the same time pretends that the measure space can be arbitrary. Either stick to the Lebesgue measure on the real line or do it in full generality.
278: 845:
will have a (continuous) distribution of voltages around 0 and around 1. If the distribution is very sloppy, and I compute entropy with the integral formula, I get a nice positive entropy. But if I sharpen up the oscillator, so that it hits the voltages 0 and 1 more and more precisely (more and more sharply peaked at 0 and 1), instead of being sloppy, the "continuous entropy" will go negative -- more and more negative, the sharper the peaks get. This is very disconcerting if one was hoping to get the entropy resembling that of a discrete coin toss out of the system.
450:
mathematical definition of information, but it is more important all types of people with different background understand the problems, concepts, and solutions. My vote is to keep simple (this is after all Knowledge, not a contest on how to make it precise with measure theory). The number of pages allowed is unlimited in internet. Please use a separate section on measure theory based definitions. In fact, by separating out this way, one will appreciate the beauty of measure theory for conceptualing problems and solving them elegantly.
417:- it is probably possible to give such a unique definition, but it would add unnecessary complication to the article; say, a Computer Science student does not normally need to study measure theory (I have a certain informal knowledge of it), as not all probability student need to know it. Beyond that, Shannon entropy is almost always used in the discrete version even because that's the natural application; beyond that, the two versions have different properties (the other article dismiss differential entropy as being of little use).-- 74: 53: 173: 163: 142: 678:(1-k/2)*psi(k/a)*(k/2). It can be derived from the general formula for "Gamma" distributions (next lines in the same table) for case scale parameter O=2 and shape parameter k=k/2. Besides correct formula for the entropy is given at wiki-page for "Gamma distribution" in the section "Information entropy". So I ask anybody who is a specialist in the theme and may prove my supposal - is there really an error? If yes, could you please correct it. 268: 247: 22: 2785:, but perhaps there is something to discuss, so I bring it up here. The undone edit spoke of the limit of a normal distribution as the variance increases without bound and how that is similar to a uniform distribution and how the uniform distribution on the real line maximizes differential entropy. Unfortunately, the uniform distribution on the real line is at best an 734:* Neumann shows that maximization of differential entropy under a constraint on expected model entropy is equivalent to maximization of relative entropy with a particular reference measure. That reference measure satisfies the demand from Jaynes that (up to a multiplicative factor) the reference measure also had to be the "total ignorance" prior. 2069: 1473: 427:
I am an avg CS theory student, far from an expert. I was trying to find a general definition of entropy but I could not find anything. After a bit of thinking, I realized that entropy (differential or not) of a probability distribution (equivalently, of a random variable) must be defined with respect
2402:
I removed the erroneous proof. It can still be seen above. I have proved the result by using variational calculus together with the fact that differential entropy is a strictly concave function (when functions differing in a set of measure zero are taken as equivalent). That takes 5 pages as a Latex
568:
is nearly zero outside of that interval, but the formula itself doesn't really mean anything. It is essentially unrelated to the ability to transmit information (which is the basis of entropy), and in fact, the entropy of any continuous distribution would always be infinite unless it is quantized. I
844:
Anyway, this article *completely* glosses over the hard problem of defining the continuous entropy. Consider a digital electronic oscillator, which nominally outputs two voltages: 0 Volts and 1 Volt. Now, if the oscillator is running at very high speed, it typically won't hit exactly 0 and 1, it
805:
I actually added this statement some time ago to the article (anonymously under my IP address), and it was recently marked as needing a citation or reference. I made the statement a little weaker by changing "quite general transformations" to "linear transformations", and added Reza's book as a
737:* Last not least, in physics there are frequently returning claims that differential entropy is useful, or in some settings even more powerful than relative entropy (e.g. Garbaczewski ). It would seem strange to me if such things do not find their counterparts in probability / information theory. 800:
Note that the continuous mutual information I(X;Y) has the distinction of retaining its fundamental significance as a measure of discrete information since it is actually the limit of the discrete mutual information of partitions of X and Y as these partitions become finer and finer. Thus it is
2484:(it is a differential cross entropy). In particular, the Kullback-Leilber divergence is not a difference of two differential entropies. Therefore, the first part of the proof does not apply. Note also that you are not using any of the assumptions: the first part works for any pdfs f and g. 449:
I am a practicing Statistician with Ph.D in statistics, and I think there is a need for the definitions the way the original developer developed the thought processes about problems, concepts, and solutions. Academically I have been trained in measure theory. It is not important for single
583:
I added a warning to the top of the page redirecting people to LDDP a little more strongly than the previous versions had. I think this is the correct treatment for this subject. This page is worthy of note, and worth keeping, because differential entropy is used (incorrectly) all over the
840:
Hmm. At the top of this talk page, there is discussion of measure theory, but there is not a drop of measure theory to be found in this article. FWIW, at least some textbooks on probability *start* with defining measure theory (I own one such), so I'm not sure why this should be so scary.
402:
i am not sure how familiar the avg information theorist is with measure theory and integration. the article does use language such as "almost everywhere", "random variable", etc, so it must not be foreign. in that case, the division between "differential" and "non-differential" entropy is
2331:
In the properties section, it is stated that the entropy of a transformed RV can be calculated by adding the expected value of the log of the Jacobian to the entropy of the original RV. I believe, however, that this is only true for bijective transforms, but not for the general case.
1891: 1584: 1320: 1882: 1776: 1081: 763:
Tilman Neumann: “Bayesian Inference Featuring Entropic Priors”, in Proceedings of 27th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, American Institute of Physics, vol. 954 (2007), pp. 283–292. Preprint:
2789:. The article does not discuss these at all, and it is not at all clear from the present article how differential entropy would apply to an improper distribution. If someone has a source that discusses this, it might merit adding material to the article. 379:
The normal distribution maximizes differential entropy for a distribution with a given mean and a given variance. Perhaps it should be emphasized that there are distributions for which the mean and variance do not exist, such as the Cauchy distribution.
1326: 2642: 2261: 2161: 1666: 1205: 714:, but they give totally different (not contradictory) results and information. Since differential entropy is of little use in Computer Science (at least so I suppose), it can be seen as an out-of-topic section in 806:
reference. However, it is still true that I(X;Y) is invariant under any bijective, continuous (and thus monotonic) transformations of the continuous spaces X and Y. This fact needs a reference, though.
2064:{\displaystyle ={h}_{f\left(x\right)}\left(X\right)+{\underset {=-{h}_{g\left(x\right)}\left(X\right)}{\underbrace {\int _{-\infty }^{\infty }f\left(x\right)\log \left(g\left(x\right)\right)dx)} }}} 124: 2750: 2827: 2813: 801:
invariant under quite general transformations of X and Y, and still represents the amount of discrete information that can be transmitted over a channel that admits a continuous space of values.
2755:
Maybe the requirement that the two have the same variance was dropped? If not, they won't cancel and you end up with the ratio of standard deviations in the answer. Or am I missing something?
2310: 1479: 1214: 535:, so it cannot be fed into the logarithm without some sort of scaling factor applied. This entire formula was the result of a mistake made by Shannon when he was really trying to define the 229: 1782: 1676: 2358:
The current proof on that a normal distribution maximizes differential entropy seems erroneous to me. First, it does not use normality anywhere. Second, the last integral is actually not
961: 848:
This is an actual problem which does show up, and is partly solvable by using bins, and doing bin-counting. But this hardly merits the la-dee-da treatment given here! </rant: -->
403:
unnecessary, and quite misleading. one can simply use a single definition for entropy, for any random variable on a arbitrary measure space, and surely it has been done somewhere.
1468:{\displaystyle =\int _{-\infty }^{\infty }f\left(x\right){\underset {\log \left(x\right)\leq x-1}{\underbrace {\log \left({\frac {g\left(x\right)}{f\left(x\right)}}\right)} }}dx} 1113: 951: 907: 533: 2884: 356:
Regarding "Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not." I propose
757:
Matthew Brand: "Structure Learning in Conditional Probability Models via an Entropic Prior and Parameter Extinction". Neural Comp. vol. 11 (1999), pp. 1155-1182. Preprint:
2762: 785:
This article was automatically assessed because at least one WikiProject had rated the article as start, and the rating on other projects was brought up to start class.
2544: 2462: 2383: 2665: 566: 501: 666:
It seems to me there is else one error: in the table "Table of differential entropies" of the section "Differential entropies for various distributions", line for "
2482: 2172: 640:
I believe there is an error in the example of a uniform(0,1/2) distribution, the integral should evaluate to log(1/2), I do not have access to tex to correct it.
334: 2075: 820:
Mutual information is invariant under diffeomorphisms of X and Y, which is not hard to show. Don't know though whether the same is true of homeomorphisms. --
2854: 114: 2869: 219: 773:
Piotr Garbaczewski: "Differential Entropy and Dynamics of Uncertainty". Journal of Statistical Physics, vol. 123 (2006), no. 2, pp. 315-355. Preprint:
539:. As a result, this is useful in that it is part of the LDDP in the cases where the measure is constant in x over some interval, and the probability 2879: 2859: 324: 1590: 569:
think this article needs stronger pointers to LDDP, and the LDDP articles need to be much more clear about the relationship with discrete entropy.
643: 603:
I believe there is an error in the differential entropy for multivariate Gaussian (it should be in minus sign). No access to tex to correct it.
195: 2437:
The first part of your proof shows that Kullback-Leibler divergence is non-negative. That is correct. The error is in that what you define as
277: 2864: 647: 361: 90: 1131: 2849: 2339: 2689:
Since differential entropy is translation invariant we can assume that f(x) has the same mean as g(x). I've added this note to the proof.
474:
The problem here is that differential entropy doesn't really mean much. Among other things, it's not even dimensionally correct. The term
610: 536: 460: 728:
I think it would be too quick to call differential entropy "of little use" in information theory. There are some hints to the opposite:
2831: 2817: 435: 300: 2874: 694: 626: 186: 147: 81: 58: 1124: 2766: 2538:
The normal distribution maximizes the entropy for a fixed variance and MEAN. In the final step of the proof, it says that
291: 252: 1579:{\displaystyle \leq \int _{-\infty }^{\infty }f\left(x\right)\left({\frac {g\left(x\right)}{f\left(x\right)}}-1\right)dx} 1315:{\displaystyle =-\int _{-\infty }^{\infty }f\left(x\right)\log \left({\frac {f\left(x\right)}{g\left(x\right)}}\right)dx} 2709: 1116: 913: 770:
Edwin T. Jaynes, "Probability Theory: The Logic of Science", Corrected reprint, Cambridge University Press 2004, p. 377.
33: 1877:{\displaystyle \int _{-\infty }^{\infty }f\left(x\right)\log \left({\frac {g\left(x\right)}{f\left(x\right)}}\right)dx} 1771:{\displaystyle \int _{-\infty }^{\infty }f\left(x\right)\log \left({\frac {g\left(x\right)}{f\left(x\right)}}\right)dx} 1076:{\displaystyle \int _{-\infty }^{\infty }g\left(x\right)xdx=0,\ \int _{-\infty }^{\infty }g\left(x\right){x}^{2}dx=1} 2267: 365: 2403:
PDF. I wonder whether the proof would be worthy as its own article? Are you aware of shorter routes to a proof?
651: 2694: 2510: 2423:
I do use properties of the Normal Distribution, please, let me know as I intend to get it back to the page. --
2343: 614: 464: 439: 690: 630: 357: 2690: 2506: 39: 746: 172: 686: 73: 52: 2758: 2672: 2335: 954: 919: 870: 731:* Brand suggested minimization of posterior differential entropy as a criterium for model selection. 682: 606: 456: 431: 2782: 1087: 925: 881: 506: 21: 910: 807: 786: 715: 711: 414: 2424: 2313: 742: 584:
literature, so it's worth being aware of it, and understanding its properties, such as they are.
299:
on Knowledge. If you would like to participate, please visit the project page, where you can join
194:
on Knowledge. If you would like to participate, please visit the project page, where you can join
89:
on Knowledge. If you would like to participate, please visit the project page, where you can join
2790: 2637:{\displaystyle \int _{-\infty }^{\infty }f(x){\frac {(x-\mu )^{2}}{2\sigma ^{2}}}={\frac {1}{2}}} 811: 589: 574: 178: 162: 141: 2786: 2428: 2317: 719: 418: 2256:{\displaystyle {h}_{f\left(x\right)}\left(X\right)-{h}_{g\left(x\right)}\left(X\right)\leq 0} 2794: 712:
Information entropy#Extending discrete entropy to the continuous case: differential entropy
415:
Information entropy#Extending_discrete_entropy_to_the_continuous_case:_differential_entropy
2524: 2492: 2440: 2411: 2393: 2361: 853: 825: 388: 2650: 542: 477: 2156:{\displaystyle ={h}_{f\left(x\right)}\left(X\right)-{h}_{g\left(x\right)}\left(X\right)} 867:
This property is given in the article, thought it would be valuable to add a full proof.
2519:
Adding those missing steps would be much appreciated:) After that the proof is fine. --
283: 2467: 453:
It is not "unnecessary treatment */. It is very much a necessary treatment. Thanks
2843: 2680: 585: 570: 404: 267: 246: 191: 2835: 2821: 2798: 2770: 2698: 2684: 2528: 2514: 2496: 2432: 2415: 2397: 2347: 2321: 857: 829: 815: 789: 750: 722: 698: 655: 634: 618: 593: 578: 468: 443: 421: 407: 392: 369: 2520: 2488: 2407: 2389: 1661:{\displaystyle =\int _{-\infty }^{\infty }g\left(x\right)-f\left(x\right)dx=0} 849: 821: 625:
I'm gonna enjoy watching you die, so let me it with my own eyes.(Darth Vader)
384: 273: 168: 86: 765: 863:
Normal Distribution Maximizes The Differential Entropy For a Given Variance
774: 758: 2676: 2809: 1200:{\displaystyle -{D}_{KL}\left(f\left(x\right)\|g\left(x\right)\right)=} 358:
https://www.crmarsh.com/static/pdf/Charles_Marsh_Continuous_Entropy.pdf
296: 2505:
I can incorporate this into article if you think it will be useful. --
2420:
I don't see the error in the proof. Could you point me to the error?
2703:
The expression in the proof should give you two different sigmas:
413:
While not being an expert on the subject, I do not agree, see
15: 2534:
Maximization in the normal distribution - Proof is incorrect
718:, i.e. something deserving just a mention in "See also". -- 670:", column "Entropy in nats". I suppose there should be 2502: 2712: 2653: 2547: 2470: 2443: 2364: 2270: 2175: 2078: 1894: 1785: 1679: 1593: 1482: 1329: 1217: 1134: 1090: 964: 928: 884: 545: 509: 480: 2168:
Applying the inequality we showed prviously we get:
295:, a collaborative effort to improve the coverage of 190:, a collaborative effort to improve the coverage of 85:, a collaborative effort to improve the coverage of 2810:
https://en.wikipedia.org/Differential_entropy#Proof
2745:{\displaystyle {\frac {\sigma _{g}}{2\sigma _{f}}}} 2744: 2659: 2636: 2476: 2456: 2377: 2304: 2255: 2155: 2063: 1876: 1770: 1660: 1578: 1467: 1314: 1199: 1107: 1075: 945: 901: 560: 527: 495: 2777:Application to improper probability distributions 2501:The proof seems ok. Here are some missing steps: 2305:{\displaystyle g\left(x\right)=f\left(x\right)} 869:One proof could be made for smooth PDF's using 8: 1172: 674:before (1-k/2), i.e. it should be ln2Г(k/2) 710:This article talks about the same thing as 2885:C-Class physics articles of Low-importance 2756: 2671:. I will give an alternate proof based on 766:http://www.tilman-neumann.de/docs/BIEP.pdf 241: 136: 47: 2733: 2719: 2713: 2711: 2652: 2624: 2612: 2597: 2578: 2560: 2552: 2546: 2469: 2448: 2442: 2369: 2363: 2269: 2219: 2214: 2182: 2177: 2174: 2125: 2120: 2088: 2083: 2077: 2030: 2025: 1952: 1944: 1937: 1935: 1904: 1899: 1893: 1828: 1798: 1790: 1784: 1722: 1692: 1684: 1678: 1609: 1601: 1592: 1523: 1498: 1490: 1481: 1380: 1367: 1365: 1345: 1337: 1328: 1266: 1236: 1228: 1216: 1144: 1139: 1133: 1089: 1055: 1050: 1029: 1021: 977: 969: 963: 927: 883: 544: 510: 508: 479: 2826:Oh, now the error is gone. Strang... -- 775:http://arxiv.org/abs/quant-ph/0408192v3 759:http://citeseer.ist.psu.edu/247075.html 243: 138: 49: 19: 2828:2A02:560:4109:AE00:8868:8CDC:7CBA:B775 2814:2A02:560:4109:AE00:8868:8CDC:7CBA:B775 1119:with the same statistical properties. 7: 375:Maximization of differential entropy 289:This article is within the scope of 184:This article is within the scope of 79:This article is within the scope of 2464:is not the differential entropy of 2327:Transformations of Random Variables 537:limiting density of discrete points 38:It is of interest to the following 2855:Mid-importance Statistics articles 2561: 2556: 1953: 1948: 1799: 1794: 1693: 1688: 1610: 1605: 1499: 1494: 1346: 1341: 1237: 1232: 1030: 1025: 978: 973: 14: 2870:Low-priority mathematics articles 2787:improper probability distribution 2264:With the Equality holds only for 204:Knowledge:WikiProject Mathematics 918:Without loss of generality (See 276: 266: 245: 207:Template:WikiProject Mathematics 171: 161: 140: 99:Knowledge:WikiProject Statistics 72: 51: 20: 2880:Low-importance physics articles 2860:WikiProject Statistics articles 1108:{\displaystyle f\left(x\right)} 946:{\displaystyle g\left(x\right)} 920:Differential Entropy Properties 902:{\displaystyle g\left(x\right)} 528:{\displaystyle {\frac {1}{dx}}} 329:This article has been rated as 224:This article has been rated as 119:This article has been rated as 102:Template:WikiProject Statistics 2594: 2581: 2575: 2569: 2416:00:11, 23 September 2010 (UTC) 2008: 876:There is a more general proof: 723:18:41, 10 September 2007 (UTC) 699:18:58, 13 September 2015 (UTC) 555: 549: 490: 484: 422:18:34, 10 September 2007 (UTC) 1: 2497:17:22, 15 February 2011 (UTC) 795:Continuous mutual information 790:09:48, 10 November 2007 (UTC) 751:12:52, 20 December 2007 (UTC) 444:22:22, 29 February 2008 (UTC) 309:Knowledge:WikiProject Physics 303:and see a list of open tasks. 198:and see a list of open tasks. 93:and see a list of open tasks. 2865:C-Class mathematics articles 2836:21:05, 15 October 2018 (UTC) 2822:21:04, 15 October 2018 (UTC) 2799:13:39, 13 January 2017 (UTC) 2763:2620:10D:C091:480:0:0:1:7F8F 2675:if no one has an objection. 2433:16:33, 27 January 2011 (UTC) 2322:13:07, 30 January 2010 (UTC) 830:22:15, 4 November 2011 (UTC) 393:11:27, 20 October 2014 (UTC) 312:Template:WikiProject Physics 2850:C-Class Statistics articles 2699:12:17, 2 October 2011 (UTC) 2398:15:52, 30 August 2010 (UTC) 1125:Kullback–Leibler divergence 858:15:21, 29 August 2008 (UTC) 619:17:08, 6 October 2014 (UTC) 352:citation in first paragraph 2901: 2529:00:59, 20 April 2011 (UTC) 816:06:57, 12 March 2008 (UTC) 656:07:46, 30 March 2007 (UTC) 335:project's importance scale 2771:20:39, 1 March 2023 (UTC) 2685:01:56, 27 July 2011 (UTC) 2515:12:45, 5 April 2011 (UTC) 2348:09:08, 19 July 2010 (UTC) 635:09:23, 7 April 2024 (UTC) 594:15:41, 8 April 2016 (UTC) 579:15:33, 8 April 2016 (UTC) 408:03:53, 18 June 2006 (UTC) 370:20:29, 23 July 2019 (UTC) 328: 261: 223: 156: 118: 67: 46: 2875:C-Class physics articles 1123:We'll apply on them the 953:to be centered with its 781:WikiProject class rating 469:06:00, 19 May 2008 (UTC) 230:project's priority scale 187:WikiProject Mathematics 2746: 2661: 2647:which is true only if 2638: 2478: 2458: 2379: 2306: 2257: 2157: 2065: 1878: 1772: 1662: 1580: 1469: 1316: 1201: 1127:(Mind the Minus Sign): 1109: 1077: 947: 903: 803: 562: 529: 497: 82:WikiProject Statistics 28:This article is rated 2747: 2662: 2639: 2479: 2459: 2457:{\displaystyle h_{g}} 2380: 2378:{\displaystyle h_{g}} 2307: 2258: 2158: 2066: 1879: 1773: 1663: 1581: 1470: 1317: 1202: 1110: 1078: 948: 904: 835:WTF! This is a mess. 798: 646:comment was added by 563: 530: 498: 398:unnecessary treatment 2710: 2673:Lagrange multipliers 2667:is also the mean of 2660:{\displaystyle \mu } 2651: 2545: 2503:http://min.us/lkIWMq 2468: 2441: 2362: 2268: 2173: 2076: 1892: 1783: 1677: 1591: 1480: 1327: 1215: 1132: 1088: 962: 926: 882: 871:Lagrange multipliers 561:{\displaystyle p(x)} 543: 507: 503:would have units of 496:{\displaystyle f(x)} 478: 210:mathematics articles 2781:I undid an edit of 2565: 1957: 1803: 1697: 1614: 1503: 1350: 1241: 1034: 982: 911:Normal Distribution 716:Information entropy 292:WikiProject Physics 105:Statistics articles 2742: 2657: 2634: 2548: 2474: 2454: 2375: 2302: 2253: 2153: 2061: 2059: 2015: 1940: 1874: 1786: 1768: 1680: 1658: 1597: 1576: 1486: 1465: 1457: 1424: 1333: 1312: 1224: 1197: 1105: 1073: 1017: 965: 943: 922:) we could assume 899: 558: 525: 493: 179:Mathematics portal 34:content assessment 2773: 2761:comment added by 2740: 2632: 2619: 2477:{\displaystyle g} 2338:comment added by 1938: 1936: 1862: 1756: 1557: 1414: 1368: 1366: 1300: 1115:be any arbitrary 1016: 702: 685:comment added by 659: 609:comment added by 523: 471: 459:comment added by 446: 434:comment added by 349: 348: 345: 344: 341: 340: 240: 239: 236: 235: 135: 134: 131: 130: 2892: 2751: 2749: 2748: 2743: 2741: 2739: 2738: 2737: 2724: 2723: 2714: 2666: 2664: 2663: 2658: 2643: 2641: 2640: 2635: 2633: 2625: 2620: 2618: 2617: 2616: 2603: 2602: 2601: 2579: 2564: 2559: 2483: 2481: 2480: 2475: 2463: 2461: 2460: 2455: 2453: 2452: 2384: 2382: 2381: 2376: 2374: 2373: 2354:Erroneous proof? 2350: 2311: 2309: 2308: 2303: 2301: 2284: 2262: 2260: 2259: 2254: 2246: 2235: 2234: 2233: 2218: 2209: 2198: 2197: 2196: 2181: 2162: 2160: 2159: 2154: 2152: 2141: 2140: 2139: 2124: 2115: 2104: 2103: 2102: 2087: 2070: 2068: 2067: 2062: 2060: 2058: 2057: 2046: 2045: 2044: 2029: 2016: 2011: 2001: 1997: 1996: 1971: 1956: 1951: 1931: 1920: 1919: 1918: 1903: 1883: 1881: 1880: 1875: 1867: 1863: 1861: 1860: 1845: 1844: 1829: 1817: 1802: 1797: 1777: 1775: 1774: 1769: 1761: 1757: 1755: 1754: 1739: 1738: 1723: 1711: 1696: 1691: 1673:Looking back on 1667: 1665: 1664: 1659: 1645: 1628: 1613: 1608: 1585: 1583: 1582: 1577: 1569: 1565: 1558: 1556: 1555: 1540: 1539: 1524: 1517: 1502: 1497: 1474: 1472: 1471: 1466: 1458: 1456: 1443: 1425: 1420: 1419: 1415: 1413: 1412: 1397: 1396: 1381: 1364: 1349: 1344: 1321: 1319: 1318: 1313: 1305: 1301: 1299: 1298: 1283: 1282: 1267: 1255: 1240: 1235: 1206: 1204: 1203: 1198: 1193: 1189: 1188: 1171: 1152: 1151: 1143: 1114: 1112: 1111: 1106: 1104: 1082: 1080: 1079: 1074: 1060: 1059: 1054: 1048: 1033: 1028: 1015: 996: 981: 976: 952: 950: 949: 944: 942: 908: 906: 905: 900: 898: 701: 679: 641: 621: 567: 565: 564: 559: 534: 532: 531: 526: 524: 522: 511: 502: 500: 499: 494: 454: 429: 317: 316: 315:physics articles 313: 310: 307: 286: 281: 280: 270: 263: 262: 257: 249: 242: 212: 211: 208: 205: 202: 181: 176: 175: 165: 158: 157: 152: 144: 137: 125:importance scale 107: 106: 103: 100: 97: 76: 69: 68: 63: 55: 48: 31: 25: 24: 16: 2900: 2899: 2895: 2894: 2893: 2891: 2890: 2889: 2840: 2839: 2806: 2779: 2729: 2725: 2715: 2708: 2707: 2649: 2648: 2608: 2604: 2593: 2580: 2543: 2542: 2536: 2466: 2465: 2444: 2439: 2438: 2365: 2360: 2359: 2356: 2333: 2329: 2291: 2274: 2266: 2265: 2236: 2223: 2213: 2199: 2186: 2176: 2171: 2170: 2142: 2129: 2119: 2105: 2092: 2082: 2074: 2073: 2047: 2034: 2024: 2017: 1986: 1982: 1978: 1961: 1939: 1921: 1908: 1898: 1890: 1889: 1850: 1846: 1834: 1830: 1824: 1807: 1781: 1780: 1744: 1740: 1728: 1724: 1718: 1701: 1675: 1674: 1635: 1618: 1589: 1588: 1545: 1541: 1529: 1525: 1522: 1518: 1507: 1478: 1477: 1433: 1426: 1402: 1398: 1386: 1382: 1376: 1369: 1354: 1325: 1324: 1288: 1284: 1272: 1268: 1262: 1245: 1213: 1212: 1178: 1161: 1157: 1153: 1138: 1130: 1129: 1094: 1086: 1085: 1049: 1038: 986: 960: 959: 932: 924: 923: 888: 880: 879: 865: 837: 797: 783: 708: 680: 648:141.149.218.145 642:—The preceding 604: 601: 541: 540: 515: 505: 504: 476: 475: 400: 377: 362:167.191.202.243 354: 314: 311: 308: 305: 304: 282: 275: 255: 209: 206: 203: 200: 199: 177: 170: 150: 104: 101: 98: 95: 94: 61: 32:on Knowledge's 29: 12: 11: 5: 2898: 2896: 2888: 2887: 2882: 2877: 2872: 2867: 2862: 2857: 2852: 2842: 2841: 2805: 2802: 2783:User:Portabili 2778: 2775: 2753: 2752: 2736: 2732: 2728: 2722: 2718: 2691:Vladimir Iofik 2656: 2645: 2644: 2631: 2628: 2623: 2615: 2611: 2607: 2600: 2596: 2592: 2589: 2586: 2583: 2577: 2574: 2571: 2568: 2563: 2558: 2555: 2551: 2535: 2532: 2507:Vladimir Iofik 2473: 2451: 2447: 2372: 2368: 2355: 2352: 2340:129.27.140.190 2328: 2325: 2300: 2297: 2294: 2290: 2287: 2283: 2280: 2277: 2273: 2263: 2252: 2249: 2245: 2242: 2239: 2232: 2229: 2226: 2222: 2217: 2212: 2208: 2205: 2202: 2195: 2192: 2189: 2185: 2180: 2169: 2166: 2165: 2164: 2163: 2151: 2148: 2145: 2138: 2135: 2132: 2128: 2123: 2118: 2114: 2111: 2108: 2101: 2098: 2095: 2091: 2086: 2081: 2071: 2056: 2053: 2050: 2043: 2040: 2037: 2033: 2028: 2023: 2020: 2014: 2010: 2007: 2004: 2000: 1995: 1992: 1989: 1985: 1981: 1977: 1974: 1970: 1967: 1964: 1960: 1955: 1950: 1947: 1943: 1934: 1930: 1927: 1924: 1917: 1914: 1911: 1907: 1902: 1897: 1873: 1870: 1866: 1859: 1856: 1853: 1849: 1843: 1840: 1837: 1833: 1827: 1823: 1820: 1816: 1813: 1810: 1806: 1801: 1796: 1793: 1789: 1779: 1767: 1764: 1760: 1753: 1750: 1747: 1743: 1737: 1734: 1731: 1727: 1721: 1717: 1714: 1710: 1707: 1704: 1700: 1695: 1690: 1687: 1683: 1671: 1670: 1669: 1668: 1657: 1654: 1651: 1648: 1644: 1641: 1638: 1634: 1631: 1627: 1624: 1621: 1617: 1612: 1607: 1604: 1600: 1596: 1586: 1575: 1572: 1568: 1564: 1561: 1554: 1551: 1548: 1544: 1538: 1535: 1532: 1528: 1521: 1516: 1513: 1510: 1506: 1501: 1496: 1493: 1489: 1485: 1475: 1464: 1461: 1455: 1452: 1449: 1446: 1442: 1439: 1436: 1432: 1429: 1423: 1418: 1411: 1408: 1405: 1401: 1395: 1392: 1389: 1385: 1379: 1375: 1372: 1363: 1360: 1357: 1353: 1348: 1343: 1340: 1336: 1332: 1322: 1311: 1308: 1304: 1297: 1294: 1291: 1287: 1281: 1278: 1275: 1271: 1265: 1261: 1258: 1254: 1251: 1248: 1244: 1239: 1234: 1231: 1227: 1223: 1220: 1196: 1192: 1187: 1184: 1181: 1177: 1174: 1170: 1167: 1164: 1160: 1156: 1150: 1147: 1142: 1137: 1128: 1120: 1103: 1100: 1097: 1093: 1083: 1072: 1069: 1066: 1063: 1058: 1053: 1047: 1044: 1041: 1037: 1032: 1027: 1024: 1020: 1014: 1011: 1008: 1005: 1002: 999: 995: 992: 989: 985: 980: 975: 972: 968: 958: 941: 938: 935: 931: 917: 897: 894: 891: 887: 877: 868: 864: 861: 836: 833: 796: 793: 787:BetacommandBot 782: 779: 778: 777: 771: 768: 761: 754: 753: 739: 738: 735: 732: 729: 707: 706:Merge proposal 704: 664:September 2015 662: 638: 637: 600: 597: 557: 554: 551: 548: 521: 518: 514: 492: 489: 486: 483: 425: 424: 399: 396: 376: 373: 353: 350: 347: 346: 343: 342: 339: 338: 331:Low-importance 327: 321: 320: 318: 301:the discussion 288: 287: 284:Physics portal 271: 259: 258: 256:Low‑importance 250: 238: 237: 234: 233: 222: 216: 215: 213: 196:the discussion 183: 182: 166: 154: 153: 145: 133: 132: 129: 128: 121:Mid-importance 117: 111: 110: 108: 91:the discussion 77: 65: 64: 62:Mid‑importance 56: 44: 43: 37: 26: 13: 10: 9: 6: 4: 3: 2: 2897: 2886: 2883: 2881: 2878: 2876: 2873: 2871: 2868: 2866: 2863: 2861: 2858: 2856: 2853: 2851: 2848: 2847: 2845: 2838: 2837: 2833: 2829: 2824: 2823: 2819: 2815: 2811: 2803: 2801: 2800: 2796: 2792: 2788: 2784: 2776: 2774: 2772: 2768: 2764: 2760: 2734: 2730: 2726: 2720: 2716: 2706: 2705: 2704: 2701: 2700: 2696: 2692: 2687: 2686: 2682: 2678: 2674: 2670: 2654: 2629: 2626: 2621: 2613: 2609: 2605: 2598: 2590: 2587: 2584: 2572: 2566: 2553: 2549: 2541: 2540: 2539: 2533: 2531: 2530: 2526: 2522: 2517: 2516: 2512: 2508: 2504: 2499: 2498: 2494: 2490: 2485: 2471: 2449: 2445: 2435: 2434: 2430: 2426: 2421: 2418: 2417: 2413: 2409: 2404: 2400: 2399: 2395: 2391: 2386: 2370: 2366: 2353: 2351: 2349: 2345: 2341: 2337: 2326: 2324: 2323: 2319: 2315: 2298: 2295: 2292: 2288: 2285: 2281: 2278: 2275: 2271: 2250: 2247: 2243: 2240: 2237: 2230: 2227: 2224: 2220: 2215: 2210: 2206: 2203: 2200: 2193: 2190: 2187: 2183: 2178: 2149: 2146: 2143: 2136: 2133: 2130: 2126: 2121: 2116: 2112: 2109: 2106: 2099: 2096: 2093: 2089: 2084: 2079: 2072: 2054: 2051: 2048: 2041: 2038: 2035: 2031: 2026: 2021: 2018: 2012: 2005: 2002: 1998: 1993: 1990: 1987: 1983: 1979: 1975: 1972: 1968: 1965: 1962: 1958: 1945: 1941: 1932: 1928: 1925: 1922: 1915: 1912: 1909: 1905: 1900: 1895: 1888: 1887: 1886: 1885: 1884: 1871: 1868: 1864: 1857: 1854: 1851: 1847: 1841: 1838: 1835: 1831: 1825: 1821: 1818: 1814: 1811: 1808: 1804: 1791: 1787: 1765: 1762: 1758: 1751: 1748: 1745: 1741: 1735: 1732: 1729: 1725: 1719: 1715: 1712: 1708: 1705: 1702: 1698: 1685: 1681: 1655: 1652: 1649: 1646: 1642: 1639: 1636: 1632: 1629: 1625: 1622: 1619: 1615: 1602: 1598: 1594: 1587: 1573: 1570: 1566: 1562: 1559: 1552: 1549: 1546: 1542: 1536: 1533: 1530: 1526: 1519: 1514: 1511: 1508: 1504: 1491: 1487: 1483: 1476: 1462: 1459: 1453: 1450: 1447: 1444: 1440: 1437: 1434: 1430: 1427: 1421: 1416: 1409: 1406: 1403: 1399: 1393: 1390: 1387: 1383: 1377: 1373: 1370: 1361: 1358: 1355: 1351: 1338: 1334: 1330: 1323: 1309: 1306: 1302: 1295: 1292: 1289: 1285: 1279: 1276: 1273: 1269: 1263: 1259: 1256: 1252: 1249: 1246: 1242: 1229: 1225: 1221: 1218: 1211: 1210: 1209: 1208: 1207: 1194: 1190: 1185: 1182: 1179: 1175: 1168: 1165: 1162: 1158: 1154: 1148: 1145: 1140: 1135: 1126: 1121: 1118: 1101: 1098: 1095: 1091: 1070: 1067: 1064: 1061: 1056: 1051: 1045: 1042: 1039: 1035: 1022: 1018: 1012: 1009: 1006: 1003: 1000: 997: 993: 990: 987: 983: 970: 966: 956: 955:second moment 939: 936: 933: 929: 921: 915: 912: 895: 892: 889: 885: 874: 872: 862: 860: 859: 855: 851: 846: 842: 839:<rant: --> 834: 832: 831: 827: 823: 818: 817: 813: 809: 802: 794: 792: 791: 788: 780: 776: 772: 769: 767: 762: 760: 756: 755: 752: 748: 744: 741: 740: 736: 733: 730: 727: 726: 725: 724: 721: 717: 713: 705: 703: 700: 696: 692: 688: 684: 677: 673: 669: 665: 660: 657: 653: 649: 645: 636: 632: 628: 624: 623: 622: 620: 616: 612: 611:93.173.58.105 608: 599:Error in Text 598: 596: 595: 591: 587: 581: 580: 576: 572: 552: 546: 538: 519: 516: 512: 487: 481: 472: 470: 466: 462: 461:198.160.96.25 458: 451: 447: 445: 441: 437: 433: 423: 420: 416: 412: 411: 410: 409: 406: 397: 395: 394: 390: 386: 381: 374: 372: 371: 367: 363: 359: 351: 336: 332: 326: 323: 322: 319: 302: 298: 294: 293: 285: 279: 274: 272: 269: 265: 264: 260: 254: 251: 248: 244: 231: 227: 221: 218: 217: 214: 197: 193: 189: 188: 180: 174: 169: 167: 164: 160: 159: 155: 149: 146: 143: 139: 126: 122: 116: 113: 112: 109: 92: 88: 84: 83: 78: 75: 71: 70: 66: 60: 57: 54: 50: 45: 41: 35: 27: 23: 18: 17: 2825: 2807: 2804:MathML error 2780: 2757:— Preceding 2754: 2702: 2688: 2668: 2646: 2537: 2518: 2500: 2486: 2436: 2422: 2419: 2405: 2401: 2387: 2357: 2330: 2167: 1672: 1122: 875: 866: 847: 843: 838: 819: 804: 799: 784: 720:Blaisorblade 709: 681:— Preceding 675: 671: 667: 663: 661: 639: 605:— Preceding 602: 582: 473: 452: 448: 436:129.97.84.19 426: 419:Blaisorblade 401: 382: 378: 360:(dead link) 355: 330: 290: 226:Low-priority 225: 185: 151:Low‑priority 120: 80: 40:WikiProjects 2808:in section 2334:—Preceding 687:Ashc~ukwiki 668:Chi-squared 627:5.176.76.35 455:—Preceding 430:—Preceding 201:Mathematics 192:mathematics 148:Mathematics 2844:Categories 96:Statistics 87:statistics 59:Statistics 672:plus sign 2759:unsigned 2336:unsigned 957:to be 1: 808:Deepmath 695:contribs 683:unsigned 644:unsigned 607:unsigned 586:Vertigre 571:Vertigre 457:unsigned 432:unsigned 2791:𝕃eegrc 1778:we get: 743:Webtier 405:Mct mht 333:on the 306:Physics 297:Physics 253:Physics 228:on the 123:on the 30:C-class 2425:Royi A 2314:Royi A 36:scale. 2521:Kaba3 2489:Kaba3 2408:Kaba3 2390:Kaba3 909:be a 850:linas 822:Kaba3 385:Kaba3 2832:talk 2818:talk 2795:talk 2767:talk 2695:talk 2681:talk 2669:f(x) 2525:talk 2511:talk 2493:talk 2429:talk 2412:talk 2394:talk 2344:talk 2318:talk 2312:. -- 1084:Let 878:Let 854:talk 826:talk 812:talk 747:talk 691:talk 652:talk 631:talk 615:talk 590:talk 575:talk 465:talk 440:talk 389:talk 366:talk 2677:PAR 1973:log 1819:log 1713:log 1428:log 1371:log 1257:log 1117:PDF 914:PDF 325:Low 220:Low 115:Mid 2846:: 2834:) 2820:) 2812:-- 2797:) 2769:) 2731:σ 2717:σ 2697:) 2683:) 2655:μ 2610:σ 2591:μ 2588:− 2562:∞ 2557:∞ 2554:− 2550:∫ 2527:) 2513:) 2495:) 2487:-- 2431:) 2414:) 2406:-- 2396:) 2388:-- 2385:. 2346:) 2320:) 2248:≤ 2211:− 2117:− 2022:− 2013:⏟ 1976:⁡ 1954:∞ 1949:∞ 1946:− 1942:∫ 1822:⁡ 1800:∞ 1795:∞ 1792:− 1788:∫ 1716:⁡ 1694:∞ 1689:∞ 1686:− 1682:∫ 1630:− 1611:∞ 1606:∞ 1603:− 1599:∫ 1560:− 1500:∞ 1495:∞ 1492:− 1488:∫ 1484:≤ 1451:− 1445:≤ 1431:⁡ 1422:⏟ 1374:⁡ 1347:∞ 1342:∞ 1339:− 1335:∫ 1260:⁡ 1238:∞ 1233:∞ 1230:− 1226:∫ 1222:− 1173:‖ 1136:− 1031:∞ 1026:∞ 1023:− 1019:∫ 979:∞ 974:∞ 971:− 967:∫ 873:. 856:) 828:) 814:) 749:) 697:) 693:• 654:) 633:) 617:) 592:) 577:) 467:) 442:) 391:) 383:-- 368:) 2830:( 2816:( 2793:( 2765:( 2735:f 2727:2 2721:g 2693:( 2679:( 2630:2 2627:1 2622:= 2614:2 2606:2 2599:2 2595:) 2585:x 2582:( 2576:) 2573:x 2570:( 2567:f 2523:( 2509:( 2491:( 2472:g 2450:g 2446:h 2427:( 2410:( 2392:( 2371:g 2367:h 2342:( 2316:( 2299:) 2296:x 2293:( 2289:f 2286:= 2282:) 2279:x 2276:( 2272:g 2251:0 2244:) 2241:X 2238:( 2231:) 2228:x 2225:( 2221:g 2216:h 2207:) 2204:X 2201:( 2194:) 2191:x 2188:( 2184:f 2179:h 2150:) 2147:X 2144:( 2137:) 2134:x 2131:( 2127:g 2122:h 2113:) 2110:X 2107:( 2100:) 2097:x 2094:( 2090:f 2085:h 2080:= 2055:) 2052:X 2049:( 2042:) 2039:x 2036:( 2032:g 2027:h 2019:= 2009:) 2006:x 2003:d 1999:) 1994:) 1991:x 1988:( 1984:g 1980:( 1969:) 1966:x 1963:( 1959:f 1933:+ 1929:) 1926:X 1923:( 1916:) 1913:x 1910:( 1906:f 1901:h 1896:= 1872:x 1869:d 1865:) 1858:) 1855:x 1852:( 1848:f 1842:) 1839:x 1836:( 1832:g 1826:( 1815:) 1812:x 1809:( 1805:f 1766:x 1763:d 1759:) 1752:) 1749:x 1746:( 1742:f 1736:) 1733:x 1730:( 1726:g 1720:( 1709:) 1706:x 1703:( 1699:f 1656:0 1653:= 1650:x 1647:d 1643:) 1640:x 1637:( 1633:f 1626:) 1623:x 1620:( 1616:g 1595:= 1574:x 1571:d 1567:) 1563:1 1553:) 1550:x 1547:( 1543:f 1537:) 1534:x 1531:( 1527:g 1520:( 1515:) 1512:x 1509:( 1505:f 1463:x 1460:d 1454:1 1448:x 1441:) 1438:x 1435:( 1417:) 1410:) 1407:x 1404:( 1400:f 1394:) 1391:x 1388:( 1384:g 1378:( 1362:) 1359:x 1356:( 1352:f 1331:= 1310:x 1307:d 1303:) 1296:) 1293:x 1290:( 1286:g 1280:) 1277:x 1274:( 1270:f 1264:( 1253:) 1250:x 1247:( 1243:f 1219:= 1195:= 1191:) 1186:) 1183:x 1180:( 1176:g 1169:) 1166:x 1163:( 1159:f 1155:( 1149:L 1146:K 1141:D 1102:) 1099:x 1096:( 1092:f 1071:1 1068:= 1065:x 1062:d 1057:2 1052:x 1046:) 1043:x 1040:( 1036:g 1013:, 1010:0 1007:= 1004:x 1001:d 998:x 994:) 991:x 988:( 984:g 940:) 937:x 934:( 930:g 916:. 896:) 893:x 890:( 886:g 852:( 824:( 810:( 745:( 689:( 676:+ 658:. 650:( 629:( 613:( 588:( 573:( 556:) 553:x 550:( 547:p 520:x 517:d 513:1 491:) 488:x 485:( 482:f 463:( 438:( 387:( 364:( 337:. 232:. 127:. 42::

Index


content assessment
WikiProjects
WikiProject icon
Statistics
WikiProject icon
WikiProject Statistics
statistics
the discussion
Mid
importance scale
WikiProject icon
Mathematics
WikiProject icon
icon
Mathematics portal
WikiProject Mathematics
mathematics
the discussion
Low
project's priority scale
WikiProject icon
Physics
WikiProject icon
icon
Physics portal
WikiProject Physics
Physics
the discussion
Low

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.