4349:
43:
348:
3790:
2790:. The most informative distribution would occur when one of the propositions was known to be true. In that case, the information entropy would be equal to zero. The least informative distribution would occur when there is no reason to favor any one of the propositions over the others. In that case, the only reasonable probability distribution would be uniform, and then the information entropy would be equal to its maximum possible value,
4344:{\displaystyle {\begin{aligned}\lim _{N\to \infty }\left({\frac {1}{N}}\log W\right)&={\frac {1}{N}}\left(N\log N-\sum _{i=1}^{m}Np_{i}\log(Np_{i})\right)\\&=\log N-\sum _{i=1}^{m}p_{i}\log(Np_{i})\\&=\log N-\log N\sum _{i=1}^{m}p_{i}-\sum _{i=1}^{m}p_{i}\log p_{i}\\&=\left(1-\sum _{i=1}^{m}p_{i}\right)\log N-\sum _{i=1}^{m}p_{i}\log p_{i}\\&=-\sum _{i=1}^{m}p_{i}\log p_{i}\\&=H(\mathbf {p} ).\end{aligned}}}
3749:
5572:
3413:
116:
4415:. This asserts that the distribution function characterizing particles entering a collision can be factorized. Though this statement can be understood as a strictly physical hypothesis, it can also be interpreted as a heuristic hypothesis regarding the most probable configuration of particles before colliding.
2899:, although the conceptual emphasis is quite different. It has the advantage of being strictly combinatorial in nature, making no reference to information entropy as a measure of 'uncertainty', 'uninformativeness', or any other imprecisely defined concept. The information entropy function is not assumed
4367:
and the principle of maximum entropy are completely compatible and can be seen as special cases of the "method of maximum relative entropy". They state that this method reproduces every aspect of orthodox
Bayesian inference methods. In addition this new method opens the door to tackling problems that
2845:
By choosing to use the distribution with the maximum entropy allowed by our information, the argument goes, we are choosing the most uninformative distribution possible. To choose a distribution with lower entropy would be to assume information we do not possess. Thus the maximum entropy distribution
3189:
Now, in order to reduce the 'graininess' of the probability assignment, it will be necessary to use quite a large number of quanta of probability. Rather than actually carry out, and possibly have to repeat, the rather long random experiment, the protagonist decides to simply calculate and use the
531:
In ordinary language, the principle of maximum entropy can be said to express a claim of epistemic modesty, or of maximum ignorance. The selected distribution is the one that makes the least claim to being informed beyond the stated prior data, that is to say the one that admits the most ignorance
4390:
as the given prior), independently of any
Bayesian considerations by treating the problem formally as a constrained optimisation problem, the Entropy functional being the objective function. For the case of given average values as testable information (averaged over the sought after probability
4354:
All that remains for the protagonist to do is to maximize entropy under the constraints of his testable information. He has found that the maximum entropy distribution is the most probable of all "fair" random distributions, in the limit as the probability levels go from discrete to continuous.
3038:
buckets while blindfolded. In order to be as fair as possible, each throw is to be independent of any other, and every bucket is to be the same size.) Once the experiment is done, he will check if the probability assignment thus obtained is consistent with his information. (For this step to be
3744:{\displaystyle {\begin{aligned}{\frac {1}{N}}\log W&={\frac {1}{N}}\log {\frac {N!}{n_{1}!\,n_{2}!\,\dotsb \,n_{m}!}}\\&={\frac {1}{N}}\log {\frac {N!}{(Np_{1})!\,(Np_{2})!\,\dotsb \,(Np_{m})!}}\\&={\frac {1}{N}}\left(\log N!-\sum _{i=1}^{m}\log((Np_{i})!)\right).\end{aligned}}}
517:, sometimes called the principle of insufficient reason), may be adopted. Thus, the maximum entropy principle is not merely an alternative way to view the usual methods of inference of classical statistics, but represents a significant conceptual generalization of those methods.
414:
Another way of stating this: Take precisely stated prior data or testable information about a probability distribution function. Consider the set of all trial probability distributions that would encode the prior data. According to this principle, the distribution with maximal
2721:
is a normalization constant. The invariant measure function is actually the prior density function encoding 'lack of relevant information'. It cannot be determined by the principle of maximum entropy, and must be determined by some other logical method, such as the
2606:
1731:
4368:
could not be addressed by either the maximal entropy principle or orthodox
Bayesian methods individually. Moreover, recent contributions (Lazar 2003, and Schennach 2005) show that frequentist relative-entropy-based inference approaches (such as
835:. Jaynes was a strong advocate of this approach, claiming the maximum entropy distribution represented the least informative distribution. A large amount of literature is now dedicated to the elicitation of maximum entropy priors and links with
1372:
2298:
1084:
3795:
809:
1607:
2474:
2929:
propositions. He has some testable information, but is not sure how to go about including this information in his probability assessment. He therefore conceives of the following random experiment. He will distribute
700:
Entropy maximization with no testable information respects the universal "constraint" that the sum of the probabilities is one. Under this constraint, the maximum entropy discrete probability distribution is the
2033:
3039:
successful, the information must be a constraint given by an open set in the space of probability measures). If it is inconsistent, he will reject it and try again. If it is consistent, his assessment will be
3418:
3339:
1866:
2065:
are observables. We also require the probability density to integrate to one, which may be viewed as a primitive constraint on the identity function and an observable equal to 1 giving the constraint
5462:
Technical Report 97-08, Institute for
Research in Cognitive Science, University of Pennsylvania. An easy-to-read introduction to maximum entropy methods in the context of natural language processing.
903:
problem, and thus provide a sparse mixture model as the optimal density estimator. One important advantage of the method is its ability to incorporate prior information in the density estimation.
1116:
are observables. We also require the probability density to sum to one, which may be viewed as a primitive constraint on the identity function and an observable equal to 1 giving the constraint
2712:
2816:. The information entropy can therefore be seen as a numerical measure which describes how uninformative a particular probability distribution is, ranging from zero (completely informative) to
1421:
3251:
1180:
2903:, but rather is found in the course of the argument; and the argument leads naturally to the procedure of maximizing the information entropy, rather than treating it in some other way.
5466:
Tang, A.; Jackson, D.; Hobbs, J.; Chen, W.; Smith, J. L.; Patel, H.; Prieto, A.; Petrusca, D.; Grivich, M. I.; Sher, A.; Hottowy, P.; Dabrowski, W.; Litke, A. M.; Beggs, J. M. (2008).
3087:
871:
Alternatively, the principle is often invoked for model specification: in this case the observed data itself is assumed to be the testable information. Such models are widely used in
5533:
2750:
Proponents of the principle of maximum entropy justify its use in assigning probabilities in several ways, including the following two arguments. These arguments take the use of
623:
2512:
1637:
2111:
3778:
2504:
5006:
2840:
2814:
4692:
3117:
2063:
1114:
680:
653:
2877:
2976:
3405:
3385:
3365:
3184:
3164:
3137:
3036:
3016:
2996:
2948:
2924:
2784:
570:
378:
1191:
5588:
169:
2129:
968:
5593:
5526:
1902:(although it is sometimes, confusingly, defined as the negative of this). The inference principle of minimizing this, due to Kullback, is known as the
711:
1433:
4445:
2739:
912:
251:
2313:
494:
The maximum entropy principle is also needed to guarantee the uniqueness and consistency of probability assignments obtained by different methods,
1631:
parameters are
Lagrange multipliers. In the case of equality constraints their values are determined from the solution of the nonlinear equations
4379:
Jaynes stated Bayes' theorem was a way to calculate a probability, while maximum entropy was a way to assign a prior probability distribution.
5519:
4957:
4608:
544:. Testable information is a statement about a probability distribution whose truth or falsity is well-defined. For example, the statements
1942:
64:
4450:
1880:
5028:
2723:
4598:
3262:
5443:
5329:
5285:
371:
334:
86:
2892:
1778:
5903:
5883:
5358:
4395:
the parameters of which must be solved for in order to achieve minimum cross entropy and satisfy the given testable information.
2304:
1613:
261:
31:
1903:
1891:
164:
5898:
5893:
4387:
4373:
1185:
The probability distribution with maximum information entropy subject to these inequality/equality constraints is of the form:
702:
693:, subject to the constraints of the information. This constrained optimization problem is typically solved using the method of
287:
4455:
4382:
It is however, possible in concept to solve for a posterior distribution directly from a stated prior distribution using the
225:
5787:
5508:
Open access article containing pointers to various papers and software implementations of
Maximum Entropy Model on the net.
4770:
Palmieri, Francesco A. N.; Ciuonzo, Domenico (2013-04-01). "Objective priors from maximum entropy in data classification".
2727:
2648:
5888:
5459:
4460:
4425:
400:
364:
256:
4376:– see e.g. Owen 2001 and Kitamura 2006) can be combined with prior information to perform Bayesian posterior analysis.
1380:
491:
implies a similar equivalence for these two ways of specifying the testable information in the maximum entropy method.
4932:
Kesavan, H. K.; Kapur, J. N. (1990). "Maximum
Entropy and Minimum Cross-Entropy Principles". In Fougère, P. F. (ed.).
1936:, i.e. we require our probability density function to satisfy the inequality (or purely equality) moment constraints:
872:
3781:
1617:
57:
51:
3200:
5664:
5580:
5435:
4440:
880:
476:
439:. In particular, Jaynes argued that the Gibbsian method of statistical mechanics is sound by also arguing that the
246:
215:
4850:
Botev, Z. I.; Kroese, D. P. (2008). "Non-asymptotic
Bandwidth Selection for Density Estimation of Discrete Data".
1122:
4465:
514:
308:
189:
2611:
In the case with inequality moment constraints the
Lagrange multipliers are determined from the solution of a
68:
5840:
5823:
5102:
Bajkova, A. T. (1992). "The generalization of maximum entropy method for reconstruction of complex functions".
3191:
1621:
860:
686:
472:
396:
329:
241:
3045:
5687:
5677:
5603:
4663:
4404:
1757:
5734:
5381:
4779:
2601:{\displaystyle F_{k}={\frac {\partial }{\partial \lambda _{k}}}\log Z(\lambda _{1},\dotsc ,\lambda _{m}).}
1918:
1726:{\displaystyle F_{k}={\frac {\partial }{\partial \lambda _{k}}}\log Z(\lambda _{1},\ldots ,\lambda _{m}).}
962:; that is, we require our probability distribution to satisfy the moment inequality/equality constraints:
896:
856:
220:
5619:
5542:
5241:
5000:
4624:
4544:
4491:
2896:
900:
582:
495:
452:
432:
123:
4891:
347:
5468:"A Maximum Entropy Model Applied to Spatial and Temporal Correlations from Cortical Networks in Vitro"
2879:
is however a source of criticisms of the approach since this dominating measure is in fact arbitrary.
2071:
1760:, the Shannon entropy cannot be used, as it is only defined for discrete probability spaces. Instead
1736:
In the case of inequality constraints, the
Lagrange multipliers are determined from the solution of a
5762:
5373:
5291:
5219:
5147:
5111:
5070:
4563:
4510:
4369:
2751:
1769:
1741:
525:
484:
468:
303:
184:
154:
4784:
3780:, i.e. as the probability levels go from grainy discrete values to smooth continuous values. Using
2642:), and that no other information is given. Then the maximum entropy probability density function is
2479:
As in the discrete case, in the case where all moment constraints are equalities, the values of the
5828:
5807:
5386:
4941:
2888:
2612:
1737:
876:
848:
694:
690:
444:
416:
135:
127:
107:
5133:"Application of Bayesian reasoning and the Maximum Entropy Method to some reconstruction problems"
4735:
Bousquet, N. (2008). "Eliciting vague but proper maximal entropy priors in Bayesian experiments".
3757:
2482:
5855:
5767:
5744:
5725:
5705:
5669:
5426:
5194:
4914:
4867:
4827:
4752:
4709:
2926:
1745:
1424:
892:
832:
448:
436:
352:
277:
149:
179:
5860:
5757:
5730:
5641:
5631:
5624:
5598:
5499:
5439:
5325:
5281:
5024:
4953:
4604:
4364:
828:
510:
506:
499:
282:
159:
131:
5802:
5752:
5489:
5479:
5399:
5391:
5345:
5309:
5227:
5186:
5155:
5119:
5115:
5078:
4986:
4945:
4933:
4906:
4859:
4819:
4789:
4744:
4701:
4672:
4643:
4571:
4518:
2819:
2793:
2631:
1765:
891:
One of the main applications of the maximum entropy principle is in discrete and continuous
467:
In most practical cases, the stated prior data or testable information is given by a set of
399:
which best represents the current state of knowledge about a system is the one with largest
174:
4892:"The Generalized Cross Entropy Method, with Applications to Probability Density Estimation"
4721:
4583:
4530:
3387:
directly, the protagonist could equivalently maximize any monotonic increasing function of
3095:
2041:
1092:
658:
631:
520:
However these statements do not imply that thermodynamical systems need not be shown to be
5797:
5792:
5720:
5715:
5682:
5609:
4717:
4579:
4526:
4470:
4408:
2853:
1624:
of bounded dimension is that it have the general form of a maximum entropy distribution.)
863:. However, maximum entropy is not a generalisation of all such sufficient updating rules.
852:
819:
The principle of maximum entropy is commonly applied in two ways to inferential problems:
210:
4934:
2953:
1925:(all integrals below are over this interval). We assume this information has the form of
5377:
5223:
5151:
5074:
4567:
4514:
1620:
states that the necessary and sufficient condition for a sampling distribution to admit
5845:
5561:
5494:
5467:
5040:
3754:
At this point, in order to simplify the expression, the protagonist takes the limit as
3390:
3370:
3350:
3169:
3149:
3122:
3021:
3001:
2981:
2933:
2909:
2891:
to E. T. Jaynes in 1962. It is essentially the same mathematical argument used for the
2769:
1367:{\displaystyle \Pr(x_{i})={\frac {1}{Z(\lambda _{1},\ldots ,\lambda _{m})}}\exp \left,}
836:
555:
549:
488:
5407:
5132:
5877:
5833:
5395:
5273:
4871:
4831:
4756:
4392:
4383:
505:
The maximum entropy principle makes explicit our freedom in using different forms of
5198:
4918:
2293:{\displaystyle p(x)={\frac {1}{Z(\lambda _{1},\dotsc ,\lambda _{m})}}q(x)\exp \left}
1079:{\displaystyle \sum _{i=1}^{n}\Pr(x_{i})f_{k}(x_{i})\geq F_{k}\qquad k=1,\ldots ,m.}
17:
5850:
5484:
5257:
4807:
4628:
4548:
4495:
4435:
1922:
1761:
428:
324:
4676:
685:
Given testable information, the maximum entropy procedure consists of seeking the
5460:"A simple introduction to maximum entropy models for natural language processing"
4974:
4949:
4793:
475:
in question. This is the way the maximum entropy principle is most often used in
5659:
5654:
4430:
2847:
2787:
804:{\displaystyle p_{i}={\frac {1}{n}}\ {\rm {for\ all}}\ i\in \{\,1,\dots ,n\,\}.}
404:
5336:
Schennach, S. M. (2005). "Bayesian exponentially tilted empirical likelihood".
5296:, Cowles Foundation Discussion Papers 1569, Cowles Foundation, Yale University.
5160:
1764:(1963, 1968, 2003) gave the following formula, which is closely related to the
1602:{\displaystyle Z(\lambda _{1},\ldots ,\lambda _{m})=\sum _{i=1}^{n}\exp \left,}
5649:
5359:"Can the Maximum Entropy Principle be explained as a consistency requirement?"
5313:
5123:
4910:
4863:
4748:
480:
5349:
4647:
2469:{\displaystyle Z(\lambda _{1},\dotsc ,\lambda _{m})=\int q(x)\exp \left\,dx.}
1887:
is known; we will discuss it further after the solution equations are given.
907:
General solution for the maximum entropy distribution with linear constraints
5710:
5131:
Fornalski, K.W.; Parzych, G.; Pylak, M.; Satuła, D.; Dobrzyński, L. (2010).
4575:
4522:
1890:
A closely related quantity, the relative entropy, is usually defined as the
456:
431:
in two papers in 1957, where he emphasized a natural correspondence between
5503:
540:
The principle of maximum entropy is useful explicitly only when applied to
455:
should be considered a particular application of a general tool of logical
2738:
For several examples of maximum entropy distributions, see the article on
5614:
5269:, J. H. Justice (ed.), Cambridge University Press, Cambridge, p. 26.
4403:
The principle of maximum entropy bears a relation to a key assumption of
899:
estimators, the maximum entropy principle may require the solution to a
5511:
5245:
5045:
Maximum-Entropy and Bayesian Methods in Science and Engineering (Vol. 1)
5779:
5190:
4823:
4713:
1879:), which Jaynes called the "invariant measure", is proportional to the
521:
440:
115:
5177:
Guiasu, S.; Shenitzer, A. (1985). "The principle of maximum entropy".
5083:
5058:
3190:
most probable result. The probability of any particular result is the
5403:
5232:
5207:
4991:
4386:(or the Principle of Maximum Entropy being a special case of using a
3347:
The most probable result is the one which maximizes the multiplicity
1744:, and the computation of the Lagrange multipliers usually requires
682:
are probabilities of events) are statements of testable information.
4705:
4661:
Clarke, B. (2006). "Information optimality and Bayesian modelling".
2906:
Suppose an individual wishes to make a probability assignment among
2028:{\displaystyle \int p(x)f_{k}(x)\,dx\geq F_{k}\qquad k=1,\dotsc ,m.}
5169:
4690:
Soofi, E.S. (2000). "Principal Information Theoretic Approaches".
5293:
Empirical Likelihood Methods in Econometrics: Theory and Practice
2506:
parameters are determined by the system of nonlinear equations:
5515:
3334:{\displaystyle W={\frac {N!}{n_{1}!\,n_{2}!\,\dotsb \,n_{m}!}}}
471:(average values of some moment functions), associated with the
3166:
proposition (i.e. the number of balls that ended up in bucket
36:
4391:
distribution), the sought after distribution is formally the
2887:
The following argument is the result of a suggestion made by
1861:{\displaystyle H_{c}=-\int p(x)\log {\frac {p(x)}{q(x)}}\,dx}
1740:
program with linear constraints. In both cases, there is no
403:, in the context of precisely stated prior data (such as a
5267:
Maximum-Entropy and Bayesian Methods in Applied Statistics
827:
The principle of maximum entropy is often used to obtain
483:
of the probability distribution. The equivalence between
3344:
is sometimes known as the multiplicity of the outcome.
2758:
Information entropy as a measure of 'uninformativeness'
2754:
as given, and are thus subject to the same postulates.
5057:
Chliamovitch, G.; Malaspinas, O.; Chopard, B. (2017).
5041:"The Relation of Bayesian and Maximum Entropy Methods"
3793:
3760:
3416:
3393:
3373:
3353:
3265:
3203:
3172:
3152:
3125:
3098:
3048:
3024:
3004:
2998:
possibilities. (One might imagine that he will throw
2984:
2956:
2936:
2912:
2856:
2822:
2796:
2772:
2651:
2515:
2485:
2316:
2132:
2074:
2044:
1945:
1781:
1640:
1436:
1383:
1194:
1125:
1095:
971:
714:
661:
634:
585:
558:
4975:"Invariant {HPD} credible sets and {MAP} estimators"
4636:
IEEE Transactions on Systems Science and Cybernetics
5816:
5778:
5743:
5698:
5640:
5579:
5549:
5366:
Studies in History and Philosophy of Modern Physics
2746:
Justifications for the principle of maximum entropy
2707:{\displaystyle p(x)=A\cdot q(x),\qquad a<x<b}
5300:Lazar, N (2003). "Bayesian empirical likelihood".
4343:
3772:
3743:
3399:
3379:
3359:
3333:
3245:
3178:
3158:
3146:is the number of quanta that were assigned to the
3131:
3111:
3081:
3030:
3010:
2990:
2970:
2942:
2918:
2871:
2834:
2808:
2778:
2706:
2600:
2498:
2468:
2292:
2105:
2057:
2027:
1860:
1725:
1601:
1415:
1366:
1174:
1108:
1078:
847:Maximum entropy is a sufficient updating rule for
803:
674:
647:
617:
564:
5278:Entropy Optimization Principles with Applications
4597:Sivia, Devinderjit; Skilling, John (2006-06-02).
4549:"Information Theory and Statistical Mechanics II"
1929:constraints on the expectations of the functions
1416:{\displaystyle \lambda _{1},\ldots ,\lambda _{m}}
955:constraints on the expectations of the functions
5256:Jaynes, E. T., 1986 (new version online 1996), "
4899:Methodology and Computing in Applied Probability
4852:Methodology and Computing in Applied Probability
3799:
1195:
1147:
993:
4693:Journal of the American Statistical Association
1904:Principle of Minimum Discrimination Information
1427:. The normalization constant is determined by:
5246:"Information Theory and Statistical Mechanics"
4496:"Information Theory and Statistical Mechanics"
3246:{\displaystyle Pr(\mathbf {p} )=W\cdot m^{-N}}
2116:The probability density function with maximum
951:}. We assume this information has the form of
5527:
5005:: CS1 maint: DOI inactive as of April 2024 (
4973:Druilhet, Pierre; Marin, Jean-Michel (2007).
372:
8:
5171:Updating Probabilities with Data and Moments
1175:{\displaystyle \sum _{i=1}^{n}\Pr(x_{i})=1.}
795:
775:
5425:Boyd, Stephen; Lieven Vandenberghe (2004).
5104:Astronomical and Astrophysical Transactions
5059:"Kinetic theory beyond the Stosszahlansatz"
2626:) can be best understood by supposing that
479:. Another possibility is to prescribe some
5534:
5520:
5512:
5023:, Cambridge University Press, p. 351-355.
4810:(1987). "Updating, supposing and MAXENT".
379:
365:
98:
5493:
5483:
5385:
5231:
5159:
5082:
4990:
4783:
4374:exponentially tilted empirical likelihood
4326:
4304:
4288:
4278:
4267:
4244:
4228:
4218:
4207:
4180:
4170:
4159:
4128:
4112:
4102:
4091:
4078:
4068:
4057:
4013:
3991:
3981:
3970:
3930:
3908:
3895:
3884:
3850:
3819:
3802:
3794:
3792:
3759:
3714:
3689:
3678:
3644:
3619:
3608:
3604:
3592:
3581:
3569:
3548:
3532:
3510:
3505:
3501:
3492:
3487:
3478:
3463:
3447:
3421:
3417:
3415:
3392:
3372:
3352:
3319:
3314:
3310:
3301:
3296:
3287:
3272:
3264:
3234:
3213:
3202:
3171:
3151:
3124:
3103:
3097:
3068:
3062:
3053:
3047:
3023:
3003:
2983:
2960:
2955:
2935:
2911:
2855:
2850:on the dominating measure represented by
2846:is the only reasonable distribution. The
2821:
2795:
2771:
2740:maximum entropy probability distributions
2650:
2586:
2567:
2542:
2529:
2520:
2514:
2490:
2484:
2456:
2436:
2426:
2398:
2388:
2346:
2327:
2315:
2270:
2260:
2232:
2222:
2183:
2164:
2148:
2131:
2090:
2073:
2049:
2043:
1994:
1980:
1965:
1944:
1851:
1819:
1786:
1780:
1711:
1692:
1667:
1654:
1645:
1639:
1582:
1569:
1559:
1537:
1524:
1514:
1493:
1482:
1466:
1447:
1435:
1407:
1388:
1382:
1347:
1334:
1324:
1302:
1289:
1279:
1252:
1233:
1217:
1205:
1193:
1157:
1141:
1130:
1124:
1100:
1094:
1045:
1029:
1016:
1003:
987:
976:
970:
794:
778:
742:
741:
728:
719:
713:
666:
660:
639:
633:
603:
590:
584:
557:
408:
87:Learn how and when to remove this message
30:For other uses of "Maximum entropy", see
5021:Probability Theory: The Logic of Science
4446:Maximum entropy probability distribution
3082:{\displaystyle p_{i}={\frac {n_{i}}{N}}}
913:Maximum entropy probability distribution
252:Integrated nested Laplace approximations
50:This article includes a list of general
5047:, Kluwer Academic Publishers, p. 25-29.
4885:
4883:
4881:
4845:
4843:
4841:
4483:
316:
295:
269:
233:
202:
141:
106:
4998:
4363:Giffin and Caticha (2007) state that
427:The principle was first expounded by
7:
4936:Maximum Entropy and Bayesian Methods
4890:Botev, Z. I.; Kroese, D. P. (2011).
2630:is known to take values only in the
451:are the same concept. Consequently,
4451:Maximum entropy spectral estimation
1881:limiting density of discrete points
5252:. New York: Benjamin. p. 181.
5168:Giffin, A. and Caticha, A., 2007,
4600:Data Analysis: A Bayesian Tutorial
4384:principle of minimum cross-entropy
3809:
3767:
2950:quanta of probability (each worth
2724:principle of transformation groups
2535:
2531:
1909:We have some testable information
1660:
1656:
922:We have some testable information
761:
758:
755:
749:
746:
743:
618:{\displaystyle p_{2}+p_{3}>0.6}
56:it lacks sufficient corresponding
25:
4393:Gibbs (or Boltzmann) distribution
4359:Compatibility with Bayes' theorem
2764:discrete probability distribution
2123:subject to these constraints is:
1612:and is conventionally called the
443:of statistical mechanics and the
5570:
4327:
3214:
2757:
2106:{\displaystyle \int p(x)\,dx=1.}
1883:. For now, we shall assume that
875:. An example of such a model is
346:
262:Approximate Bayesian computation
114:
41:
32:Maximum entropy (disambiguation)
27:Principle in Bayesian statistics
2688:
2618:The invariant measure function
2000:
1051:
829:prior probability distributions
509:. As a special case, a uniform
288:Maximum a posteriori estimation
5485:10.1523/JNEUROSCI.3359-07.2008
5208:"Maximum entropy fundamentals"
5206:Harremoës, P.; Topsøe (2001).
5179:The Mathematical Intelligencer
4456:Maximum entropy thermodynamics
4331:
4323:
4019:
4003:
3936:
3920:
3806:
3764:
3726:
3720:
3704:
3701:
3625:
3609:
3598:
3582:
3575:
3559:
3218:
3210:
2866:
2860:
2682:
2676:
2661:
2655:
2592:
2560:
2448:
2442:
2410:
2404:
2370:
2364:
2352:
2320:
2282:
2276:
2244:
2238:
2204:
2198:
2189:
2157:
2142:
2136:
2087:
2081:
1977:
1971:
1958:
1952:
1845:
1839:
1831:
1825:
1810:
1804:
1717:
1685:
1588:
1575:
1543:
1530:
1472:
1440:
1353:
1340:
1308:
1295:
1258:
1226:
1211:
1198:
1163:
1150:
1035:
1022:
1009:
996:
887:Probability density estimation
883:for independent observations.
532:beyond the stated prior data.
1:
4677:10.1016/j.jeconom.2006.05.003
1423:. It is sometimes called the
5557:Principle of maximum entropy
5396:10.1016/1355-2198(95)00015-1
4950:10.1007/978-94-009-0683-9_29
4794:10.1016/j.inffus.2012.01.012
4461:Principle of maximum caliber
4426:Akaike information criterion
3773:{\displaystyle N\to \infty }
2893:Maxwell–Boltzmann statistics
2842:(completely uninformative).
2499:{\displaystyle \lambda _{k}}
1618:Pitman–Koopman theorem
393:principle of maximum entropy
195:Principle of maximum entropy
1917:which takes values in some
1892:Kullback–Leibler divergence
879:, which corresponds to the
873:natural language processing
165:Bernstein–von Mises theorem
5920:
5581:Statistical thermodynamics
5436:Cambridge University Press
5280:, Boston: Academic Press.
5161:10.12693/APhysPolA.117.892
4441:Maximum entropy classifier
3119:is the probability of the
2848:dependence of the solution
910:
881:maximum entropy classifier
524:to justify treatment as a
477:statistical thermodynamics
29:
5568:
5124:10.1080/10556799208230532
4911:10.1007/s11009-009-9133-7
4864:10.1007/s11009-007-9057-z
4749:10.1007/s00362-008-0149-9
4466:Thermodynamic equilibrium
3407:. He decides to maximize
3367:. Rather than maximizing
861:maximum entropy inference
515:principle of indifference
190:Principle of indifference
5841:Condensed matter physics
5824:Statistical field theory
5324:, Chapman and Hall/CRC.
4648:10.1109/TSSC.1968.300117
3782:Stirling's approximation
3192:multinomial distribution
1758:continuous distributions
687:probability distribution
473:probability distribution
459:and information theory.
397:probability distribution
242:Markov chain Monte Carlo
5904:Mathematical principles
5884:Entropy and information
5699:Mathematical approaches
5688:Lennard-Jones potential
5604:thermodynamic potential
5472:Journal of Neuroscience
5314:10.1093/biomet/90.2.319
5258:Monkeys, kangaroos and
5140:Acta Physica Polonica A
5116:1992A&AT....1..313B
4664:Journal of Econometrics
4576:10.1103/PhysRev.108.171
4523:10.1103/PhysRev.106.620
4405:kinetic theory of gases
843:Posterior probabilities
247:Laplace's approximation
234:Posterior approximation
71:more precise citations.
5899:Probability assessment
5894:Statistical principles
5735:conformal field theory
5458:Ratnaparkhi A. (1997)
5350:10.1093/biomet/92.1.31
4995:(inactive 2024-04-27).
4358:
4345:
4283:
4223:
4175:
4107:
4073:
3986:
3900:
3774:
3745:
3694:
3401:
3381:
3361:
3335:
3247:
3180:
3160:
3133:
3113:
3083:
3032:
3012:
2992:
2978:) at random among the
2972:
2944:
2920:
2873:
2836:
2835:{\displaystyle \log m}
2810:
2809:{\displaystyle \log m}
2780:
2728:marginalization theory
2708:
2602:
2500:
2470:
2294:
2107:
2059:
2029:
1862:
1727:
1603:
1498:
1417:
1368:
1176:
1146:
1110:
1080:
992:
897:support vector machine
867:Maximum entropy models
857:probability kinematics
805:
676:
649:
619:
566:
353:Mathematics portal
296:Evidence approximation
5650:Ferromagnetism models
5543:Statistical mechanics
5248:. In Ford, K. (ed.).
5039:Jaynes, E. T. (1988)
5019:Jaynes, E. T. (2003)
4629:"Prior Probabilities"
4346:
4263:
4203:
4155:
4087:
4053:
3966:
3880:
3775:
3746:
3674:
3402:
3382:
3362:
3336:
3248:
3181:
3161:
3134:
3114:
3112:{\displaystyle p_{i}}
3084:
3033:
3013:
2993:
2973:
2945:
2921:
2897:statistical mechanics
2883:The Wallis derivation
2874:
2837:
2811:
2781:
2709:
2603:
2501:
2471:
2295:
2108:
2060:
2058:{\displaystyle F_{k}}
2030:
1863:
1728:
1622:sufficient statistics
1604:
1478:
1418:
1369:
1177:
1126:
1111:
1109:{\displaystyle F_{k}}
1081:
972:
901:quadratic programming
859:is a special case of
806:
677:
675:{\displaystyle p_{3}}
650:
648:{\displaystyle p_{2}}
620:
567:
496:statistical mechanics
453:statistical mechanics
433:statistical mechanics
257:Variational inference
5357:Uffink, Jos (1995).
5322:Empirical Likelihood
5290:Kitamura, Y., 2006,
4399:Relevance to physics
4388:uniform distribution
4370:empirical likelihood
3791:
3758:
3414:
3391:
3371:
3351:
3263:
3201:
3170:
3150:
3123:
3096:
3046:
3022:
3002:
2982:
2954:
2934:
2910:
2872:{\displaystyle m(x)}
2854:
2820:
2794:
2770:
2752:Bayesian probability
2649:
2513:
2483:
2314:
2130:
2072:
2042:
1943:
1779:
1770:differential entropy
1742:closed form solution
1638:
1434:
1381:
1192:
1123:
1093:
969:
712:
703:uniform distribution
695:Lagrange multipliers
659:
632:
583:
556:
542:testable information
536:Testable information
526:statistical ensemble
485:conserved quantities
469:conserved quantities
419:is the best choice.
409:testable information
335:Posterior predictive
304:Evidence lower bound
185:Likelihood principle
155:Bayesian probability
18:Entropy maximization
5889:Bayesian statistics
5829:elementary particle
5594:partition functions
5428:Convex Optimization
5378:1995SHPMP..26..223U
5320:Owen, A. B., 2001,
5250:Statistical Physics
5224:2001Entrp...3..191H
5152:2010AcPPA.117..892F
5075:2017Entrp..19..381C
4812:Theory and Decision
4568:1957PhRv..108..171J
4515:1957PhRv..106..620J
3139:proposition, while
2971:{\displaystyle 1/N}
2786:mutually exclusive
2613:convex optimization
1738:convex optimization
877:logistic regression
849:radical probabilism
823:Prior probabilities
691:information entropy
513:density (Laplace's
445:information entropy
417:information entropy
108:Bayesian statistics
102:Part of a series on
5856:information theory
5763:correlation length
5758:Critical exponents
5745:Critical phenomena
5726:stochastic process
5706:Boltzmann equation
5599:equations of state
5272:Kapur, J. N.; and
5191:10.1007/bf03023004
4824:10.1007/BF00134086
4772:Information Fusion
4737:Statistical Papers
4700:(452): 1349–1353.
4341:
4339:
3813:
3770:
3741:
3739:
3397:
3377:
3357:
3331:
3243:
3176:
3156:
3129:
3109:
3079:
3028:
3008:
2988:
2968:
2940:
2927:mutually exclusive
2916:
2869:
2832:
2806:
2776:
2704:
2598:
2496:
2466:
2305:partition function
2290:
2103:
2055:
2025:
1858:
1723:
1614:partition function
1599:
1425:Gibbs distribution
1413:
1364:
1172:
1106:
1076:
930:taking values in {
893:density estimation
833:Bayesian inference
801:
672:
645:
615:
562:
487:and corresponding
449:information theory
437:information theory
278:Bayesian estimator
226:Hierarchical model
150:Bayesian inference
5871:
5870:
5861:Boltzmann machine
5731:mean-field theory
5632:Maxwell relations
5084:10.3390/e19080381
4959:978-94-010-6792-8
4610:978-0-19-154670-9
3858:
3827:
3798:
3652:
3632:
3540:
3520:
3455:
3429:
3400:{\displaystyle W}
3380:{\displaystyle W}
3360:{\displaystyle W}
3329:
3179:{\displaystyle i}
3159:{\displaystyle i}
3132:{\displaystyle i}
3077:
3031:{\displaystyle m}
3011:{\displaystyle N}
2991:{\displaystyle m}
2943:{\displaystyle N}
2919:{\displaystyle m}
2779:{\displaystyle m}
2549:
2193:
1913:about a quantity
1849:
1746:numerical methods
1674:
1262:
926:about a quantity
768:
754:
740:
736:
565:{\displaystyle x}
511:prior probability
500:logical inference
389:
388:
283:Credible interval
216:Linear regression
97:
96:
89:
16:(Redirected from
5911:
5753:Phase transition
5574:
5573:
5536:
5529:
5522:
5513:
5507:
5497:
5487:
5455:
5453:
5452:
5433:
5414:
5412:
5406:. Archived from
5389:
5363:
5353:
5317:
5263:
5253:
5237:
5235:
5233:10.3390/e3030191
5202:
5165:
5163:
5137:
5127:
5089:
5088:
5086:
5054:
5048:
5037:
5031:
5017:
5011:
5010:
5004:
4996:
4994:
4992:10.1214/07-BA227
4970:
4964:
4963:
4939:
4929:
4923:
4922:
4896:
4887:
4876:
4875:
4847:
4836:
4835:
4804:
4798:
4797:
4787:
4767:
4761:
4760:
4732:
4726:
4725:
4687:
4681:
4680:
4658:
4652:
4651:
4633:
4621:
4615:
4614:
4594:
4588:
4587:
4553:
4541:
4535:
4534:
4500:
4488:
4350:
4348:
4347:
4342:
4340:
4330:
4313:
4309:
4308:
4293:
4292:
4282:
4277:
4253:
4249:
4248:
4233:
4232:
4222:
4217:
4190:
4186:
4185:
4184:
4174:
4169:
4137:
4133:
4132:
4117:
4116:
4106:
4101:
4083:
4082:
4072:
4067:
4025:
4018:
4017:
3996:
3995:
3985:
3980:
3947:
3943:
3939:
3935:
3934:
3913:
3912:
3899:
3894:
3859:
3851:
3842:
3838:
3828:
3820:
3812:
3779:
3777:
3776:
3771:
3750:
3748:
3747:
3742:
3740:
3733:
3729:
3719:
3718:
3693:
3688:
3653:
3645:
3637:
3633:
3631:
3624:
3623:
3597:
3596:
3574:
3573:
3557:
3549:
3541:
3533:
3525:
3521:
3519:
3515:
3514:
3497:
3496:
3483:
3482:
3472:
3464:
3456:
3448:
3430:
3422:
3406:
3404:
3403:
3398:
3386:
3384:
3383:
3378:
3366:
3364:
3363:
3358:
3340:
3338:
3337:
3332:
3330:
3328:
3324:
3323:
3306:
3305:
3292:
3291:
3281:
3273:
3252:
3250:
3249:
3244:
3242:
3241:
3217:
3185:
3183:
3182:
3177:
3165:
3163:
3162:
3157:
3138:
3136:
3135:
3130:
3118:
3116:
3115:
3110:
3108:
3107:
3088:
3086:
3085:
3080:
3078:
3073:
3072:
3063:
3058:
3057:
3037:
3035:
3034:
3029:
3017:
3015:
3014:
3009:
2997:
2995:
2994:
2989:
2977:
2975:
2974:
2969:
2964:
2949:
2947:
2946:
2941:
2925:
2923:
2922:
2917:
2878:
2876:
2875:
2870:
2841:
2839:
2838:
2833:
2815:
2813:
2812:
2807:
2785:
2783:
2782:
2777:
2713:
2711:
2710:
2705:
2632:bounded interval
2607:
2605:
2604:
2599:
2591:
2590:
2572:
2571:
2550:
2548:
2547:
2546:
2530:
2525:
2524:
2505:
2503:
2502:
2497:
2495:
2494:
2475:
2473:
2472:
2467:
2455:
2451:
2441:
2440:
2431:
2430:
2403:
2402:
2393:
2392:
2351:
2350:
2332:
2331:
2299:
2297:
2296:
2291:
2289:
2285:
2275:
2274:
2265:
2264:
2237:
2236:
2227:
2226:
2194:
2192:
2188:
2187:
2169:
2168:
2149:
2112:
2110:
2109:
2104:
2064:
2062:
2061:
2056:
2054:
2053:
2034:
2032:
2031:
2026:
1999:
1998:
1970:
1969:
1867:
1865:
1864:
1859:
1850:
1848:
1834:
1820:
1791:
1790:
1766:relative entropy
1732:
1730:
1729:
1724:
1716:
1715:
1697:
1696:
1675:
1673:
1672:
1671:
1655:
1650:
1649:
1608:
1606:
1605:
1600:
1595:
1591:
1587:
1586:
1574:
1573:
1564:
1563:
1542:
1541:
1529:
1528:
1519:
1518:
1497:
1492:
1471:
1470:
1452:
1451:
1422:
1420:
1419:
1414:
1412:
1411:
1393:
1392:
1373:
1371:
1370:
1365:
1360:
1356:
1352:
1351:
1339:
1338:
1329:
1328:
1307:
1306:
1294:
1293:
1284:
1283:
1263:
1261:
1257:
1256:
1238:
1237:
1218:
1210:
1209:
1181:
1179:
1178:
1173:
1162:
1161:
1145:
1140:
1115:
1113:
1112:
1107:
1105:
1104:
1085:
1083:
1082:
1077:
1050:
1049:
1034:
1033:
1021:
1020:
1008:
1007:
991:
986:
810:
808:
807:
802:
766:
765:
764:
752:
738:
737:
729:
724:
723:
689:which maximizes
681:
679:
678:
673:
671:
670:
654:
652:
651:
646:
644:
643:
624:
622:
621:
616:
608:
607:
595:
594:
571:
569:
568:
563:
552:of the variable
395:states that the
381:
374:
367:
351:
350:
317:Model evaluation
118:
99:
92:
85:
81:
78:
72:
67:this article by
58:inline citations
45:
44:
37:
21:
5919:
5918:
5914:
5913:
5912:
5910:
5909:
5908:
5874:
5873:
5872:
5867:
5812:
5774:
5739:
5721:BBGKY hierarchy
5716:Vlasov equation
5694:
5683:depletion force
5676:Particles with
5636:
5575:
5571:
5566:
5545:
5540:
5465:
5450:
5448:
5446:
5438:. p. 362.
5431:
5424:
5421:
5419:Further reading
5410:
5361:
5356:
5335:
5299:
5259:
5240:
5205:
5176:
5135:
5130:
5101:
5098:
5093:
5092:
5056:
5055:
5051:
5038:
5034:
5018:
5014:
4997:
4972:
4971:
4967:
4960:
4931:
4930:
4926:
4894:
4889:
4888:
4879:
4849:
4848:
4839:
4806:
4805:
4801:
4785:10.1.1.387.4515
4769:
4768:
4764:
4734:
4733:
4729:
4706:10.2307/2669786
4689:
4688:
4684:
4660:
4659:
4655:
4631:
4623:
4622:
4618:
4611:
4596:
4595:
4591:
4556:Physical Review
4551:
4543:
4542:
4538:
4503:Physical Review
4498:
4490:
4489:
4485:
4480:
4475:
4471:Molecular chaos
4421:
4413:Stosszahlansatz
4409:molecular chaos
4401:
4361:
4338:
4337:
4311:
4310:
4300:
4284:
4251:
4250:
4240:
4224:
4176:
4148:
4144:
4135:
4134:
4124:
4108:
4074:
4023:
4022:
4009:
3987:
3945:
3944:
3926:
3904:
3864:
3860:
3843:
3818:
3814:
3789:
3788:
3756:
3755:
3738:
3737:
3710:
3658:
3654:
3635:
3634:
3615:
3588:
3565:
3558:
3550:
3523:
3522:
3506:
3488:
3474:
3473:
3465:
3440:
3412:
3411:
3389:
3388:
3369:
3368:
3349:
3348:
3315:
3297:
3283:
3282:
3274:
3261:
3260:
3230:
3199:
3198:
3168:
3167:
3148:
3147:
3144:
3121:
3120:
3099:
3094:
3093:
3064:
3049:
3044:
3043:
3020:
3019:
3000:
2999:
2980:
2979:
2952:
2951:
2932:
2931:
2908:
2907:
2885:
2852:
2851:
2818:
2817:
2792:
2791:
2768:
2767:
2760:
2748:
2736:
2647:
2646:
2582:
2563:
2538:
2534:
2516:
2511:
2510:
2486:
2481:
2480:
2432:
2422:
2394:
2384:
2383:
2379:
2342:
2323:
2312:
2311:
2266:
2256:
2228:
2218:
2217:
2213:
2179:
2160:
2153:
2128:
2127:
2121:
2070:
2069:
2045:
2040:
2039:
1990:
1961:
1941:
1940:
1934:
1835:
1821:
1782:
1777:
1776:
1754:
1752:Continuous case
1707:
1688:
1663:
1659:
1641:
1636:
1635:
1630:
1578:
1565:
1555:
1533:
1520:
1510:
1509:
1505:
1462:
1443:
1432:
1431:
1403:
1384:
1379:
1378:
1343:
1330:
1320:
1298:
1285:
1275:
1274:
1270:
1248:
1229:
1222:
1201:
1190:
1189:
1153:
1121:
1120:
1096:
1091:
1090:
1041:
1025:
1012:
999:
967:
966:
960:
949:
942:
935:
920:
915:
909:
889:
869:
853:Richard Jeffrey
845:
825:
817:
715:
710:
709:
662:
657:
656:
635:
630:
629:
599:
586:
581:
580:
554:
553:
538:
502:in particular.
489:symmetry groups
465:
425:
407:that expresses
385:
345:
330:Model averaging
309:Nested sampling
221:Empirical Bayes
211:Conjugate prior
180:Cromwell's rule
93:
82:
76:
73:
63:Please help to
62:
46:
42:
35:
28:
23:
22:
15:
12:
11:
5:
5917:
5915:
5907:
5906:
5901:
5896:
5891:
5886:
5876:
5875:
5869:
5868:
5866:
5865:
5864:
5863:
5858:
5853:
5846:Complex system
5843:
5838:
5837:
5836:
5831:
5820:
5818:
5814:
5813:
5811:
5810:
5805:
5800:
5795:
5790:
5784:
5782:
5776:
5775:
5773:
5772:
5771:
5770:
5765:
5755:
5749:
5747:
5741:
5740:
5738:
5737:
5728:
5723:
5718:
5713:
5708:
5702:
5700:
5696:
5695:
5693:
5692:
5691:
5690:
5685:
5674:
5673:
5672:
5667:
5662:
5657:
5646:
5644:
5638:
5637:
5635:
5634:
5629:
5628:
5627:
5622:
5617:
5612:
5601:
5596:
5591:
5585:
5583:
5577:
5576:
5569:
5567:
5565:
5564:
5562:ergodic theory
5559:
5553:
5551:
5547:
5546:
5541:
5539:
5538:
5531:
5524:
5516:
5510:
5509:
5478:(2): 505–518.
5463:
5456:
5444:
5420:
5417:
5416:
5415:
5413:on 2006-06-03.
5387:10.1.1.27.6392
5372:(3): 223–261.
5354:
5333:
5318:
5308:(2): 319–326.
5297:
5288:
5274:Kesavan, H. K.
5270:
5254:
5238:
5218:(3): 191–226.
5203:
5174:
5166:
5146:(6): 892–899.
5128:
5110:(4): 313–320.
5097:
5094:
5091:
5090:
5049:
5032:
5029:978-0521592710
5012:
4965:
4958:
4924:
4877:
4837:
4799:
4778:(2): 186–198.
4762:
4743:(3): 613–628.
4727:
4682:
4671:(2): 405–429.
4653:
4642:(3): 227–241.
4616:
4609:
4603:. OUP Oxford.
4589:
4562:(2): 171–190.
4536:
4509:(4): 620–630.
4482:
4481:
4479:
4476:
4474:
4473:
4468:
4463:
4458:
4453:
4448:
4443:
4438:
4433:
4428:
4422:
4420:
4417:
4400:
4397:
4365:Bayes' theorem
4360:
4357:
4352:
4351:
4336:
4333:
4329:
4325:
4322:
4319:
4316:
4314:
4312:
4307:
4303:
4299:
4296:
4291:
4287:
4281:
4276:
4273:
4270:
4266:
4262:
4259:
4256:
4254:
4252:
4247:
4243:
4239:
4236:
4231:
4227:
4221:
4216:
4213:
4210:
4206:
4202:
4199:
4196:
4193:
4189:
4183:
4179:
4173:
4168:
4165:
4162:
4158:
4154:
4151:
4147:
4143:
4140:
4138:
4136:
4131:
4127:
4123:
4120:
4115:
4111:
4105:
4100:
4097:
4094:
4090:
4086:
4081:
4077:
4071:
4066:
4063:
4060:
4056:
4052:
4049:
4046:
4043:
4040:
4037:
4034:
4031:
4028:
4026:
4024:
4021:
4016:
4012:
4008:
4005:
4002:
3999:
3994:
3990:
3984:
3979:
3976:
3973:
3969:
3965:
3962:
3959:
3956:
3953:
3950:
3948:
3946:
3942:
3938:
3933:
3929:
3925:
3922:
3919:
3916:
3911:
3907:
3903:
3898:
3893:
3890:
3887:
3883:
3879:
3876:
3873:
3870:
3867:
3863:
3857:
3854:
3849:
3846:
3844:
3841:
3837:
3834:
3831:
3826:
3823:
3817:
3811:
3808:
3805:
3801:
3797:
3796:
3769:
3766:
3763:
3752:
3751:
3736:
3732:
3728:
3725:
3722:
3717:
3713:
3709:
3706:
3703:
3700:
3697:
3692:
3687:
3684:
3681:
3677:
3673:
3670:
3667:
3664:
3661:
3657:
3651:
3648:
3643:
3640:
3638:
3636:
3630:
3627:
3622:
3618:
3614:
3611:
3607:
3603:
3600:
3595:
3591:
3587:
3584:
3580:
3577:
3572:
3568:
3564:
3561:
3556:
3553:
3547:
3544:
3539:
3536:
3531:
3528:
3526:
3524:
3518:
3513:
3509:
3504:
3500:
3495:
3491:
3486:
3481:
3477:
3471:
3468:
3462:
3459:
3454:
3451:
3446:
3443:
3441:
3439:
3436:
3433:
3428:
3425:
3420:
3419:
3396:
3376:
3356:
3342:
3341:
3327:
3322:
3318:
3313:
3309:
3304:
3300:
3295:
3290:
3286:
3280:
3277:
3271:
3268:
3254:
3253:
3240:
3237:
3233:
3229:
3226:
3223:
3220:
3216:
3212:
3209:
3206:
3175:
3155:
3142:
3128:
3106:
3102:
3090:
3089:
3076:
3071:
3067:
3061:
3056:
3052:
3027:
3007:
2987:
2967:
2963:
2959:
2939:
2915:
2884:
2881:
2868:
2865:
2862:
2859:
2831:
2828:
2825:
2805:
2802:
2799:
2775:
2759:
2756:
2747:
2744:
2735:
2732:
2715:
2714:
2703:
2700:
2697:
2694:
2691:
2687:
2684:
2681:
2678:
2675:
2672:
2669:
2666:
2663:
2660:
2657:
2654:
2609:
2608:
2597:
2594:
2589:
2585:
2581:
2578:
2575:
2570:
2566:
2562:
2559:
2556:
2553:
2545:
2541:
2537:
2533:
2528:
2523:
2519:
2493:
2489:
2477:
2476:
2465:
2462:
2459:
2454:
2450:
2447:
2444:
2439:
2435:
2429:
2425:
2421:
2418:
2415:
2412:
2409:
2406:
2401:
2397:
2391:
2387:
2382:
2378:
2375:
2372:
2369:
2366:
2363:
2360:
2357:
2354:
2349:
2345:
2341:
2338:
2335:
2330:
2326:
2322:
2319:
2307:determined by
2301:
2300:
2288:
2284:
2281:
2278:
2273:
2269:
2263:
2259:
2255:
2252:
2249:
2246:
2243:
2240:
2235:
2231:
2225:
2221:
2216:
2212:
2209:
2206:
2203:
2200:
2197:
2191:
2186:
2182:
2178:
2175:
2172:
2167:
2163:
2159:
2156:
2152:
2147:
2144:
2141:
2138:
2135:
2119:
2114:
2113:
2102:
2099:
2096:
2093:
2089:
2086:
2083:
2080:
2077:
2052:
2048:
2036:
2035:
2024:
2021:
2018:
2015:
2012:
2009:
2006:
2003:
1997:
1993:
1989:
1986:
1983:
1979:
1976:
1973:
1968:
1964:
1960:
1957:
1954:
1951:
1948:
1932:
1869:
1868:
1857:
1854:
1847:
1844:
1841:
1838:
1833:
1830:
1827:
1824:
1818:
1815:
1812:
1809:
1806:
1803:
1800:
1797:
1794:
1789:
1785:
1753:
1750:
1734:
1733:
1722:
1719:
1714:
1710:
1706:
1703:
1700:
1695:
1691:
1687:
1684:
1681:
1678:
1670:
1666:
1662:
1658:
1653:
1648:
1644:
1628:
1610:
1609:
1598:
1594:
1590:
1585:
1581:
1577:
1572:
1568:
1562:
1558:
1554:
1551:
1548:
1545:
1540:
1536:
1532:
1527:
1523:
1517:
1513:
1508:
1504:
1501:
1496:
1491:
1488:
1485:
1481:
1477:
1474:
1469:
1465:
1461:
1458:
1455:
1450:
1446:
1442:
1439:
1410:
1406:
1402:
1399:
1396:
1391:
1387:
1375:
1374:
1363:
1359:
1355:
1350:
1346:
1342:
1337:
1333:
1327:
1323:
1319:
1316:
1313:
1310:
1305:
1301:
1297:
1292:
1288:
1282:
1278:
1273:
1269:
1266:
1260:
1255:
1251:
1247:
1244:
1241:
1236:
1232:
1228:
1225:
1221:
1216:
1213:
1208:
1204:
1200:
1197:
1183:
1182:
1171:
1168:
1165:
1160:
1156:
1152:
1149:
1144:
1139:
1136:
1133:
1129:
1103:
1099:
1087:
1086:
1075:
1072:
1069:
1066:
1063:
1060:
1057:
1054:
1048:
1044:
1040:
1037:
1032:
1028:
1024:
1019:
1015:
1011:
1006:
1002:
998:
995:
990:
985:
982:
979:
975:
958:
947:
940:
933:
919:
916:
911:Main article:
908:
905:
888:
885:
868:
865:
844:
841:
837:channel coding
824:
821:
816:
813:
812:
811:
800:
797:
793:
790:
787:
784:
781:
777:
774:
771:
763:
760:
757:
751:
748:
745:
735:
732:
727:
722:
718:
669:
665:
642:
638:
626:
625:
614:
611:
606:
602:
598:
593:
589:
574:
573:
561:
537:
534:
464:
461:
424:
421:
387:
386:
384:
383:
376:
369:
361:
358:
357:
356:
355:
340:
339:
338:
337:
332:
327:
319:
318:
314:
313:
312:
311:
306:
298:
297:
293:
292:
291:
290:
285:
280:
272:
271:
267:
266:
265:
264:
259:
254:
249:
244:
236:
235:
231:
230:
229:
228:
223:
218:
213:
205:
204:
203:Model building
200:
199:
198:
197:
192:
187:
182:
177:
172:
167:
162:
160:Bayes' theorem
157:
152:
144:
143:
139:
138:
120:
119:
111:
110:
104:
103:
95:
94:
77:September 2008
49:
47:
40:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
5916:
5905:
5902:
5900:
5897:
5895:
5892:
5890:
5887:
5885:
5882:
5881:
5879:
5862:
5859:
5857:
5854:
5852:
5849:
5848:
5847:
5844:
5842:
5839:
5835:
5834:superfluidity
5832:
5830:
5827:
5826:
5825:
5822:
5821:
5819:
5815:
5809:
5806:
5804:
5801:
5799:
5796:
5794:
5791:
5789:
5786:
5785:
5783:
5781:
5777:
5769:
5766:
5764:
5761:
5760:
5759:
5756:
5754:
5751:
5750:
5748:
5746:
5742:
5736:
5732:
5729:
5727:
5724:
5722:
5719:
5717:
5714:
5712:
5709:
5707:
5704:
5703:
5701:
5697:
5689:
5686:
5684:
5681:
5680:
5679:
5675:
5671:
5668:
5666:
5663:
5661:
5658:
5656:
5653:
5652:
5651:
5648:
5647:
5645:
5643:
5639:
5633:
5630:
5626:
5623:
5621:
5618:
5616:
5613:
5611:
5608:
5607:
5605:
5602:
5600:
5597:
5595:
5592:
5590:
5587:
5586:
5584:
5582:
5578:
5563:
5560:
5558:
5555:
5554:
5552:
5548:
5544:
5537:
5532:
5530:
5525:
5523:
5518:
5517:
5514:
5505:
5501:
5496:
5491:
5486:
5481:
5477:
5473:
5469:
5464:
5461:
5457:
5447:
5445:0-521-83378-7
5441:
5437:
5430:
5429:
5423:
5422:
5418:
5409:
5405:
5401:
5397:
5393:
5388:
5383:
5379:
5375:
5371:
5367:
5360:
5355:
5351:
5347:
5343:
5339:
5334:
5331:
5330:1-58-488071-6
5327:
5323:
5319:
5315:
5311:
5307:
5303:
5298:
5295:
5294:
5289:
5287:
5286:0-12-397670-7
5283:
5279:
5275:
5271:
5268:
5264:
5262:
5255:
5251:
5247:
5243:
5242:Jaynes, E. T.
5239:
5234:
5229:
5225:
5221:
5217:
5213:
5209:
5204:
5200:
5196:
5192:
5188:
5184:
5180:
5175:
5173:
5172:
5167:
5162:
5157:
5153:
5149:
5145:
5141:
5134:
5129:
5125:
5121:
5117:
5113:
5109:
5105:
5100:
5099:
5095:
5085:
5080:
5076:
5072:
5068:
5064:
5060:
5053:
5050:
5046:
5042:
5036:
5033:
5030:
5026:
5022:
5016:
5013:
5008:
5002:
4993:
4988:
4984:
4980:
4979:Bayesian Anal
4976:
4969:
4966:
4961:
4955:
4951:
4947:
4943:
4938:
4937:
4928:
4925:
4920:
4916:
4912:
4908:
4904:
4900:
4893:
4886:
4884:
4882:
4878:
4873:
4869:
4865:
4861:
4857:
4853:
4846:
4844:
4842:
4838:
4833:
4829:
4825:
4821:
4818:(3): 225–46.
4817:
4813:
4809:
4803:
4800:
4795:
4791:
4786:
4781:
4777:
4773:
4766:
4763:
4758:
4754:
4750:
4746:
4742:
4738:
4731:
4728:
4723:
4719:
4715:
4711:
4707:
4703:
4699:
4695:
4694:
4686:
4683:
4678:
4674:
4670:
4666:
4665:
4657:
4654:
4649:
4645:
4641:
4637:
4630:
4626:
4625:Jaynes, E. T.
4620:
4617:
4612:
4606:
4602:
4601:
4593:
4590:
4585:
4581:
4577:
4573:
4569:
4565:
4561:
4558:. Series II.
4557:
4550:
4546:
4545:Jaynes, E. T.
4540:
4537:
4532:
4528:
4524:
4520:
4516:
4512:
4508:
4505:. Series II.
4504:
4497:
4493:
4492:Jaynes, E. T.
4487:
4484:
4477:
4472:
4469:
4467:
4464:
4462:
4459:
4457:
4454:
4452:
4449:
4447:
4444:
4442:
4439:
4437:
4434:
4432:
4429:
4427:
4424:
4423:
4418:
4416:
4414:
4410:
4406:
4398:
4396:
4394:
4389:
4385:
4380:
4377:
4375:
4371:
4366:
4356:
4334:
4320:
4317:
4315:
4305:
4301:
4297:
4294:
4289:
4285:
4279:
4274:
4271:
4268:
4264:
4260:
4257:
4255:
4245:
4241:
4237:
4234:
4229:
4225:
4219:
4214:
4211:
4208:
4204:
4200:
4197:
4194:
4191:
4187:
4181:
4177:
4171:
4166:
4163:
4160:
4156:
4152:
4149:
4145:
4141:
4139:
4129:
4125:
4121:
4118:
4113:
4109:
4103:
4098:
4095:
4092:
4088:
4084:
4079:
4075:
4069:
4064:
4061:
4058:
4054:
4050:
4047:
4044:
4041:
4038:
4035:
4032:
4029:
4027:
4014:
4010:
4006:
4000:
3997:
3992:
3988:
3982:
3977:
3974:
3971:
3967:
3963:
3960:
3957:
3954:
3951:
3949:
3940:
3931:
3927:
3923:
3917:
3914:
3909:
3905:
3901:
3896:
3891:
3888:
3885:
3881:
3877:
3874:
3871:
3868:
3865:
3861:
3855:
3852:
3847:
3845:
3839:
3835:
3832:
3829:
3824:
3821:
3815:
3803:
3787:
3786:
3785:
3783:
3761:
3734:
3730:
3723:
3715:
3711:
3707:
3698:
3695:
3690:
3685:
3682:
3679:
3675:
3671:
3668:
3665:
3662:
3659:
3655:
3649:
3646:
3641:
3639:
3628:
3620:
3616:
3612:
3605:
3601:
3593:
3589:
3585:
3578:
3570:
3566:
3562:
3554:
3551:
3545:
3542:
3537:
3534:
3529:
3527:
3516:
3511:
3507:
3502:
3498:
3493:
3489:
3484:
3479:
3475:
3469:
3466:
3460:
3457:
3452:
3449:
3444:
3442:
3437:
3434:
3431:
3426:
3423:
3410:
3409:
3408:
3394:
3374:
3354:
3345:
3325:
3320:
3316:
3311:
3307:
3302:
3298:
3293:
3288:
3284:
3278:
3275:
3269:
3266:
3259:
3258:
3257:
3238:
3235:
3231:
3227:
3224:
3221:
3207:
3204:
3197:
3196:
3195:
3193:
3187:
3173:
3153:
3145:
3126:
3104:
3100:
3074:
3069:
3065:
3059:
3054:
3050:
3042:
3041:
3040:
3025:
3005:
2985:
2965:
2961:
2957:
2937:
2928:
2913:
2904:
2902:
2898:
2894:
2890:
2889:Graham Wallis
2882:
2880:
2863:
2857:
2849:
2843:
2829:
2826:
2823:
2803:
2800:
2797:
2789:
2773:
2765:
2755:
2753:
2745:
2743:
2741:
2733:
2731:
2729:
2725:
2720:
2701:
2698:
2695:
2692:
2689:
2685:
2679:
2673:
2670:
2667:
2664:
2658:
2652:
2645:
2644:
2643:
2641:
2637:
2633:
2629:
2625:
2621:
2616:
2614:
2595:
2587:
2583:
2579:
2576:
2573:
2568:
2564:
2557:
2554:
2551:
2543:
2539:
2526:
2521:
2517:
2509:
2508:
2507:
2491:
2487:
2463:
2460:
2457:
2452:
2445:
2437:
2433:
2427:
2423:
2419:
2416:
2413:
2407:
2399:
2395:
2389:
2385:
2380:
2376:
2373:
2367:
2361:
2358:
2355:
2347:
2343:
2339:
2336:
2333:
2328:
2324:
2317:
2310:
2309:
2308:
2306:
2286:
2279:
2271:
2267:
2261:
2257:
2253:
2250:
2247:
2241:
2233:
2229:
2223:
2219:
2214:
2210:
2207:
2201:
2195:
2184:
2180:
2176:
2173:
2170:
2165:
2161:
2154:
2150:
2145:
2139:
2133:
2126:
2125:
2124:
2122:
2100:
2097:
2094:
2091:
2084:
2078:
2075:
2068:
2067:
2066:
2050:
2046:
2022:
2019:
2016:
2013:
2010:
2007:
2004:
2001:
1995:
1991:
1987:
1984:
1981:
1974:
1966:
1962:
1955:
1949:
1946:
1939:
1938:
1937:
1935:
1928:
1924:
1920:
1916:
1912:
1907:
1905:
1901:
1897:
1893:
1888:
1886:
1882:
1878:
1874:
1855:
1852:
1842:
1836:
1828:
1822:
1816:
1813:
1807:
1801:
1798:
1795:
1792:
1787:
1783:
1775:
1774:
1773:
1771:
1767:
1763:
1759:
1751:
1749:
1747:
1743:
1739:
1720:
1712:
1708:
1704:
1701:
1698:
1693:
1689:
1682:
1679:
1676:
1668:
1664:
1651:
1646:
1642:
1634:
1633:
1632:
1625:
1623:
1619:
1615:
1596:
1592:
1583:
1579:
1570:
1566:
1560:
1556:
1552:
1549:
1546:
1538:
1534:
1525:
1521:
1515:
1511:
1506:
1502:
1499:
1494:
1489:
1486:
1483:
1479:
1475:
1467:
1463:
1459:
1456:
1453:
1448:
1444:
1437:
1430:
1429:
1428:
1426:
1408:
1404:
1400:
1397:
1394:
1389:
1385:
1361:
1357:
1348:
1344:
1335:
1331:
1325:
1321:
1317:
1314:
1311:
1303:
1299:
1290:
1286:
1280:
1276:
1271:
1267:
1264:
1253:
1249:
1245:
1242:
1239:
1234:
1230:
1223:
1219:
1214:
1206:
1202:
1188:
1187:
1186:
1169:
1166:
1158:
1154:
1142:
1137:
1134:
1131:
1127:
1119:
1118:
1117:
1101:
1097:
1073:
1070:
1067:
1064:
1061:
1058:
1055:
1052:
1046:
1042:
1038:
1030:
1026:
1017:
1013:
1004:
1000:
988:
983:
980:
977:
973:
965:
964:
963:
961:
954:
950:
943:
936:
929:
925:
918:Discrete case
917:
914:
906:
904:
902:
898:
895:. Similar to
894:
886:
884:
882:
878:
874:
866:
864:
862:
858:
854:
850:
842:
840:
838:
834:
830:
822:
820:
814:
798:
791:
788:
785:
782:
779:
772:
769:
733:
730:
725:
720:
716:
708:
707:
706:
704:
698:
696:
692:
688:
683:
667:
663:
640:
636:
612:
609:
604:
600:
596:
591:
587:
579:
578:
577:
559:
551:
547:
546:
545:
543:
535:
533:
529:
527:
523:
518:
516:
512:
508:
503:
501:
497:
492:
490:
486:
482:
478:
474:
470:
462:
460:
458:
454:
450:
446:
442:
438:
434:
430:
422:
420:
418:
412:
410:
406:
402:
398:
394:
382:
377:
375:
370:
368:
363:
362:
360:
359:
354:
349:
344:
343:
342:
341:
336:
333:
331:
328:
326:
323:
322:
321:
320:
315:
310:
307:
305:
302:
301:
300:
299:
294:
289:
286:
284:
281:
279:
276:
275:
274:
273:
268:
263:
260:
258:
255:
253:
250:
248:
245:
243:
240:
239:
238:
237:
232:
227:
224:
222:
219:
217:
214:
212:
209:
208:
207:
206:
201:
196:
193:
191:
188:
186:
183:
181:
178:
176:
175:Cox's theorem
173:
171:
168:
166:
163:
161:
158:
156:
153:
151:
148:
147:
146:
145:
140:
137:
133:
129:
125:
122:
121:
117:
113:
112:
109:
105:
101:
100:
91:
88:
80:
70:
66:
60:
59:
53:
48:
39:
38:
33:
19:
5817:Applications
5768:size scaling
5556:
5475:
5471:
5449:. Retrieved
5427:
5408:the original
5369:
5365:
5344:(1): 31–46.
5341:
5337:
5321:
5305:
5301:
5292:
5277:
5266:
5260:
5249:
5215:
5211:
5185:(1): 42–48.
5182:
5178:
5170:
5143:
5139:
5107:
5103:
5066:
5062:
5052:
5044:
5035:
5020:
5015:
5001:cite journal
4982:
4978:
4968:
4935:
4927:
4902:
4898:
4855:
4851:
4815:
4811:
4802:
4775:
4771:
4765:
4740:
4736:
4730:
4697:
4691:
4685:
4668:
4662:
4656:
4639:
4635:
4619:
4599:
4592:
4559:
4555:
4539:
4506:
4502:
4486:
4436:Info-metrics
4412:
4402:
4381:
4378:
4362:
4353:
3753:
3346:
3343:
3255:
3188:
3140:
3091:
2905:
2900:
2886:
2844:
2788:propositions
2763:
2761:
2749:
2737:
2718:
2716:
2639:
2635:
2627:
2623:
2619:
2617:
2610:
2478:
2302:
2117:
2115:
2037:
1930:
1926:
1923:real numbers
1914:
1910:
1908:
1899:
1895:
1889:
1884:
1876:
1872:
1870:
1762:Edwin Jaynes
1755:
1735:
1626:
1611:
1376:
1184:
1088:
956:
952:
945:
938:
931:
927:
923:
921:
890:
870:
846:
826:
818:
815:Applications
699:
684:
627:
575:
541:
539:
530:
519:
504:
493:
466:
429:E. T. Jaynes
426:
413:
392:
390:
325:Bayes factor
194:
83:
74:
55:
5808:von Neumann
5678:force field
5670:percolation
4985:: 681–691.
4940:. pp.
4905:(1): 1–27.
4431:Dissipation
3784:, he finds
3018:balls into
2762:Consider a
550:expectation
405:proposition
69:introducing
5878:Categories
5665:Heisenberg
5451:2008-08-24
5338:Biometrika
5302:Biometrika
5096:References
5069:(8): 381.
4858:(3): 435.
2038:where the
1768:(see also
1089:where the
507:prior data
481:symmetries
270:Estimators
142:Background
128:Likelihood
52:references
5788:Boltzmann
5711:H-theorem
5589:Ensembles
5404:1874/2649
5382:CiteSeerX
4872:122047337
4832:121847242
4808:Skyrms, B
4780:CiteSeerX
4757:119657859
4407:known as
4298:
4265:∑
4261:−
4238:
4205:∑
4201:−
4195:
4157:∑
4153:−
4122:
4089:∑
4085:−
4055:∑
4048:
4042:−
4036:
4001:
3968:∑
3964:−
3958:
3918:
3882:∑
3878:−
3872:
3833:
3810:∞
3807:→
3768:∞
3765:→
3699:
3676:∑
3672:−
3663:
3606:⋯
3546:
3503:⋯
3461:
3435:
3312:⋯
3236:−
3228:⋅
2827:
2801:
2671:⋅
2615:program.
2584:λ
2577:…
2565:λ
2555:
2540:λ
2536:∂
2532:∂
2488:λ
2424:λ
2417:⋯
2386:λ
2377:
2359:∫
2344:λ
2337:…
2325:λ
2303:with the
2258:λ
2251:⋯
2220:λ
2211:
2181:λ
2174:…
2162:λ
2076:∫
2014:…
1988:≥
1947:∫
1817:
1799:∫
1796:−
1709:λ
1702:…
1690:λ
1680:
1665:λ
1661:∂
1657:∂
1557:λ
1550:⋯
1512:λ
1503:
1480:∑
1464:λ
1457:…
1445:λ
1405:λ
1398:…
1386:λ
1377:for some
1322:λ
1315:⋯
1277:λ
1268:
1250:λ
1243:…
1231:λ
1128:∑
1065:…
1039:≥
974:∑
786:…
773:∈
457:inference
170:Coherence
124:Posterior
5798:Tsallis
5504:18184793
5276:, 1992,
5244:(1963).
5199:53059968
4919:18155189
4627:(1968).
4547:(1957).
4494:(1957).
4419:See also
2901:a priori
2734:Examples
1919:interval
1616:. (The
463:Overview
136:Evidence
5793:Shannon
5780:Entropy
5495:6670549
5374:Bibcode
5220:Bibcode
5212:Entropy
5148:Bibcode
5112:Bibcode
5071:Bibcode
5063:Entropy
4722:1825292
4714:2669786
4584:0096414
4564:Bibcode
4531:0087305
4511:Bibcode
1921:of the
628:(where
572:is 2.87
522:ergodic
441:entropy
423:History
401:entropy
65:improve
5642:Models
5550:Theory
5502:
5492:
5442:
5384:
5328:
5284:
5265:", in
5197:
5027:
4956:
4944:–432.
4917:
4870:
4830:
4782:
4755:
4720:
4712:
4607:
4582:
4529:
3256:where
3092:where
2766:among
2717:where
1871:where
944:,...,
767:
753:
739:
54:, but
5851:chaos
5803:RĂ©nyi
5660:Potts
5655:Ising
5432:(PDF)
5411:(PDF)
5362:(PDF)
5195:S2CID
5136:(PDF)
5043:, in
4915:S2CID
4895:(PDF)
4868:S2CID
4828:S2CID
4753:S2CID
4710:JSTOR
4632:(PDF)
4552:(PDF)
4499:(PDF)
4478:Notes
1898:from
1627:The λ
132:Prior
5733:and
5500:PMID
5440:ISBN
5326:ISBN
5282:ISBN
5025:ISBN
5007:link
4954:ISBN
4605:ISBN
4372:and
2699:<
2693:<
1756:For
831:for
655:and
610:>
576:and
548:the
498:and
435:and
391:The
5490:PMC
5480:doi
5400:hdl
5392:doi
5370:26B
5346:doi
5310:doi
5228:doi
5187:doi
5156:doi
5144:117
5120:doi
5079:doi
4987:doi
4946:doi
4942:419
4907:doi
4860:doi
4820:doi
4790:doi
4745:doi
4702:doi
4673:doi
4669:138
4644:doi
4572:doi
4560:108
4519:doi
4507:106
4411:or
4295:log
4235:log
4192:log
4119:log
4045:log
4033:log
3998:log
3955:log
3915:log
3869:log
3830:log
3800:lim
3696:log
3660:log
3543:log
3458:log
3432:log
3186:).
2895:in
2824:log
2798:log
2726:or
2552:log
2374:exp
2208:exp
1894:of
1814:log
1772:).
1677:log
1500:exp
1265:exp
855:'s
613:0.6
447:of
411:).
5880::
5606::
5498:.
5488:.
5476:28
5474:.
5470:.
5434:.
5398:.
5390:.
5380:.
5368:.
5364:.
5342:92
5340:.
5306:90
5304:.
5226:.
5214:.
5210:.
5193:.
5181:.
5154:.
5142:.
5138:.
5118:.
5106:.
5077:.
5067:19
5065:.
5061:.
5003:}}
4999:{{
4981:.
4977:.
4952:.
4913:.
4903:13
4901:.
4897:.
4880:^
4866:.
4856:10
4854:.
4840:^
4826:.
4816:22
4814:.
4788:.
4776:14
4774:.
4751:.
4741:51
4739:.
4718:MR
4716:.
4708:.
4698:95
4696:.
4667:.
4638:.
4634:.
4580:MR
4578:.
4570:.
4554:.
4527:MR
4525:.
4517:.
4501:.
3194:,
2742:.
2730:.
2638:,
2101:1.
1906:.
1748:.
1196:Pr
1170:1.
1148:Pr
994:Pr
937:,
851:.
839:.
705:,
697:.
528:.
134:Ă·
130:Ă—
126:=
5625:G
5620:F
5615:H
5610:U
5535:e
5528:t
5521:v
5506:.
5482::
5454:.
5402::
5394::
5376::
5352:.
5348::
5332:.
5316:.
5312::
5261:N
5236:.
5230::
5222::
5216:3
5201:.
5189::
5183:7
5164:.
5158::
5150::
5126:.
5122::
5114::
5108:1
5087:.
5081::
5073::
5009:)
4989::
4983:2
4962:.
4948::
4921:.
4909::
4874:.
4862::
4834:.
4822::
4796:.
4792::
4759:.
4747::
4724:.
4704::
4679:.
4675::
4650:.
4646::
4640:4
4613:.
4586:.
4574::
4566::
4533:.
4521::
4513::
4335:.
4332:)
4328:p
4324:(
4321:H
4318:=
4306:i
4302:p
4290:i
4286:p
4280:m
4275:1
4272:=
4269:i
4258:=
4246:i
4242:p
4230:i
4226:p
4220:m
4215:1
4212:=
4209:i
4198:N
4188:)
4182:i
4178:p
4172:m
4167:1
4164:=
4161:i
4150:1
4146:(
4142:=
4130:i
4126:p
4114:i
4110:p
4104:m
4099:1
4096:=
4093:i
4080:i
4076:p
4070:m
4065:1
4062:=
4059:i
4051:N
4039:N
4030:=
4020:)
4015:i
4011:p
4007:N
4004:(
3993:i
3989:p
3983:m
3978:1
3975:=
3972:i
3961:N
3952:=
3941:)
3937:)
3932:i
3928:p
3924:N
3921:(
3910:i
3906:p
3902:N
3897:m
3892:1
3889:=
3886:i
3875:N
3866:N
3862:(
3856:N
3853:1
3848:=
3840:)
3836:W
3825:N
3822:1
3816:(
3804:N
3762:N
3735:.
3731:)
3727:)
3724:!
3721:)
3716:i
3712:p
3708:N
3705:(
3702:(
3691:m
3686:1
3683:=
3680:i
3669:!
3666:N
3656:(
3650:N
3647:1
3642:=
3629:!
3626:)
3621:m
3617:p
3613:N
3610:(
3602:!
3599:)
3594:2
3590:p
3586:N
3583:(
3579:!
3576:)
3571:1
3567:p
3563:N
3560:(
3555:!
3552:N
3538:N
3535:1
3530:=
3517:!
3512:m
3508:n
3499:!
3494:2
3490:n
3485:!
3480:1
3476:n
3470:!
3467:N
3453:N
3450:1
3445:=
3438:W
3427:N
3424:1
3395:W
3375:W
3355:W
3326:!
3321:m
3317:n
3308:!
3303:2
3299:n
3294:!
3289:1
3285:n
3279:!
3276:N
3270:=
3267:W
3239:N
3232:m
3225:W
3222:=
3219:)
3215:p
3211:(
3208:r
3205:P
3174:i
3154:i
3143:i
3141:n
3127:i
3105:i
3101:p
3075:N
3070:i
3066:n
3060:=
3055:i
3051:p
3026:m
3006:N
2986:m
2966:N
2962:/
2958:1
2938:N
2914:m
2867:)
2864:x
2861:(
2858:m
2830:m
2804:m
2774:m
2719:A
2702:b
2696:x
2690:a
2686:,
2683:)
2680:x
2677:(
2674:q
2668:A
2665:=
2662:)
2659:x
2656:(
2653:p
2640:b
2636:a
2634:(
2628:x
2624:x
2622:(
2620:q
2596:.
2593:)
2588:m
2580:,
2574:,
2569:1
2561:(
2558:Z
2544:k
2527:=
2522:k
2518:F
2492:k
2464:.
2461:x
2458:d
2453:]
2449:)
2446:x
2443:(
2438:m
2434:f
2428:m
2420:+
2414:+
2411:)
2408:x
2405:(
2400:1
2396:f
2390:1
2381:[
2371:)
2368:x
2365:(
2362:q
2356:=
2353:)
2348:m
2340:,
2334:,
2329:1
2321:(
2318:Z
2287:]
2283:)
2280:x
2277:(
2272:m
2268:f
2262:m
2254:+
2248:+
2245:)
2242:x
2239:(
2234:1
2230:f
2224:1
2215:[
2205:)
2202:x
2199:(
2196:q
2190:)
2185:m
2177:,
2171:,
2166:1
2158:(
2155:Z
2151:1
2146:=
2143:)
2140:x
2137:(
2134:p
2120:c
2118:H
2098:=
2095:x
2092:d
2088:)
2085:x
2082:(
2079:p
2051:k
2047:F
2023:.
2020:m
2017:,
2011:,
2008:1
2005:=
2002:k
1996:k
1992:F
1985:x
1982:d
1978:)
1975:x
1972:(
1967:k
1963:f
1959:)
1956:x
1953:(
1950:p
1933:k
1931:f
1927:m
1915:x
1911:I
1900:q
1896:p
1885:q
1877:x
1875:(
1873:q
1856:x
1853:d
1846:)
1843:x
1840:(
1837:q
1832:)
1829:x
1826:(
1823:p
1811:)
1808:x
1805:(
1802:p
1793:=
1788:c
1784:H
1721:.
1718:)
1713:m
1705:,
1699:,
1694:1
1686:(
1683:Z
1669:k
1652:=
1647:k
1643:F
1629:k
1597:,
1593:]
1589:)
1584:i
1580:x
1576:(
1571:m
1567:f
1561:m
1553:+
1547:+
1544:)
1539:i
1535:x
1531:(
1526:1
1522:f
1516:1
1507:[
1495:n
1490:1
1487:=
1484:i
1476:=
1473:)
1468:m
1460:,
1454:,
1449:1
1441:(
1438:Z
1409:m
1401:,
1395:,
1390:1
1362:,
1358:]
1354:)
1349:i
1345:x
1341:(
1336:m
1332:f
1326:m
1318:+
1312:+
1309:)
1304:i
1300:x
1296:(
1291:1
1287:f
1281:1
1272:[
1259:)
1254:m
1246:,
1240:,
1235:1
1227:(
1224:Z
1220:1
1215:=
1212:)
1207:i
1203:x
1199:(
1167:=
1164:)
1159:i
1155:x
1151:(
1143:n
1138:1
1135:=
1132:i
1102:k
1098:F
1074:.
1071:m
1068:,
1062:,
1059:1
1056:=
1053:k
1047:k
1043:F
1036:)
1031:i
1027:x
1023:(
1018:k
1014:f
1010:)
1005:i
1001:x
997:(
989:n
984:1
981:=
978:i
959:k
957:f
953:m
948:n
946:x
941:2
939:x
934:1
932:x
928:x
924:I
799:.
796:}
792:n
789:,
783:,
780:1
776:{
770:i
762:l
759:l
756:a
750:r
747:o
744:f
734:n
731:1
726:=
721:i
717:p
668:3
664:p
641:2
637:p
605:3
601:p
597:+
592:2
588:p
560:x
380:e
373:t
366:v
90:)
84:(
79:)
75:(
61:.
34:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.