Knowledge

Talk:Entropy (information theory)/Archive 2

Source 📝

4122:
For the given statistical distribution of symbols we can calculate the number of possible permutations and enumerate all messages. If we do that, we can send statistics and index of the message in enumeration list instead of the message and message can be restored. But the index of the message has length as well and it can be very long so we consider the worst case scenario and take the longest index that is number of possible permutations. For example, if we have message with symbols A,B,C of 1000 symbols long with statistics 700, 200 and 100. The number of possible permutations is (1000!) / (700! * 200! * 100!). The approximate bit length of this number divided by the number of symbols is (log(1000!) – log(700!) – log(200!) – log(100!))/1000 = 1.147 bits/symbol, where all logarithms have base 2. If you calculate the entropy it is 1.157. The figures are close and they asymptotically approach each other with the growing size of the message. The limits are explained by Sterling formula, so there is no trick, just approximation. Obviously, when writing his famous article Claude Shannon did not have an idea what is going on and could not explain clear what the entropy is. He simply noticed that in compression by making binary trees similar to Huffman tree the bit length of the symbol is close to –log(p) but always larger and introduced entropy as a compression limit without clear understanding. The article was published in 1948 and Huffman algorithm did not exist but there were other similar algorithms that provided slightly different binary trees with the same concept as Huffman tree, so Shannon knew them. Surprising is not Shannon’s entropy but the other scientists who use obscure and misleading terminology for 60 years. Entropy is a measure for a number of different messages that can be possibly constructed with constrain given as frequency for every symbol that is all, simple and clear.
154:
use the smallest quantumly distinguishable value for L. If x is truly a length, then L could be Planck's length. But this is already too obfuscating for me. I would rather recommend on concentrating on the discrete formula of entropy: S = Sum . Now, in the continuous case, the probability is infinitesimal an it is dP = f(x) dx. Thus, the exact transcription of the above formula with this probability would give S = Sum . Now Sum would become Integral and log ( f(x) dx ) is a functional which must take a form of L(x) dx. The worst problem now is that there are two dx under one integral. This problem appears in the above modified formula for S. This problem must be worked out somehow. Its source is in the product in the initial Shannon entropy.
127:
Transcendental functions (such as logarithms), of a variable x with units, present problems for determining the resulting units of the results of the functions of x. This is why scientists and engineers try to form ratios of quantities in which all the units cancel, and then apply transcendental functions to these ratios rather than the original quantities. As an example, in exp the constant k has the proper units for canceling the units of energy E and temperature T so units cancel in the quantity E/(kT). Then the result of the operation, of a typical transcendental function on its dimensionless argument, is also dimensionless.
3041:
differences between receivers and in what's called "English") is the reason a range instead of a single value is given. It's true that knowing more about the language (i.e. having more ability to predict the text) decreases the entropy; the studies on which the referenced statement is based are generally assuming something like the average user of English. Anyway, the statement in the article is what's in the reference and it's not appropriate for us to second-guess it.
3830:- Sounds sensible to me. I favour the name "Entropy (information theory)" rather than "Shannon entropy" - because that will make it clearer to newbies that this is where they come to find out what the unqualified word "entropy" means when they come across it in an information-theory context. Very often "entropy" is discussed in the literature without specifying that it's "Shannon entropy", even in many cases where the discussion only applies to Shannon entropy. -- 4302: 31: 3773: 3161:"Shannon's entropy measures the information contained in a message as opposed to the portion of the message that is determined (or predictable). Examples of the latter include redundancy in language structure or statistical properties relating to the occurrence frequencies of letter or word pairs, triplets etc. See Markov chain." 165:, ie the Kullback-Leibler distance from some prior distribution, rather than the Shannon entropy. This avoids all the problems of the infinities and the physical dimensionality; and often, when you think it through, you'll find that it may make a lot more sense philosophically in the context of your application, too. 3809:, and not just Shannon entropy (which this article exclusively discusses). Most if not all uses of the term "entropy" in some sense quantify the "information", diversity, dissipation, or "mixing up" that is present in a probability distribution, stochastic process, or the microstates of a physical system. 4121:
Very many scientists like to make simple things complicated and earn the respect over this. Information entropy is a very good example of such attempt. Actually entropy is only a number of possible permutations expressed in bits divided by the length of the message. And the concept is simple as well.
3743:
The example given about the sequence ABABAB... sounds like utter nonsense to me: a source that always produces the same sequence has entropy 0, regardless of whether the sequence consists of a single symbol or not. For instance, the sequence of integers produced by counting from 0 has entropy 0, even
3282:
f(x) will have the unit 1/x. Unless x is dimmensionless the unit of entropy will inclue the log of a unit which is weird. This is a strong reason why it is more useful for the continuous case to use the relative entropy of a distribution, where the general form is the Kullback-Leibler divergence from
3099:
Hmmm, this article seems to assume that logs must always be taken to base 2 - which is not the case. We can define entropy to whatever base we like (in coding it often makes things easier to define it to a base equal to the number of code symbols, which in computer science is typically 2). This leads
182:
Of course, the relative entropy is very good for the continuous case, but, unlike Shannon entropy, it is relative, as it needs a second distribution from which to depart. I was thinking of a formula that would give a good absolute entropy, similar to the Shannon entropy, for the continuous case. This
3900:
for a howto. A redirect will automatically be put in place. In general there's no need to go around fixing the 403 articles - they will gradually get fixed, either by bots or by users, and most users won't even notice the difference since the redirect thing happens so transparently. Some things will
3881:
Thank you both for the input. If no major objections are forthcoming in the next few days, I say we go ahead with the move to "Entropy (information theory)". However, I counted 403 articles that link here, excluding user and talk pages. Is there a bot somewhere we could use to at least pipe those
3555:
Since "uncertainty" (whatever that may mean) is used as a motivating factor in this article, it might be good to have a brief discussion about what is meant by "uncertainty." Should the reader simply assume the common definition of uncertainty? Or is there a specific technical meaning to this word
4052:
And by the way, as to Army1987's suggestions, it might be a little confusing for newbies to talk about Rényi entropy right in the intro. I did try the edit the intro a little, but if you feel you can word things there a little more clearly, please go right ahead. Or perhaps a section later in the
3509:
quote Consider a source that produces the string ABABABABAB... in which A is always followed by B and vice versa. If the probabilistic model considers individual letters as independent, the entropy rate of the sequence is 1 bit per character. But if the sequence is considered as "AB AB AB AB AB..."
3142:
I agree. That reference reads more like a rant than a discussion. Its author appears to lack some basic understanding of thermodynamic vs. information-theoretic entropy. The above comment is absolutely correct in that "the more random a system is the more information we need in order to describe
4158:
This is the way that Kardar introduced the information entropy in his book Statistical Physics of Particles. There is also a wikibook at the external connection named An Intuitive Guide to the Concept of Entropy Arising in Various Sectors of Science, which this kind of opinion might be contributed
4061:
of the number of choices available. He did not attempt to analyze the mathematics behind unequal probability distributions like Shannon did, but he basically invented the concept of "bandwidth" as we know it today: that the rate of information that can be transmitted over a continuous channel is
153:
For canceling the inverse unit of length (actually the inverse unit of x), there should appear a product of f(x) and a length L under the logarithm, i.e. log . This would be, indeed, bizare, as any length L would work - unless we are in the frame of quantum mechanics. In that case, we would simply
3029:
The article currently says "The entropy of English text is between 1.0 and 1.5 bits per letter.". Shouldn't the entropy in question decrease as one discovers more and more patterns in the language, making a text more predictable? If so, I think it would be a good idea to be a little less precise,
4242:
I haven't read the article or preceding comments, but AFAIK it is not a copyright violation to copy GFDL-licensed material to Knowledge as long as it has proper attribution (maybe you need to change the attribution above to more closely reflect the kind of attribution PlanetMath wants, at most).
3610:
As I see it, there is per definition no uncertainty with respect to the survival of the parents, and a moment matrix of their characters may as well exist. Thus a Gaussian distribution may serve as a good approximation of the region of acceptability, A, determining the possible spread of parents
3600:
For instance if we replace a generation with Gaussian distributed quantitative characters of one billion individuals in a large population with a new generation, the situation is quite different. This is like sending one billion different Gaussian distributed messages in parallel from parents to
3002:
Compression software does give a nice rule-of-thumb entropy estimate, but in this case the actual entropy is a lot lower because compression software designed for general-purpose use doesn't have the extensive knowledge of the language that allows humans to see more redundancy in the text. More
122:
The extension to the continuous case has a subtle problem: the distribution f(x) has units of inverse length and the integral contains "log f(x)" in it. Logarithms should be taken on dimensionless quantities (quantities without units). Thus, the logarithm should be of the ratio of f(x) to some
3040:
No, that's like saying "The sum of 2 plus 2 can be regarded as 4." Entropy has a precise mathematical definition. It isn't just possible to "regard" it as having an exact value, it actually does have an exact value. At most it can be said that entropy is hard to measure, which (along with
126:
The problem with taking a transcendental function of a quantity with units arises from the way we define arithmetic operations for quantities with units. 5 m + 2 m is defined (5 m + 2 m = 7 m) but 5 m + 2 kg is not defined because the units are different among the quantities to be added.
84:
The statement at the end of the second paragraph is simply not true: "the shortest number of bits necessary to transmit the message is the Shannon entropy in bits/symbol multiplied by the number of symbols in the original message." -- the formula of (bit/symbol * number of symbols) does
1363: 566: 279:
In the roulette example, the entropy of a combination of numbers hit over P spins is defined as Omega/T, but the entropy is given as lg(Omega), which then calculates to the Shannon definition. Why is lg(Omega) used? (Note: I'm using the notation "lg" to denote "log base 2")
3784:
I suggest renaming this article to either "Entropy (information theory)", or preferably, "Shannon entropy". The term "Information entropy" seems to be rarely used in a serious academic context, and I believe the term is redundant and unnecessarily confusing. Information
3580:
The article states:“Equivalently, the Shannon entropy is a measure of the average information content the recipient is missing when he does not know the value of the random variable.” This has also been interpreted as an uncertainty in a system, not a measure of the
1161: 3882:
links to the new article title? That would let the people watching those articles about the new title and avoid all those annoying redirects for people who are just browsing Knowledge. I guess I'm just not familiar with what's generally done in cases like this.
1702: 3063:
Boltzmann, Ludwig (1896, 1898). Vorlesungen über Gastheorie : 2 Volumes - Leipzig 1895/98 UB: O 5262-6. English version: Lectures on gas theory. Translated by Stephen G. Brush (1964) Berkeley: University of California Press; (1995) New York: Dover
270:
Thus, the last definition of h could not even be used. I recommend checking with a reliable source on this, then, maybe, if that formula is wrong, its erasure. Misfortunately, I have no knowledge of the way formulas are written in wikipedia (yet).
3485: 3120:
They referenced article is mistaken. It refutes the claim that "information is proportional to physical randomness". However, the more random a system is the more information we need in order to describe it. I suggest we remove this reference.
4143:
Ok, so you're angry and thinking Claude Shannon sucks. Even as I type I realize this is a pointless post but seriously, expressing it unambigously in mathematical terms that are irrefutable is essential, especially in a subject area such as
1876: 3590:
This interpretation is valid if we are sending a message from a sender to a receiver along a noisy channel, which may make the message uncertain. But there is an alternative interpretation where information entropy is hardly a measure of
865: 2929: 300:
Since the entropy was given as a definition, it does not need to be derived. On the other hand, a "derivation" can be given which gives a sense of the motivation for the definition as well as the link to thermodynamic entropy.
998: 4053:
article about generalizations of Shannon's entropy would be more appropriate for mentioning Rényi entropy. Also by the way, I read Hartley's 1928 paper "Transmission of Information" after somebody posted a link to it in the
676: 2598: 1475: 3277: 2603:
There appears to have been a confusion between two meanings of the word "outcome". Previously, the word was being used on these pages in a loose, informal, everyday sense to mean "the range of the random variable
3164:
Im sorry if the above concept is a bit basic and present in basic textbooks. I have not studied the subject formally, but i may have to apply the entropy concenpt in a small analysis for my master's dissertation.
4227:. This talk page mentions that the article incorporates material from PlanetMath, which is licensed under the GFDL, but I am not sure that is enough? So, should the section be removed as a copyright violation? — 2835: 2504: 1981: 3955:
I'm not trying to suggest it is. The pre-existing edit history for the target page simply makes it technically more difficult to accomplish the move without an administrator's help. I'll see what I can do.
1174: 2093: 427: 4097:, was already known to Boltzmann and Gibbs from their study of the entropy discovered by Clausius based upon Carnot's work improving the efficiency (and theoretical understanding) of steam engines. 2187: 2985: 1010: 2324: 3719: 406: 1486: 719: 2279: 4095: 3801:
etc., "Shannon entropy" is the term almost universally used. For me, the term "information entropy" is too vague and could easily be interpreted to include such concepts as
3635:
This section needs a major rewrite. It correctly states that Shannon entropy depends crucially on a probabilistic model. Several important points need to be made, though.
3289: 3789:
entropy in the context of Shannon's theory, and when it is necessary to disambiguate this type of information-theoretic entropy from other concepts such as thermodynamic
130:
My suggested solution to the problem with the units raises another question: what choice of length L should be used in the expression log ? I think any choice can work.
3490:
which should corresponds to a rectangular distribution of m(x) between xmin and xmax. It is the entropy of a general bounded signal, and it gives the entropy in bits.
1708: 2952:
So how can entropy be the expectation of self-information? I sort-of understand what the formula is coming from, but it doesn't look theoretically sound... Thanks.
750: 3003:
rigorous experiments usually show lower entropy rates for English, typically between 1.0 and 1.5 bits per character, as described in the reference I've added.
3523:
also in the next paragraph quote However, if we use very large blocks, then the estimate of per-character entropy rate may become artificially low. endquote
4342: 4330: 71: 59: 3117:
Regrading the reference: Information is not entropy, information is not uncertainty ! - a discussion of the use of the terms "information" and "entropy".
2986:
http://etext.lib.virginia.edu/etcbin/toccer-new2?id=StoCabi.sgm&images=images/modeng&data=/texts/english/modeng/parsed&tag=public&part=all
2846: 875: 577: 2515: 218: 186: 133: 3946:
Already having an edit history isn't a valid reason not to move - the edit history would just have to be copied to the talk page to preserve it.
1374: 4057:
article. This guy was apparently the first one to recognize that the amount of information that could be transmitted was proportional to the
3179: 3654:
is, as stated, a measure of the complexity of an individual message, independent of any probability distribution, however it is only defined
2970:
Uh-oh, what have I done? "Failed to parse (Missing texvc executable; please see math/README to configure.)" Could you please fix? Thank you.
3643:, not entropy. Entropy is a measure of the complexity of the whole probability distribution, not of an individual message. Entropy is the 4145: 3533: 2717:(Ω), with equality holding only if the mapping is one-to-one over all subsets of Ω with non-zero measure. (the "data processing theorem"). 2203: 222: 190: 109: 4062:
proportional to the width of the range of frequencies that one is allowed to use. And the formula for unequal probability distributions,
1358:{\displaystyle H=\int _{1}^{P}\log x\,dx-\int _{1}^{A_{1}}\log x\,dx-\int _{1}^{A_{2}}\log x\,dx-\cdots -\int _{1}^{A_{n}}\log x\,dx.\,\!} 244:
This would ensure the complete canceling of the second sum in H^Delta. With the current formula, there would remain a non-canceling term:
4129: 3751: 3563: 3130: 2726: 2395: 1882: 137: 4318: 3068: 2356: 561:{\displaystyle p={\Omega \over \mathrm {T} }={P! \over A_{1}!\ A_{2}!\ A_{3}!\ \cdots \ A_{n}!}\left({\frac {1}{n}}\right)^{P}\,\!} 47: 17: 4212:<!-- Figure: Discretizing the function $ f$ into bins of width $ \Delta$ \includegraphics{function-with-bins.eps} --: --> 3087: 2006: 4224: 4194: 4015: 3872: 3283:
the distribution to a reference measure m(x). It could be pointed out that a useful special case of the relative entropy is:
3601:
offspring. Every new message is a random – noisy - recombination of messages from two randomly chosen parents, for instance.
2230: 2115: 3915: 3030:
saying "The entropy of English text can be regarded as being between 1.0 and 1.5 bits per letter." or similar instead. —
1156:{\displaystyle =\sum _{i}^{P}\log i-\sum _{i}^{A_{1}}\log i-\sum _{i}^{A_{2}}\log i-\cdots -\sum _{i}^{A_{n}}\log i\,\!} 315:
pockets which are all equally likely to be landed on by the ball, what is the probability of obtaining a distribution (
3932:
Still can't do the move, even though I tried to move the old page out of the way. An administrator needs to do this.
3526:
isn't the `per-character entropy rate' redundant? should be either the `per-character entropy' or the `entropy rate'
252:
The last limit does not go to zero. Actually, through a l'Hopital applied to (1-Sum) / (1/log Delta) , it would go to
237:
The last definition of the differential entropy (second last formula) seems to malfunction. Actually, it should read
1697:{\displaystyle H=(P\log P-P+1)-(A_{1}\log A_{1}-A_{1}+1)-(A_{2}\log A_{2}-A_{2}+1)-\cdots -(A_{n}\log A_{n}-A_{n}+1)} 4309: 4282: 4232: 38: 2211: 2974: 2965: 2956: 2938: 2290: 4164: 3672: 2207: 418: 353: 105: 4149: 3537: 2961:
Ok, maybe I understand. I(omega) is a number, but I(X) is itself a random variable. I have fixed the formula.
226: 194: 4133: 3755: 3567: 3134: 141: 4278: 4228: 4160: 681:
is the number of possible combinations of outcomes (for the events) which fit the given distribution, and
3812:
I would do this myself, but this article is rather frequently viewed, so I am seeking some input first.
3144: 3042: 687: 101: 3651: 3004: 2971: 2962: 2953: 2636:
But "outcome" also has a technical meaning in probability, meaning the possible states of the universe {
2352: 281: 4286: 4252: 4236: 4198: 4168: 4153: 4137: 4106: 4033: 4019: 3988: 3974: 3965: 3950: 3941: 3927: 3910: 3891: 3876: 3839: 3821: 3759: 3737: 3622: 3571: 3541: 3496: 3147: 3107: 3045: 3034: 3016: 3007: 2996: 2234: 284: 230: 198: 169: 145: 4178:
The section about the entropy of a continuous function refers to a figure, but no figure is present.
3480:{\displaystyle H_{relative}=-\int _{x_{min}}^{x_{max}}f(x)\log _{2}(f(x)(x_{max}-x_{min}))\,dx,\quad } 4248: 4182: 4125: 3747: 3639:
When we are talking about the information content of an individual message, we are talking about its
3559: 3529: 3126: 3083: 3079: 3075: 3013: 2993: 2344: 2250: 2199: 97: 93:
message! The original should be replaced with something like the "shortest possible representation".
4186: 4098: 3970:
Oh, I see now. :-) I'm an admin and I'll do the move once the discussion settles (has it already?)
3957: 3933: 3919: 3883: 3813: 3729: 161:
If you want to work with continuous variables, you're on much stronger ground if you work with the
4262: 4220: 4065: 4190: 4102: 4054: 4012: 3961: 3937: 3923: 3887: 3869: 3845: 3817: 3733: 2335: 4266: 3860:, which is a generalization of Shannon entropy, . Throughout this article, the unqualified word 3493: 2988:, its about a megabyte of text. If I compress it using winzip I get 395K bytes. bzip2: 295KB. 1871:{\displaystyle =(P\log P+1)-(A_{1}\log A_{1}+1)-(A_{2}\log A_{2}+1)-\cdots -(A_{n}\log A_{n}+1)} 4001: 3510:
with symbols as two-character blocks, then the entropy rate is 0 bits per character. endquote
3065: 2226: 3857: 3802: 3794: 3640: 3618: 3057: 2348: 162: 3173:
I think there need to be some explanition on the matter of units for the continuous case.
860:{\displaystyle H=\log \Omega =\log {\frac {P!}{A_{1}!\ A_{2}!\ A_{3}!\cdots \ A_{n}!}}\,\!} 4244: 3806: 3798: 3725: 3031: 2381: 4029: 3984: 3906: 3835: 4209:, and apparently the figure itself never was added as an image but only as a comment: 3901:
need tweaking I think, but nothing like 403. But that link I gave has all the info. --
3724:
Such a bound would be extremely to obtain in the case of a single message, due to the
4009: 3971: 3947: 3866: 3852:
is a measure of . Several types of entropy can be introduced, the most common one is
3104: 3897: 3158:
Im looking for realiable, hard references for the following phrase in the article:
2924:{\displaystyle H(\Omega )=-\sum _{\omega \in \Omega }p(\omega )\log _{2}p(\omega )} 2218: 262:
1), so it would cancel the first Delta in the limit above, and there would be only
2368:
Recent edits to this page now stress the word "outcome" in the opening sentence:
993:{\displaystyle =\log P!-\log A_{1}!-\log A_{2}!-\log A_{3}!-\cdots -\log A_{n}!\ } 4274: 3513:
the average number of bits needed to encode this string is zero (asymptotically)
4317:
If you wish to start a new discussion or revive an old one, please do so on the
4223:) without proper attribution of the authors as I think would be required by the 3658:
an additive constant, which depends on the specific model of computation chosen.
3614: 3516:
also, treating this as a markov chain (order 1), we can see from the formula in
2935: 166: 46:
If you wish to start a new discussion or revive an old one, please do so on the
4270: 4216: 671:{\displaystyle \Omega ={P! \over A_{1}!\ A_{2}!\ A_{3}!\ \cdots \ A_{n}!}\,\!} 2949:
Self-information of an event is a number, right? Not a random variable. Yes?
2593:{\displaystyle H(X)=-\sum _{\omega \in \Omega }p(\omega )\log _{2}p(\omega )} 1168:
The summations can be approximated closely by being replaced with integrals:
4025: 3980: 3902: 3831: 3506:
not sure about the section `Limitations of entropy as information content'.
741: 2989: 1470:{\displaystyle \int \log x\,dx=x\log x-\int x\,{dx \over x}=x\log x-x.\,\!} 3272:{\displaystyle H=-\int _{-\infty }^{\infty }f(x)\log _{2}f(x)\,dx,\quad } 308: 3790: 211:
Extending discrete entropy to the continuous case: differential entropy
2992:
235KB. This isn't normal English text, but I think you get the idea.
724:
is the number of all possible combinations of outcomes for the set of
2705:(Ω) are not in general the same. In fact we can say definitely that 2376:
is a measure of the average information content associated with the
123:
characteristic length L. Something like log would be more proper.
3611:
along A. See also the article about "Entropy in thermodynamics ...
2830:{\displaystyle H(X)=-\sum _{i=1}^{n}p(x_{i})\log _{2}p(x_{i}),\,\!} 2499:{\displaystyle H(X)=-\sum _{i=1}^{n}p(x_{i})\log _{2}p(x_{i}),\,\!} 3661:
Nonetheless the information entropy provides a lower bound on the
3655: 1976:{\displaystyle =P\log P-\sum _{x=1}^{n}A_{x}\log A_{x}+(1-n)\,\!} 89:
give the entropy when multiplied by the number of symbols in the
3100:
to different units of measurements: bits vs. nats vs. hartleys.
4296: 3517: 25: 3647:
self-information of a message, given our probabilistic model.
2217:
Not sure what you mean. At first glance it looks good to me.
4215:
Furthermore, apparently the text was copied and pasted from
2296: 2102:) can be dropped since it is a constant, independent of the 2244:
Thus, the Shannon entropy is a consequence of the equation
2088:{\displaystyle H=(1-n)-\sum _{x=1}^{n}p_{x}\log p_{x}\,\!} 3864:
will refer to Shannon entropy.", or something similar. --
4273:, all visible versions of that article were authored by 3103:
The article should probably be modified to reflect this
4206: 290:
moved to talk page because wikipedia is not a textbook
4068: 3675: 3292: 3182: 2849: 2729: 2518: 2398: 2293: 2253: 2182:{\displaystyle H=-\sum _{x=1}^{n}p_{x}\log p_{x}\,\!} 2118: 2009: 1885: 1711: 1489: 1377: 1177: 1013: 878: 753: 740:
The entropy of the distribution is obtained from the
690: 580: 430: 356: 3520:
and also in this article that the entropy rate is 0
2984:
If I take the text of the book "Uncle Tom's Cabin",
4261:I think you are right. As far as I understand the 2196:(Isn't factor of P dropped in the formula above?) 4089: 3713: 3479: 3271: 2923: 2829: 2657:...), which are then mapped down onto the states { 2592: 2498: 2318: 2273: 2181: 2087: 1975: 1870: 1696: 1469: 1357: 1155: 992: 859: 713: 670: 560: 400: 2689:It is important the mapping X may in general be 2825: 2494: 2177: 2083: 1971: 1465: 1353: 1151: 855: 666: 556: 396: 3766:Name change suggestion to alleviate confusion 3631:Limitations of entropy as information content 2682:(considered to be a function mapping Ω -: --> 8: 4205:The corresponding text apparently was added 3744:though each symbol (integer) is different. 2934:But in general the two are not the same. -- 411:is the total number of ball-landing events? 2319:{\displaystyle {\mathcal {S}}=k\ln \Omega } 3844:Yes, and the article could begin with "In 3714:{\displaystyle H(M)\leq \mathbb {E} K(M).} 401:{\displaystyle P=\sum _{i=1}^{n}A_{i}\,\!} 4117:Scientists make simple things complicated 4067: 3856:, defined as . Other definitions include 3692: 3691: 3674: 3665:Kolmogorov complexity of a message, i.e.: 3446: 3427: 3396: 3366: 3361: 3348: 3343: 3297: 3291: 3235: 3213: 3205: 3181: 3056:Since entropy was formally introduced by 2900: 2872: 2848: 2811: 2792: 2779: 2763: 2752: 2728: 2569: 2541: 2517: 2480: 2461: 2448: 2432: 2421: 2397: 2295: 2294: 2292: 2284:which relates to Boltzmann's definition, 2252: 2169: 2153: 2143: 2132: 2117: 2075: 2059: 2049: 2038: 2008: 2000:and doing some simple algebra we obtain: 1945: 1929: 1919: 1908: 1884: 1853: 1837: 1806: 1790: 1765: 1749: 1710: 1679: 1666: 1650: 1619: 1606: 1590: 1565: 1552: 1536: 1488: 1424: 1376: 1323: 1318: 1313: 1275: 1270: 1265: 1233: 1228: 1223: 1193: 1188: 1176: 1132: 1127: 1122: 1092: 1087: 1082: 1058: 1053: 1048: 1026: 1021: 1012: 979: 951: 929: 907: 877: 841: 823: 808: 793: 778: 752: 703: 691: 689: 652: 632: 617: 602: 587: 579: 548: 534: 517: 497: 482: 467: 452: 442: 437: 429: 388: 378: 367: 355: 3464: 3256: 2823: 2492: 2175: 2081: 1969: 1463: 1422: 1390: 1351: 1340: 1292: 1250: 1208: 1149: 853: 664: 554: 394: 4315:Do not edit the contents of this page. 3060:the article should refer to his work: 44:Do not edit the contents of this page. 261:infinity as 1/Delta (since Sum -: --> 7: 3518:http://en.wikipedia.org/Entropy_rate 3113:Mistake inside an external reference 2720:The correct equations are therefore 714:{\displaystyle \mathrm {T} =n^{P}\ } 248:0) = Integral - -lim (Delta -: --> 3896:Moving the page is quite easy, see 3051: 4174:Missing figure for continuous case 3214: 3209: 3012:Thanks, that was a good one.  :-) 2879: 2856: 2548: 2364:H(X), H(Ω), and the word 'outcome' 2313: 2266: 766: 692: 581: 443: 439: 24: 1368:The integral of the logarithm is 18:Talk:Entropy (information theory) 4300: 3979:seems pretty settled to me... -- 3771: 2629:...) that might be revealed for 2330:of thermodynamic entropy, where 2274:{\displaystyle H=\log \Omega \ } 29: 3052:Boltzmann's lectures on entropy 2389:and have changed formulas like 296:Derivation of Shannon's entropy 183:is purely speculative, though. 4225:GNU Free Documentation License 4034:14:27, 19 September 2008 (UTC) 4020:10:00, 19 September 2008 (UTC) 3989:08:22, 19 September 2008 (UTC) 3705: 3699: 3685: 3679: 3461: 3458: 3420: 3417: 3411: 3405: 3389: 3383: 3330: 3324: 3253: 3247: 3228: 3222: 3192: 3186: 3148:16:36, 25 September 2007 (UTC) 2918: 2912: 2893: 2887: 2859: 2853: 2817: 2804: 2785: 2772: 2739: 2733: 2587: 2581: 2562: 2556: 2528: 2522: 2486: 2473: 2454: 2441: 2408: 2402: 2028: 2016: 1966: 1954: 1865: 1830: 1818: 1783: 1777: 1742: 1736: 1715: 1691: 1643: 1631: 1583: 1577: 1529: 1523: 1496: 343:is the number of times pocket 1: 4287:22:21, 22 December 2008 (UTC) 4253:21:38, 22 December 2008 (UTC) 4237:21:28, 22 December 2008 (UTC) 4199:17:10, 22 December 2008 (UTC) 4154:02:15, 25 November 2008 (UTC) 4090:{\displaystyle -\sum p\log p} 3918:already has an edit history. 3475: 3267: 2235:23:13, 18 November 2008 (UTC) 2212:22:43, 18 November 2008 (UTC) 2109:distribution. The result is 231:10:18, 12 February 2007 (UTC) 146:18:06, 17 December 2006 (UTC) 118:Units and the Continuous Case 4169:09:02, 3 December 2010 (UTC) 3997:moved, indeed. I'm adding a 3916:Entropy (information theory) 3760:19:30, 21 January 2010 (UTC) 3572:19:41, 27 January 2008 (UTC) 3556:that should be introduced? 3542:07:23, 16 January 2008 (UTC) 3169:Units in the continuous case 2678:...) by the random variable 199:13:52, 8 February 2007 (UTC) 170:19:32, 7 February 2007 (UTC) 4107:22:03, 23 August 2008 (UTC) 3975:07:09, 27 August 2008 (UTC) 3966:06:22, 27 August 2008 (UTC) 3951:04:27, 27 August 2008 (UTC) 3942:06:42, 27 August 2008 (UTC) 3928:00:46, 27 August 2008 (UTC) 3911:13:40, 25 August 2008 (UTC) 3892:21:05, 23 August 2008 (UTC) 3877:10:59, 23 August 2008 (UTC) 3840:08:36, 23 August 2008 (UTC) 3822:01:29, 23 August 2008 (UTC) 3497:13:38, 6 October 2007 (UTC) 2980:Compression of English Text 2608:" -- ie the set of values { 4361: 3769: 285:20:41, 31 March 2006 (UTC) 4138:17:47, 24 June 2008 (UTC) 3738:21:28, 15 July 2008 (UTC) 3108:01:16, 13 June 2007 (UTC) 3046:13:18, 26 June 2007 (UTC) 2975:13:30, 4 March 2007 (UTC) 2966:13:27, 4 March 2007 (UTC) 2957:13:19, 4 March 2007 (UTC) 2939:11:37, 4 March 2007 (UTC) 734:And what is the entropy? 3623:13:30, 8 June 2008 (UTC) 3035:11:43, 7 June 2007 (UTC) 3017:19:35, 21 May 2007 (UTC) 3008:19:23, 21 May 2007 (UTC) 2997:19:06, 13 May 2007 (UTC) 419:multinomial distribution 3793:, topological entropy, 3502:Entropy vs Entropy Rate 3025:Entropy of English text 295: 4207:almost three years ago 4091: 3715: 3481: 3273: 2944: 2925: 2831: 2768: 2594: 2500: 2437: 2363: 2359:) 17:34, 1 March 2007. 2320: 2275: 2183: 2148: 2089: 2054: 1977: 1924: 1872: 1698: 1471: 1359: 1157: 1139: 1099: 1065: 1031: 994: 861: 715: 672: 562: 402: 383: 4313:of past discussions. 4092: 3716: 3652:Kolmogorov complexity 3482: 3274: 3154:Looking for reference 3137:) 07:32, 13 June 2007 3129:comment was added by 3090:) 19:35, 7 June 2007. 3078:comment was added by 2945:Sorry, I don't get it 2926: 2832: 2748: 2595: 2501: 2417: 2347:comment was added by 2321: 2276: 2184: 2128: 2090: 2034: 1978: 1904: 1873: 1699: 1472: 1360: 1158: 1118: 1078: 1044: 1017: 995: 862: 716: 673: 563: 417:The probability is a 403: 363: 247:h = lim (Delta -: --> 240:h = lim (Delta -: --> 221:comment was added by 189:comment was added by 136:comment was added by 42:of past discussions. 4066: 3673: 3290: 3180: 2847: 2727: 2516: 2396: 2291: 2251: 2116: 2007: 1883: 1709: 1487: 1375: 1175: 1011: 876: 751: 688: 578: 428: 354: 259:and, as Delta -: --> 3379: 3218: 2374:information entropy 1330: 1282: 1240: 1198: 265:- lim (Delta -: --> 255:- lim (Delta -: --> 4087: 4055:Information theory 4024:oops ok, thanks -- 4010:A r m y 1 9 8 7 ! 3846:information theory 3711: 3477: 3476: 3465: 3339: 3269: 3268: 3257: 3201: 2921: 2883: 2827: 2826: 2824: 2590: 2552: 2496: 2495: 2493: 2336:Boltzmann constant 2316: 2271: 2179: 2178: 2176: 2098:and the term (1 − 2085: 2084: 2082: 1973: 1972: 1970: 1868: 1694: 1480:So the entropy is 1467: 1466: 1464: 1423: 1391: 1355: 1354: 1352: 1341: 1309: 1293: 1261: 1251: 1219: 1209: 1184: 1153: 1152: 1150: 990: 857: 856: 854: 711: 668: 667: 665: 558: 557: 555: 398: 397: 395: 347:was landed on and 4348: 4347: 4325: 4324: 4319:current talk page 4202: 4185:comment added by 4140: 4128:comment added by 3750:comment added by 3574: 3562:comment added by 3544: 3532:comment added by 3138: 3091: 2868: 2537: 2360: 2270: 2222: 2202:comment added by 1437: 989: 851: 836: 818: 803: 710: 662: 647: 642: 627: 612: 542: 527: 512: 507: 492: 477: 447: 234: 202: 149: 114: 100:comment added by 77: 76: 54: 53: 48:current talk page 4352: 4339: 4327: 4326: 4304: 4303: 4297: 4279:Tobias Bergemann 4229:Tobias Bergemann 4201: 4179: 4123: 4096: 4094: 4093: 4088: 4018: 4007:tag at the top. 4006: 4000: 3875: 3867:A r m y 1 9 8 7 3781: 3775: 3774: 3762: 3720: 3718: 3717: 3712: 3695: 3641:self-information 3557: 3527: 3486: 3484: 3483: 3478: 3457: 3456: 3438: 3437: 3401: 3400: 3378: 3377: 3376: 3360: 3359: 3358: 3323: 3322: 3278: 3276: 3275: 3270: 3240: 3239: 3217: 3212: 3124: 3073: 3058:Ludwig Boltzmann 2930: 2928: 2927: 2922: 2905: 2904: 2882: 2836: 2834: 2833: 2828: 2816: 2815: 2797: 2796: 2784: 2783: 2767: 2762: 2599: 2597: 2596: 2591: 2574: 2573: 2551: 2505: 2503: 2502: 2497: 2485: 2484: 2466: 2465: 2453: 2452: 2436: 2431: 2342: 2325: 2323: 2322: 2317: 2300: 2299: 2280: 2278: 2277: 2272: 2269: 2220: 2214: 2188: 2186: 2185: 2180: 2174: 2173: 2158: 2157: 2147: 2142: 2094: 2092: 2091: 2086: 2080: 2079: 2064: 2063: 2053: 2048: 1982: 1980: 1979: 1974: 1950: 1949: 1934: 1933: 1923: 1918: 1877: 1875: 1874: 1869: 1858: 1857: 1842: 1841: 1811: 1810: 1795: 1794: 1770: 1769: 1754: 1753: 1703: 1701: 1700: 1695: 1684: 1683: 1671: 1670: 1655: 1654: 1624: 1623: 1611: 1610: 1595: 1594: 1570: 1569: 1557: 1556: 1541: 1540: 1476: 1474: 1473: 1468: 1438: 1433: 1425: 1364: 1362: 1361: 1356: 1329: 1328: 1327: 1317: 1281: 1280: 1279: 1269: 1239: 1238: 1237: 1227: 1197: 1192: 1162: 1160: 1159: 1154: 1138: 1137: 1136: 1126: 1098: 1097: 1096: 1086: 1064: 1063: 1062: 1052: 1030: 1025: 999: 997: 996: 991: 988: 984: 983: 956: 955: 934: 933: 912: 911: 866: 864: 863: 858: 852: 850: 846: 845: 835: 828: 827: 817: 813: 812: 802: 798: 797: 787: 779: 720: 718: 717: 712: 709: 708: 707: 695: 677: 675: 674: 669: 663: 661: 657: 656: 646: 641: 637: 636: 626: 622: 621: 611: 607: 606: 596: 588: 567: 565: 564: 559: 553: 552: 547: 543: 535: 528: 526: 522: 521: 511: 506: 502: 501: 491: 487: 486: 476: 472: 471: 461: 453: 448: 446: 438: 407: 405: 404: 399: 393: 392: 382: 377: 275:Roulette Example 216: 184: 163:relative entropy 131: 113: 94: 68: 56: 55: 33: 32: 26: 4360: 4359: 4355: 4354: 4353: 4351: 4350: 4349: 4335: 4301: 4213: 4180: 4176: 4161:Tschijnmotschau 4119: 4064: 4063: 4008: 4004: 3998: 3865: 3854:Shannon entropy 3807:Tsallis entropy 3799:Tsallis entropy 3782: 3779: 3777: 3772: 3768: 3745: 3726:halting problem 3671: 3670: 3633: 3553: 3504: 3442: 3423: 3392: 3362: 3344: 3293: 3288: 3287: 3231: 3178: 3177: 3171: 3156: 3125:—The preceding 3115: 3097: 3074:—The preceding 3054: 3027: 3014:Daniel.Cardenas 2994:Daniel.Cardenas 2982: 2947: 2896: 2845: 2844: 2807: 2788: 2775: 2725: 2724: 2677: 2670: 2663: 2656: 2649: 2642: 2628: 2621: 2614: 2565: 2514: 2513: 2476: 2457: 2444: 2394: 2393: 2382:random variable 2366: 2343:—The preceding 2340: 2289: 2288: 2249: 2248: 2197: 2165: 2149: 2114: 2113: 2107: 2071: 2055: 2005: 2004: 1997: 1993: 1941: 1925: 1881: 1880: 1849: 1833: 1802: 1786: 1761: 1745: 1707: 1706: 1675: 1662: 1646: 1615: 1602: 1586: 1561: 1548: 1532: 1485: 1484: 1426: 1373: 1372: 1319: 1271: 1229: 1173: 1172: 1128: 1088: 1054: 1009: 1008: 975: 947: 925: 903: 874: 873: 837: 819: 804: 789: 788: 780: 749: 748: 699: 686: 685: 648: 628: 613: 598: 597: 589: 576: 575: 530: 529: 513: 493: 478: 463: 462: 454: 426: 425: 384: 352: 351: 341: 334: 327: 320: 298: 292: 277: 217:—The preceding 213: 185:—The preceding 132:—The preceding 120: 95: 82: 64: 30: 22: 21: 20: 12: 11: 5: 4358: 4356: 4346: 4345: 4340: 4333: 4323: 4322: 4305: 4294: 4292: 4291: 4290: 4289: 4256: 4255: 4211: 4175: 4172: 4146:85.224.240.204 4118: 4115: 4114: 4113: 4112: 4111: 4110: 4109: 4086: 4083: 4080: 4077: 4074: 4071: 4050: 4049: 4048: 4047: 4046: 4045: 4044: 4043: 4042: 4041: 4040: 4039: 4038: 4037: 4036: 4022: 3930: 3770: 3767: 3764: 3742: 3722: 3721: 3710: 3707: 3704: 3701: 3698: 3694: 3690: 3687: 3684: 3681: 3678: 3667: 3666: 3659: 3648: 3632: 3629: 3628: 3627: 3626: 3625: 3605: 3604: 3603: 3602: 3595: 3594: 3593: 3592: 3585: 3584: 3583: 3582: 3552: 3549: 3547: 3534:71.137.215.129 3503: 3500: 3488: 3487: 3474: 3471: 3468: 3463: 3460: 3455: 3452: 3449: 3445: 3441: 3436: 3433: 3430: 3426: 3422: 3419: 3416: 3413: 3410: 3407: 3404: 3399: 3395: 3391: 3388: 3385: 3382: 3375: 3372: 3369: 3365: 3357: 3354: 3351: 3347: 3342: 3338: 3335: 3332: 3329: 3326: 3321: 3318: 3315: 3312: 3309: 3306: 3303: 3300: 3296: 3280: 3279: 3266: 3263: 3260: 3255: 3252: 3249: 3246: 3243: 3238: 3234: 3230: 3227: 3224: 3221: 3216: 3211: 3208: 3204: 3200: 3197: 3194: 3191: 3188: 3185: 3170: 3167: 3155: 3152: 3151: 3150: 3145:198.145.196.71 3114: 3111: 3096: 3093: 3053: 3050: 3049: 3048: 3043:216.75.189.154 3026: 3023: 3022: 3021: 3020: 3019: 2981: 2978: 2946: 2943: 2932: 2931: 2920: 2917: 2914: 2911: 2908: 2903: 2899: 2895: 2892: 2889: 2886: 2881: 2878: 2875: 2871: 2867: 2864: 2861: 2858: 2855: 2852: 2838: 2837: 2822: 2819: 2814: 2810: 2806: 2803: 2800: 2795: 2791: 2787: 2782: 2778: 2774: 2771: 2766: 2761: 2758: 2755: 2751: 2747: 2744: 2741: 2738: 2735: 2732: 2675: 2668: 2661: 2654: 2647: 2640: 2626: 2619: 2612: 2601: 2600: 2589: 2586: 2583: 2580: 2577: 2572: 2568: 2564: 2561: 2558: 2555: 2550: 2547: 2544: 2540: 2536: 2533: 2530: 2527: 2524: 2521: 2507: 2506: 2491: 2488: 2483: 2479: 2475: 2472: 2469: 2464: 2460: 2456: 2451: 2447: 2443: 2440: 2435: 2430: 2427: 2424: 2420: 2416: 2413: 2410: 2407: 2404: 2401: 2387: 2386: 2365: 2362: 2328: 2327: 2315: 2312: 2309: 2306: 2303: 2298: 2282: 2281: 2268: 2265: 2262: 2259: 2256: 2242: 2241: 2240: 2239: 2238: 2237: 2204:128.200.203.33 2191: 2190: 2172: 2168: 2164: 2161: 2156: 2152: 2146: 2141: 2138: 2135: 2131: 2127: 2124: 2121: 2105: 2096: 2095: 2078: 2074: 2070: 2067: 2062: 2058: 2052: 2047: 2044: 2041: 2037: 2033: 2030: 2027: 2024: 2021: 2018: 2015: 2012: 1995: 1991: 1986: 1985: 1984: 1983: 1968: 1965: 1962: 1959: 1956: 1953: 1948: 1944: 1940: 1937: 1932: 1928: 1922: 1917: 1914: 1911: 1907: 1903: 1900: 1897: 1894: 1891: 1888: 1878: 1867: 1864: 1861: 1856: 1852: 1848: 1845: 1840: 1836: 1832: 1829: 1826: 1823: 1820: 1817: 1814: 1809: 1805: 1801: 1798: 1793: 1789: 1785: 1782: 1779: 1776: 1773: 1768: 1764: 1760: 1757: 1752: 1748: 1744: 1741: 1738: 1735: 1732: 1729: 1726: 1723: 1720: 1717: 1714: 1693: 1690: 1687: 1682: 1678: 1674: 1669: 1665: 1661: 1658: 1653: 1649: 1645: 1642: 1639: 1636: 1633: 1630: 1627: 1622: 1618: 1614: 1609: 1605: 1601: 1598: 1593: 1589: 1585: 1582: 1579: 1576: 1573: 1568: 1564: 1560: 1555: 1551: 1547: 1544: 1539: 1535: 1531: 1528: 1525: 1522: 1519: 1516: 1513: 1510: 1507: 1504: 1501: 1498: 1495: 1492: 1478: 1477: 1462: 1459: 1456: 1453: 1450: 1447: 1444: 1441: 1436: 1432: 1429: 1421: 1418: 1415: 1412: 1409: 1406: 1403: 1400: 1397: 1394: 1389: 1386: 1383: 1380: 1366: 1365: 1350: 1347: 1344: 1339: 1336: 1333: 1326: 1322: 1316: 1312: 1308: 1305: 1302: 1299: 1296: 1291: 1288: 1285: 1278: 1274: 1268: 1264: 1260: 1257: 1254: 1249: 1246: 1243: 1236: 1232: 1226: 1222: 1218: 1215: 1212: 1207: 1204: 1201: 1196: 1191: 1187: 1183: 1180: 1166: 1165: 1164: 1163: 1148: 1145: 1142: 1135: 1131: 1125: 1121: 1117: 1114: 1111: 1108: 1105: 1102: 1095: 1091: 1085: 1081: 1077: 1074: 1071: 1068: 1061: 1057: 1051: 1047: 1043: 1040: 1037: 1034: 1029: 1024: 1020: 1016: 1003: 1002: 1001: 1000: 987: 982: 978: 974: 971: 968: 965: 962: 959: 954: 950: 946: 943: 940: 937: 932: 928: 924: 921: 918: 915: 910: 906: 902: 899: 896: 893: 890: 887: 884: 881: 868: 867: 849: 844: 840: 834: 831: 826: 822: 816: 811: 807: 801: 796: 792: 786: 783: 777: 774: 771: 768: 765: 762: 759: 756: 722: 721: 706: 702: 698: 694: 679: 678: 660: 655: 651: 645: 640: 635: 631: 625: 620: 616: 610: 605: 601: 595: 592: 586: 583: 569: 568: 551: 546: 541: 538: 533: 525: 520: 516: 510: 505: 500: 496: 490: 485: 481: 475: 470: 466: 460: 457: 451: 445: 441: 436: 433: 409: 408: 391: 387: 381: 376: 373: 370: 366: 362: 359: 339: 332: 325: 318: 297: 294: 293: 291: 288: 276: 273: 223:193.254.231.71 212: 209: 208: 207: 206: 205: 204: 203: 191:193.254.231.71 175: 174: 173: 172: 156: 155: 119: 116: 102:139.149.31.232 81: 78: 75: 74: 69: 62: 52: 51: 34: 23: 15: 14: 13: 10: 9: 6: 4: 3: 2: 4357: 4344: 4341: 4338: 4334: 4332: 4329: 4328: 4320: 4316: 4312: 4311: 4306: 4299: 4298: 4295: 4288: 4284: 4280: 4276: 4272: 4268: 4264: 4260: 4259: 4258: 4257: 4254: 4250: 4246: 4241: 4240: 4239: 4238: 4234: 4230: 4226: 4222: 4218: 4210: 4208: 4203: 4200: 4196: 4192: 4188: 4184: 4173: 4171: 4170: 4166: 4162: 4156: 4155: 4151: 4147: 4141: 4139: 4135: 4131: 4130:63.144.61.175 4127: 4116: 4108: 4104: 4100: 4084: 4081: 4078: 4075: 4072: 4069: 4060: 4056: 4051: 4035: 4031: 4027: 4023: 4021: 4017: 4014: 4011: 4003: 3996: 3992: 3991: 3990: 3986: 3982: 3978: 3977: 3976: 3973: 3969: 3968: 3967: 3963: 3959: 3954: 3953: 3952: 3949: 3945: 3944: 3943: 3939: 3935: 3931: 3929: 3925: 3921: 3917: 3914: 3913: 3912: 3908: 3904: 3899: 3895: 3894: 3893: 3889: 3885: 3880: 3879: 3878: 3874: 3871: 3868: 3863: 3859: 3858:Rényi entropy 3855: 3851: 3847: 3843: 3842: 3841: 3837: 3833: 3829: 3826: 3825: 3824: 3823: 3819: 3815: 3810: 3808: 3804: 3803:Rényi entropy 3800: 3796: 3795:Rényi entropy 3792: 3788: 3765: 3763: 3761: 3757: 3753: 3752:99.65.138.158 3749: 3740: 3739: 3735: 3731: 3727: 3708: 3702: 3696: 3688: 3682: 3676: 3669: 3668: 3664: 3660: 3657: 3653: 3649: 3646: 3642: 3638: 3637: 3636: 3630: 3624: 3620: 3616: 3612: 3609: 3608: 3607: 3606: 3599: 3598: 3597: 3596: 3589: 3588: 3587: 3586: 3579: 3578: 3577: 3576: 3575: 3573: 3569: 3565: 3564:131.215.7.196 3561: 3550: 3548: 3545: 3543: 3539: 3535: 3531: 3524: 3521: 3519: 3514: 3511: 3507: 3501: 3499: 3498: 3495: 3491: 3472: 3469: 3466: 3453: 3450: 3447: 3443: 3439: 3434: 3431: 3428: 3424: 3414: 3408: 3402: 3397: 3393: 3386: 3380: 3373: 3370: 3367: 3363: 3355: 3352: 3349: 3345: 3340: 3336: 3333: 3327: 3319: 3316: 3313: 3310: 3307: 3304: 3301: 3298: 3294: 3286: 3285: 3284: 3264: 3261: 3258: 3250: 3244: 3241: 3236: 3232: 3225: 3219: 3206: 3202: 3198: 3195: 3189: 3183: 3176: 3175: 3174: 3168: 3166: 3162: 3159: 3153: 3149: 3146: 3141: 3140: 3139: 3136: 3132: 3131:89.139.67.125 3128: 3122: 3118: 3112: 3110: 3109: 3106: 3101: 3094: 3092: 3089: 3085: 3081: 3077: 3071: 3070: 3069:0-486-68455-5 3067: 3061: 3059: 3047: 3044: 3039: 3038: 3037: 3036: 3033: 3024: 3018: 3015: 3011: 3010: 3009: 3006: 3005:129.97.79.144 3001: 3000: 2999: 2998: 2995: 2991: 2987: 2979: 2977: 2976: 2973: 2972:83.67.217.254 2968: 2967: 2964: 2963:83.67.217.254 2959: 2958: 2955: 2954:83.67.217.254 2950: 2942: 2940: 2937: 2915: 2909: 2906: 2901: 2897: 2890: 2884: 2876: 2873: 2869: 2865: 2862: 2850: 2843: 2842: 2841: 2820: 2812: 2808: 2801: 2798: 2793: 2789: 2780: 2776: 2769: 2764: 2759: 2756: 2753: 2749: 2745: 2742: 2736: 2730: 2723: 2722: 2721: 2718: 2716: 2712: 2708: 2704: 2700: 2696: 2692: 2687: 2685: 2681: 2674: 2667: 2660: 2653: 2646: 2639: 2634: 2632: 2625: 2618: 2611: 2607: 2584: 2578: 2575: 2570: 2566: 2559: 2553: 2545: 2542: 2538: 2534: 2531: 2525: 2519: 2512: 2511: 2510: 2489: 2481: 2477: 2470: 2467: 2462: 2458: 2449: 2445: 2438: 2433: 2428: 2425: 2422: 2418: 2414: 2411: 2405: 2399: 2392: 2391: 2390: 2384: 2383: 2377: 2375: 2371: 2370: 2369: 2361: 2358: 2354: 2350: 2346: 2339: 2337: 2333: 2310: 2307: 2304: 2301: 2287: 2286: 2285: 2263: 2260: 2257: 2254: 2247: 2246: 2245: 2236: 2232: 2228: 2224: 2216: 2215: 2213: 2209: 2205: 2201: 2195: 2194: 2193: 2192: 2170: 2166: 2162: 2159: 2154: 2150: 2144: 2139: 2136: 2133: 2129: 2125: 2122: 2119: 2112: 2111: 2110: 2108: 2101: 2076: 2072: 2068: 2065: 2060: 2056: 2050: 2045: 2042: 2039: 2035: 2031: 2025: 2022: 2019: 2013: 2010: 2003: 2002: 2001: 1999: 1963: 1960: 1957: 1951: 1946: 1942: 1938: 1935: 1930: 1926: 1920: 1915: 1912: 1909: 1905: 1901: 1898: 1895: 1892: 1889: 1886: 1879: 1862: 1859: 1854: 1850: 1846: 1843: 1838: 1834: 1827: 1824: 1821: 1815: 1812: 1807: 1803: 1799: 1796: 1791: 1787: 1780: 1774: 1771: 1766: 1762: 1758: 1755: 1750: 1746: 1739: 1733: 1730: 1727: 1724: 1721: 1718: 1712: 1705: 1704: 1688: 1685: 1680: 1676: 1672: 1667: 1663: 1659: 1656: 1651: 1647: 1640: 1637: 1634: 1628: 1625: 1620: 1616: 1612: 1607: 1603: 1599: 1596: 1591: 1587: 1580: 1574: 1571: 1566: 1562: 1558: 1553: 1549: 1545: 1542: 1537: 1533: 1526: 1520: 1517: 1514: 1511: 1508: 1505: 1502: 1499: 1493: 1490: 1483: 1482: 1481: 1460: 1457: 1454: 1451: 1448: 1445: 1442: 1439: 1434: 1430: 1427: 1419: 1416: 1413: 1410: 1407: 1404: 1401: 1398: 1395: 1392: 1387: 1384: 1381: 1378: 1371: 1370: 1369: 1348: 1345: 1342: 1337: 1334: 1331: 1324: 1320: 1314: 1310: 1306: 1303: 1300: 1297: 1294: 1289: 1286: 1283: 1276: 1272: 1266: 1262: 1258: 1255: 1252: 1247: 1244: 1241: 1234: 1230: 1224: 1220: 1216: 1213: 1210: 1205: 1202: 1199: 1194: 1189: 1185: 1181: 1178: 1171: 1170: 1169: 1146: 1143: 1140: 1133: 1129: 1123: 1119: 1115: 1112: 1109: 1106: 1103: 1100: 1093: 1089: 1083: 1079: 1075: 1072: 1069: 1066: 1059: 1055: 1049: 1045: 1041: 1038: 1035: 1032: 1027: 1022: 1018: 1014: 1007: 1006: 1005: 1004: 985: 980: 976: 972: 969: 966: 963: 960: 957: 952: 948: 944: 941: 938: 935: 930: 926: 922: 919: 916: 913: 908: 904: 900: 897: 894: 891: 888: 885: 882: 879: 872: 871: 870: 869: 847: 842: 838: 832: 829: 824: 820: 814: 809: 805: 799: 794: 790: 784: 781: 775: 772: 769: 763: 760: 757: 754: 747: 746: 745: 743: 739: 735: 733: 729: 727: 704: 700: 696: 684: 683: 682: 658: 653: 649: 643: 638: 633: 629: 623: 618: 614: 608: 603: 599: 593: 590: 584: 574: 573: 572: 549: 544: 539: 536: 531: 523: 518: 514: 508: 503: 498: 494: 488: 483: 479: 473: 468: 464: 458: 455: 449: 434: 431: 424: 423: 422: 420: 416: 412: 389: 385: 379: 374: 371: 368: 364: 360: 357: 350: 349: 348: 346: 342: 335: 328: 321: 314: 310: 306: 302: 289: 287: 286: 283: 282:66.151.13.191 274: 272: 268: 263: 260:0, Sum -: --> 257: 253: 250: 249:0) -1 ) ] . 245: 242: 238: 235: 232: 228: 224: 220: 210: 200: 196: 192: 188: 181: 180: 179: 178: 177: 176: 171: 168: 164: 160: 159: 158: 157: 152: 151: 150: 147: 143: 139: 135: 128: 124: 117: 115: 111: 107: 103: 99: 92: 88: 79: 73: 70: 67: 63: 61: 58: 57: 49: 45: 41: 40: 35: 28: 27: 19: 4336: 4314: 4308: 4293: 4275:Kenneth Shum 4214: 4204: 4177: 4157: 4142: 4120: 4058: 3994: 3993:The article 3861: 3853: 3849: 3827: 3811: 3786: 3783: 3741: 3723: 3662: 3644: 3634: 3591:uncertainty. 3581:information. 3554: 3546: 3525: 3522: 3515: 3512: 3508: 3505: 3492: 3489: 3281: 3172: 3163: 3160: 3157: 3123: 3119: 3116: 3102: 3098: 3072: 3062: 3055: 3028: 2983: 2969: 2960: 2951: 2948: 2933: 2839: 2719: 2714: 2710: 2706: 2702: 2698: 2694: 2690: 2688: 2683: 2679: 2672: 2665: 2658: 2651: 2644: 2637: 2635: 2630: 2623: 2616: 2609: 2605: 2602: 2508: 2388: 2379: 2373: 2372: 2367: 2341: 2331: 2329: 2283: 2243: 2103: 2099: 2097: 1989: 1987: 1479: 1367: 1167: 737: 736: 731: 730: 725: 723: 680: 570: 414: 413: 410: 344: 337: 330: 323: 316: 312: 304: 303: 299: 278: 269: 264: 258: 254: 251: 246: 243: 239: 236: 214: 138:75.85.88.234 129: 125: 121: 90: 86: 83: 65: 43: 37: 4307:This is an 4181:—Preceding 4124:—Preceding 3780:Page moved. 3746:—Preceding 3558:—Preceding 3551:Uncertainty 3528:—Preceding 2691:many-to-one 2349:MisterSheik 2198:—Preceding 1988:By letting 267:- infinity 96:—Preceding 36:This is an 4271:PlanetMath 4245:Shreevatsa 4217:PlanetMath 3080:Algorithms 3032:Bromskloss 266:0) -: --> 4343:Archive 3 4337:Archive 2 4331:Archive 1 4059:logarithm 3095:log basis 742:logarithm 72:Archive 3 66:Archive 2 60:Archive 1 4195:contribs 4187:Halberdo 4183:unsigned 4126:unsigned 4099:Deepmath 4002:resolved 3972:Dcoetzee 3958:Deepmath 3948:Dcoetzee 3934:Deepmath 3920:Deepmath 3884:Deepmath 3814:Deepmath 3776:Resolved 3748:unsigned 3730:Deepmath 3663:expected 3645:expected 3560:unsigned 3530:unsigned 3127:unsigned 3105:HyDeckar 3088:contribs 3076:unsigned 2713:) <= 2378:outcome 2357:contribs 2345:unsigned 2200:unsigned 728:events. 336:) where 309:roulette 307:Given a 219:unsigned 187:unsigned 134:unsigned 110:contribs 98:unsigned 91:original 4310:archive 4267:article 4265:of the 4263:history 3862:entropy 3850:entropy 3828:Support 3791:entropy 2334:is the 421:, viz. 256:0) ], 80:minimum 39:archive 3615:Kjells 2936:Jheald 2701:) and 744:of Ω: 571:where 241:0) ] 167:Jheald 4219:(see 4144:this. 3898:WP:MV 3656:up to 3494:Petkr 3143:it." 2990:paq8l 2693:: so 2380:of a 2221:RETOG 329:, …, 311:with 16:< 4283:talk 4277:. — 4249:talk 4233:talk 4221:here 4191:talk 4165:talk 4159:to. 4150:talk 4134:talk 4103:talk 4030:talk 4026:mcld 3985:talk 3981:mcld 3962:talk 3938:talk 3924:talk 3907:talk 3903:mcld 3888:talk 3836:talk 3832:mcld 3818:talk 3805:and 3756:talk 3734:talk 3650:The 3619:talk 3568:talk 3538:talk 3135:talk 3084:talk 3066:ISBN 2353:talk 2208:talk 227:talk 195:talk 142:talk 106:talk 4269:at 4079:log 3995:was 3728:. 3613:]-- 3394:log 3233:log 2898:log 2840:or 2790:log 2686:). 2567:log 2509:to 2459:log 2261:log 2160:log 2066:log 1994:= A 1936:log 1893:log 1844:log 1797:log 1756:log 1722:log 1657:log 1597:log 1543:log 1503:log 1446:log 1405:log 1382:log 1332:log 1284:log 1242:log 1200:log 1141:log 1101:log 1067:log 1033:log 970:log 942:log 920:log 898:log 883:log 773:log 761:log 87:not 4285:) 4251:) 4235:) 4197:) 4193:• 4167:) 4152:) 4136:) 4105:) 4082:⁡ 4073:∑ 4070:− 4032:) 4016:! 4013:! 4005:}} 3999:{{ 3987:) 3964:) 3940:) 3926:) 3909:) 3890:) 3873:! 3870:! 3848:, 3838:) 3820:) 3797:, 3787:is 3778:– 3758:) 3736:) 3689:≤ 3621:) 3570:) 3540:) 3440:− 3403:⁡ 3341:∫ 3337:− 3242:⁡ 3215:∞ 3210:∞ 3207:− 3203:∫ 3199:− 3086:• 2941:. 2916:ω 2907:⁡ 2891:ω 2880:Ω 2877:∈ 2874:ω 2870:∑ 2866:− 2857:Ω 2799:⁡ 2750:∑ 2746:− 2671:, 2664:, 2650:, 2643:, 2633:. 2622:, 2615:, 2585:ω 2576:⁡ 2560:ω 2549:Ω 2546:∈ 2543:ω 2539:∑ 2535:− 2468:⁡ 2419:∑ 2415:− 2355:• 2338:. 2314:Ω 2311:⁡ 2308:ln 2267:Ω 2264:⁡ 2233:) 2210:) 2163:⁡ 2130:∑ 2126:− 2069:⁡ 2036:∑ 2032:− 2023:− 1998:/P 1961:− 1939:⁡ 1906:∑ 1902:− 1896:⁡ 1847:⁡ 1828:− 1825:⋯ 1822:− 1800:⁡ 1781:− 1759:⁡ 1740:− 1725:⁡ 1673:− 1660:⁡ 1641:− 1638:⋯ 1635:− 1613:− 1600:⁡ 1581:− 1559:− 1546:⁡ 1527:− 1512:− 1506:⁡ 1455:− 1449:⁡ 1417:∫ 1414:− 1408:⁡ 1385:⁡ 1379:∫ 1335:⁡ 1311:∫ 1307:− 1304:⋯ 1301:− 1287:⁡ 1263:∫ 1259:− 1245:⁡ 1221:∫ 1217:− 1203:⁡ 1186:∫ 1144:⁡ 1120:∑ 1116:− 1113:⋯ 1110:− 1104:⁡ 1080:∑ 1076:− 1070:⁡ 1046:∑ 1042:− 1036:⁡ 1019:∑ 973:⁡ 967:− 964:⋯ 961:− 945:⁡ 939:− 923:⁡ 917:− 901:⁡ 895:− 886:⁡ 833:⋯ 776:⁡ 767:Ω 764:⁡ 738:A. 732:Q. 644:⋯ 582:Ω 509:⋯ 440:Ω 415:A. 365:∑ 322:, 305:Q. 229:) 215:Q 197:) 144:) 112:) 108:• 4321:. 4281:( 4247:( 4231:( 4189:( 4163:( 4148:( 4132:( 4101:( 4085:p 4076:p 4028:( 3983:( 3960:( 3936:( 3922:( 3905:( 3886:( 3834:( 3816:( 3754:( 3732:( 3709:. 3706:) 3703:M 3700:( 3697:K 3693:E 3686:) 3683:M 3680:( 3677:H 3617:( 3566:( 3536:( 3473:, 3470:x 3467:d 3462:) 3459:) 3454:n 3451:i 3448:m 3444:x 3435:x 3432:a 3429:m 3425:x 3421:( 3418:) 3415:x 3412:( 3409:f 3406:( 3398:2 3390:) 3387:x 3384:( 3381:f 3374:x 3371:a 3368:m 3364:x 3356:n 3353:i 3350:m 3346:x 3334:= 3331:] 3328:f 3325:[ 3320:e 3317:v 3314:i 3311:t 3308:a 3305:l 3302:e 3299:r 3295:H 3265:, 3262:x 3259:d 3254:) 3251:x 3248:( 3245:f 3237:2 3229:) 3226:x 3223:( 3220:f 3196:= 3193:] 3190:f 3187:[ 3184:H 3133:( 3082:( 2919:) 2913:( 2910:p 2902:2 2894:) 2888:( 2885:p 2863:= 2860:) 2854:( 2851:H 2821:, 2818:) 2813:i 2809:x 2805:( 2802:p 2794:2 2786:) 2781:i 2777:x 2773:( 2770:p 2765:n 2760:1 2757:= 2754:i 2743:= 2740:) 2737:X 2734:( 2731:H 2715:H 2711:X 2709:( 2707:H 2703:H 2699:X 2697:( 2695:H 2684:R 2680:X 2676:3 2673:x 2669:2 2666:x 2662:1 2659:x 2655:3 2652:ω 2648:2 2645:ω 2641:1 2638:ω 2631:X 2627:3 2624:x 2620:2 2617:x 2613:1 2610:x 2606:X 2588:) 2582:( 2579:p 2571:2 2563:) 2557:( 2554:p 2532:= 2529:) 2526:X 2523:( 2520:H 2490:, 2487:) 2482:i 2478:x 2474:( 2471:p 2463:2 2455:) 2450:i 2446:x 2442:( 2439:p 2434:n 2429:1 2426:= 2423:i 2412:= 2409:) 2406:X 2403:( 2400:H 2385:. 2351:( 2332:k 2326:, 2305:k 2302:= 2297:S 2258:= 2255:H 2231:c 2229:/ 2227:t 2225:( 2223:8 2219:C 2206:( 2189:. 2171:x 2167:p 2155:x 2151:p 2145:n 2140:1 2137:= 2134:x 2123:= 2120:H 2106:x 2104:p 2100:n 2077:x 2073:p 2061:x 2057:p 2051:n 2046:1 2043:= 2040:x 2029:) 2026:n 2020:1 2017:( 2014:= 2011:H 1996:x 1992:x 1990:p 1967:) 1964:n 1958:1 1955:( 1952:+ 1947:x 1943:A 1931:x 1927:A 1921:n 1916:1 1913:= 1910:x 1899:P 1890:P 1887:= 1866:) 1863:1 1860:+ 1855:n 1851:A 1839:n 1835:A 1831:( 1819:) 1816:1 1813:+ 1808:2 1804:A 1792:2 1788:A 1784:( 1778:) 1775:1 1772:+ 1767:1 1763:A 1751:1 1747:A 1743:( 1737:) 1734:1 1731:+ 1728:P 1719:P 1716:( 1713:= 1692:) 1689:1 1686:+ 1681:n 1677:A 1668:n 1664:A 1652:n 1648:A 1644:( 1632:) 1629:1 1626:+ 1621:2 1617:A 1608:2 1604:A 1592:2 1588:A 1584:( 1578:) 1575:1 1572:+ 1567:1 1563:A 1554:1 1550:A 1538:1 1534:A 1530:( 1524:) 1521:1 1518:+ 1515:P 1509:P 1500:P 1497:( 1494:= 1491:H 1461:. 1458:x 1452:x 1443:x 1440:= 1435:x 1431:x 1428:d 1420:x 1411:x 1402:x 1399:= 1396:x 1393:d 1388:x 1349:. 1346:x 1343:d 1338:x 1325:n 1321:A 1315:1 1298:x 1295:d 1290:x 1277:2 1273:A 1267:1 1256:x 1253:d 1248:x 1235:1 1231:A 1225:1 1214:x 1211:d 1206:x 1195:P 1190:1 1182:= 1179:H 1147:i 1134:n 1130:A 1124:i 1107:i 1094:2 1090:A 1084:i 1073:i 1060:1 1056:A 1050:i 1039:i 1028:P 1023:i 1015:= 986:! 981:n 977:A 958:! 953:3 949:A 936:! 931:2 927:A 914:! 909:1 905:A 892:! 889:P 880:= 848:! 843:n 839:A 830:! 825:3 821:A 815:! 810:2 806:A 800:! 795:1 791:A 785:! 782:P 770:= 758:= 755:H 726:P 705:P 701:n 697:= 693:T 659:! 654:n 650:A 639:! 634:3 630:A 624:! 619:2 615:A 609:! 604:1 600:A 594:! 591:P 585:= 550:P 545:) 540:n 537:1 532:( 524:! 519:n 515:A 504:! 499:3 495:A 489:! 484:2 480:A 474:! 469:1 465:A 459:! 456:P 450:= 444:T 435:= 432:p 390:i 386:A 380:n 375:1 372:= 369:i 361:= 358:P 345:i 340:i 338:A 333:n 331:A 326:2 324:A 319:1 317:A 313:n 233:. 225:( 201:. 193:( 148:. 140:( 104:( 50:.

Index

Talk:Entropy (information theory)
archive
current talk page
Archive 1
Archive 2
Archive 3
unsigned
139.149.31.232
talk
contribs
unsigned
75.85.88.234
talk
18:06, 17 December 2006 (UTC)
relative entropy
Jheald
19:32, 7 February 2007 (UTC)
unsigned
193.254.231.71
talk
13:52, 8 February 2007 (UTC)
unsigned
193.254.231.71
talk
10:18, 12 February 2007 (UTC)
66.151.13.191
20:41, 31 March 2006 (UTC)
roulette
multinomial distribution
logarithm

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.