Knowledge

Talk:Entropy (information theory)/Archive 3

Source šŸ“

2338:
is not useful. The problems outlined are the result of applying the concept of entropy wrongly, and do not therefore constitute a "limitation". For example, suppose I have a three-bit password for a total of eight possible passwords, each equally likely. Brute force guessing, I will get it in 4 guesses on average. The entropy is three bits. If the passwords are labelled 0,1,...7 corresponding to , ,..., then I can certainly get it by asking 3 questions - for example, is the password : -->
31: 2340:=6?, if no, then is it 5? if no then we know it is 4, that's three questions - and the entropy is three bits. This type of analysis also works when the probabilities are skewed. Since you cannot ask intelligent questions in a password situation, the concept of entropy is not applicable. This section should either be deleted or used as an example of the incorrect application of the concept of entropy. 1927:
indistinguishable from one another. Therefore the entropy of of a series of tosses will be 0, since the result of each and every toss will be indistinguishably heads. If, however, a normal, fair, two-sided (1 heads, 1 tails) coin is tossed multiple times, the entropy will be 1, since the results will be entirely unpredictable.
1272:
of some) can be best understood as a particular concrete application of the more general idea of entropy that arises in information theory. So I don't see any great value in a move; but I do agree with Djr32 that a section towards the end introducing mathematical generalisations of the idea could be a useful addition.
2337:
This criticism is bogus. Entropy is not the number of brute force guesses to find a password, nor its logarithm, it is the average number of "intelligent" questions you ask about the password in order to find it. In a password situation, these questions will not be answered, so the concept of entropy
2003:
You placed two links to the article into the discussion. I think the article speaks for itself. Beyond that I didn't say anything. The basic question is, would it be helpful for readers to be able to easily link to the basic term somewhere in the article? This question is not asked to SudoGhost.
1990:
is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process. Placing the link there is confusing at best, because the article being linked to has nothing to do with the information being discussed. A wikilink is placed within an article
1947:
The question is, is it appropriate to have one link to entropy in the article, or would the need to use two links to get to the basic article using the disambiguation link at the top of the page be irritating to some readers. Does anyone think that one, but only one, direct link would contribute to
1906:
In this case the term 'predictable' is in reference to knowing what can happen as a result of the coin toss. It will either be heads or tails with no other options. People tend to get confused between the probability of one particular outcome vs. the predictability of it either being heads or tails.
1695:
in relation to the probability mass function where uncertainty is the integral of the probability mass function. While this was probably quite obvious to the writer, I'm not sure it would be to all readers. The way its worded almost makes it sound like the log relationship is something that came out
1687:
When describing the case of a a set of n\, possible outcomes (events) \left\{ x_iĀ : i = 1 , \ldots , n \right\} the article says that the probability mass function is given by p(x_i) = 1 / n\, and then states that the uncertainty for such a set of n\, outcomes is defined by \displaystyle u = \log_b
1343:
Now that I have read this discussion page, it is clear to me that it is #3. The section Layman's terms begins "Entropy is a measure of disorder". This sentence is leading me down the primrose path into utter befuddlement. In thermodynamics, entropy is the tendency toward disorder, thus the words
1271:
My instinct is to leave put. In information theory, in a book like say Cover & Thomas, I think this is now more commonly just called "Entropy" rather than "Shannon Entropy"; and in many ways it is actually the more fundamental concept than Entropy in thermodynamics, which (at least in the view
1236:
I think this article is about both the general concept of Entropy in information theory, and Shannon's entropy (which is by far the most common example, and hence the primary topic for "Entropy (information theory)"). Most of the other definitions seem to be generalisations of this concept to very
1813:
The article states that "The Shannon entropy is restricted to random variables taking discrete values". Is this technically true? My understanding is that the Shannon entropy of a continuous random variable is defined, but infinite. The infinite Shannon entropy of a continuous r.v. is an important
1393:
level things are different. There are still an enormous number of different possible microscopic states that all the electrons, all the atoms, all the photons, all the parts of the universe together could be in. So if you wanted a total description of the state of the universe, down at the most
1251:
Parenthesized names are artificial and don't have primary topics. "Entropy" can have a primary topic. "Entropy (information theory)" is a name purely created for disambiguation and therefore primary topics aren't applicable. Splitting the article into 2 separate ones about the general concept and
2069:
I would agree with that statement. Would you agree that the link would be appropriate for the end of the phrase 'The inspiration for adopting the word entropy' in the section 'Aspects - Relationship to thermodynamic entropy'? If you would think that it was a different entropy being referred to
1077:
I've put in a short section titled "Layman's terms" to make it more user-friendly to the curious layman who is not familiar with hard sums or long winded technical definitions. It is my belief that every scientific and technical article should have one of these to encourage public interest in
2108:
I agree with this last modification. There is *information entropy* and *thermodynamic entropy*. The concept of information entropy can be applied to the special case of a statistical ensemble of physical particles. The thermodynamic entropy is then equal to the Boltzmann constant times this
1860:
As for whether the entropy is "fairly low", surely what the article is doing is comparing the entropy of English with that of a random sequence of letters. English clearly does have systematic regularities compared to such a stream. But the article could make more explicit that this is the
1926:
Perhaps the confusion arises from a misunderstanding of the term "two-headed coin" in the sentence "A series of tosses of a two-headed coin will have zero entropy." This is distinct from the notion of a normal two-sided coin. A two-headed coin has two sided, both of which are heads and
326:
Actually "abcd - abcd - abcd - abcd" is highly ordered. If we read these as sets of four hexadecimal digits, "aaaa - bbbb - cccc - dddd" is different 16 bit characters, while "abcd - abcd - abcd - abcd" is four of the same 16 bit character, and therefore more ordered. Any pattern is
1348:. In such a state nothing differs from anything else, so there is no information. Yet Shannon calls information disorder, and therefor entropy is information. According to Shannon, the heat death of the universe is maximum information, which is a distinctly odd way of viewing it. 1432:
Just adding that, if it's any consolation, you're certainly not the first person that's been tripped up by the use of the word "disorder" to describe entropy. For further discussion of what the term "disorder" means in thermodynamical discussions of entropy, see the article
1833:
The paragraph that discusses the entropy of the English language, stating "English text has fairly low entropy. In other words, it is fairly predictable.", is a bold claim and appears to be original research. I would like to see citations to back up this discussion.
1955:(cur | prev) 09:26, 2 October 2011 SudoGhost (talk | contribs) (42,074 bytes) (Undid revision by 67.206.184.19 (talk) The wikilink doesn't belong there. The word being defined is NOT the word you're linking to. Period. That's what the disambiguation is for) (undo) 804: 1961:(cur | prev) 09:21, 2 October 2011 SudoGhost (talk | contribs) (42,074 bytes) (Undid revision by 67.206.184.19 (talk) The use of the word defining the information theory is not an appropriate place to link Entropy. There is a disambiguation for it above) (undo) 1814:
result, for example, combined with the source coding theorem, it predicts that an information channel must have infinite capacity to perfectly transmit continuously distributed random variables (which is also true, and also an important result in the field). --
1952:(cur | prev) 09:31, 2 October 2011 67.206.184.19 (talk) (42,078 bytes) (The word being defined in the introduction is entropy. The word being linked to is entropy. One more revert and I will attempt to transfer these headers to the discussion page.) (undo) 186:
and was even more surprised to see the latter not even linked! I'd like to see this in the otheruses template at the top but am not sure how to phrase it succinctly. Can someone give it a shot? It's a hairy issue since the articles are so tightly related.
1549:
stack so I am guessing the connection is chaos. However, I agree that it is more suited to a "Entropy in popular culture" section or article. Problem is, we currently don't have a better image for the lede to replace it with - do you have a suggestion?
1991:
to help give the reader a better understanding of the content of an article, and placing that wikilink there not only does not accomplish that task, it does the opposite. There's no reason to place that wikilink there, but plenty of reason not to. -
1958:(cur | prev) 09:24, 2 October 2011 67.206.184.19 (talk) (42,078 bytes) (Your view is interesting. The disambiguation page takes two links to get to the article. This could be irritating to many readers who would want to get to the basic term.) (undo) 2307: 1237:
specialised mathematical contexts. Would it be better to keep this page where it is, but to make the links to the other mathematical entropies more explicit (e.g. to have a section about mathematical generalisations of the idea)?
1744: 531: 379:, you will find a whole section on different measures and generalisations of entropy which are used in information theory -- most notably Renyi entropy, also the entropies listed under the "Mathematics" section. 1770:
Where /usr/share/dict/words is a mostly complete list of English words, these lines count the occurance of 'th', 'no' and 'na' in that file. I'm sure that there are others that are more frequent still.
404:
However, a hatnote may still be appropriate when even a more specific name is still ambiguous. For example, Matt Smith (comics) might still be confused for the comics illustrator Matt Smith (illustrator).
619: 2057:
save someone a single click. That's what disambiguations are for. The entropy article you linked has nothing to do with the entropy definition in the article, and as such it doesn't belong there. -
1462:
should discuss some of the problems with seeing entropy only (or even primarily/preferentially) as something related to energy. But between the two articles, I hope you may find something of use.)
1700:
be able to figure this out by themselves, but I'm wondering if pointing this out would be better policy. At the very least, avoiding the misnomer of a "definition" would avoid some confusion.
1682: 109:
says that information entropy is "also called" log probability. Is it true that they're the same thing? If so, a mention or brief discussion of this in the article might be appropriate.
851: 655: 1793:
It's the most common character sequence in a corpus made of English sentences, not a list of unique words. Consider some of the most common words: the, then, this, that, with, etc.
932: 1093:
Maybe we could use the example of drawing a number between 2 and 12 out of an equiprobable hat, versus rolling dice. I think the focus would be on the probability of the number 7.
881: 1964:(cur | prev) 09:18, 2 October 2011 67.206.184.19 (talk) (42,078 bytes) (Could not find a single link to basic entropy in article. A vast number is bad but one seems reasonable. 1020: 976: 1329:
and thought the word "entropy" was used backwards. So I came to this WP article to read about Shannon entropy. I quickly realized one of the following had to be true:
1575:
I don't think it needs an image; not all articles do, and the image shouldn't be kept simply because there is no alternative. Anyone else think it should be removed? --
287:
But something remains constant in both universes, because the second universe is a big replica of each of the small particles or sub-universes of the first universe.
1049: 646:
In other words, changing the base is nothing more than changing the unit of measurment. All reasoning and comparaisons between entropy are independent of the base.
641: 145:
Normally information from a blog is not authoritative and I wouldn't use this way to post primary information on Knowledge; but math is math and stands on its own.
2148: 64: 59: 1719:
Can these two be linked? According to Computer Scientist Rolf Landauer, no. "...there is no unavoidable minimal energy requirement per transmitted bit."
1750:
In the introduction, the article states that 'th' is the most common character sequence in the English language. A quick test seems to contradict this:
332:
I agree the poster of this image meant well, and it would be a great analogy if it were right. Unfortunately it's wrong, for reasons I get into below.
1881:"A single toss of a fair coin has an entropy of one bit, but a particular result (e.g. "heads") has zero entropy, since it is entirely 'predictable'." 292: 265:
The bigger quantity of information in the wave case, gets compensated by, or gets correlated to, the smaller quantity of entropy in that wave's case.
2389:
A series of coin tosses with a fair coin has one bit of entropy, since there are two possible states, each of which is independent of the others.
456: 2324: 211: 1725: 2139:
I was thinking this article (or a subarticle) should discuss the entropy chain rule as discussed in Cover & Thomas (1991, pg 21) (see
2033:. They are not the same. It's misleading and inaccurate. You cannot place that wikilink there for the same reason you cannot say that " 2071: 2005: 1968: 1778: 1344:"measure of disorder" imply you are measuring the same thing meant by thermodynamic entropy. The ultimate thermodynamic entropy is the 1058: 1928: 1530:
What does this sculpture have to do with entropy? While nice, I don't really see the connection, nor the relevance of this picture. --
1079: 2399: 47: 17: 1857:
Two cites for numerical estimates for the entropy of English are given in the lead. But they could happily be repeated lower down.
2392:
Isn't that wrong? Since each toss of a fair coin has one bit of entropy, wouldn't a series of N tosses have N bits of entropy?
1891: 1884:
It calls the result of a random coin toss result entirely predictable. That doesn't make any sense. Can someone please clarify?
1307: 84:
this statement is not followed up with something that uses the premise it states: "Since the probability of each event is 1 / n"
539: 311:
More information = more entropy. Also, I don't think this is very relevant to the information theory definition of entropy.
1410:
is what makes heat death the state of maximum entropy -- both maximum thermodynamic entropy, and maximum Shannon entropy.
1326: 2027: 1493: 1489: 1455: 1434: 1190: 1295:
Subsequent comments should be made in a new section on this talk page. No further edits should be made to this section.
1167:
Subsequent comments should be made in a new section on the talk page. No further edits should be made to this section.
799:{\displaystyle \scriptstyle {\max H_{b}(X)=-\sum {p_{i}\log _{b}p_{i}}=-N{\frac {1}{N}}log_{b}{\frac {1}{N}}=log_{b}N}} 2050: 2042: 1612: 1593:' article, where it may have least a little relevance, but it's not appropriate in the information theory context. 1580: 1535: 1497: 1459: 1438: 1345: 38: 2018:
The article speaks for itself? That requires clarification, because your meaning is unclear. What you linked is
1722:
Reference: Rolf Landauer, "Minimal Energy Requirements in Communication" p 1914-1918 v 272 Science, 28 June 1996.
1696:
of thin air by some definition when its really the result of the previous equation. I know math students probably
1640: 813: 238:
It's the same amount of information, but the information in the first one can be described more succinctly. See
142: 2320: 1215: 1142: 1100: 1087: 376: 2109:
information entropy. Best of all worlds would be two articles: Thermodynamic entropy and Information entropy.
1729: 1218:, there are multiple notions of entropy in information theory, which makes the current title not unambiguous. 424:
If there are other entropies in information theory, then the title of this article isn't fully disambiguated.
271:
HereĀ : aaa , there is less information than hereĀ : abc . But hereĀ : aaa , entropy is bigger than hereĀ : abc .
206:
Hi. I've drawn this graphic. I'd like to know your comments about the idea it describes, thank you very much.
1782: 1117:
I wanted to relate Shannon to DNA and cell biology by searching for answers to the following four questions:
1062: 163:
I think this article requires some review and attention to mathematical pedanting to maintain B-class level.
2375: 2356: 2075: 2009: 1972: 1932: 1705: 1437:. For a discussion of difficulties that the word "disorder" can lead to, see eg material near the start of 1402:
the number of microscopic states compatible with that macroscopic state -- i.e. the state that requires the
2316: 1916: 1895: 1819: 1311: 1083: 1136: 1094: 889: 2403: 2085: 1701: 1608: 1576: 1531: 1306:
The article introduces the concept of "compression" without explaining what it is. Explanation needed.
1209: 1126:ā†’ Was the Shannon information content required to bootstrap life into existence lost after life began? 239: 2407: 2378: 2359: 2349: 2118: 2103: 2079: 2064: 2013: 1998: 1976: 1936: 1920: 1899: 1870: 1849: 1823: 1802: 1786: 1733: 1709: 1616: 1602: 1584: 1570: 1539: 1513: 1471: 1419: 1364: 1315: 1281: 1264: 1246: 1230: 1185: 1148: 1106: 1066: 856: 440: 419: 369: 345: 320: 306: 302: 250: 225: 221: 196: 172: 154: 131: 117: 95: 2395: 2312: 1887: 1845: 1815: 1798: 1774: 1358: 1261: 1227: 1054: 437: 339: 316: 183: 168: 150: 127: 262:
It is true that there is something that remains constant in both graphs, but it is not information:
141:
Hello, I have posted some (relatively?) interesting properties of the entropy function on my blog.
2100: 2061: 1995: 1176:
No consensus for proposal. No prejudice regarding other editorial proposals and potential renames.
1458:
page would include some of the discussion as to why the "disorder" can lead to confusion; and the
1551: 1351:
The article should be changed to acknowledge that this is the reverse of thermodynamic entropy.
981: 937: 365: 291: 2084:
Yes, that would be a completely appropropriate place for it. I've edited the article so that "
1123:ā†’ Is the Shannon information content of DNA sufficient to animate matter and energy into life? 2302:{\displaystyle H(X_{1},X_{2},\dots X_{n})=\sum _{i=1}^{n}H(X_{i}|X_{i-1},X_{i-2},\dots X_{1})} 1908: 283:
This other universeĀ : aaaa - bbbb - cccc - dddd , is minimum entropy and maximum information.
91: 2140: 1866: 1839: 1509: 1467: 1415: 1277: 1205: 1181: 1129:ā†’ Hypothetically, was that (now lost) bootstrap information derived from NP-hard processes? 415: 1794: 1353: 1253: 1242: 1219: 429: 425: 334: 312: 246: 192: 164: 146: 123: 115: 1406:
further information to fully specify the microscopic state given the macroscopic state.
1028: 624: 279:
This universeĀ : abcd - abcd - abcd - abcd , is maximum entropy and minimum information.
2097: 2058: 2046: 2038: 2034: 1992: 1598: 394: 357: 2037:
is an operating system for mobile devices such as smartphones and tablet computers."
2345: 2114: 1986:
is a measure of disorder, or more precisely unpredictability" is completely untrue.
361: 268:
So, certainly, there is something that remains constant. But it is not information.
298: 217: 210: 87: 1398:
amount of information to find. In fact, heat death is the macroscopic state that
1862: 1835: 1505: 1463: 1411: 1273: 1177: 411: 46:
If you wish to start a new discussion or revive an old one, please do so on the
2371: 1488:
Regarding the last point, I've added some more text in a specific new section (
1383:
In such a state nothing differs from anything else, so there is no information.
649:
The question arise then to choose a reference base. The maximal entropy beeing
1333:
I was misreading the article (and the LiveScience write made the same mistake)
1238: 242: 231: 188: 110: 106: 1120:ā†’ Is inanimate matter and energy both the input and output of a living cell? 143:
http://fulldecent.blogspot.com/2009/12/interesting-properties-of-entropy.html
1594: 2088:" (which sent the reader to the disambiguation page) became "thermodynamic 2053:
article at a random spot where the word "Android" is used just because it
1500:
with an explanation on the talk page, to at least visibly flag up some of
526:{\displaystyle \scriptstyle {\log _{b}x={\frac {\log _{r}x}{\log _{r}b}}}} 2341: 2110: 1907:
For it to be unpredictable you would have to have other unknown options.
2333:
Problems with "Limitations of entropy as a measure of unpredictability"
2093: 2089: 2023: 2019: 1987: 1983: 1590: 1589:
I also think it should be removed from this article. It's also on the '
1132:
I searched the web, PubMed, xxx.lanl.gov ... and found no references.
1691:
I believe that the uncertainty is not defined this way but is really
453:
Changing the base of a logarithm is tantamount as a scaling factorĀ :
360:. I'll remove it again unless some valid reason is given to keep it. 2355:
The section currently looks okay to me. Does this problem remain?
1546: 1496:
page to at least start to bring up this issue; and added a tag to
1289:
The above discussion is preserved as an archive of the proposal.
1372:
No. You misunderstand the notion of "heat death of the universe".
159:
I have made a few edits today and merged in some of that blog to
25: 1743:
This article is also assessed within the mathematics field
1216:
Entropy_(disambiguation)#Information_theory_and_mathematics
160: 614:{\displaystyle H_{b}(X)={\frac {1}{\log _{r}b}}H_{r}(X)\!} 2141:
http://www.cse.msu.edu/~cse842/Papers/CoverThomas-Ch2.pdf
1078:
science. Hope my idea meets with general approvalĀ :-) --
384:
The article, as the hatnote says, is specifically about
1943:
Appropriateness of one link to basic entropy in article
1195: 2004:
It is asked to other viewers of the discussion page.
1161:
The following is a closed discussion of the proposal.
860: 817: 659: 460: 2151: 1644: 1643: 1031: 984: 940: 892: 859: 816: 658: 627: 542: 459: 182:
I was surprised to see Shannon entropy here and not
2301: 1676: 1336:This article is using the word "entropy" backwards 1252:Shannon entropy specifically is another option. -- 1043: 1014: 970: 926: 875: 845: 798: 635: 613: 525: 202:About the relation between entropy and information 256:I think it's not the same amount of information. 2045:are not the same, and you can't place a link to 1339:Claude Shannon used the word "entropy" backwards 661: 2070:there, what entropy would it be referring to? 1809:Shannon entropy and continuous random variables 1010: 966: 631: 609: 2384:Error in "series ... has one bit of entropy"? 1715:Information Theory and Thermodynamics Entropy 259:There is more information in the wave frame. 8: 1677:{\displaystyle \displaystyle u=\log _{b}(n)} 1113:Use of Shannon Information Content with DNA 846:{\displaystyle \scriptstyle {\log _{b}N=1}} 1441:, and references and links from that page. 2290: 2268: 2249: 2240: 2234: 2218: 2207: 2191: 2175: 2162: 2150: 1655: 1642: 1030: 989: 983: 945: 939: 903: 891: 861: 858: 823: 818: 815: 810:a natural choice would then be to choose 785: 762: 756: 736: 720: 707: 697: 692: 668: 660: 657: 626: 594: 575: 565: 547: 541: 506: 488: 481: 466: 461: 458: 2374:of this article has some nice graphics. 1764:grep -i na /usr/share/dict/words | wc -l 1759:grep -i no /usr/share/dict/words | wc -l 1754:grep -i th /usr/share/dict/words | wc -l 1454:(Knowledge isn't perfect. Ideally, the 428:would be fully disambiguated however. -- 2092:" (sending the reader directly to the 1829:Citation Needed for Entropy of English 122:Entropy is *expected* log probability 44:Do not edit the contents of this page. 1394:microscopic level, there would be an 7: 927:{\displaystyle 0\leq H_{b}(X)\leq 1} 184:the explicit collapse of information 876:{\displaystyle \scriptstyle {b=N}} 161:http://en.wikipedia.org/Perplexity 24: 2339:=4? if yes, is the password : --> 1302:"Compression" needs to be defined 18:Talk:Entropy (information theory) 375:If you bother yourself to go to 290: 209: 29: 2026:is not a measure of disorder. 1212:) 02:52, 14 November 2010 (UTC) 1172:The result of the proposal was 2296: 2241: 2227: 2197: 2155: 1937:17:08, 22 September 2011 (UTC) 1670: 1664: 1001: 995: 957: 951: 915: 909: 680: 674: 606: 600: 559: 553: 1: 2379:11:21, 17 February 2012 (UTC) 2360:11:25, 17 February 2012 (UTC) 1617:18:36, 16 February 2011 (UTC) 1603:13:03, 16 February 2011 (UTC) 1585:03:52, 16 February 2011 (UTC) 1571:22:51, 15 February 2011 (UTC) 1540:19:44, 14 February 2011 (UTC) 1385:But this is only true at the 1316:15:35, 27 November 2010 (UTC) 1282:18:27, 17 November 2010 (UTC) 1265:04:16, 14 November 2010 (UTC) 1186:18:48, 18 December 2010 (UTC) 1051:when analysing binary data. 978:for certain distribution and 321:15:58, 11 December 2009 (UTC) 232:Fluctuation theorem talk page 173:16:00, 11 December 2009 (UTC) 2408:19:50, 7 November 2012 (UTC) 2350:05:56, 7 November 2011 (UTC) 2028:Entropy (information theory) 1824:20:58, 21 January 2012 (UTC) 1803:06:12, 24 October 2011 (UTC) 1514:21:35, 3 December 2010 (UTC) 1494:Entropy (order and disorder) 1472:10:00, 3 December 2010 (UTC) 1456:Entropy (order and disorder) 1435:Entropy (order and disorder) 1420:23:56, 2 December 2010 (UTC) 1365:20:20, 2 December 2010 (UTC) 1321:Reverse meaning of "entropy" 1247:11:29, 7 November 2010 (UTC) 1231:07:45, 7 November 2010 (UTC) 1191:Entropy (information theory) 1088:19:39, 31 January 2010 (UTC) 1067:11:27, 29 January 2010 (UTC) 1015:{\displaystyle H_{b}(X)=1\!} 971:{\displaystyle H_{b}(X)=0\!} 536:The same holds for entropy: 441:01:04, 10 January 2010 (UTC) 346:21:45, 2 December 2010 (UTC) 155:01:27, 3 December 2009 (UTC) 132:01:21, 3 December 2009 (UTC) 2119:19:03, 2 October 2011 (UTC) 2104:10:23, 2 October 2011 (UTC) 2080:10:18, 2 October 2011 (UTC) 2065:10:07, 2 October 2011 (UTC) 2014:09:52, 2 October 2011 (UTC) 1999:09:46, 2 October 2011 (UTC) 1977:09:41, 2 October 2011 (UTC) 1861:comparison it has in mind. 1545:It appears to be a chaotic 1149:10:31, 5 October 2010 (UTC) 1107:13:35, 5 October 2010 (UTC) 393:This is in conformity with 2424: 2051:Android (operating system) 1787:18:42, 11 April 2011 (UTC) 1745:Probability and statistics 1734:06:39, 10 April 2011 (UTC) 1633:Is the uncertainty really 1498:Entropy (energy dispersal) 1460:Entropy (energy dispersal) 1439:Entropy (energy dispersal) 1346:heat death of the universe 226:00:07, 29 March 2009 (UTC) 197:03:06, 26 March 2009 (UTC) 178:Actual information entropy 118:01:34, 20 April 2006 (UTC) 1710:03:28, 2 March 2011 (UTC) 1629:Definition of uncertainty 1022:for uniform distibution. 621:for any alternative base 307:11:03, 4 April 2009 (UTC) 251:01:30, 4 April 2009 (UTC) 96:04:08, 5 March 2009 (UTC) 2327:) 21:39, 13 October 2011 1921:23:08, 4 July 2011 (UTC) 1900:08:27, 28 May 2011 (UTC) 1871:10:33, 28 May 2011 (UTC) 1850:19:26, 26 May 2011 (UTC) 1763:desktops:root:ga(3): --> 1758:desktops:root:ga(3): --> 1753:desktops:root:ga(3): --> 1327:this LiveScience article 1292:Please do not modify it. 1164:Please do not modify it. 1134:Anyone know a reference? 1025:This justify the use of 420:23:19, 27 May 2009 (UTC) 377:Entropy (disambiguation) 370:21:56, 27 May 2009 (UTC) 356:It doesn't belong, read 2303: 2223: 1678: 1045: 1016: 972: 928: 877: 847: 800: 637: 615: 527: 406: 137:Interesting properties 2304: 2203: 2086:thermodynamic entropy 1679: 1072: 1046: 1017: 973: 929: 878: 848: 801: 638: 616: 528: 402: 240:Kolmogorov complexity 42:of past discussions. 2376:David Spector (talk) 2357:David Spector (talk) 2149: 1877:Sentence confuses me 1641: 1029: 982: 938: 890: 857: 814: 656: 625: 540: 457: 1044:{\displaystyle b=2} 636:{\displaystyle r\!} 2299: 1674: 1673: 1041: 1012: 1011: 968: 967: 924: 873: 872: 843: 842: 796: 795: 633: 632: 611: 610: 523: 522: 2398:comment added by 2329: 2315:comment added by 1890:comment added by 1777:comment added by 1213: 1057:comment added by 770: 744: 588: 519: 275:Another example: 80:entropy explained 77: 76: 54: 53: 48:current talk page 2415: 2410: 2328: 2309: 2308: 2306: 2305: 2300: 2295: 2294: 2279: 2278: 2260: 2259: 2244: 2239: 2238: 2222: 2217: 2196: 2195: 2180: 2179: 2167: 2166: 1912: 1911:Ā§Ā MusicĀ SorterĀ Ā§ 1902: 1853: 1789: 1683: 1681: 1680: 1675: 1660: 1659: 1609:InverseHypercube 1577:InverseHypercube 1567: 1564: 1563: 1560: 1559: 1532:InverseHypercube 1361: 1356: 1294: 1259: 1225: 1200: 1198: 1166: 1145: 1139: 1103: 1097: 1069: 1050: 1048: 1047: 1042: 1021: 1019: 1018: 1013: 994: 993: 977: 975: 974: 969: 950: 949: 933: 931: 930: 925: 908: 907: 882: 880: 879: 874: 871: 852: 850: 849: 844: 841: 828: 827: 805: 803: 802: 797: 794: 790: 789: 771: 763: 761: 760: 745: 737: 726: 725: 724: 712: 711: 702: 701: 673: 672: 642: 640: 639: 634: 620: 618: 617: 612: 599: 598: 589: 587: 580: 579: 566: 552: 551: 532: 530: 529: 524: 521: 520: 518: 511: 510: 500: 493: 492: 482: 471: 470: 435: 342: 337: 294: 213: 113: 73: 56: 55: 33: 32: 26: 2423: 2422: 2418: 2417: 2416: 2414: 2413: 2412: 2393: 2386: 2368: 2335: 2317:150.135.222.186 2310: 2286: 2264: 2245: 2230: 2187: 2171: 2158: 2147: 2146: 2137: 1945: 1910: 1885: 1879: 1843: 1842:) 26 May 2011 1831: 1811: 1772: 1741: 1717: 1651: 1639: 1638: 1631: 1565: 1561: 1557: 1555: 1553: 1528: 1389:level. At the 1359: 1354: 1323: 1304: 1299: 1290: 1257: 1223: 1196:Shannon entropy 1194: 1162: 1156: 1147: 1143: 1137: 1115: 1105: 1101: 1095: 1075: 1052: 1027: 1026: 985: 980: 979: 941: 936: 935: 899: 888: 887: 855: 854: 819: 812: 811: 781: 752: 716: 703: 693: 664: 654: 653: 623: 622: 590: 571: 570: 543: 538: 537: 502: 501: 484: 483: 462: 455: 454: 451: 433: 426:Shannon entropy 354: 340: 335: 230:Also posted at 204: 180: 139: 111: 103: 101:log probability 82: 69: 30: 22: 21: 20: 12: 11: 5: 2421: 2419: 2385: 2382: 2372:German version 2367: 2366:German version 2364: 2363: 2362: 2334: 2331: 2298: 2293: 2289: 2285: 2282: 2277: 2274: 2271: 2267: 2263: 2258: 2255: 2252: 2248: 2243: 2237: 2233: 2229: 2226: 2221: 2216: 2213: 2210: 2206: 2202: 2199: 2194: 2190: 2186: 2183: 2178: 2174: 2170: 2165: 2161: 2157: 2154: 2136: 2133: 2132: 2131: 2130: 2129: 2128: 2127: 2126: 2125: 2124: 2123: 2122: 2121: 2047:Android (drug) 1966: 1965: 1962: 1959: 1956: 1953: 1944: 1941: 1924: 1923: 1878: 1875: 1874: 1873: 1858: 1848:comment added 1830: 1827: 1810: 1807: 1806: 1805: 1767: 1765: 1762: 1760: 1757: 1755: 1740: 1737: 1726:210.17.201.123 1716: 1713: 1672: 1669: 1666: 1663: 1658: 1654: 1650: 1647: 1630: 1627: 1626: 1625: 1624: 1623: 1622: 1621: 1620: 1619: 1527: 1524: 1523: 1522: 1521: 1520: 1519: 1518: 1517: 1516: 1479: 1478: 1477: 1476: 1475: 1474: 1447: 1446: 1445: 1444: 1443: 1442: 1425: 1424: 1423: 1422: 1376: 1375: 1374: 1373: 1355:Randall Bart 1341: 1340: 1337: 1334: 1322: 1319: 1303: 1300: 1298: 1297: 1285: 1284: 1269: 1268: 1267: 1170: 1169: 1157: 1155: 1154:Requested move 1152: 1141: 1138:Bridgettttttte 1114: 1111: 1110: 1109: 1099: 1096:Bridgettttttte 1074: 1073:Layman's Terms 1071: 1040: 1037: 1034: 1009: 1006: 1003: 1000: 997: 992: 988: 965: 962: 959: 956: 953: 948: 944: 923: 920: 917: 914: 911: 906: 902: 898: 895: 886:In that case, 870: 867: 864: 840: 837: 834: 831: 826: 822: 808: 807: 793: 788: 784: 780: 777: 774: 769: 766: 759: 755: 751: 748: 743: 740: 735: 732: 729: 723: 719: 715: 710: 706: 700: 696: 691: 688: 685: 682: 679: 676: 671: 667: 663: 630: 608: 605: 602: 597: 593: 586: 583: 578: 574: 569: 564: 561: 558: 555: 550: 546: 517: 514: 509: 505: 499: 496: 491: 487: 480: 477: 474: 469: 465: 450: 449:About the base 447: 446: 445: 444: 443: 408: 407: 399: 398: 390: 389: 381: 380: 353: 350: 349: 348: 336:Randall Bart 329: 328: 286: 282: 278: 274: 254: 236: 203: 200: 179: 176: 138: 135: 102: 99: 81: 78: 75: 74: 67: 62: 52: 51: 34: 23: 15: 14: 13: 10: 9: 6: 4: 3: 2: 2420: 2411: 2409: 2405: 2401: 2397: 2390: 2383: 2381: 2380: 2377: 2373: 2365: 2361: 2358: 2354: 2353: 2352: 2351: 2347: 2343: 2332: 2330: 2326: 2322: 2318: 2314: 2291: 2287: 2283: 2280: 2275: 2272: 2269: 2265: 2261: 2256: 2253: 2250: 2246: 2235: 2231: 2224: 2219: 2214: 2211: 2208: 2204: 2200: 2192: 2188: 2184: 2181: 2176: 2172: 2168: 2163: 2159: 2152: 2144: 2142: 2134: 2120: 2116: 2112: 2107: 2106: 2105: 2102: 2099: 2095: 2091: 2087: 2083: 2082: 2081: 2077: 2073: 2072:67.206.184.19 2068: 2067: 2066: 2063: 2060: 2056: 2052: 2048: 2044: 2040: 2036: 2032: 2029: 2025: 2021: 2017: 2016: 2015: 2011: 2007: 2006:67.206.184.19 2002: 2001: 2000: 1997: 1994: 1989: 1985: 1981: 1980: 1979: 1978: 1974: 1970: 1969:67.206.184.19 1963: 1960: 1957: 1954: 1951: 1950: 1949: 1948:the article? 1942: 1940: 1938: 1934: 1930: 1922: 1918: 1914: 1913: 1905: 1904: 1903: 1901: 1897: 1893: 1889: 1882: 1876: 1872: 1868: 1864: 1859: 1856: 1855: 1854: 1851: 1847: 1841: 1837: 1828: 1826: 1825: 1821: 1817: 1808: 1804: 1800: 1796: 1792: 1791: 1790: 1788: 1784: 1780: 1779:72.165.89.132 1776: 1768: 1751: 1748: 1746: 1738: 1736: 1735: 1731: 1727: 1723: 1720: 1714: 1712: 1711: 1707: 1703: 1702:Dugthemathguy 1699: 1694: 1689: 1685: 1667: 1661: 1656: 1652: 1648: 1645: 1636: 1628: 1618: 1614: 1610: 1606: 1605: 1604: 1600: 1596: 1592: 1588: 1587: 1586: 1582: 1578: 1574: 1573: 1572: 1569: 1568: 1548: 1544: 1543: 1542: 1541: 1537: 1533: 1525: 1515: 1511: 1507: 1503: 1499: 1495: 1491: 1487: 1486: 1485: 1484: 1483: 1482: 1481: 1480: 1473: 1469: 1465: 1461: 1457: 1453: 1452: 1451: 1450: 1449: 1448: 1440: 1436: 1431: 1430: 1429: 1428: 1427: 1426: 1421: 1417: 1413: 1409: 1405: 1401: 1397: 1392: 1388: 1384: 1381:You say that 1380: 1379: 1378: 1377: 1371: 1370: 1369: 1368: 1367: 1366: 1363: 1362: 1357: 1349: 1347: 1338: 1335: 1332: 1331: 1330: 1328: 1320: 1318: 1317: 1313: 1309: 1301: 1296: 1293: 1287: 1286: 1283: 1279: 1275: 1270: 1266: 1263: 1260: 1256: 1250: 1249: 1248: 1244: 1240: 1235: 1234: 1233: 1232: 1229: 1226: 1222: 1217: 1211: 1207: 1203: 1197: 1192: 1188: 1187: 1183: 1179: 1175: 1168: 1165: 1159: 1158: 1153: 1151: 1150: 1146: 1140: 1135: 1130: 1127: 1124: 1121: 1118: 1112: 1108: 1104: 1098: 1092: 1091: 1090: 1089: 1085: 1081: 1070: 1068: 1064: 1060: 1059:62.65.141.230 1056: 1038: 1035: 1032: 1023: 1007: 1004: 998: 990: 986: 963: 960: 954: 946: 942: 921: 918: 912: 904: 900: 896: 893: 884: 868: 865: 862: 838: 835: 832: 829: 824: 820: 791: 786: 782: 778: 775: 772: 767: 764: 757: 753: 749: 746: 741: 738: 733: 730: 727: 721: 717: 713: 708: 704: 698: 694: 689: 686: 683: 677: 669: 665: 652: 651: 650: 647: 644: 628: 603: 595: 591: 584: 581: 576: 572: 567: 562: 556: 548: 544: 534: 515: 512: 507: 503: 497: 494: 489: 485: 478: 475: 472: 467: 463: 448: 442: 439: 436: 432: 427: 423: 422: 421: 417: 413: 410: 409: 405: 401: 400: 396: 392: 391: 387: 383: 382: 378: 374: 373: 372: 371: 367: 363: 359: 351: 347: 344: 343: 338: 331: 330: 325: 324: 323: 322: 318: 314: 309: 308: 304: 300: 295: 293: 288: 284: 280: 276: 272: 269: 266: 263: 260: 257: 253: 252: 248: 244: 241: 235: 233: 228: 227: 223: 219: 214: 212: 207: 201: 199: 198: 194: 190: 185: 177: 175: 174: 170: 166: 162: 157: 156: 152: 148: 144: 136: 134: 133: 129: 125: 120: 119: 116: 114: 108: 100: 98: 97: 93: 89: 85: 79: 72: 68: 66: 63: 61: 58: 57: 49: 45: 41: 40: 35: 28: 27: 19: 2394:ā€” Preceding 2391: 2387: 2369: 2336: 2311:ā€”Ā Preceding 2145: 2138: 2096:article). - 2054: 2030: 1967: 1946: 1929:76.65.229.24 1925: 1909: 1886:ā€” Preceding 1883: 1880: 1832: 1812: 1769: 1752: 1749: 1742: 1724: 1721: 1718: 1697: 1692: 1690: 1686: 1634: 1632: 1552: 1529: 1501: 1407: 1403: 1399: 1395: 1390: 1386: 1382: 1352: 1350: 1342: 1324: 1305: 1291: 1288: 1254: 1220: 1201: 1189: 1173: 1171: 1163: 1160: 1133: 1131: 1128: 1125: 1122: 1119: 1116: 1080:82.45.15.186 1076: 1024: 885: 809: 648: 645: 535: 452: 430: 403: 385: 355: 333: 310: 296: 289: 285: 281: 277: 273: 270: 267: 264: 261: 258: 255: 237: 229: 215: 208: 205: 181: 158: 140: 121: 105:The article 104: 86: 83: 70: 43: 37: 2400:128.229.4.2 2135:Chain Rule? 1939:Joey Morin 1844:ā€”Preceding 1773:ā€”Preceding 1391:microscopic 1387:macroscopic 1206:Vegaswikian 1053:ā€”Preceding 853:, that is 313:Full Decent 165:Full Decent 147:Full Decent 124:Full Decent 36:This is an 1892:67.1.51.94 1816:YearOfGlad 1795:NeoAdamite 1308:74.96.8.53 107:Perplexity 1526:Sculpture 1504:issues. 1492:) to the 1400:maximises 1174:Not done. 71:ArchiveĀ 3 65:ArchiveĀ 2 60:ArchiveĀ 1 2396:unsigned 2325:contribs 2313:unsigned 1982:Saying " 1888:unsigned 1775:unsigned 1739:Untitled 1396:enormous 1202:Relisted 1055:unsigned 388:entropy. 362:Mintrick 352:Hat link 2094:entropy 2090:entropy 2049:in the 2043:Android 2039:Android 2035:Android 2024:Entropy 2020:Entropy 1988:Entropy 1984:Entropy 1846:undated 1693:defined 1635:defined 1607:Done.-- 1591:Entropy 1325:I read 934:, with 395:WP:NAMB 386:Shannon 358:WP:NAMB 299:Faustnh 243:.froth. 218:Faustnh 189:.froth. 39:archive 1863:Jheald 1836:hovden 1698:should 1506:Jheald 1464:Jheald 1412:Jheald 1360:Talk 1274:Jheald 1262:(talk) 1228:(talk) 1178:DMacks 1144:babble 1102:babble 438:(talk) 412:Jheald 341:Talk 327:order. 88:watson 2388:: --> 2101:Ghost 2062:Ghost 2055:might 1996:Ghost 1766:22103 1761:22801 1756:21205 1688:(n). 1566:Spark 1547:Jenga 1258:cobra 1255:Cyber 1239:Djr32 1224:cobra 1221:Cyber 434:cobra 431:Cyber 112:dbtfz 16:< 2404:talk 2370:The 2346:talk 2321:talk 2115:talk 2098:Sudo 2076:talk 2059:Sudo 2041:and 2010:talk 1993:Sudo 1973:talk 1933:talk 1917:talk 1896:talk 1867:talk 1840:talk 1820:talk 1799:talk 1783:talk 1730:talk 1706:talk 1613:talk 1599:talk 1595:Qwfp 1581:talk 1536:talk 1510:talk 1490:here 1468:talk 1416:talk 1408:That 1404:most 1312:talk 1278:talk 1243:talk 1214:Per 1210:talk 1182:talk 1084:talk 1063:talk 416:talk 366:talk 317:talk 303:talk 247:talk 222:talk 193:talk 169:talk 151:talk 128:talk 92:talk 2342:PAR 2143:). 2111:PAR 2022:. 1653:log 1637:as 1502:its 883:. 821:log 705:log 662:max 573:log 504:log 486:log 464:log 2406:) 2348:) 2323:ā€¢ 2284:ā€¦ 2273:āˆ’ 2254:āˆ’ 2205:āˆ‘ 2185:ā€¦ 2117:) 2078:) 2031:is 2012:) 1975:) 1935:) 1919:) 1898:) 1869:) 1822:) 1801:) 1785:) 1747:. 1732:) 1708:) 1684:? 1662:ā” 1615:) 1601:) 1583:) 1562:ng 1558:ni 1556:in 1554:Sp 1538:) 1512:) 1470:) 1418:) 1314:) 1280:) 1245:) 1204:. 1199:ā€” 1193:ā†’ 1184:) 1086:) 1065:) 919:ā‰¤ 897:ā‰¤ 830:ā” 731:āˆ’ 714:ā” 690:āˆ‘ 687:āˆ’ 643:. 582:ā” 533:. 513:ā” 495:ā” 473:ā” 418:) 368:) 319:) 305:) 297:-- 249:) 234:. 224:) 216:-- 195:) 171:) 153:) 130:) 94:) 2402:( 2344:( 2319:( 2297:) 2292:1 2288:X 2281:, 2276:2 2270:i 2266:X 2262:, 2257:1 2251:i 2247:X 2242:| 2236:i 2232:X 2228:( 2225:H 2220:n 2215:1 2212:= 2209:i 2201:= 2198:) 2193:n 2189:X 2182:, 2177:2 2173:X 2169:, 2164:1 2160:X 2156:( 2153:H 2113:( 2074:( 2008:( 1971:( 1931:( 1915:( 1894:( 1865:( 1852:. 1838:( 1818:( 1797:( 1781:( 1728:( 1704:( 1671:) 1668:n 1665:( 1657:b 1649:= 1646:u 1611:( 1597:( 1579:( 1534:( 1508:( 1466:( 1414:( 1310:( 1276:( 1241:( 1208:( 1180:( 1082:( 1061:( 1039:2 1036:= 1033:b 1008:1 1005:= 1002:) 999:X 996:( 991:b 987:H 964:0 961:= 958:) 955:X 952:( 947:b 943:H 922:1 916:) 913:X 910:( 905:b 901:H 894:0 869:N 866:= 863:b 839:1 836:= 833:N 825:b 806:, 792:N 787:b 783:g 779:o 776:l 773:= 768:N 765:1 758:b 754:g 750:o 747:l 742:N 739:1 734:N 728:= 722:i 718:p 709:b 699:i 695:p 684:= 681:) 678:X 675:( 670:b 666:H 629:r 607:) 604:X 601:( 596:r 592:H 585:b 577:r 568:1 563:= 560:) 557:X 554:( 549:b 545:H 516:b 508:r 498:x 490:r 479:= 476:x 468:b 414:( 397:: 364:( 315:( 301:( 245:( 220:( 191:( 167:( 149:( 126:( 90:( 50:.

Index

Talk:Entropy (information theory)
archive
current talk page
ArchiveĀ 1
ArchiveĀ 2
ArchiveĀ 3
watson
talk
04:08, 5 March 2009 (UTC)
Perplexity
dbtfz

01:34, 20 April 2006 (UTC)
Full Decent
talk
01:21, 3 December 2009 (UTC)
http://fulldecent.blogspot.com/2009/12/interesting-properties-of-entropy.html
Full Decent
talk
01:27, 3 December 2009 (UTC)
http://en.wikipedia.org/Perplexity
Full Decent
talk
16:00, 11 December 2009 (UTC)
the explicit collapse of information
.froth.
talk
03:06, 26 March 2009 (UTC)

Faustnh

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

ā†‘