Knowledge

:Reference desk/Archives/Mathematics/2008 May 20 - Knowledge

Source 📝

893:
taking an authors conclusion and converting what it means. Any reliability issues with 4.4 must be sourced from the author. If the author states the 4.4 figure is not reliable then so can we. Of course, I can't find the original book so I don't know if he says it's normal or not. If in another article someone states X is 1.96 SDs above the normal we could state that this is 2.5% - it would not be appropriate to go into a discussion of sample bias, etc etc if the respected source doesn't do so. My feeling is that either the 1/x probabilities stay or the the whole table goes. If we feel that the authors original statistics are not a valid reference then fair enough. However, if we feel that the author's reference is valid and he himself does not discuss the 4.4 then there is no reason to remove it. . These numbers make the S.D's clear to understand to a reader, and if we have problems with them anyway the whole table shouldn't be there.
141:, don't use REML for one and ML for the other; in fact, you should probably use ML for both and not REML (which is the default for some functions, BTW), as in many cases the comparison of REML-fitted models can be meaningless. Second, check the "type" argument of the relevant anova method. Make sure it will do the equivalent test to the comparison of the deviances, as some "types" do different tests. 1645:
vote, and the group wins if the majority vote correctly. A variant allows weighted voting, so you can (say) cast 100 votes that you have a red hat. I was reminded of these because the forced voting clause makes it easy to prove the optimal strategies are indeed optimal, which I can't quite see how to
1498:
The key point to note is that the method which the hats were assigned was made explicit; there is a prior distribution on what the hats can be. While when reading the problem the description seems trivial, it turns out to be crucial. If no information about that is known then of course there is no
645:
Yes, the probabilities derive from the number of SDs. They are accurate (except that the first should be 185,000, I think) subject to the following caveats: firstly, it is not clear on what basis they have been rounded to their current values. Without access to the book in question, I don't know the
108:
In doing a negative binomial glm in R, I found two oddities. Doing the analysis of deviance I get the message "Dispersion parameter for the negative binomial(0.553) family taken to be 1." I interpret this to mean no dispersion parameter, as in the case of a quasilikelihood, is estimated. Is this
1610:
Having read the spoiler I don't think that's right. The events don't influence eachother, the strategy just means that when a wrong guess is made it will be by all three people at the same time, whereas a correct guess will be made by just one. Since each geuess has a 50/50 chance three out of four
1658:
Consider that regardless of the strategy everyone will guess wrong 50% of the time. If two people guess wrong together in a case, that provides correctness of at most 67%. If all three people guess wrong together in a case, then this provides correctness of at most 75%. This exhausts all cases. It
892:
I mean 'factual' in the sense that (if the author is stating the statistics are a normal distribution) it is a straight interpretation of the number 4.4. The discussion of statistically what 4.4 means in terms of how it can be interpreted or how reliable it is, is not under discussion here. We are
830:
a straight factual conversion, it is a conversion predicated on the assumption that the data are normally distributed. This assumption is of course straightforwardly wrong in a strict sense (the data are all nonnegative) but it might be a sensible approximation. I do not have the means to discover
1483:
You’re right that each person that guesses has a 50% chance of guessing right, or guessing wrong regardless of any other factors. But the problem has three people, and whether one person guesses right or wrong is not independent of whether somebody else does (in an optimal strategy, which too my
1415:
You have to assume that they can in order to give them a better than 50% chance of getting it right. And it took me a while to work out what the strategy is but I knew it had something to do with what to guess given what you see. (Here's a hint: no matter what the arrangement of hats, there will
935:
Algebraist is correct that it's not possible to score <0 in these statistics. Davis seems to have only cited the SDs. To show their impact in a practical sense, rather than looking at probabilities of finding others with such skill, he's applied the SDs to the leading stats in various sports,
842:
Agree with Algebraist. In practice a data point like this that is more than 4 SDs away from the mean is far too much of an outlier to provide any useful likelihood estimates. I doubt that the sample size was any near large enough to support the "1 in 184,000" conclusion. I would delete the whole
1379:
I had first confused this with another problem where after receiving the hats, the players are asked one by one if they know their own hat's color, and after at most 5 (IIRC) queries, someone will know. BTW, Tango's going to sleep quip is actually quite a cute coincidence, as a New York Times
112:
The second question is more detailed. In comparing two (nested, neg. binom.) models, the anova comparison, anova(model1,model2), shows a different log-likelihood than taking the deviance(model1)-deviance(model2). Not very different, mind you, about 0.1, which is more that I could attribute to
794:
0.000005413 from published tables ; see below 4.45: 0.99999570276356 (1/232,000) */ function normdist($ X, $ mean, $ sigma) { $ res = 0; $ x = ($ X - $ mean) / $ sigma; if ($ x == 0) { $ res = 0.5; } else { $ oor2pi = 1 / (sqrt(2 * 3.14159265358979323846)); $ t = 1 / (1 +
612:
As the requirements of FAC mean that every claim needs to be referenced, and the user seems to have left the Project (with no email address enabled) I wondered whether the probability figures he's used derive from the (cited) standard deviation figures. As my maths is rubbish, I have no idea.
843:"Probability" column and the paragraph immediately following the table, and just leave the SD figures, which presumably come straight from Davis. It is reasonable to state the SD figures, if they are given by Davis, but we should not embroider them with additional interpretations. 1659:
also assumes that one can partition the sample space though, which isn’t always possible, but nonetheless should provide the sufficient upper bound. (for instance, I don’t think it’s possible to have a strategy which wins two thirds of the time because 8 isn’t divisible by 3).
869:
and I'm afraid they'll just have to go. We have sourced comment giving context for just how good people in other sports would have to be to reach such an exalted SD, which will do just fine. Thanks all, especially Macgruder who responded very quickly to my request for help.
1450:
Am I missing something? If all are given with 50% probability then whatever colour hats the other people have won't affect the probability of guessing your own colour. The obvious answer seems to be that they should decide that just one should guess having a 50% chance. --
133:
Are you using glm or something from the MASS package? (Or something else???) Whichever you are, look at the help page for the relevant function, especially where it talks about "family". But I would suspect your hunch is true regardless, that the dispersion is assumed
1555:
Pure coincidence - I read the problem just as I was about to turn the computer off and go to sleep, attempted it, gave up, turned the computer off, immeadiately realised the answer and turned the computer back on again! (I'm not entirely sure why... bragging rights, I
1350:
Nope, not happening. At least one person has to guess, and they have a 50% chance of getting it wrong and blowing everything, so there is no way to improve on 50%. However, if the answer was 50%, you wouldn't have asked the question... Damn you... Let me sleep on it.
1315:
The situation is that there are 3 people, and they will enter the same room and receive a hat (either red or blue, each with a 50% chance). They cannot communicate whatsoever once in the room, but they can collaborate and determine a strategy before entering the
1251:. You have a triangle with one known angle and two known sides. The law of sines, combined with the fact that angles in a triangle add up to 180 degrees should be enough to determine all the other angles and sides. That will then give you the point in 795:
0.2316419 * abs($ x)); $ t *= $ oor2pi * exp(-0.5 * $ x * $ x) * (0.31938153 + $ t * (-0.356563782 + $ t * (1.781477937 + $ t * (-1.821255978 + $ t * 1.330274429)))); if ($ x : -->
146:
Check also the R-Help mailing list if you need to. The readers there are very helpful and knowledgeable—just make sure you read the manuals that came with your R distribution, the relevant function help pages, and search the archives of that list
1065:
circle, extending outwards as a line in the direction of the second circle. How does one find the point for the rightmost intersection of circle B's perimeter, expressed as a θ(b)? Ah, I will upload something to help illustrate my example! Right
666:
Davis includes the probabilities, but getting hold of a copy of the book has been a bit of a problem! Your comment of "'1 in somewhere between 147,000 and 233,000'"... does that derive from a log chart or something that I could reference?
650:: for example, the Don's probability should be '1 in somewhere between 147,000 and 233,000'. Secondly, they depend on the assumption that the underlying distribution is approximately normal. I do not know if this is a sensible assumption. 810:
If you link to that table it indicates that the calculations are by and large correct (if we consider 4.4 to be correct). There is no more need to cite a SD to probability than a conversion of Inches to cms. It's a factual conversion.
906:
Although thinking about it, major championships is obviously not Normal. It will heavily skewed towards zero, but some of the other stats like goals might be, and cricket scores might be too. I wish I could see the original
169:
Thanks! It is the glm.nb function from the MASS library; I was unaware of any ordinary glms being fit using reml; (although lme and glmm models I have seen it used). I may have to send it over to the r-help, thanks!
1416:
always be at least two people with the same colour. If you're a prisoner who can see two hats the same colour, or two hats of different colour, see if that affects your optimum strategy for guessing your own colour.)
1319:
Then at the exact same time each of them is to guess what color their hat is (or they may choose not to guess at all). If at least one person guesses correctly and nobody guesses wrong, they win a prize.
1401:
I'm having a hard time understanding the problem completely. Each one gets a hat (at the same time?), but can't see what color their own hat is? Can they see the color of each of the others' hats? --
1061:
Assume a circle (call it A if you desire) on a line with radius r. Constructed at 2r from its center point (rightwards direction) is an identical circle B of radius r. Then assume an angle θ in the
489: 388: 66: 45: 563: 51: 1811:
Maybe this is a red herring and was just a typo when you restated the problem, but I think you introduced a sign error at some point in the velocity of C too. At one point you said -1
59: 1312:
I’ve seen this problem which is somewhat similar cause much debate (and I remember it because I was convinced I had the right answer for several hours until I figured out otherwise).
1148: 196:? Since it IS a PDE, it is quite a lot of grunt work for not much benefit. But if you were to get (1/2)sin(3x)e^(3t)+(1/2)sin(3x)e^(-3t), I would be happy and grateful. My answer is 55: 1195: 1224: 1641:
I saw more or less the same problem a while ago. Here's one form: you have (say) 1023 (this is a hint) people, each provided with a random hat as before. This time everyone
1290:
What kind of more complex situations did you have in mind? There are all kinds of situations in probability were intuition turns out to be wildly incorrect. Take a look at
1002: 255: 294: 1465:
Yeah, you're missing it. The external link given by Baccyak4H contains a pretty thorough spoiler which should have you slapping your forehead and saying "Of course!" --
1595:
So it's not really correct that events which aren't linked don't "influence" each other? You can get the flow of the event by looking at other manifestations of it?
1385:
covering the problem mentions someone going to bed after solving it and recognizing its relevance to coding theory as he fell asleep. And yes, the answer is 75%
1730:(5i - 3j) + 0.2(2i + 3j). Due to the conservation of momentum, the particle C has to have the same mass and momentum, right? So I set that expression equal to ( 25: 85:
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the
193: 653:
As for sourcing, I doubt you'll be able to find these specific figures anywhere if they're not present in Davis' work (which I don't have access to).
1706:
kg and velocity (5i - 3j) ms. Particle B has mass 0.2kg and velocity (2i + 3j) ms. The particles collide and form a single particle C with velocity (
1626:
Incidentally, is there a simple proof that I'm missing that the 75% strategy is optimal? I'm pretty sure it is, but I can't see how to prove it. --
865:
and I have just come to. The probabilities are just that bit too close to debatable (and are definitely very hard to source) for the strictures of
1536:
What if the prior was probability 1 for all hats being red? What is the chance that the strategy will work in that case? </SPOILER ALERT: -->
1226:), the latter by trigonometric identities.) Then it's just a simultaneous equation which may be able to be solved (haven't gone any further). 1275:
I came across the monthy hall problem some time ago, and I'm curious about how it applies to more complex situations. Anybody have a clue?
1037: 1005: 37: 745:
The Bradman one is so far out you need to find special website that doesn't have rounding errors, but I calculated it to be as 1/184,000
646:
precise SD figures, but assuming the current figures are accurate to 1 decimal place only, the probability figures are being given far
21: 1474: 1441: 763: 1828: 1804: 1790: 1769: 1754: 1686: 1668: 1650: 1635: 1620: 1604: 1567: 1548: 1530: 1516: 1493: 1478: 1460: 1445: 1427: 1410: 1394: 1374: 1360: 1345: 1331: 1307: 1284: 1264: 1242: 1079: 1045: 1027: 1013: 945: 923: 879: 852: 835: 820: 780: 693: 676: 657: 639: 588: 574: 209: 179: 160: 122: 1553:
Well, yes, but we were given the prior. I wasn't suggesting you were wrong, just that there's a less technical approach.
1237: 393: 748:
This is not 'original' research. This is a straight correspondence to the statistical meaning of Standard Deviation.
609:
edit was made, with no edit summary or discussion at the article talk page. It's never been challenged or referenced.
681:
I couldn't find precise enough normal distribution tables online (though they doubtless exist somewhere), so I used
1255:
with the centre of the second circle as origin, which you can then convert to Cartesian coordinates if you want. --
584: 205: 299: 1824: 1406: 570: 1041: 1009: 86: 17: 1538:
Was your sleeping comment intentional, or just a coincidence, with respect to the article I linked to above?
494: 1699:
Hopefully this is okay here, I have an exam tomorrow and I'm going over past papers and am very confused :(
1760:
You have two unknowns, but if you look carefully, you also have two equations - one for i and one for j. --
1738:
i + 1j). But now I appear to have two unknowns and am thoroughly stuck. Any pointers would be appreciated.
579:
Cool, thanks. I would never have thought of using a hyperbolic trig function to express it; that's neat. --
1820: 1600: 1402: 1280: 1088: 786:
The above poster has a good point in that 4.4 is anywhere between 4.35 and 4.45, which would give a range.
566: 192:
Heh...does anyone wanna help me out and solve the partial differential equation given in the last problem
1291: 1153: 862: 580: 201: 1677:
strategy with 2/3 chance of winning, if you allow non-deterministic strategies, it should be doable. --
1596: 1276: 1647: 832: 751: 690: 654: 175: 118: 793:/* 4.35: 0.99999318792039 (1/147,000) 4.4: 0.99999458304695 (1/185,000) Agreeing with 4.4 --: --> 734: 1664: 1544: 1512: 1489: 1470: 1437: 1390: 1327: 919: 848: 816: 759: 686: 156: 1322:
The question is, given an optimal strategy, what is the probability that they will get the prize?
1200: 1071: 1295: 965: 218: 1616: 1456: 1075: 941: 875: 776: 672: 635: 260: 1611:
times a correct guess will be made, one out of four three incorrect guesses will be made. --
1783: 1747: 1422: 1381: 1252: 1234: 771:
So I'm stuck. I don't know what figures to include or how to reference them. Please help! --
200:
solution to the PDE, but I'd like to know if it's the general solution, as was asked for. --
1067: 1800: 1765: 1682: 1631: 1563: 1526: 1504: 1499:
improvement over 50%. So look at this as a conditional probability problem, and consider
1370: 1356: 1341: 1303: 1260: 796:= 0) { $ res = 1 - $ t; } else { $ res = $ t; } } return $ res; } 647: 171: 137:
Two thoughts. You will want to make sure the fitting method for both models is the same,
114: 682: 626:
If we're so far at a "yes" and "yes" scenario, how can I possibly cite the probabilities?
1702:
Two particles, A and B, are moving on a smooth horizontal surface. Particle A has mass
1660: 1540: 1508: 1485: 1466: 1433: 1386: 1323: 915: 844: 812: 755: 705: 602: 152: 804: 1022: 936:
showing what a (say) baseball batting average would need to be to be Bradmanesque. --
716:
You can work these out by number how many standard deviations you are from the mean
1612: 1500: 1452: 1248: 937: 871: 866: 772: 668: 631: 1655:
This might work...although I haven’t tried to actually prove the optimality before
1294:
for one example. Also, if you haven't seen it already, we have an article on the
962:
For the field of integers mod p, I cannot find an inverse element for all cases.
1777: 1741: 1417: 1228: 725: 1796: 1761: 1678: 1627: 1559: 1522: 1366: 1352: 1337: 1299: 1256: 1019:
0 is the only element with no multiplicative inverse. You've missed nothing. –
151:
posting a question there. If you don't, they can be quite, er, uncivil.  ;-)
74: 1535:
Right, but that still presupposes the prior. <SPOILER ALERT: -->
1726:
I formed an expression for the total momentum of the two particles:
1432:
Interesting how the game show contestants became prisoners there. --
1521:
Or just right out all the possible combinations - worked for me! --
807:
which agrees with my calculations. Look at the far 'right numbers'
722:
Move than 2 standard deviations above mean is about 2.5% (1/40)
1197:(you can work this out: 2 points on the line are (-r, 0) and (0, 735:
http://photos1.blogger.com/blogger/4850/1908/1600/sd%20table1.jpg
605:, trying to get the article to FA. As long ago as Dec 2006, 79:
Welcome to the Knowledge Mathematics Reference Desk Archives
1365:
75%. Turning my computer off and going to sleep: Take 2. --
1036:
Hurf, didn't read the definition properly. Blargh burp
1004:
where x = 0 does not seem to work. What have I missed?
606: 1203: 1156: 1091: 968: 497: 396: 302: 263: 221: 805:
http://www.math.unb.ca/~knight/utility/NormTble.htm
620:
Do the probability figures derive from the SD ones?
1218: 1189: 1142: 996: 557: 484:{\displaystyle u(x,t)=c(x)\cosh(3t)+d(x)\sinh(3t)} 483: 382: 288: 249: 491:). The initial conditions narrow that down to 383:{\displaystyle u(x,t)=a(x)e^{3t}+b(x)e^{-3t}} 8: 726:http://davidmlane.com/hyperstat/z_table.html 1202: 1155: 1134: 1121: 1108: 1090: 979: 967: 496: 395: 368: 340: 301: 268: 262: 226: 220: 558:{\displaystyle u(x,t)=\sin(3x)\cosh(3t)} 49: 36: 65: 43: 1143:{\displaystyle (x-r)^{2}+y^{2}=r^{2}} 597:Standard deviations and probabilities 7: 1190:{\displaystyle y=(x+r)\tan \theta } 826:I'd like to reiterate that this is 32: 1247:Alternatively, you could use the 1695:Momentum and vectors - Mechanics 731:Or look at the x-numbers here: 728:Type 1.96 into the 'above' box 296:which has the general solution 215:I think you got it. Obviously 113:rounding error. Any ideas? -- 1175: 1163: 1105: 1092: 1085:The right circle has equation 861:That's exactly the conclusion 708:. The maths is beyond (I mean 552: 543: 534: 525: 513: 501: 478: 469: 460: 454: 445: 436: 427: 421: 412: 400: 361: 355: 333: 327: 318: 306: 1: 1219:{\displaystyle r\tan \theta } 188:Partial differential equation 33: 1815:, and at another you used +1 1673:It's not possible to have a 1484:knowledge results in 75%). 1070:I'm thankful for any help. 1057:Geometry question on angles 1847: 1150:and the line has equation 997:{\displaystyle x*x^{-1}=1} 766:) 13:28, 20 May 2008 (UTC) 565:just like you have it. -- 250:{\displaystyle u_{xx}=-9u} 1646:do for the problem here. 289:{\displaystyle u_{tt}=9u} 1829:05:02, 21 May 2008 (UTC) 1805:11:47, 21 May 2008 (UTC) 1791:23:12, 20 May 2008 (UTC) 1770:23:03, 20 May 2008 (UTC) 1755:22:52, 20 May 2008 (UTC) 1687:18:25, 21 May 2008 (UTC) 1669:16:42, 21 May 2008 (UTC) 1651:12:53, 21 May 2008 (UTC) 1636:11:46, 21 May 2008 (UTC) 1621:10:54, 21 May 2008 (UTC) 1605:08:40, 21 May 2008 (UTC) 1568:18:30, 21 May 2008 (UTC) 1549:16:37, 21 May 2008 (UTC) 1531:15:17, 21 May 2008 (UTC) 1517:14:28, 21 May 2008 (UTC) 1494:10:52, 21 May 2008 (UTC) 1479:07:58, 21 May 2008 (UTC) 1461:07:36, 21 May 2008 (UTC) 1446:07:10, 21 May 2008 (UTC) 1428:06:42, 21 May 2008 (UTC) 1411:04:55, 21 May 2008 (UTC) 1395:04:11, 21 May 2008 (UTC) 1375:23:02, 20 May 2008 (UTC) 1361:22:50, 20 May 2008 (UTC) 1346:22:44, 20 May 2008 (UTC) 1332:22:36, 20 May 2008 (UTC) 1308:21:55, 20 May 2008 (UTC) 1285:21:42, 20 May 2008 (UTC) 1265:16:43, 20 May 2008 (UTC) 1243:16:34, 20 May 2008 (UTC) 1080:16:09, 20 May 2008 (UTC) 1046:12:14, 20 May 2008 (UTC) 1028:11:29, 20 May 2008 (UTC) 1014:11:10, 20 May 2008 (UTC) 946:15:06, 20 May 2008 (UTC) 924:14:53, 20 May 2008 (UTC) 880:14:37, 20 May 2008 (UTC) 853:14:25, 20 May 2008 (UTC) 836:14:15, 20 May 2008 (UTC) 821:14:09, 20 May 2008 (UTC) 781:13:46, 20 May 2008 (UTC) 740:So Pele is 3.7 ---: --> 694:11:48, 20 May 2008 (UTC) 677:11:36, 20 May 2008 (UTC) 658:11:30, 20 May 2008 (UTC) 640:10:41, 20 May 2008 (UTC) 589:02:21, 21 May 2008 (UTC) 575:05:59, 20 May 2008 (UTC) 210:02:42, 20 May 2008 (UTC) 180:20:39, 20 May 2008 (UTC) 161:03:20, 20 May 2008 (UTC) 123:01:57, 20 May 2008 (UTC) 104:Neg. Binom. GLM question 18:Knowledge:Reference desk 1418:Confusing Manifestation 1384:</SPOILER ALERT: --> 712:beyond) my poor brain: 1380:<SPOILER ALERT: --> 1336:Give me a minute... -- 1220: 1191: 1144: 998: 831:whether or not it is. 769: 742:1/9259 (about 1/9300) 559: 485: 384: 290: 251: 87:current reference desk 1221: 1192: 1145: 999: 863:User:The Rambling Man 800:refer to this table : 714: 616:So, three questions: 560: 486: 385: 291: 252: 1774:I love you tango :) 1292:Prosecutor's fallacy 1201: 1154: 1089: 966: 495: 394: 300: 261: 219: 1795:I love you, too! -- 704:I had a reply from 687:normal distribution 601:Hi. I'm working on 1710:i - 1j) ms, where 1296:Monty Hall problem 1216: 1187: 1140: 994: 741:0.000108 ----: --> 623:Are they accurate? 555: 481: 380: 286: 247: 1789: 1753: 1557: 1539: 1426: 1271:Game show theorem 1253:polar coordinates 767: 754:comment added by 390:(or equivalently 93: 92: 73: 72: 1838: 1786: 1780: 1775: 1750: 1744: 1739: 1554: 1537: 1420: 1241: 1240: 1225: 1223: 1222: 1217: 1196: 1194: 1193: 1188: 1149: 1147: 1146: 1141: 1139: 1138: 1126: 1125: 1113: 1112: 1003: 1001: 1000: 995: 987: 986: 789:You can use this 749: 662:Thanks. I don't 581:M1ss1ontomars2k4 564: 562: 561: 556: 490: 488: 487: 482: 389: 387: 386: 381: 379: 378: 348: 347: 295: 293: 292: 287: 276: 275: 256: 254: 253: 248: 234: 233: 202:M1ss1ontomars2k4 75: 38:Mathematics desk 34: 1846: 1845: 1841: 1840: 1839: 1837: 1836: 1835: 1821:Prestidigitator 1784: 1778: 1748: 1742: 1714:is a constant. 1697: 1403:Prestidigitator 1273: 1233: 1227: 1199: 1198: 1152: 1151: 1130: 1117: 1104: 1087: 1086: 1059: 975: 964: 963: 960: 797: 599: 567:Prestidigitator 493: 492: 392: 391: 364: 336: 298: 297: 264: 259: 258: 222: 217: 216: 190: 106: 101: 30: 29: 28: 12: 11: 5: 1844: 1842: 1834: 1833: 1832: 1831: 1809: 1808: 1807: 1724: 1723: 1696: 1693: 1692: 1691: 1690: 1689: 1656: 1653: 1624: 1623: 1593: 1592: 1591: 1590: 1589: 1588: 1587: 1586: 1585: 1584: 1583: 1582: 1581: 1580: 1579: 1578: 1577: 1576: 1575: 1574: 1573: 1572: 1571: 1570: 1399: 1398: 1397: 1363: 1348: 1320: 1317: 1313: 1272: 1269: 1268: 1267: 1245: 1215: 1212: 1209: 1206: 1186: 1183: 1180: 1177: 1174: 1171: 1168: 1165: 1162: 1159: 1137: 1133: 1129: 1124: 1120: 1116: 1111: 1107: 1103: 1100: 1097: 1094: 1058: 1055: 1053: 1051: 1050: 1049: 1048: 1038:81.187.252.174 1031: 1030: 1006:81.187.252.174 993: 990: 985: 982: 978: 974: 971: 959: 956: 955: 954: 953: 952: 951: 950: 949: 948: 913: 912: 911: 910: 909: 908: 899: 898: 897: 896: 895: 894: 885: 884: 883: 882: 856: 855: 839: 838: 802: 801: 792: 791: 790: 787: 739: 706:User:Macgruder 703: 701: 700: 699: 698: 697: 696: 685:, linked from 651: 628: 627: 624: 621: 603:Donald Bradman 598: 595: 594: 593: 592: 591: 554: 551: 548: 545: 542: 539: 536: 533: 530: 527: 524: 521: 518: 515: 512: 509: 506: 503: 500: 480: 477: 474: 471: 468: 465: 462: 459: 456: 453: 450: 447: 444: 441: 438: 435: 432: 429: 426: 423: 420: 417: 414: 411: 408: 405: 402: 399: 377: 374: 371: 367: 363: 360: 357: 354: 351: 346: 343: 339: 335: 332: 329: 326: 323: 320: 317: 314: 311: 308: 305: 285: 282: 279: 274: 271: 267: 246: 243: 240: 237: 232: 229: 225: 189: 186: 185: 184: 183: 182: 164: 163: 143: 142: 135: 130: 129: 105: 102: 100: 97: 95: 91: 90: 82: 81: 71: 70: 64: 48: 41: 40: 31: 15: 14: 13: 10: 9: 6: 4: 3: 2: 1843: 1830: 1826: 1822: 1818: 1814: 1810: 1806: 1802: 1798: 1794: 1793: 1792: 1788: 1787: 1781: 1773: 1772: 1771: 1767: 1763: 1759: 1758: 1757: 1756: 1752: 1751: 1745: 1737: 1733: 1729: 1721: 1717: 1716: 1715: 1713: 1709: 1705: 1700: 1694: 1688: 1684: 1680: 1676: 1675:deterministic 1672: 1671: 1670: 1666: 1662: 1657: 1654: 1652: 1649: 1644: 1640: 1639: 1638: 1637: 1633: 1629: 1622: 1618: 1614: 1609: 1608: 1607: 1606: 1602: 1598: 1569: 1565: 1561: 1552: 1551: 1550: 1546: 1542: 1534: 1533: 1532: 1528: 1524: 1520: 1519: 1518: 1514: 1510: 1506: 1502: 1497: 1496: 1495: 1491: 1487: 1482: 1481: 1480: 1476: 1472: 1468: 1464: 1463: 1462: 1458: 1454: 1449: 1448: 1447: 1443: 1439: 1435: 1431: 1430: 1429: 1424: 1419: 1414: 1413: 1412: 1408: 1404: 1400: 1396: 1392: 1388: 1383: 1378: 1377: 1376: 1372: 1368: 1364: 1362: 1358: 1354: 1349: 1347: 1343: 1339: 1335: 1334: 1333: 1329: 1325: 1321: 1318: 1314: 1311: 1310: 1309: 1305: 1301: 1297: 1293: 1289: 1288: 1287: 1286: 1282: 1278: 1270: 1266: 1262: 1258: 1254: 1250: 1246: 1244: 1239: 1236: 1232: 1231: 1213: 1210: 1207: 1204: 1184: 1181: 1178: 1172: 1169: 1166: 1160: 1157: 1135: 1131: 1127: 1122: 1118: 1114: 1109: 1101: 1098: 1095: 1084: 1083: 1082: 1081: 1077: 1073: 1069: 1064: 1056: 1054: 1047: 1043: 1039: 1035: 1034: 1033: 1032: 1029: 1026: 1025: 1024: 1018: 1017: 1016: 1015: 1011: 1007: 991: 988: 983: 980: 976: 972: 969: 957: 947: 943: 939: 934: 933: 932: 931: 930: 929: 928: 927: 926: 925: 921: 917: 905: 904: 903: 902: 901: 900: 891: 890: 889: 888: 887: 886: 881: 877: 873: 868: 864: 860: 859: 858: 857: 854: 850: 846: 841: 840: 837: 834: 829: 825: 824: 823: 822: 818: 814: 808: 806: 799: 798: 788: 785: 784: 783: 782: 778: 774: 768: 765: 761: 757: 753: 746: 743: 737: 736: 732: 729: 727: 723: 720: 717: 713: 711: 707: 695: 692: 688: 684: 680: 679: 678: 674: 670: 665: 661: 660: 659: 656: 652: 649: 648:too precisely 644: 643: 642: 641: 637: 633: 625: 622: 619: 618: 617: 614: 610: 608: 604: 596: 590: 586: 582: 578: 577: 576: 572: 568: 549: 546: 540: 537: 531: 528: 522: 519: 516: 510: 507: 504: 498: 475: 472: 466: 463: 457: 451: 448: 442: 439: 433: 430: 424: 418: 415: 409: 406: 403: 397: 375: 372: 369: 365: 358: 352: 349: 344: 341: 337: 330: 324: 321: 315: 312: 309: 303: 283: 280: 277: 272: 269: 265: 244: 241: 238: 235: 230: 227: 223: 214: 213: 212: 211: 207: 203: 199: 195: 187: 181: 177: 173: 168: 167: 166: 165: 162: 158: 154: 150: 145: 144: 140: 136: 132: 131: 127: 126: 125: 124: 120: 116: 110: 103: 98: 96: 88: 84: 83: 80: 77: 76: 68: 61: 57: 53: 47: 42: 39: 35: 27: 23: 19: 1816: 1812: 1776: 1740: 1735: 1731: 1727: 1725: 1719: 1711: 1707: 1703: 1701: 1698: 1674: 1642: 1625: 1597:Bastard Soap 1594: 1501:Thomas Bayes 1277:Bastard Soap 1274: 1249:law of sines 1229: 1062: 1060: 1052: 1021: 1020: 961: 914: 827: 809: 803: 770: 747: 744: 738: 733: 730: 724: 721: 719:For example 718: 715: 709: 702: 663: 629: 615: 611: 600: 197: 191: 148: 138: 128:Suggestions. 111: 107: 94: 78: 750:—Preceding 26:Mathematics 1718:Show that 1648:Algebraist 833:Algebraist 691:Algebraist 655:Algebraist 172:TeaDrinker 115:TeaDrinker 1661:GromXXVII 1541:Baccyak4H 1509:Baccyak4H 1486:GromXXVII 1467:tcsetattr 1434:tcsetattr 1387:Baccyak4H 1324:GromXXVII 916:Macgruder 845:Gandalf61 813:Macgruder 756:Macgruder 683:this tool 630:Cheers -- 153:Baccyak4H 109:correct? 50:<< 1475:contribs 1442:contribs 1023:King Bee 764:contribs 752:unsigned 24:‎ | 22:Archives 20:‎ | 1734:+ 0.2)( 1613:Q Chris 1556:guess.) 1453:Q Chris 1423:Say hi! 1382:article 1072:Scaller 938:Dweller 907:source. 872:Dweller 773:Dweller 669:Dweller 632:Dweller 89:pages. 1779:naerii 1743:naerii 1505:friend 1230:x42bn6 958:Fields 149:before 134:fixed. 99:May 20 67:May 21 46:May 19 1819:. -- 1797:Tango 1762:Tango 1722:= 0.1 1679:Tango 1628:Tango 1560:Tango 1523:Tango 1503:your 1367:Tango 1353:Tango 1338:Tango 1316:room. 1300:Tango 1257:Tango 1068:here! 1063:first 867:WP:FA 664:think 257:, so 69:: --> 63:: --> 62:: --> 44:< 16:< 1825:talk 1801:talk 1785:talk 1766:talk 1749:talk 1683:talk 1665:talk 1643:must 1632:talk 1617:talk 1601:talk 1564:talk 1545:Yak! 1527:talk 1513:Yak! 1507:... 1490:talk 1471:talk 1457:talk 1438:talk 1407:talk 1391:Yak! 1371:talk 1357:talk 1342:talk 1328:talk 1304:talk 1298:. -- 1281:talk 1261:talk 1238:Mess 1235:Talk 1076:talk 1042:talk 1010:talk 942:talk 920:talk 876:talk 849:talk 817:talk 777:talk 760:talk 673:talk 636:talk 607:this 585:talk 571:talk 538:cosh 464:sinh 431:cosh 206:talk 194:here 176:talk 157:Yak! 139:i.e. 119:talk 1208:tan 1179:tan 828:not 710:way 520:sin 60:Jun 56:May 52:Apr 1827:) 1803:) 1782:- 1768:) 1746:- 1685:) 1667:) 1634:) 1619:) 1603:) 1566:) 1558:-- 1547:) 1529:) 1515:) 1492:) 1477:) 1473:/ 1459:) 1444:) 1440:/ 1409:) 1393:) 1373:) 1359:) 1351:-- 1344:) 1330:) 1306:) 1283:) 1263:) 1214:θ 1211:⁥ 1185:θ 1182:⁥ 1099:− 1078:) 1044:) 1012:) 981:− 973:∗ 944:) 922:) 878:) 870:-- 851:) 819:) 779:) 762:• 689:. 675:) 667:-- 638:) 587:) 573:) 541:⁥ 523:⁥ 467:⁥ 434:⁥ 370:− 239:− 208:) 178:) 159:) 121:) 58:| 54:| 1823:( 1817:j 1813:j 1799:( 1764:( 1736:k 1732:m 1728:m 1720:m 1712:k 1708:k 1704:m 1681:( 1663:( 1630:( 1615:( 1599:( 1562:( 1543:( 1525:( 1511:( 1488:( 1469:( 1455:( 1436:( 1425:) 1421:( 1405:( 1389:( 1369:( 1355:( 1340:( 1326:( 1302:( 1279:( 1259:( 1205:r 1176:) 1173:r 1170:+ 1167:x 1164:( 1161:= 1158:y 1136:2 1132:r 1128:= 1123:2 1119:y 1115:+ 1110:2 1106:) 1102:r 1096:x 1093:( 1074:( 1040:( 1008:( 992:1 989:= 984:1 977:x 970:x 940:( 918:( 874:( 847:( 815:( 775:( 758:( 671:( 634:( 583:( 569:( 553:) 550:t 547:3 544:( 535:) 532:x 529:3 526:( 517:= 514:) 511:t 508:, 505:x 502:( 499:u 479:) 476:t 473:3 470:( 461:) 458:x 455:( 452:d 449:+ 446:) 443:t 440:3 437:( 428:) 425:x 422:( 419:c 416:= 413:) 410:t 407:, 404:x 401:( 398:u 376:t 373:3 366:e 362:) 359:x 356:( 353:b 350:+ 345:t 342:3 338:e 334:) 331:x 328:( 325:a 322:= 319:) 316:t 313:, 310:x 307:( 304:u 284:u 281:9 278:= 273:t 270:t 266:u 245:u 242:9 236:= 231:x 228:x 224:u 204:( 198:a 174:( 170:- 155:( 117:(

Index

Knowledge:Reference desk
Archives
Mathematics
Mathematics desk
May 19
Apr
May
Jun
May 21
current reference desk
TeaDrinker
talk
01:57, 20 May 2008 (UTC)
Baccyak4H
Yak!
03:20, 20 May 2008 (UTC)
TeaDrinker
talk
20:39, 20 May 2008 (UTC)
here
M1ss1ontomars2k4
talk
02:42, 20 May 2008 (UTC)
Prestidigitator
talk
05:59, 20 May 2008 (UTC)
M1ss1ontomars2k4
talk
02:21, 21 May 2008 (UTC)
Donald Bradman

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑