2029:
982:
By 1933, Yule's findings were generally recognized, thanks in part to the detailed discussion of partial correlation and the introduction of his innovative notation in 1907. The theorem, later associated with Frisch, Waugh, and Lovell, was also included in chapter 10 of Yule's successful statistics
986:
In a 1931 paper co-authored with
Mudgett, Frisch cited Yule's results. Yule's formulas for partial regressions were quoted and explicitly attributed to him in order to rectify a misquotation by another author. Although Yule was not explicitly mentioned in the 1933 paper by Frisch and Waugh, they
979:'s comprehensive analysis of partial regressions, published in 1907, included the theorem in section 9 on page 184. Yule emphasized the theorem's importance for understanding multiple and partial regression and correlation coefficients, as mentioned in section 10 of the same paper.
963:
is unnecessary when the predictor variables are uncorrelated: using projection matrices to make the explanatory variables orthogonal to each other will lead to the same results as running the regression with all non-orthogonal explanators included.
675:
533:
398:
1484:
130:
1270:
1477:
870:
292:
245:
218:
961:
435:
799:
722:
1305:
1046:
187:
160:
1546:
1470:
924:
897:
826:
769:
265:
742:
564:
1555:
1560:
990:
In 1963, Lovell published a proof considered more straightforward and intuitive. In recognition, people generally add his name to the theorem name.
1993:
1411:
926:. This is the basis for understanding the contribution of each single variable to a multivariate regression (see, for instance, Ch. 13 in ).
451:
1528:
975:
The origin of the theorem is uncertain, but it was well-established in the realm of linear regression before the Frisch and Waugh paper.
1868:
1518:
300:
1788:
1453:
1432:
1374:
1347:
1151:
1124:
987:
utilized the notation for partial regression coefficients initially introduced by Yule in 1907, which was widely accepted by 1933.
1594:
2074:
1830:
1075:
63:
2069:
2016:
1916:
1906:
1825:
1770:
2064:
2043:
1858:
1538:
1883:
1710:
1674:
1643:
1863:
1741:
1705:
1633:
1523:
1505:
1604:
1007:
Frisch, Ragnar; Waugh, Frederick V. (1933). "Partial Time
Regressions as Compared with Individual Trends".
1957:
1783:
1648:
1638:
1589:
2038:
1998:
1962:
1947:
1898:
1842:
1669:
548:
438:
2028:
1183:
2003:
1942:
1929:
1878:
1778:
1700:
1679:
1653:
190:
1209:
2021:
1952:
1837:
1804:
1756:
1746:
1725:
1720:
1599:
1581:
1566:
1497:
1303:
Lovell, M. (1963). "Seasonal
Adjustment of Economic Time Series and Multiple Regression Analysis".
1044:
Lovell, M. (1963). "Seasonal
Adjustment of Economic Time Series and Multiple Regression Analysis".
442:
2033:
1937:
1926:
1751:
1392:
1092:
1026:
681:
43:
967:
Moreover, the standard errors from the partial regression equal those from the full regression.
831:
270:
223:
196:
1210:"On the Theory of Correlation for any Number of Variables, Treated by a New System of Notation"
932:
406:
1988:
1715:
1625:
1616:
1449:
1443:
1428:
1407:
1388:
1370:
1366:
1343:
1147:
1120:
446:
54:
47:
1422:
1337:
1114:
1684:
1551:
1396:
1358:
1314:
1285:
1229:
1221:
1141:
1084:
1055:
1018:
976:
777:
690:
165:
138:
1967:
1873:
1809:
1814:
57:
we are concerned with is expressed in terms of two separate sets of predictor variables:
1462:
1911:
902:
875:
804:
747:
250:
2058:
1983:
1513:
1493:
1384:
1359:
1250:
1110:
1096:
39:
670:{\displaystyle M_{X_{1}}=I-X_{1}(X_{1}^{\mathsf {T}}X_{1})^{-1}X_{1}^{\mathsf {T}},}
1318:
1289:
1234:
1059:
1009:
727:
31:
1404:
The
Elements of Statistical Learning : Data Mining, Inference, and Prediction
1571:
983:
textbook, first published in 1911. The book reached its tenth edition by 1932.
294:
will be the same as the estimate of it from a modified regression of the form:
1088:
1225:
17:
929:
The theorem also implies that the secondary regression used for obtaining
1030:
774:
The most relevant consequence of the theorem is that the parameters in
528:{\displaystyle X_{1}(X_{1}^{\mathsf {T}}X_{1})^{-1}X_{1}^{\mathsf {T}}}
1022:
680:
and this particular orthogonal projection matrix is known as the
1466:
393:{\displaystyle M_{X_{1}}Y=M_{X_{1}}X_{2}\beta _{2}+M_{X_{1}}u,}
1169:
Data
Analysis and Regression a Second Course in Statistics
1271:"Statistical Correlation and the Theory of Cluster Types"
1119:. Princeton: Princeton University Press. pp. 18–19.
1252:
1073:
Lovell, M. (2008). "A Simple Proof of the FWL Theorem".
1397:"Multiple Regression from Simple Univariate Regression"
1184:"The Frisch--Waugh--Lovell theorem for standard errors"
905:
878:
834:
807:
780:
750:
730:
693:
1427:. New York: Oxford University Press. pp. 54–60.
1342:. New York: Oxford University Press. pp. 19–24.
935:
567:
454:
409:
303:
273:
253:
226:
199:
168:
141:
66:
1406:(2nd ed.). New York: Springer. pp. 52–55.
1976:
1925:
1897:
1851:
1797:
1769:
1734:
1693:
1662:
1624:
1613:
1580:
1537:
1504:
125:{\displaystyle Y=X_{1}\beta _{1}+X_{2}\beta _{2}+u}
53:The Frisch–Waugh–Lovell theorem states that if the
955:
918:
891:
864:
820:
793:
763:
736:
716:
669:
527:
429:
392:
286:
259:
239:
212:
181:
154:
124:
1424:An Introduction to Classical Econometric Theory
1357:Davidson, Russell; MacKinnon, James G. (2004).
1336:Davidson, Russell; MacKinnon, James G. (1993).
1306:Journal of the American Statistical Association
1278:Journal of the American Statistical Association
1047:Journal of the American Statistical Association
1365:. New York: Oxford University Press. pp.
724:is the vector of residuals from regression of
1478:
8:
682:residual maker matrix or annihilator matrix
1621:
1485:
1471:
1463:
1233:
945:
940:
934:
910:
904:
883:
877:
856:
844:
839:
833:
812:
806:
785:
779:
755:
749:
729:
703:
698:
692:
657:
656:
651:
638:
628:
617:
616:
611:
598:
577:
572:
566:
518:
517:
512:
499:
489:
478:
477:
472:
459:
453:
419:
414:
408:
376:
371:
358:
348:
336:
331:
313:
308:
302:
278:
272:
267:is the error term), then the estimate of
252:
231:
225:
204:
198:
173:
167:
146:
140:
110:
100:
87:
77:
65:
1339:Estimation and Inference in Econometrics
1994:Numerical smoothing and differentiation
1269:Frisch, Ragnar; Mudgett, B. D. (1931).
999:
658:
618:
519:
479:
27:Theorem in statistics and econometrics
1264:
1262:
7:
1529:Iteratively reweighted least squares
1203:
1201:
1167:Mosteller, F.; Tukey, J. W. (1977).
38:is named after the econometricians
1547:Pearson product-moment correlation
1255:. London: Charles Griffin &Co.
1214:Proceedings of the Royal Society A
1188:Statistics and Probability Letters
25:
36:Frisch–Waugh–Lovell (FWL) theorem
2027:
1146:. Malden: Blackwell. p. 7.
1448:. MIT Press. pp. 311–314.
1445:A Primer in Econometric Theory
1361:Econometric Theory and Methods
1319:10.1080/01621459.1963.10480682
1290:10.1080/01621459.1931.10502225
1060:10.1080/01621459.1963.10480682
635:
604:
496:
465:
1:
1076:Journal of Economic Education
2017:Regression analysis category
1907:Response surface methodology
551:of the column space of
1889:Frisch–Waugh–Lovell theorem
1859:Mean and predicted response
865:{\textstyle M_{X_{1}}X_{2}}
2091:
1539:Correlation and dependence
1249:Yule, George Udny (1932).
1208:Yule, George Udny (1907).
287:{\displaystyle \beta _{2}}
240:{\displaystyle \beta _{2}}
213:{\displaystyle \beta _{1}}
2012:
1884:Minimum mean-square error
1771:Decomposition of variance
1675:Growth curve (statistics)
1644:Generalized least squares
1442:Stachurski, John (2016).
956:{\displaystyle M_{X_{1}}}
430:{\displaystyle M_{X_{1}}}
1742:Generalized linear model
1634:Simple linear regression
1524:Non-linear least squares
1506:Computational statistics
1140:Davidson, James (2000).
1235:2027/coo.31924081088423
1089:10.3200/JECE.39.1.88-91
872:, that is: the part of
794:{\textstyle \beta _{2}}
717:{\textstyle M_{X_{1}}Y}
2075:Theorems in statistics
2034:Mathematics portal
1958:Orthogonal polynomials
1784:Analysis of covariance
1649:Weighted least squares
1639:Ordinary least squares
1590:Ordinary least squares
1226:10.1098/rspa.1907.0028
957:
920:
893:
866:
822:
795:
765:
738:
718:
671:
529:
431:
394:
288:
261:
241:
214:
183:
156:
126:
1999:System identification
1963:Chebyshev polynomials
1948:Numerical integration
1899:Design of experiments
1843:Regression validation
1670:Polynomial regression
1595:Partial least squares
958:
921:
894:
867:
823:
796:
766:
739:
719:
672:
549:orthogonal complement
530:
439:orthogonal complement
432:
395:
289:
262:
242:
215:
184:
182:{\displaystyle X_{2}}
157:
155:{\displaystyle X_{1}}
127:
2004:Moving least squares
1943:Approximation theory
1879:Studentized residual
1869:Errors and residuals
1864:Gauss–Markov theorem
1779:Analysis of variance
1701:Nonlinear regression
1680:Segmented regression
1654:General linear model
1572:Confounding variable
1519:Linear least squares
1421:Ruud, P. A. (2000).
933:
903:
876:
832:
805:
778:
748:
728:
691:
565:
452:
407:
301:
271:
251:
224:
197:
166:
139:
64:
2070:Regression analysis
2022:Statistics category
1953:Gaussian quadrature
1838:Model specification
1805:Stepwise regression
1663:Predictor structure
1600:Total least squares
1582:Regression analysis
1567:Partial correlation
1498:regression analysis
1182:Peng, Ding (2021).
663:
623:
524:
484:
2065:Economics theorems
2039:Statistics outline
1938:Numerical analysis
1389:Tibshirani, Robert
1143:Econometric Theory
953:
919:{\textstyle X_{1}}
916:
899:uncorrelated with
892:{\textstyle X_{2}}
889:
862:
821:{\textstyle X_{2}}
818:
791:
764:{\textstyle X_{1}}
761:
744:on the columns of
734:
714:
667:
647:
607:
547:projects onto the
525:
508:
468:
437:projects onto the
427:
390:
284:
257:
237:
210:
179:
152:
122:
44:Frederick V. Waugh
2052:
2051:
2044:Statistics topics
1989:Calibration curve
1798:Model exploration
1765:
1764:
1735:Non-normal errors
1626:Linear regression
1617:statistical model
1413:978-0-387-84857-0
1313:(304): 993–1010.
1171:. Addison-Wesley.
1054:(304): 993–1010.
558:. Specifically,
535:. Equivalently,
447:projection matrix
260:{\displaystyle u}
247:are vectors (and
48:Michael C. Lovell
16:(Redirected from
2082:
2032:
2031:
1789:Multivariate AOV
1685:Local regression
1622:
1614:Regression as a
1605:Ridge regression
1552:Rank correlation
1487:
1480:
1473:
1464:
1459:
1438:
1417:
1401:
1393:Friedman, Jerome
1380:
1364:
1353:
1323:
1322:
1300:
1294:
1293:
1284:(176): 375–392.
1275:
1266:
1257:
1256:
1246:
1240:
1239:
1237:
1220:(529): 182–193.
1205:
1196:
1195:
1179:
1173:
1172:
1164:
1158:
1157:
1137:
1131:
1130:
1107:
1101:
1100:
1070:
1064:
1063:
1041:
1035:
1034:
1004:
977:George Udny Yule
962:
960:
959:
954:
952:
951:
950:
949:
925:
923:
922:
917:
915:
914:
898:
896:
895:
890:
888:
887:
871:
869:
868:
863:
861:
860:
851:
850:
849:
848:
827:
825:
824:
819:
817:
816:
801:do not apply to
800:
798:
797:
792:
790:
789:
770:
768:
767:
762:
760:
759:
743:
741:
740:
735:
723:
721:
720:
715:
710:
709:
708:
707:
676:
674:
673:
668:
662:
661:
655:
646:
645:
633:
632:
622:
621:
615:
603:
602:
584:
583:
582:
581:
534:
532:
531:
526:
523:
522:
516:
507:
506:
494:
493:
483:
482:
476:
464:
463:
436:
434:
433:
428:
426:
425:
424:
423:
399:
397:
396:
391:
383:
382:
381:
380:
363:
362:
353:
352:
343:
342:
341:
340:
320:
319:
318:
317:
293:
291:
290:
285:
283:
282:
266:
264:
263:
258:
246:
244:
243:
238:
236:
235:
219:
217:
216:
211:
209:
208:
188:
186:
185:
180:
178:
177:
161:
159:
158:
153:
151:
150:
131:
129:
128:
123:
115:
114:
105:
104:
92:
91:
82:
81:
21:
2090:
2089:
2085:
2084:
2083:
2081:
2080:
2079:
2055:
2054:
2053:
2048:
2026:
2008:
1972:
1968:Chebyshev nodes
1921:
1917:Bayesian design
1893:
1874:Goodness of fit
1847:
1820:
1810:Model selection
1793:
1761:
1730:
1689:
1658:
1615:
1609:
1576:
1533:
1500:
1491:
1456:
1441:
1435:
1420:
1414:
1399:
1383:
1377:
1356:
1350:
1335:
1332:
1330:Further reading
1327:
1326:
1302:
1301:
1297:
1273:
1268:
1267:
1260:
1248:
1247:
1243:
1207:
1206:
1199:
1181:
1180:
1176:
1166:
1165:
1161:
1154:
1139:
1138:
1134:
1127:
1109:
1108:
1104:
1072:
1071:
1067:
1043:
1042:
1038:
1023:10.2307/1907330
1006:
1005:
1001:
996:
973:
941:
936:
931:
930:
906:
901:
900:
879:
874:
873:
852:
840:
835:
830:
829:
808:
803:
802:
781:
776:
775:
751:
746:
745:
726:
725:
699:
694:
689:
688:
634:
624:
594:
573:
568:
563:
562:
557:
546:
545:
495:
485:
455:
450:
449:
415:
410:
405:
404:
372:
367:
354:
344:
332:
327:
309:
304:
299:
298:
274:
269:
268:
249:
248:
227:
222:
221:
200:
195:
194:
169:
164:
163:
142:
137:
136:
106:
96:
83:
73:
62:
61:
28:
23:
22:
15:
12:
11:
5:
2088:
2086:
2078:
2077:
2072:
2067:
2057:
2056:
2050:
2049:
2047:
2046:
2041:
2036:
2024:
2019:
2013:
2010:
2009:
2007:
2006:
2001:
1996:
1991:
1986:
1980:
1978:
1974:
1973:
1971:
1970:
1965:
1960:
1955:
1950:
1945:
1940:
1934:
1932:
1923:
1922:
1920:
1919:
1914:
1912:Optimal design
1909:
1903:
1901:
1895:
1894:
1892:
1891:
1886:
1881:
1876:
1871:
1866:
1861:
1855:
1853:
1849:
1848:
1846:
1845:
1840:
1835:
1834:
1833:
1828:
1823:
1818:
1807:
1801:
1799:
1795:
1794:
1792:
1791:
1786:
1781:
1775:
1773:
1767:
1766:
1763:
1762:
1760:
1759:
1754:
1749:
1744:
1738:
1736:
1732:
1731:
1729:
1728:
1723:
1718:
1713:
1711:Semiparametric
1708:
1703:
1697:
1695:
1691:
1690:
1688:
1687:
1682:
1677:
1672:
1666:
1664:
1660:
1659:
1657:
1656:
1651:
1646:
1641:
1636:
1630:
1628:
1619:
1611:
1610:
1608:
1607:
1602:
1597:
1592:
1586:
1584:
1578:
1577:
1575:
1574:
1569:
1564:
1558:
1556:Spearman's rho
1549:
1543:
1541:
1535:
1534:
1532:
1531:
1526:
1521:
1516:
1510:
1508:
1502:
1501:
1492:
1490:
1489:
1482:
1475:
1467:
1461:
1460:
1454:
1439:
1433:
1418:
1412:
1385:Hastie, Trevor
1381:
1375:
1354:
1348:
1331:
1328:
1325:
1324:
1295:
1258:
1241:
1197:
1174:
1159:
1152:
1132:
1125:
1111:Hayashi, Fumio
1102:
1065:
1036:
1017:(4): 387–401.
998:
997:
995:
992:
972:
969:
948:
944:
939:
913:
909:
886:
882:
859:
855:
847:
843:
838:
815:
811:
788:
784:
758:
754:
737:{\textstyle Y}
733:
713:
706:
702:
697:
678:
677:
666:
660:
654:
650:
644:
641:
637:
631:
627:
620:
614:
610:
606:
601:
597:
593:
590:
587:
580:
576:
571:
555:
543:
539:
521:
515:
511:
505:
502:
498:
492:
488:
481:
475:
471:
467:
462:
458:
422:
418:
413:
401:
400:
389:
386:
379:
375:
370:
366:
361:
357:
351:
347:
339:
335:
330:
326:
323:
316:
312:
307:
281:
277:
256:
234:
230:
207:
203:
176:
172:
149:
145:
133:
132:
121:
118:
113:
109:
103:
99:
95:
90:
86:
80:
76:
72:
69:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
2087:
2076:
2073:
2071:
2068:
2066:
2063:
2062:
2060:
2045:
2042:
2040:
2037:
2035:
2030:
2025:
2023:
2020:
2018:
2015:
2014:
2011:
2005:
2002:
2000:
1997:
1995:
1992:
1990:
1987:
1985:
1984:Curve fitting
1982:
1981:
1979:
1975:
1969:
1966:
1964:
1961:
1959:
1956:
1954:
1951:
1949:
1946:
1944:
1941:
1939:
1936:
1935:
1933:
1931:
1930:approximation
1928:
1924:
1918:
1915:
1913:
1910:
1908:
1905:
1904:
1902:
1900:
1896:
1890:
1887:
1885:
1882:
1880:
1877:
1875:
1872:
1870:
1867:
1865:
1862:
1860:
1857:
1856:
1854:
1850:
1844:
1841:
1839:
1836:
1832:
1829:
1827:
1824:
1822:
1821:
1813:
1812:
1811:
1808:
1806:
1803:
1802:
1800:
1796:
1790:
1787:
1785:
1782:
1780:
1777:
1776:
1774:
1772:
1768:
1758:
1755:
1753:
1750:
1748:
1745:
1743:
1740:
1739:
1737:
1733:
1727:
1724:
1722:
1719:
1717:
1714:
1712:
1709:
1707:
1706:Nonparametric
1704:
1702:
1699:
1698:
1696:
1692:
1686:
1683:
1681:
1678:
1676:
1673:
1671:
1668:
1667:
1665:
1661:
1655:
1652:
1650:
1647:
1645:
1642:
1640:
1637:
1635:
1632:
1631:
1629:
1627:
1623:
1620:
1618:
1612:
1606:
1603:
1601:
1598:
1596:
1593:
1591:
1588:
1587:
1585:
1583:
1579:
1573:
1570:
1568:
1565:
1562:
1561:Kendall's tau
1559:
1557:
1553:
1550:
1548:
1545:
1544:
1542:
1540:
1536:
1530:
1527:
1525:
1522:
1520:
1517:
1515:
1514:Least squares
1512:
1511:
1509:
1507:
1503:
1499:
1495:
1494:Least squares
1488:
1483:
1481:
1476:
1474:
1469:
1468:
1465:
1457:
1455:9780262337465
1451:
1447:
1446:
1440:
1436:
1434:0-19-511164-8
1430:
1426:
1425:
1419:
1415:
1409:
1405:
1398:
1394:
1390:
1386:
1382:
1378:
1376:0-19-512372-7
1372:
1368:
1363:
1362:
1355:
1351:
1349:0-19-506011-3
1345:
1341:
1340:
1334:
1333:
1329:
1320:
1316:
1312:
1308:
1307:
1299:
1296:
1291:
1287:
1283:
1279:
1272:
1265:
1263:
1259:
1254:
1253:
1245:
1242:
1236:
1231:
1227:
1223:
1219:
1215:
1211:
1204:
1202:
1198:
1193:
1189:
1185:
1178:
1175:
1170:
1163:
1160:
1155:
1153:0-631-21584-0
1149:
1145:
1144:
1136:
1133:
1128:
1126:0-691-01018-8
1122:
1118:
1117:
1112:
1106:
1103:
1098:
1094:
1090:
1086:
1082:
1078:
1077:
1069:
1066:
1061:
1057:
1053:
1049:
1048:
1040:
1037:
1032:
1028:
1024:
1020:
1016:
1012:
1011:
1003:
1000:
993:
991:
988:
984:
980:
978:
970:
968:
965:
946:
942:
937:
927:
911:
907:
884:
880:
857:
853:
845:
841:
836:
813:
809:
786:
782:
772:
756:
752:
731:
711:
704:
700:
695:
685:
683:
664:
652:
648:
642:
639:
629:
625:
612:
608:
599:
595:
591:
588:
585:
578:
574:
569:
561:
560:
559:
554:
550:
542:
538:
513:
509:
503:
500:
490:
486:
473:
469:
460:
456:
448:
444:
440:
420:
416:
411:
387:
384:
377:
373:
368:
364:
359:
355:
349:
345:
337:
333:
328:
324:
321:
314:
310:
305:
297:
296:
295:
279:
275:
254:
232:
228:
205:
201:
192:
174:
170:
147:
143:
119:
116:
111:
107:
101:
97:
93:
88:
84:
78:
74:
70:
67:
60:
59:
58:
56:
51:
49:
45:
41:
40:Ragnar Frisch
37:
33:
19:
1977:Applications
1888:
1816:
1694:Non-standard
1444:
1423:
1403:
1360:
1338:
1310:
1304:
1298:
1281:
1277:
1251:
1244:
1217:
1213:
1191:
1187:
1177:
1168:
1162:
1142:
1135:
1116:Econometrics
1115:
1105:
1083:(1): 88–91.
1080:
1074:
1068:
1051:
1045:
1039:
1014:
1010:Econometrica
1008:
1002:
989:
985:
981:
974:
966:
928:
773:
686:
679:
552:
540:
536:
402:
134:
52:
35:
32:econometrics
29:
687:The vector
18:FWL theorem
2059:Categories
1852:Background
1815:Mallows's
994:References
55:regression
1927:Numerical
1194:: 108945.
1097:154907484
783:β
640:−
592:−
501:−
356:β
276:β
229:β
202:β
108:β
85:β
1757:Logistic
1747:Binomial
1726:Isotonic
1721:Quantile
1395:(2017).
1113:(2000).
191:matrices
1752:Poisson
1031:1907330
971:History
828:but to
445:of the
441:of the
1716:Robust
1452:
1431:
1410:
1373:
1346:
1150:
1123:
1095:
1029:
403:where
135:where
46:, and
34:, the
1400:(PDF)
1369:–75.
1274:(PDF)
1093:S2CID
1027:JSTOR
443:image
1496:and
1450:ISBN
1429:ISBN
1408:ISBN
1371:ISBN
1344:ISBN
1148:ISBN
1121:ISBN
220:and
189:are
162:and
1831:BIC
1826:AIC
1315:doi
1286:doi
1230:hdl
1222:doi
1192:168
1085:doi
1056:doi
1019:doi
30:In
2061::
1402:.
1391:;
1387:;
1367:62
1311:58
1309:.
1282:21
1280:.
1276:.
1261:^
1228:.
1218:79
1216:.
1212:.
1200:^
1190:.
1186:.
1091:.
1081:39
1079:.
1052:58
1050:.
1025:.
1013:.
771:.
684:.
193:,
50:.
42:,
1819:p
1817:C
1563:)
1554:(
1486:e
1479:t
1472:v
1458:.
1437:.
1416:.
1379:.
1352:.
1321:.
1317::
1292:.
1288::
1238:.
1232::
1224::
1156:.
1129:.
1099:.
1087::
1062:.
1058::
1033:.
1021::
1015:1
947:1
943:X
938:M
912:1
908:X
885:2
881:X
858:2
854:X
846:1
842:X
837:M
814:2
810:X
787:2
757:1
753:X
732:Y
712:Y
705:1
701:X
696:M
665:,
659:T
653:1
649:X
643:1
636:)
630:1
626:X
619:T
613:1
609:X
605:(
600:1
596:X
589:I
586:=
579:1
575:X
570:M
556:1
553:X
544:1
541:X
537:M
520:T
514:1
510:X
504:1
497:)
491:1
487:X
480:T
474:1
470:X
466:(
461:1
457:X
421:1
417:X
412:M
388:,
385:u
378:1
374:X
369:M
365:+
360:2
350:2
346:X
338:1
334:X
329:M
325:=
322:Y
315:1
311:X
306:M
280:2
255:u
233:2
206:1
175:2
171:X
148:1
144:X
120:u
117:+
112:2
102:2
98:X
94:+
89:1
79:1
75:X
71:=
68:Y
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.