Knowledge

talk:WikiProject Mathematics/equivlist - Knowledge

Source 📝

615:) with no loss of generality. After a suitable set of factors are found, they may also be arbitrarily rotated within the hyperplane, so that any such rotation of the factor vectors will define the same hyperplane, and also be a solution. As a result, in the above example, in which the fitting hyperplane is two dimensional, if we do not know beforehand that the two types of intelligence are uncorrelated, then we cannot interpret the two factors as the two different types of intelligence. Even if they are uncorrelated, we cannot tell which factor corresponds to verbal intelligence and which corresponds to mathematical intelligence, or whether the factors are linear combinations of both, without an outside argument. 1228:
at approximate solutions to the problem, particularly in estimating the communalities by other means, which then simplifies the problem considerably. With the advent of high-speed computers, the minimization problem can be solved quickly amd directly, and the communalities are calculated in the process, rather than being needed beforehand. The
859:
The goal of factor analysis is to choose the fitting hyperplane such that the reduced correlation matrix reproduces the correlation matrix as nearly as possible, except for the diagonal elements of the correlation matrix which are known to have unit value. In other words, the goal is to reproduce as
1227:
Large values of the commmunalities will indicate that the fitting hyperplane is rather accurately reproducing the correlation matrix. The optimization problem stated above is intractable without high-speed computation. Before the advent of high speed computers, considerable effort was made to arrive
1101:
are orthogonal projections of the data vectors, their length will be less than or equal to the length of the projected data vector, which is unity. The square of these lengths are just the diagonal elements of the reduced correlation matrix. These diagonal elements of the reduced correlation matrix
1064:
The term on the right is just the covariance of the errors. In the model, the error covariance is stated to be a diagonal matrix and so the above minimization problem will in fact yield a "best fit" to the model: It will yield a sample estimate of the error covariance which has its off-diagonal
1222: 1059: 390: 968: 853: 527:
and the errors are vectors from that projected point to the data point and are perpendicular to the hyperplane. The goal of factor analysis is to find a hyperplane which is a "best fit" to the data in some sense. Any set of
522: 443: 613: 705: 278: 228: 1108: 979: 114: 765:. The diagonal elements will clearly be 1's and the off diagonal elements will have absolute values less than or equal to unity. The "reduced correlation matrix" is defined as 763: 734: 645: 199: 170: 1099: 445:. In the above example, the hyperplane is just a 2-dimensional plane defined by the two factor vectors. The projection of the data vectors onto the hyperplane is given by 313: 84: 54: 553: 305: 141: 555:
factor vectors which lie in the hyperplane and are independent will serve to define the hyperplane, so we are free to specify them as both orthogonal and normal (
866: 307:-dimensional linear subspace (i.e. a hyperplane) in this space, upon which the data vectors are projected orthogonally. This follows from the model equation 771: 860:
accurately as possible the cross-correlations in the data. Specifically, for the fitting hyperplane, the mean square error in the off-diagonal components
451: 17: 398: 1229: 973:
is to be minimized, and this is accomplished by minimizing it with respect to a set of orthonormal factor vectors. It can be seen that
558: 650: 233: 204: 707:. The correlation matrix can be geometrically interpreted as the cosine of the angle between the two data vectors 1217:{\displaystyle h_{a}^{2}={\hat {\mathbf {z} }}_{a}\cdot {\hat {\mathbf {z} }}_{a}=\sum _{p}\ell _{ap}\ell _{ap}} 1232:
algorithm is particularly suited to this problem, but is hardly the only means of finding an exact solution.
1054:{\displaystyle r_{ab}-{\hat {r}}_{ab}={\boldsymbol {\varepsilon }}_{a}\cdot {\boldsymbol {\varepsilon }}_{b}} 89: 739: 710: 621: 385:{\displaystyle \mathbf {z} _{a}=\sum _{p}\ell _{ap}\mathbf {F} _{p}+{\boldsymbol {\varepsilon }}_{a}} 175: 146: 26:
The parameters and variables of factor analysis can be given a geometrical interpretation. The data (
1068: 59: 29: 531: 283: 119: 963:{\displaystyle \varepsilon ^{2}=\sum _{a,b\neq a}\left(r_{ab}-{\hat {r}}_{ab}\right)^{2}} 848:{\displaystyle {\hat {r}}_{ab}={\hat {\mathbf {z} }}_{a}\cdot {\hat {\mathbf {z} }}_{b}} 230:
respectively. Since the data is standardized, the data vectors are of unit length (
1065:
components minimized in the mean square sense. It can be seen that since the
517:{\displaystyle {\hat {\mathbf {z} }}_{a}=\sum _{p}\ell _{ap}\mathbf {F} _{p}} 438:{\displaystyle \mathbf {F} _{p}\cdot {\boldsymbol {\varepsilon }}_{a}=0} 608:{\displaystyle \mathbf {F} _{p}\cdot \mathbf {F} _{q}=\delta _{pq}} 647:
have unit length. The correlation matrix for the data is given by
700:{\displaystyle r_{ab}=\mathbf {z} _{a}\cdot \mathbf {z} _{b}} 143:-dimensional Euclidean space (sample space), represented as 273:{\displaystyle \mathbf {z} _{a}\cdot \mathbf {z} _{a}=1} 1111: 1071: 982: 869: 774: 742: 713: 653: 624: 561: 534: 454: 401: 316: 286: 236: 207: 178: 149: 122: 92: 62: 32: 395:
and the independence of the factors and the errors:
1216: 1093: 1053: 962: 847: 757: 728: 699: 639: 607: 547: 516: 437: 384: 299: 272: 222: 193: 164: 135: 108: 78: 48: 223:{\displaystyle {\boldsymbol {\varepsilon }}_{a}} 8: 1205: 1192: 1182: 1169: 1158: 1156: 1155: 1145: 1134: 1132: 1131: 1121: 1116: 1110: 1085: 1074: 1073: 1070: 1045: 1040: 1030: 1025: 1012: 1001: 1000: 987: 981: 954: 940: 929: 928: 915: 887: 874: 868: 839: 828: 826: 825: 815: 804: 802: 801: 788: 777: 776: 773: 749: 744: 741: 720: 715: 712: 691: 686: 676: 671: 658: 652: 631: 626: 623: 596: 583: 578: 568: 563: 560: 539: 533: 508: 503: 493: 483: 470: 459: 457: 456: 453: 423: 418: 408: 403: 400: 376: 371: 361: 356: 346: 336: 323: 318: 315: 291: 285: 258: 253: 243: 238: 235: 214: 209: 206: 185: 180: 177: 156: 151: 148: 127: 121: 97: 91: 67: 61: 37: 31: 1041: 1026: 419: 372: 210: 18:Knowledge talk:WikiProject Mathematics 7: 24: 116:) can be viewed as vectors in an 109:{\displaystyle \varepsilon _{ai}} 1159: 1135: 829: 805: 758:{\displaystyle \mathbf {z} _{b}} 745: 729:{\displaystyle \mathbf {z} _{a}} 716: 687: 672: 640:{\displaystyle \mathbf {z} _{a}} 627: 579: 564: 504: 460: 404: 357: 319: 280:). The factor vectors define an 254: 239: 194:{\displaystyle \mathbf {F} _{p}} 181: 165:{\displaystyle \mathbf {z} _{a}} 152: 1163: 1139: 1102:are known as "communalities": 1094:{\displaystyle {\hat {z}}_{a}} 1079: 1006: 934: 833: 809: 782: 464: 1: 1246: 1218: 1095: 1055: 964: 849: 759: 730: 701: 641: 609: 549: 518: 439: 386: 301: 274: 224: 195: 166: 137: 110: 80: 79:{\displaystyle F_{pi}} 50: 49:{\displaystyle z_{ai}} 1219: 1096: 1056: 965: 850: 760: 731: 702: 642: 610: 550: 548:{\displaystyle N_{p}} 519: 440: 387: 302: 300:{\displaystyle N_{p}} 275: 225: 196: 167: 138: 136:{\displaystyle N_{i}} 111: 81: 51: 1109: 1069: 980: 867: 772: 740: 711: 651: 622: 559: 532: 452: 399: 314: 284: 234: 205: 176: 147: 120: 90: 60: 30: 1126: 1214: 1187: 1112: 1091: 1051: 960: 904: 845: 755: 726: 697: 637: 605: 545: 514: 488: 435: 382: 341: 297: 270: 220: 191: 162: 133: 106: 86:) and the errors ( 76: 46: 1178: 1166: 1142: 1082: 1009: 937: 883: 836: 812: 785: 618:The data vectors 479: 467: 332: 1237: 1223: 1221: 1220: 1215: 1213: 1212: 1200: 1199: 1186: 1174: 1173: 1168: 1167: 1162: 1157: 1150: 1149: 1144: 1143: 1138: 1133: 1125: 1120: 1100: 1098: 1097: 1092: 1090: 1089: 1084: 1083: 1075: 1060: 1058: 1057: 1052: 1050: 1049: 1044: 1035: 1034: 1029: 1020: 1019: 1011: 1010: 1002: 995: 994: 969: 967: 966: 961: 959: 958: 953: 949: 948: 947: 939: 938: 930: 923: 922: 903: 879: 878: 854: 852: 851: 846: 844: 843: 838: 837: 832: 827: 820: 819: 814: 813: 808: 803: 796: 795: 787: 786: 778: 764: 762: 761: 756: 754: 753: 748: 735: 733: 732: 727: 725: 724: 719: 706: 704: 703: 698: 696: 695: 690: 681: 680: 675: 666: 665: 646: 644: 643: 638: 636: 635: 630: 614: 612: 611: 606: 604: 603: 588: 587: 582: 573: 572: 567: 554: 552: 551: 546: 544: 543: 523: 521: 520: 515: 513: 512: 507: 501: 500: 487: 475: 474: 469: 468: 463: 458: 444: 442: 441: 436: 428: 427: 422: 413: 412: 407: 391: 389: 388: 383: 381: 380: 375: 366: 365: 360: 354: 353: 340: 328: 327: 322: 306: 304: 303: 298: 296: 295: 279: 277: 276: 271: 263: 262: 257: 248: 247: 242: 229: 227: 226: 221: 219: 218: 213: 200: 198: 197: 192: 190: 189: 184: 171: 169: 168: 163: 161: 160: 155: 142: 140: 139: 134: 132: 131: 115: 113: 112: 107: 105: 104: 85: 83: 82: 77: 75: 74: 56:), the factors ( 55: 53: 52: 47: 45: 44: 1245: 1244: 1240: 1239: 1238: 1236: 1235: 1234: 1201: 1188: 1154: 1130: 1107: 1106: 1072: 1067: 1066: 1039: 1024: 999: 983: 978: 977: 927: 911: 910: 906: 905: 870: 865: 864: 824: 800: 775: 770: 769: 743: 738: 737: 714: 709: 708: 685: 670: 654: 649: 648: 625: 620: 619: 592: 577: 562: 557: 556: 535: 530: 529: 502: 489: 455: 450: 449: 417: 402: 397: 396: 370: 355: 342: 317: 312: 311: 287: 282: 281: 252: 237: 232: 231: 208: 203: 202: 179: 174: 173: 150: 145: 144: 123: 118: 117: 93: 88: 87: 63: 58: 57: 33: 28: 27: 22: 21: 20: 12: 11: 5: 1243: 1241: 1225: 1224: 1211: 1208: 1204: 1198: 1195: 1191: 1185: 1181: 1177: 1172: 1165: 1161: 1153: 1148: 1141: 1137: 1129: 1124: 1119: 1115: 1088: 1081: 1078: 1062: 1061: 1048: 1043: 1038: 1033: 1028: 1023: 1018: 1015: 1008: 1005: 998: 993: 990: 986: 971: 970: 957: 952: 946: 943: 936: 933: 926: 921: 918: 914: 909: 902: 899: 896: 893: 890: 886: 882: 877: 873: 857: 856: 842: 835: 831: 823: 818: 811: 807: 799: 794: 791: 784: 781: 752: 747: 723: 718: 694: 689: 684: 679: 674: 669: 664: 661: 657: 634: 629: 602: 599: 595: 591: 586: 581: 576: 571: 566: 542: 538: 525: 524: 511: 506: 499: 496: 492: 486: 482: 478: 473: 466: 462: 434: 431: 426: 421: 416: 411: 406: 393: 392: 379: 374: 369: 364: 359: 352: 349: 345: 339: 335: 331: 326: 321: 294: 290: 269: 266: 261: 256: 251: 246: 241: 217: 212: 188: 183: 159: 154: 130: 126: 103: 100: 96: 73: 70: 66: 43: 40: 36: 23: 15: 14: 13: 10: 9: 6: 4: 3: 2: 1242: 1233: 1231: 1209: 1206: 1202: 1196: 1193: 1189: 1183: 1179: 1175: 1170: 1151: 1146: 1127: 1122: 1117: 1113: 1105: 1104: 1103: 1086: 1076: 1046: 1036: 1031: 1021: 1016: 1013: 1003: 996: 991: 988: 984: 976: 975: 974: 955: 950: 944: 941: 931: 924: 919: 916: 912: 907: 900: 897: 894: 891: 888: 884: 880: 875: 871: 863: 862: 861: 840: 821: 816: 797: 792: 789: 779: 768: 767: 766: 750: 721: 692: 682: 677: 667: 662: 659: 655: 632: 616: 600: 597: 593: 589: 584: 574: 569: 540: 536: 509: 497: 494: 490: 484: 480: 476: 471: 448: 447: 446: 432: 429: 424: 414: 409: 377: 367: 362: 350: 347: 343: 337: 333: 329: 324: 310: 309: 308: 292: 288: 267: 264: 259: 249: 244: 215: 186: 157: 128: 124: 101: 98: 94: 71: 68: 64: 41: 38: 34: 19: 1226: 1063: 972: 858: 617: 526: 394: 25: 1230:MinRes 16:< 736:and 201:and 1203:ℓ 1190:ℓ 1180:∑ 1164:^ 1152:⋅ 1140:^ 1080:^ 1042:ε 1037:⋅ 1027:ε 1007:^ 997:− 935:^ 925:− 898:≠ 885:∑ 872:ε 834:^ 822:⋅ 810:^ 783:^ 683:⋅ 594:δ 575:⋅ 491:ℓ 481:∑ 465:^ 420:ε 415:⋅ 373:ε 344:ℓ 334:∑ 250:⋅ 211:ε 172:, 95:ε 1210:p 1207:a 1197:p 1194:a 1184:p 1176:= 1171:a 1160:z 1147:a 1136:z 1128:= 1123:2 1118:a 1114:h 1087:a 1077:z 1047:b 1032:a 1022:= 1017:b 1014:a 1004:r 992:b 989:a 985:r 956:2 951:) 945:b 942:a 932:r 920:b 917:a 913:r 908:( 901:a 895:b 892:, 889:a 881:= 876:2 855:. 841:b 830:z 817:a 806:z 798:= 793:b 790:a 780:r 751:b 746:z 722:a 717:z 693:b 688:z 678:a 673:z 668:= 663:b 660:a 656:r 633:a 628:z 601:q 598:p 590:= 585:q 580:F 570:p 565:F 541:p 537:N 510:p 505:F 498:p 495:a 485:p 477:= 472:a 461:z 433:0 430:= 425:a 410:p 405:F 378:a 368:+ 363:p 358:F 351:p 348:a 338:p 330:= 325:a 320:z 293:p 289:N 268:1 265:= 260:a 255:z 245:a 240:z 216:a 187:p 182:F 158:a 153:z 129:i 125:N 102:i 99:a 72:i 69:p 65:F 42:i 39:a 35:z

Index

Knowledge talk:WikiProject Mathematics
MinRes

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.