Knowledge

Oja's rule

Source 📝

1465: 1054: 2603: 2375:, in particular those in which feature extraction is of primary interest. Therefore, Oja's rule has an important place in image and speech processing. It is also useful as it expands easily to higher dimensions of processing, thus being able to integrate multiple outputs quickly. A canonical example is its use in 745:
Hebb's rule has synaptic weights approaching infinity with a positive learning rate. We can stop this by normalizing the weights so that each weight's magnitude is restricted between 0, corresponding to no weight, and 1, corresponding to being the only input neuron with any weight. We do this by
1460:{\displaystyle \,w_{i}(n+1)~=~{\frac {w_{i}(n)}{\left(\sum _{j}w_{j}^{p}(n)\right)^{1/p}}}~+~\eta \left({\frac {yx_{i}}{\left(\sum _{j}w_{j}^{p}(n)\right)^{1/p}}}-{\frac {w_{i}(n)\sum _{j}yx_{j}w_{j}^{p-1}(n)}{\left(\sum _{j}w_{j}^{p}(n)\right)^{(1+1/p)}}}\right)~+~O(\eta ^{2})} 365: 2410: 964: 2323: 508: 1722: 599: 2395:
in biological neural networks, along with a normalization effect in both input weights and neuron outputs. However, while there is no direct experimental evidence yet of Oja's rule active in a biological neural network, a
2162: 2671:. By taking the pre- and post-synaptic functions into frequency space and combining integration terms with the convolution, we find that this gives an arbitrary-dimensional generalization of Oja's rule known as 1596: 714: 1851: 1991: 191: 2172:
analysis, and they show that Oja's neuron necessarily converges on strictly the first principal component if certain conditions are met in our original learning rule. Most importantly, our learning rate
233: 2598:{\displaystyle \Delta w_{ij}~\propto ~\langle x_{i}y_{j}\rangle -\epsilon \left\langle \left(c_{\mathrm {pre} }*\sum _{k}w_{ik}y_{k}\right)\cdot \left(c_{\mathrm {post} }*y_{j}\right)\right\rangle ,} 2738: 1491:
go to zero. We again make the specification of a linear neuron, that is, the output of the neuron is equal to the sum of the product of each input and its synaptic weight to the power of
752: 2193: 2400:
derivation of a generalization of the rule is possible. Such a derivation requires retrograde signalling from the postsynaptic neuron, which is biologically plausible (see
1022: 65:
Oja's rule requires a number of simplifications to derive, but in its final form it is demonstrably stable, unlike Hebb's rule. It is a single-neuron special case of the
396: 2371:
in 1952. PCA has also had a long history of use before Oja's rule formalized its use in network computation in 1989. The model can thus be applied to any problem of
1046: 2667:
are presynaptic and postsynaptic functions that model the weakening of signals over time. Note that the angle brackets denote the average and the ∗ operator is a
219: 95: 3037: 1612: 532: 2943:
Friston, K.J.; C.D. Frith; R.S.J. Frackowiak (22 October 1993). "Principal Component Analysis Learning Algorithms: A Neurobiological Analysis".
2096: 2367:
Oja's rule was originally described in Oja's 1982 paper, but the principle of self-organization to which it is applied is first attributed to
2868: 2006:, or the first principal component, as time or number of iterations approaches infinity. We can also define, given a set of input vectors 1508: 617: 3132: 3127: 1734: 40: 1933: 360:{\displaystyle \,\Delta \mathbf {w} ~=~\mathbf {w} _{n+1}-\mathbf {w} _{n}~=~\eta \,y_{n}(\mathbf {x} _{n}-y_{n}\mathbf {w} _{n}),} 114: 3152: 2767: 3065: 2762: 1871: 1867: 66: 54: 3060: 2772: 2757: 2681: 959:{\displaystyle \,w_{i}(n+1)~=~{\frac {w_{i}(n)+\eta \,y(\mathbf {x} )x_{i}}{\left(\sum _{j=1}^{m}^{p}\right)^{1/p}}}} 3030: 3091: 982: 46: 3137: 3023: 3086: 2318:{\displaystyle \sum _{n=1}^{\infty }\eta (n)=\infty ,~~~\sum _{n=1}^{\infty }\eta (n)^{p}<\infty ,~~~p>1} 3147: 53:) that, through multiplicative normalization, solves all stability problems and generates an algorithm for 2401: 2388: 69:. However, Oja's rule can also be generalized in other ways to varying degrees of stability and success. 978: 1870:, one can create a multi-Oja neural network that can extract as many features as desired, allowing for 49:
change connection strength, or learn, over time. It is a modification of the standard Hebb's Rule (see
2952: 2392: 3142: 2893: 2782: 2777: 2372: 2330: 2347:
is also allowed to be nonlinear and nonstatic, but it must be continuously differentiable in both
991: 2984: 2968: 2835: 503:{\displaystyle \,{\frac {d\mathbf {w} }{dt}}~=~\eta \,y(t)(\mathbf {x} (t)-y(t)\mathbf {w} (t)).} 1862:
In analyzing the convergence of a single neuron evolving by Oja's rule, one extracts the first
981:
normalization rule. However, any type of normalization, even linear, will give the same result
57:. This is a computational form of an effect which is believed to happen in biological neurons. 2976: 2864: 2827: 2169: 1997:
In the case of a single neuron trained by Oja's rule, we find the weight vector converges to
3046: 2960: 2925: 2819: 2376: 381: 50: 35: 1031: 3096: 390:
defines a discrete time iteration. The rule can also be made for continuous iterations as
2885: 3005: 2956: 1478: 204: 80: 3121: 519: 2988: 2839: 2856: 1025: 1717:{\displaystyle \,|\mathbf {w} |~=~\left(\sum _{j=1}^{m}w_{j}^{p}\right)^{1/p}~=~1} 594:{\displaystyle \,\Delta \mathbf {w} ~=~\eta \,y(\mathbf {x} _{n})\mathbf {x} _{n}} 2668: 2368: 2046: 16:
Model of how neurons in the brain or artificial neural networks learn over time
2929: 2810:(November 1982). "Simplified neuron model as a principal component analyzer". 2752: 2397: 2061: 2913: 2807: 27: 2964: 977:, corresponding to quadrature (root sum of squares), which is the familiar 2980: 2831: 2157:{\displaystyle \lim _{n\rightarrow \infty }\sigma ^{2}(n)~=~\lambda _{1}} 2075: 736:
is again the output, this time explicitly dependent on its input vector
3106: 2823: 3101: 2972: 2090:
then converges with time iterations to the principal eigenvalue, or
1866:, or feature, of a data set. Furthermore, with extensions using the 3015: 1591:{\displaystyle \,y(\mathbf {x} )~=~\sum _{j=1}^{m}x_{j}w_{j}^{p-1}} 709:{\displaystyle \,w_{i}(n+1)~=~w_{i}(n)+\eta \,y(\mathbf {x} )x_{i}} 1846:{\displaystyle \,w_{i}(n+1)~=~w_{i}(n)+\eta \,y(x_{i}-w_{i}(n)y)} 1728:
which, when substituted into our expansion, gives Oja's rule, or
3010: 2916:(1989). "Neural Networks, Principal Components, and Subspaces". 380:
which can also change with time. Note that the bold symbols are
3019: 1986:{\displaystyle \mathbf {x} ~=~\sum _{j}a_{j}\mathbf {q} _{j}} 186:{\displaystyle \,y(\mathbf {x} )~=~\sum _{j=1}^{m}x_{j}w_{j}} 2179:
is allowed to vary with time, but only such that its sum is
526:. In component form as a difference equation, it is written 522:
known is Hebb's rule, which states in conceptual terms that
1606:, which will be a necessary condition for stability, so 2684: 2413: 2196: 2099: 1936: 1737: 1615: 1511: 1057: 1034: 994: 755: 620: 535: 399: 236: 207: 195:
Oja's rule defines the change in presynaptic weights
117: 83: 41: 1927:, and we can restore our original dataset by taking 3079: 3053: 746:normalizing the weight vector to be of length one: 2732: 2649:to be a constant analogous the learning rate, and 2597: 2317: 2156: 1985: 1845: 1716: 1590: 1459: 1040: 1016: 958: 708: 593: 502: 359: 213: 185: 89: 2101: 97:that returns a linear combination of its inputs 45:), is a model of how neurons in the brain or in 2733:{\displaystyle \Delta w~=~Cx\cdot w-w\cdot Cy.} 1602:We also specify that our weights normalize to 3031: 3006:Oja, Erkki: Oja learning rule in Scholarpedia 8: 2462: 2439: 2861:Neural Networks: A Comprehensive Foundation 3038: 3024: 3016: 2851: 2849: 2643:is the postsynaptic output, and we define 2683: 2576: 2553: 2552: 2529: 2516: 2506: 2486: 2485: 2456: 2446: 2421: 2412: 2282: 2263: 2252: 2212: 2201: 2195: 2148: 2120: 2104: 2098: 1977: 1972: 1965: 1955: 1937: 1935: 1822: 1809: 1798: 1777: 1743: 1738: 1736: 1692: 1688: 1677: 1672: 1662: 1651: 1627: 1622: 1617: 1616: 1614: 1576: 1571: 1561: 1551: 1540: 1519: 1512: 1510: 1448: 1409: 1396: 1376: 1371: 1361: 1329: 1324: 1314: 1301: 1282: 1275: 1260: 1256: 1236: 1231: 1221: 1204: 1194: 1165: 1161: 1141: 1136: 1126: 1100: 1093: 1063: 1058: 1056: 1033: 1003: 995: 993: 944: 940: 929: 919: 907: 900: 879: 866: 855: 838: 826: 819: 798: 791: 761: 756: 754: 700: 688: 681: 660: 626: 621: 619: 585: 580: 570: 565: 557: 540: 536: 534: 524:neurons that fire together, wire together 480: 451: 435: 407: 401: 400: 398: 345: 340: 333: 320: 315: 305: 300: 282: 277: 261: 256: 241: 237: 235: 206: 177: 167: 157: 146: 125: 118: 116: 82: 26:, named after Finnish computer scientist 77:Consider a simplified model of a neuron 2918:International Journal of Neural Systems 2802: 2800: 2798: 2794: 2359:and have derivatives bounded in time. 34: 7: 2619:is the synaptic weight between the 970:Note that in Oja's original paper, 2685: 2563: 2560: 2557: 2554: 2493: 2490: 2487: 2414: 2291: 2264: 2233: 2213: 2111: 1024:the equation can be expanded as a 537: 238: 14: 2387:There is clear evidence for both 605:or in scalar form with implicit 2945:Proceedings: Biological Sciences 2168:These results are derived using 1973: 1938: 1623: 1520: 908: 827: 689: 581: 566: 541: 481: 452: 408: 341: 316: 278: 257: 242: 126: 2812:Journal of Mathematical Biology 2768:Independent components analysis 2383:Biology and Oja's subspace rule 1894:through some associated vector 2279: 2272: 2227: 2221: 2132: 2126: 2108: 2017:, that its correlation matrix 1840: 1834: 1828: 1802: 1789: 1783: 1761: 1749: 1628: 1618: 1524: 1516: 1502:is synaptic weight itself, or 1454: 1441: 1417: 1397: 1388: 1382: 1347: 1341: 1294: 1288: 1248: 1242: 1153: 1147: 1112: 1106: 1081: 1069: 1004: 996: 926: 912: 904: 891: 885: 872: 831: 823: 810: 804: 779: 767: 693: 685: 672: 666: 644: 632: 576: 561: 494: 491: 485: 477: 471: 462: 456: 448: 445: 439: 351: 311: 130: 122: 1: 3066:Generalized Hebbian algorithm 2863:(2 ed.). Prentice Hall. 2763:Generalized Hebbian algorithm 2382: 2078:of outputs of our Oja neuron 1872:principal components analysis 1868:Generalized Hebbian Algorithm 67:Generalized Hebbian Algorithm 55:principal components analysis 3061:Contrastive Hebbian learning 3011:Oja, Erkki: Aalto University 2773:Principal component analysis 2758:Contrastive Hebbian learning 1888:is extracted from a dataset 1017:{\displaystyle |\eta |\ll 1} 2890:Neural Computation lectures 3169: 3133:Artificial neural networks 3128:Computational neuroscience 3092:Feedforward neural network 988:For a small learning rate 983:without loss of generality 221:of a neuron to its inputs 201:given the output response 103:using presynaptic weights 47:artificial neural networks 2930:10.1142/S0129065789000475 2884:Intrator, Nathan (2007). 2404:), and takes the form of 3087:Engram (neuropsychology) 2886:"Unsupervised Learning" 2373:self-organizing mapping 1495:, which in the case of 3153:Management cybernetics 2965:10.1098/rspb.1993.0125 2734: 2599: 2402:neural backpropagation 2389:long-term potentiation 2319: 2268: 2217: 2158: 1987: 1877:A principal component 1847: 1718: 1667: 1592: 1556: 1461: 1042: 1018: 960: 871: 710: 595: 504: 361: 215: 187: 162: 91: 32:Finnish pronunciation: 3054:True Hebbian learning 2735: 2600: 2320: 2248: 2197: 2183:but its power sum is 2159: 1988: 1848: 1719: 1647: 1593: 1536: 1462: 1043: 1041:{\displaystyle \eta } 1019: 961: 851: 711: 596: 505: 362: 216: 188: 142: 92: 2682: 2411: 2393:long-term depression 2194: 2097: 1934: 1735: 1613: 1509: 1055: 1032: 992: 753: 618: 533: 397: 234: 205: 115: 81: 2957:1993RSPSB.254...47F 2894:Tel-Aviv University 2783:Synaptic plasticity 2778:Self-organizing map 2631:th output neurons, 2331:activation function 1864:principal component 1682: 1587: 1381: 1340: 1241: 1146: 20:Oja's learning rule 2824:10.1007/BF00275687 2730: 2595: 2511: 2315: 2154: 2115: 2045:has an associated 1983: 1960: 1843: 1714: 1668: 1588: 1567: 1479:higher-order terms 1457: 1367: 1366: 1320: 1306: 1227: 1226: 1132: 1131: 1038: 1014: 956: 706: 591: 500: 357: 211: 183: 87: 3115: 3114: 2870:978-0-13-273350-2 2699: 2693: 2502: 2438: 2432: 2305: 2302: 2299: 2247: 2244: 2241: 2170:Lyapunov function 2143: 2137: 2100: 1951: 1950: 1944: 1858:Stability and PCA 1772: 1766: 1710: 1704: 1640: 1634: 1535: 1529: 1437: 1431: 1422: 1357: 1297: 1270: 1217: 1185: 1179: 1175: 1122: 1092: 1086: 954: 790: 784: 655: 649: 553: 547: 431: 425: 421: 296: 290: 254: 248: 214:{\displaystyle y} 141: 135: 90:{\displaystyle y} 3160: 3138:Neural circuitry 3080:Related concepts 3047:Hebbian learning 3040: 3033: 3026: 3017: 2993: 2992: 2940: 2934: 2933: 2910: 2904: 2903: 2901: 2900: 2881: 2875: 2874: 2853: 2844: 2843: 2804: 2739: 2737: 2736: 2731: 2697: 2691: 2666: 2657: 2648: 2642: 2636: 2630: 2624: 2618: 2608:where as before 2604: 2602: 2601: 2596: 2591: 2587: 2586: 2582: 2581: 2580: 2568: 2567: 2566: 2539: 2535: 2534: 2533: 2524: 2523: 2510: 2498: 2497: 2496: 2461: 2460: 2451: 2450: 2436: 2430: 2429: 2428: 2377:binocular vision 2358: 2352: 2346: 2324: 2322: 2321: 2316: 2303: 2300: 2297: 2287: 2286: 2267: 2262: 2245: 2242: 2239: 2216: 2211: 2178: 2163: 2161: 2160: 2155: 2153: 2152: 2141: 2135: 2125: 2124: 2114: 2089: 2073: 2059: 2044: 2016: 2005: 1992: 1990: 1989: 1984: 1982: 1981: 1976: 1970: 1969: 1959: 1948: 1942: 1941: 1926: 1904: 1893: 1887: 1852: 1850: 1849: 1844: 1827: 1826: 1814: 1813: 1782: 1781: 1770: 1764: 1748: 1747: 1723: 1721: 1720: 1715: 1708: 1702: 1701: 1700: 1696: 1687: 1683: 1681: 1676: 1666: 1661: 1638: 1632: 1631: 1626: 1621: 1605: 1597: 1595: 1594: 1589: 1586: 1575: 1566: 1565: 1555: 1550: 1533: 1527: 1523: 1501: 1494: 1490: 1476: 1466: 1464: 1463: 1458: 1453: 1452: 1435: 1429: 1428: 1424: 1423: 1421: 1420: 1413: 1395: 1391: 1380: 1375: 1365: 1350: 1339: 1328: 1319: 1318: 1305: 1287: 1286: 1276: 1271: 1269: 1268: 1264: 1255: 1251: 1240: 1235: 1225: 1210: 1209: 1208: 1195: 1183: 1177: 1176: 1174: 1173: 1169: 1160: 1156: 1145: 1140: 1130: 1115: 1105: 1104: 1094: 1090: 1084: 1068: 1067: 1047: 1045: 1044: 1039: 1023: 1021: 1020: 1015: 1007: 999: 976: 965: 963: 962: 957: 955: 953: 952: 948: 939: 935: 934: 933: 924: 923: 911: 884: 883: 870: 865: 844: 843: 842: 830: 803: 802: 792: 788: 782: 766: 765: 741: 735: 715: 713: 712: 707: 705: 704: 692: 665: 664: 653: 647: 631: 630: 610: 600: 598: 597: 592: 590: 589: 584: 575: 574: 569: 551: 545: 544: 509: 507: 506: 501: 484: 455: 429: 423: 422: 420: 412: 411: 402: 389: 375: 366: 364: 363: 358: 350: 349: 344: 338: 337: 325: 324: 319: 310: 309: 294: 288: 287: 286: 281: 272: 271: 260: 252: 246: 245: 226: 220: 218: 217: 212: 200: 192: 190: 189: 184: 182: 181: 172: 171: 161: 156: 139: 133: 129: 108: 102: 96: 94: 93: 88: 51:Hebbian learning 38: 33: 3168: 3167: 3163: 3162: 3161: 3159: 3158: 3157: 3118: 3117: 3116: 3111: 3097:Backpropagation 3075: 3049: 3044: 3002: 2997: 2996: 2951:(1339): 47–54. 2942: 2941: 2937: 2912: 2911: 2907: 2898: 2896: 2883: 2882: 2878: 2871: 2855: 2854: 2847: 2806: 2805: 2796: 2791: 2749: 2743: 2680: 2679: 2665: 2659: 2656: 2650: 2644: 2638: 2632: 2626: 2620: 2617: 2609: 2572: 2548: 2547: 2543: 2525: 2512: 2481: 2480: 2476: 2475: 2471: 2452: 2442: 2417: 2409: 2408: 2385: 2365: 2354: 2348: 2333: 2278: 2192: 2191: 2174: 2144: 2116: 2095: 2094: 2079: 2072: 2064: 2058: 2050: 2043: 2035: 2026: 2018: 2015: 2007: 2004: 1998: 1971: 1961: 1932: 1931: 1921: 1912: 1906: 1903: 1895: 1889: 1886: 1878: 1860: 1818: 1805: 1773: 1739: 1733: 1732: 1646: 1642: 1641: 1611: 1610: 1603: 1557: 1507: 1506: 1496: 1492: 1481: 1472: 1444: 1356: 1352: 1351: 1310: 1278: 1277: 1216: 1212: 1211: 1200: 1196: 1193: 1189: 1121: 1117: 1116: 1096: 1095: 1059: 1053: 1052: 1030: 1029: 990: 989: 971: 925: 915: 875: 850: 846: 845: 834: 794: 793: 757: 751: 750: 737: 733: 721: 696: 656: 622: 616: 615: 606: 579: 564: 531: 530: 516: 413: 403: 395: 394: 385: 371: 339: 329: 314: 301: 276: 255: 232: 231: 222: 203: 202: 196: 173: 163: 113: 112: 104: 98: 79: 78: 75: 63: 31: 17: 12: 11: 5: 3166: 3164: 3156: 3155: 3150: 3148:Hebbian theory 3145: 3140: 3135: 3130: 3120: 3119: 3113: 3112: 3110: 3109: 3104: 3099: 3094: 3089: 3083: 3081: 3077: 3076: 3074: 3073: 3068: 3063: 3057: 3055: 3051: 3050: 3045: 3043: 3042: 3035: 3028: 3020: 3014: 3013: 3008: 3001: 3000:External links 2998: 2995: 2994: 2935: 2905: 2876: 2869: 2845: 2818:(3): 267–273. 2793: 2792: 2790: 2787: 2786: 2785: 2780: 2775: 2770: 2765: 2760: 2755: 2748: 2745: 2741: 2740: 2729: 2726: 2723: 2720: 2717: 2714: 2711: 2708: 2705: 2702: 2696: 2690: 2687: 2673:Oja's Subspace 2663: 2654: 2637:is the input, 2613: 2606: 2605: 2594: 2590: 2585: 2579: 2575: 2571: 2565: 2562: 2559: 2556: 2551: 2546: 2542: 2538: 2532: 2528: 2522: 2519: 2515: 2509: 2505: 2501: 2495: 2492: 2489: 2484: 2479: 2474: 2470: 2467: 2464: 2459: 2455: 2449: 2445: 2441: 2435: 2427: 2424: 2420: 2416: 2384: 2381: 2364: 2361: 2327: 2326: 2314: 2311: 2308: 2296: 2293: 2290: 2285: 2281: 2277: 2274: 2271: 2266: 2261: 2258: 2255: 2251: 2238: 2235: 2232: 2229: 2226: 2223: 2220: 2215: 2210: 2207: 2204: 2200: 2166: 2165: 2151: 2147: 2140: 2134: 2131: 2128: 2123: 2119: 2113: 2110: 2107: 2103: 2068: 2054: 2039: 2031: 2022: 2011: 2002: 1995: 1994: 1980: 1975: 1968: 1964: 1958: 1954: 1947: 1940: 1917: 1910: 1899: 1882: 1859: 1856: 1855: 1854: 1842: 1839: 1836: 1833: 1830: 1825: 1821: 1817: 1812: 1808: 1804: 1801: 1797: 1794: 1791: 1788: 1785: 1780: 1776: 1769: 1763: 1760: 1757: 1754: 1751: 1746: 1742: 1726: 1725: 1713: 1707: 1699: 1695: 1691: 1686: 1680: 1675: 1671: 1665: 1660: 1657: 1654: 1650: 1645: 1637: 1630: 1625: 1620: 1600: 1599: 1585: 1582: 1579: 1574: 1570: 1564: 1560: 1554: 1549: 1546: 1543: 1539: 1532: 1526: 1522: 1518: 1515: 1469: 1468: 1456: 1451: 1447: 1443: 1440: 1434: 1427: 1419: 1416: 1412: 1408: 1405: 1402: 1399: 1394: 1390: 1387: 1384: 1379: 1374: 1370: 1364: 1360: 1355: 1349: 1346: 1343: 1338: 1335: 1332: 1327: 1323: 1317: 1313: 1309: 1304: 1300: 1296: 1293: 1290: 1285: 1281: 1274: 1267: 1263: 1259: 1254: 1250: 1247: 1244: 1239: 1234: 1230: 1224: 1220: 1215: 1207: 1203: 1199: 1192: 1188: 1182: 1172: 1168: 1164: 1159: 1155: 1152: 1149: 1144: 1139: 1135: 1129: 1125: 1120: 1114: 1111: 1108: 1103: 1099: 1089: 1083: 1080: 1077: 1074: 1071: 1066: 1062: 1037: 1013: 1010: 1006: 1002: 998: 968: 967: 951: 947: 943: 938: 932: 928: 922: 918: 914: 910: 906: 903: 899: 896: 893: 890: 887: 882: 878: 874: 869: 864: 861: 858: 854: 849: 841: 837: 833: 829: 825: 822: 818: 815: 812: 809: 806: 801: 797: 787: 781: 778: 775: 772: 769: 764: 760: 729: 718: 717: 703: 699: 695: 691: 687: 684: 680: 677: 674: 671: 668: 663: 659: 652: 646: 643: 640: 637: 634: 629: 625: 603: 602: 588: 583: 578: 573: 568: 563: 560: 556: 550: 543: 539: 515: 512: 511: 510: 499: 496: 493: 490: 487: 483: 479: 476: 473: 470: 467: 464: 461: 458: 454: 450: 447: 444: 441: 438: 434: 428: 419: 416: 410: 406: 368: 367: 356: 353: 348: 343: 336: 332: 328: 323: 318: 313: 308: 304: 299: 293: 285: 280: 275: 270: 267: 264: 259: 251: 244: 240: 210: 180: 176: 170: 166: 160: 155: 152: 149: 145: 138: 132: 128: 124: 121: 86: 74: 71: 62: 59: 36:[ˈojɑ] 15: 13: 10: 9: 6: 4: 3: 2: 3165: 3154: 3151: 3149: 3146: 3144: 3141: 3139: 3136: 3134: 3131: 3129: 3126: 3125: 3123: 3108: 3105: 3103: 3100: 3098: 3095: 3093: 3090: 3088: 3085: 3084: 3082: 3078: 3072: 3069: 3067: 3064: 3062: 3059: 3058: 3056: 3052: 3048: 3041: 3036: 3034: 3029: 3027: 3022: 3021: 3018: 3012: 3009: 3007: 3004: 3003: 2999: 2990: 2986: 2982: 2978: 2974: 2970: 2966: 2962: 2958: 2954: 2950: 2946: 2939: 2936: 2931: 2927: 2923: 2919: 2915: 2909: 2906: 2895: 2891: 2887: 2880: 2877: 2872: 2866: 2862: 2858: 2857:Haykin, Simon 2852: 2850: 2846: 2842:. BF00275687. 2841: 2837: 2833: 2829: 2825: 2821: 2817: 2813: 2809: 2803: 2801: 2799: 2795: 2788: 2784: 2781: 2779: 2776: 2774: 2771: 2769: 2766: 2764: 2761: 2759: 2756: 2754: 2751: 2750: 2746: 2744: 2727: 2724: 2721: 2718: 2715: 2712: 2709: 2706: 2703: 2700: 2694: 2688: 2678: 2677: 2676: 2674: 2670: 2662: 2653: 2647: 2641: 2635: 2629: 2625:th input and 2623: 2616: 2612: 2592: 2588: 2583: 2577: 2573: 2569: 2549: 2544: 2540: 2536: 2530: 2526: 2520: 2517: 2513: 2507: 2503: 2499: 2482: 2477: 2472: 2468: 2465: 2457: 2453: 2447: 2443: 2433: 2425: 2422: 2418: 2407: 2406: 2405: 2403: 2399: 2394: 2390: 2380: 2378: 2374: 2370: 2362: 2360: 2357: 2351: 2344: 2340: 2336: 2332: 2312: 2309: 2306: 2294: 2288: 2283: 2275: 2269: 2259: 2256: 2253: 2249: 2236: 2230: 2224: 2218: 2208: 2205: 2202: 2198: 2190: 2189: 2188: 2186: 2182: 2177: 2171: 2149: 2145: 2138: 2129: 2121: 2117: 2105: 2093: 2092: 2091: 2087: 2083: 2077: 2071: 2067: 2063: 2057: 2053: 2048: 2042: 2038: 2034: 2030: 2025: 2021: 2014: 2010: 2001: 1978: 1966: 1962: 1956: 1952: 1945: 1930: 1929: 1928: 1925: 1920: 1916: 1909: 1902: 1898: 1892: 1885: 1881: 1875: 1873: 1869: 1865: 1857: 1837: 1831: 1823: 1819: 1815: 1810: 1806: 1799: 1795: 1792: 1786: 1778: 1774: 1767: 1758: 1755: 1752: 1744: 1740: 1731: 1730: 1729: 1711: 1705: 1697: 1693: 1689: 1684: 1678: 1673: 1669: 1663: 1658: 1655: 1652: 1648: 1643: 1635: 1609: 1608: 1607: 1583: 1580: 1577: 1572: 1568: 1562: 1558: 1552: 1547: 1544: 1541: 1537: 1530: 1513: 1505: 1504: 1503: 1499: 1488: 1484: 1480: 1475: 1449: 1445: 1438: 1432: 1425: 1414: 1410: 1406: 1403: 1400: 1392: 1385: 1377: 1372: 1368: 1362: 1358: 1353: 1344: 1336: 1333: 1330: 1325: 1321: 1315: 1311: 1307: 1302: 1298: 1291: 1283: 1279: 1272: 1265: 1261: 1257: 1252: 1245: 1237: 1232: 1228: 1222: 1218: 1213: 1205: 1201: 1197: 1190: 1186: 1180: 1170: 1166: 1162: 1157: 1150: 1142: 1137: 1133: 1127: 1123: 1118: 1109: 1101: 1097: 1087: 1078: 1075: 1072: 1064: 1060: 1051: 1050: 1049: 1035: 1027: 1011: 1008: 1000: 986: 984: 980: 974: 949: 945: 941: 936: 930: 920: 916: 901: 897: 894: 888: 880: 876: 867: 862: 859: 856: 852: 847: 839: 835: 820: 816: 813: 807: 799: 795: 785: 776: 773: 770: 762: 758: 749: 748: 747: 743: 740: 732: 728: 724: 701: 697: 682: 678: 675: 669: 661: 657: 650: 641: 638: 635: 627: 623: 614: 613: 612: 611:-dependence, 609: 586: 571: 558: 554: 548: 529: 528: 527: 525: 521: 520:learning rule 518:The simplest 513: 497: 488: 474: 468: 465: 459: 442: 436: 432: 426: 417: 414: 404: 393: 392: 391: 388: 383: 379: 378:learning rate 374: 354: 346: 334: 330: 326: 321: 306: 302: 297: 291: 283: 273: 268: 265: 262: 249: 230: 229: 228: 225: 208: 199: 193: 178: 174: 168: 164: 158: 153: 150: 147: 143: 136: 119: 110: 107: 101: 84: 72: 70: 68: 60: 58: 56: 52: 48: 44: 43: 37: 29: 25: 21: 3070: 2948: 2944: 2938: 2924:(1): 61–68. 2921: 2917: 2908: 2897:. Retrieved 2889: 2879: 2860: 2815: 2811: 2742: 2672: 2660: 2651: 2645: 2639: 2633: 2627: 2621: 2614: 2610: 2607: 2386: 2366: 2363:Applications 2355: 2349: 2342: 2338: 2334: 2328: 2184: 2180: 2175: 2167: 2085: 2081: 2069: 2065: 2055: 2051: 2040: 2036: 2032: 2028: 2023: 2019: 2012: 2008: 1999: 1996: 1923: 1918: 1914: 1907: 1900: 1896: 1890: 1883: 1879: 1876: 1863: 1861: 1727: 1601: 1497: 1486: 1482: 1473: 1470: 1026:Power series 987: 972: 969: 744: 738: 730: 726: 722: 719: 607: 604: 523: 517: 386: 377: 372: 369: 223: 197: 194: 111: 105: 99: 76: 64: 23: 22:, or simply 19: 18: 2669:convolution 2398:biophysical 2369:Alan Turing 2329:Our output 2047:eigenvector 3143:Biophysics 3122:Categories 3071:Oja's rule 2914:Oja, Erkki 2899:2007-11-22 2808:Oja, Erkki 2789:References 2753:BCM theory 2187:, that is 2185:convergent 2062:eigenvalue 1471:For small 514:Derivation 24:Oja's rule 2719:⋅ 2713:− 2707:⋅ 2686:Δ 2675:, namely 2570:∗ 2541:⋅ 2504:∑ 2500:∗ 2469:ϵ 2466:− 2463:⟩ 2440:⟨ 2434:∝ 2415:Δ 2292:∞ 2270:η 2265:∞ 2250:∑ 2234:∞ 2219:η 2214:∞ 2199:∑ 2181:divergent 2146:λ 2118:σ 2112:∞ 2109:→ 2049:given by 1953:∑ 1816:− 1796:η 1649:∑ 1581:− 1538:∑ 1446:η 1359:∑ 1334:− 1299:∑ 1273:− 1219:∑ 1187:η 1124:∑ 1036:η 1009:≪ 1001:η 979:Cartesian 898:η 853:∑ 817:η 679:η 555:η 538:Δ 466:− 433:η 327:− 298:η 274:− 239:Δ 144:∑ 28:Erkki Oja 2989:42179377 2859:(1998). 2840:16577977 2747:See also 2589:⟩ 2473:⟨ 2076:variance 3107:GeneRec 2981:8265675 2953:Bibcode 2832:7153672 2084:) = ⟨y( 382:vectors 376:is the 73:Formula 3102:Leabra 2987:  2979:  2971:  2867:  2838:  2830:  2698:  2692:  2437:  2431:  2304:  2301:  2298:  2246:  2243:  2240:  2142:  2136:  2074:. The 1949:  1943:  1771:  1765:  1709:  1703:  1639:  1633:  1534:  1528:  1477:, our 1436:  1430:  1184:  1178:  1091:  1085:  789:  783:  720:where 654:  648:  552:  546:  430:  424:  370:where 295:  289:  253:  247:  227:to be 140:  134:  61:Theory 42:AW-yuh 2985:S2CID 2973:49565 2969:JSTOR 2836:S2CID 2060:with 1905:, or 2977:PMID 2865:ISBN 2828:PMID 2664:post 2658:and 2391:and 2353:and 2310:> 2289:< 384:and 2961:doi 2949:254 2926:doi 2820:doi 2655:pre 2102:lim 1493:p-1 1028:in 3124:: 2983:. 2975:. 2967:. 2959:. 2947:. 2920:. 2892:. 2888:. 2848:^ 2834:. 2826:. 2816:15 2814:. 2797:^ 2615:ij 2379:. 2345:)) 2088:)⟩ 2080:σ( 2027:= 2024:ij 1913:= 1874:. 1500:=2 1048:. 985:. 975:=2 742:. 109:: 39:, 3039:e 3032:t 3025:v 2991:. 2963:: 2955:: 2932:. 2928:: 2922:1 2902:. 2873:. 2822:: 2728:. 2725:y 2722:C 2716:w 2710:w 2704:x 2701:C 2695:= 2689:w 2661:c 2652:c 2646:ε 2640:y 2634:x 2628:j 2622:i 2611:w 2593:, 2584:) 2578:j 2574:y 2564:t 2561:s 2558:o 2555:p 2550:c 2545:( 2537:) 2531:k 2527:y 2521:k 2518:i 2514:w 2508:k 2494:e 2491:r 2488:p 2483:c 2478:( 2458:j 2454:y 2448:i 2444:x 2426:j 2423:i 2419:w 2356:w 2350:x 2343:n 2341:( 2339:x 2337:( 2335:y 2325:. 2313:1 2307:p 2295:, 2284:p 2280:) 2276:n 2273:( 2260:1 2257:= 2254:n 2237:, 2231:= 2228:) 2225:n 2222:( 2209:1 2206:= 2203:n 2176:η 2164:. 2150:1 2139:= 2133:) 2130:n 2127:( 2122:2 2106:n 2086:n 2082:n 2070:j 2066:λ 2056:j 2052:q 2041:j 2037:X 2033:i 2029:X 2020:R 2013:i 2009:X 2003:1 2000:q 1993:. 1979:j 1974:q 1967:j 1963:a 1957:j 1946:= 1939:x 1924:x 1922:⋅ 1919:j 1915:q 1911:j 1908:a 1901:j 1897:q 1891:x 1884:j 1880:a 1853:. 1841:) 1838:y 1835:) 1832:n 1829:( 1824:i 1820:w 1811:i 1807:x 1803:( 1800:y 1793:+ 1790:) 1787:n 1784:( 1779:i 1775:w 1768:= 1762:) 1759:1 1756:+ 1753:n 1750:( 1745:i 1741:w 1724:, 1712:1 1706:= 1698:p 1694:/ 1690:1 1685:) 1679:p 1674:j 1670:w 1664:m 1659:1 1656:= 1653:j 1644:( 1636:= 1629:| 1624:w 1619:| 1604:1 1598:. 1584:1 1578:p 1573:j 1569:w 1563:j 1559:x 1553:m 1548:1 1545:= 1542:j 1531:= 1525:) 1521:x 1517:( 1514:y 1498:p 1489:) 1487:η 1485:( 1483:O 1474:η 1467:. 1455:) 1450:2 1442:( 1439:O 1433:+ 1426:) 1418:) 1415:p 1411:/ 1407:1 1404:+ 1401:1 1398:( 1393:) 1389:) 1386:n 1383:( 1378:p 1373:j 1369:w 1363:j 1354:( 1348:) 1345:n 1342:( 1337:1 1331:p 1326:j 1322:w 1316:j 1312:x 1308:y 1303:j 1295:) 1292:n 1289:( 1284:i 1280:w 1266:p 1262:/ 1258:1 1253:) 1249:) 1246:n 1243:( 1238:p 1233:j 1229:w 1223:j 1214:( 1206:i 1202:x 1198:y 1191:( 1181:+ 1171:p 1167:/ 1163:1 1158:) 1154:) 1151:n 1148:( 1143:p 1138:j 1134:w 1128:j 1119:( 1113:) 1110:n 1107:( 1102:i 1098:w 1088:= 1082:) 1079:1 1076:+ 1073:n 1070:( 1065:i 1061:w 1012:1 1005:| 997:| 973:p 966:. 950:p 946:/ 942:1 937:) 931:p 927:] 921:j 917:x 913:) 909:x 905:( 902:y 895:+ 892:) 889:n 886:( 881:j 877:w 873:[ 868:m 863:1 860:= 857:j 848:( 840:i 836:x 832:) 828:x 824:( 821:y 814:+ 811:) 808:n 805:( 800:i 796:w 786:= 780:) 777:1 774:+ 771:n 768:( 763:i 759:w 739:x 734:) 731:n 727:x 725:( 723:y 716:, 702:i 698:x 694:) 690:x 686:( 683:y 676:+ 673:) 670:n 667:( 662:i 658:w 651:= 645:) 642:1 639:+ 636:n 633:( 628:i 624:w 608:n 601:, 587:n 582:x 577:) 572:n 567:x 562:( 559:y 549:= 542:w 498:. 495:) 492:) 489:t 486:( 482:w 478:) 475:t 472:( 469:y 463:) 460:t 457:( 453:x 449:( 446:) 443:t 440:( 437:y 427:= 418:t 415:d 409:w 405:d 387:n 373:η 355:, 352:) 347:n 342:w 335:n 331:y 322:n 317:x 312:( 307:n 303:y 292:= 284:n 279:w 269:1 266:+ 263:n 258:w 250:= 243:w 224:x 209:y 198:w 179:j 175:w 169:j 165:x 159:m 154:1 151:= 148:j 137:= 131:) 127:x 123:( 120:y 106:w 100:x 85:y 30:(

Index

Erkki Oja
[ˈojɑ]
AW-yuh
artificial neural networks
Hebbian learning
principal components analysis
Generalized Hebbian Algorithm
vectors
learning rule
Cartesian
without loss of generality
Power series
higher-order terms
Generalized Hebbian Algorithm
principal components analysis
eigenvector
eigenvalue
variance
Lyapunov function
activation function
Alan Turing
self-organizing mapping
binocular vision
long-term potentiation
long-term depression
biophysical
neural backpropagation
convolution
BCM theory
Contrastive Hebbian learning

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.