6624:, or ResNets (not to be confused with recurrent neural networks). ResNets refer to neural networks where skip connections or residual connections are part of the network architecture. These skip connections allow gradient information to pass through the layers, by creating "highways" of information, where the output of a previous layer/activation is added to the output of a deeper layer. This allows information from the earlier parts of the network to be passed to the deeper parts of the network, helping maintain signal propagation even in deeper networks. Skip connections are a critical component of what allowed successful training of deeper neural networks.
2243:
1570:
4892:
6421:
2238:{\displaystyle {\begin{aligned}dx_{t}&=\nabla _{\theta }F(x_{t-1},u_{t},\theta )d\theta +\nabla _{x}F(x_{t-1},u_{t},\theta )dx_{t-1}\\&=\nabla _{\theta }F(x_{t-1},u_{t},\theta )d\theta +\nabla _{x}F(x_{t-1},u_{t},\theta )(\nabla _{\theta }F(x_{t-2},u_{t-1},\theta )d\theta +\nabla _{x}F(x_{t-2},u_{t-1},\theta )dx_{t-2})\\&=\cdots \\&=\left(\nabla _{\theta }F(x_{t-1},u_{t},\theta )+\nabla _{x}F(x_{t-1},u_{t},\theta )\nabla _{\theta }F(x_{t-2},u_{t-1},\theta )+\cdots \right)d\theta \end{aligned}}}
3947:
6608:) has increased around a million-fold, making standard backpropagation feasible for networks several layers deeper than when the vanishing gradient problem was recognized. Schmidhuber notes that this "is basically what is winning many of the image recognition competitions now", but that it "does not really overcome the problem in a fundamental way" since the original models tackling the vanishing gradient problem by Hinton and others were trained in a
3523:
6363:
6322:
4450:
2608:
3942:{\displaystyle {\begin{aligned}\nabla _{x}F(x_{t-1},u_{t},\theta )&\nabla _{x}F(x_{t-2},u_{t-1},\theta )\cdots \nabla _{x}F(x_{t-k},u_{t-k+1},\theta )\\=W_{rec}\mathop {diag} (\sigma '(x_{t-1}))&W_{rec}\mathop {diag} (\sigma '(x_{t-2}))\cdots W_{rec}\mathop {diag} (\sigma '(x_{t-k}))\end{aligned}}}
6047:
neither decays to zero nor blows up to infinity. Indeed, it's the only well-behaved gradient, which explains why early researches focused on learning or designing recurrent networks systems that could perform long-ranged computations (such as outputting the first input it sees at the very end of an
6627:
ResNets yielded lower training error (and test error) than their shallower counterparts simply by reintroducing outputs from shallower layers in the network to compensate for the vanishing data. Note that ResNets are an ensemble of relatively shallow nets and do not resolve the vanishing gradient
4168:
6045:
3108:
2327:
6654:
Kumar suggested that the distribution of initial weights should vary according to activation function used and proposed to initialize the weights in networks with the logistic activation function using a
Gaussian distribution with a zero mean and a standard deviation of
991:
increases, the gradient magnitude typically is expected to decrease (or grow uncontrollably), slowing the training process. In the worst case, this may completely stop the neural network from further training. As one example of the problem cause, traditional
6916:
3518:
1046:. The latter are trained by unfolding them into very deep feedforward networks, where a new layer is created for each time step of an input sequence processed by the network. (The combination of unfolding and backpropagation is termed
5023:
6666:
Recently, Yilmaz and Poli performed a theoretical analysis on how gradients are affected by the mean of the initial weights in deep neural networks using the logistic activation function and found that gradients do not vanish if the
3267:
7006:
6557:
by reproducing the data when sampling down the model (an "ancestral pass") from the top level feature activations. Hinton reports that his models are effective feature extractors over high-dimensional, structured data.
2919:
5380:
7305:
4445:{\displaystyle \nabla _{\theta }L=\nabla _{x}L(x_{T},u_{1},...,u_{T})\left(\nabla _{\theta }F(x_{t-1},u_{t},\theta )+\nabla _{x}F(x_{t-1},u_{t},\theta )\nabla _{\theta }F(x_{t-2},u_{t-1},\theta )+\cdots \right)}
5154:
4697:
7065:
5898:
6628:
problem by preserving gradient flow throughout the entire depth of the network – rather, they avoid the problem simply by constructing ensembles of many short networks together. (Ensemble by
Construction)
2603:{\displaystyle dL=\nabla _{x}L(x_{T},u_{1},...,u_{T})\left(\nabla _{\theta }F(x_{t-1},u_{t},\theta )+\nabla _{x}F(x_{t-1},u_{t},\theta )\nabla _{\theta }F(x_{t-2},u_{t-1},\theta )+\cdots \right)d\theta }
1344:
5890:
2891:
3528:
1575:
5681:
5438:
7527:
Hochreiter, S.; Bengio, Y.; Frasconi, P.; Schmidhuber, J. (2001). "Gradient flow in recurrent nets: the difficulty of learning long-term dependencies". In Kremer, S. C.; Kolen, J. F. (eds.).
5727:
5484:
4501:
1561:
877:
7098:
915:
7202:
6197:
3328:
2317:
6605:
3988:
6125:
4898:
of the one-neuron recurrent network. Horizontal axis is b, and vertical axis is x. The black curve is the set of stable and unstable equilibria. Notice that the system exhibits
8197:"Successfully and efficiently training deep multi-layer perceptrons with logistic activation function simply requires initializing the weights with an appropriate negative mean"
4813:
4753:
4608:
872:
4154:
4852:
4038:
1231:
1179:
1127:
862:
7982:
Graves, A.; Liwicki, M.; Fernandez, S.; Bertolami, R.; Bunke, H.; Schmidhuber, J. (2009). "A Novel
Connectionist System for Improved Unconstrained Handwriting Recognition".
6737:
1447:
7395:
4909:
4881:
703:
4783:
4101:
3122:
1487:
7136:
4530:
5043:
5771:
5548:
7169:
4075:
3348:
1251:
7363:
4723:
7421:
5210:
2911:
1398:
1371:
6283:
6240:
5829:
5800:
5626:
5577:
3380:
5184:
910:
6525:
Similar ideas have been used in feed-forward neural networks for unsupervised pre-training to structure a neural network, making it first learn generally useful
5290:
decreases, the system has 1 stable point, then has 2 stable points and 1 unstable point, and finally has 1 stable point again. Explicitly, the stable points are
7325:
6306:
6260:
6217:
5597:
5504:
5288:
5230:
4550:
4121:
3372:
5268:
867:
718:
6335:
449:
7687:
950:
753:
6308:
by gradient descent would "hit a wall in the loss landscape", and cause exploding gradient. A slightly more complex situation is plotted in, Figures 6.
6939:
6373:
5048:
6553:
of the data, thus improving the model, if trained properly. Once sufficiently many layers have been learned the deep architecture may be used as a
829:
6537:
model by Hinton et al. (2006) involves learning the distribution of a high-level representation using successive layers of binary or real-valued
378:
5293:
7207:
8124:
1256:
887:
650:
185:
2628:
8140:
Veit, Andreas; Wilber, Michael; Belongie, Serge (20 May 2016). "Residual
Networks Behave Like Ensembles of Relatively Shallow Networks".
8302:
6040:{\displaystyle {\frac {\Delta x(T)}{\Delta b}}\approx {\frac {\partial x(T)}{\partial b}}=\left({\frac {1}{x(T)(1-x(T))}}-5\right)^{-1}}
905:
4613:
7011:
6526:
738:
713:
662:
6675:. This simple strategy allows networks with 10 or 15 hidden layers to be trained very efficiently and effectively using the standard
7662:
7536:
6588:. In 2009, deep multidimensional LSTM networks demonstrated the power of deep learning with many nonlinear layers, by winning three
6473:
6455:
6402:
6349:
3103:{\displaystyle \nabla _{x}F(x_{t-1},u_{t},\theta )\nabla _{x}F(x_{t-2},u_{t-1},\theta )\nabla _{x}F(x_{t-3},u_{t-2},\theta )\cdots }
1492:
972:
786:
781:
434:
6440:
Please help improve this article by looking for better, more reliable sources. Unreliable citations may be challenged and removed.
444:
82:
7771:
983:. In such methods, during each iteration of training each of the neural networks weights receives an update proportional to the
7620:
Pascanu, Razvan; Mikolov, Tomas; Bengio, Yoshua (21 November 2012). "On the difficulty of training
Recurrent Neural Networks".
6341:
5157:
839:
6384:
943:
603:
424:
6698:
Neural networks can also be optimized by using a universal search algorithm on the space of neural network's weights, e.g.,
6651:
Weight initialization is another approach that has been proposed to reduce the vanishing gradient problem in deep networks.
1035:
thesis of 1991 formally identified the reason for this failure in the "vanishing gradient problem", which not only affects
5834:
4891:
814:
516:
292:
7821:
6542:
5631:
5388:
1047:
771:
708:
618:
596:
439:
429:
6636:
4883:, the above analysis does not quite work. For the prototypical exploding gradient problem, the next model is clearer.
976:
922:
834:
819:
280:
102:
5686:
5443:
4455:
809:
6434:
6377:
1053:
When activation functions are used whose derivatives can take on larger values, one risks encountering the related
1039:
882:
559:
454:
242:
175:
135:
7508:
7070:
936:
542:
310:
180:
7552:
Goh, Garrett B.; Hodas, Nathan O.; Vishnu, Abhinav (15 June 2017). "Deep learning for computational chemistry".
6692:
6383:
Help add sources such as review articles, monographs, or textbooks. Please also establish the relevance for any
7174:
6573:
6429:
1043:
564:
484:
407:
325:
155:
117:
112:
72:
67:
6130:
3272:
2248:
6621:
6593:
511:
360:
260:
87:
7991:
7830:
6577:
6567:
6420:
691:
667:
569:
330:
305:
265:
77:
6062:
4159:
The effect of a vanishing gradient is that the network cannot learn long-range effects. Recall
Equation (
7968:, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.),
6517:. Here each level learns a compressed representation of the observations that is fed to the next level.
6510:
4788:
4728:
2916:
The vanishing/exploding gradient problem appears because there are repeated multiplications, of the form
645:
467:
419:
275:
190:
62:
7919:
6911:{\displaystyle L(x_{1},...,x_{T},u_{1},...,u_{T})=\sum _{t=1}^{T}{\mathcal {E}}(x_{t},u_{1},...,u_{t})}
6585:
6506:
4555:
8270:
8196:
7970:
Advances in Neural
Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC
4126:
3952:
987:
of the error function with respect to the current weight. The problem is that as the network depth or
7887:
7571:
4818:
3997:
1184:
1132:
1080:
574:
524:
7996:
7792:
J. Schmidhuber., "Learning complex, extended sequences using the principle of history compression,"
1403:
7835:
7368:
6534:
6494:
4895:
4860:
1024:
993:
677:
613:
584:
489:
315:
248:
234:
220:
195:
145:
97:
57:
4758:
4080:
8232:
8141:
8102:
8070:
8044:
8017:
7947:
7856:
7750:
7723:
7668:
7621:
7595:
7561:
7489:
7461:
1452:
997:
984:
655:
579:
365:
160:
7103:
6604:
Hardware advances have meant that from 1991 to 2015, computer power (especially as delivered by
4506:
6545:
to model each new layer of higher level features. Each new layer guarantees an increase on the
5028:
8224:
8216:
8120:
8062:
8009:
7939:
7848:
7743:"Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift"
7715:
7707:
7658:
7587:
7532:
7481:
6703:
6643:
suffer less from the vanishing gradient problem, because they only saturate in one direction.
5735:
5512:
1564:
748:
591:
504:
300:
270:
215:
210:
165:
107:
7141:
4047:
3513:{\displaystyle \nabla _{x}F(x_{t-1},u_{t},\theta )=W_{rec}\mathop {diag} (\sigma '(x_{t-1}))}
3333:
1236:
8208:
8112:
8054:
8001:
7931:
7895:
7840:
7699:
7650:
7579:
7471:
7339:
6620:
One of the newest and most effective ways to resolve the vanishing gradient problem is with
6554:
4702:
3351:
964:
776:
529:
479:
389:
373:
343:
205:
200:
150:
140:
38:
7400:
5189:
4906:
Following (Doya, 1993), consider this one-neuron recurrent network with sigmoid activation:
2896:
1376:
1349:
7915:
7809:
6715:
6676:
6581:
6538:
6530:
6514:
6265:
6222:
5805:
5776:
5602:
5553:
4041:
1028:
980:
804:
608:
474:
414:
17:
6497:
is a standard method for solving both the exploding and the vanishing gradient problems.
5163:
7891:
7575:
7310:
6927:
Any activation function works, as long as it is differentiable with bounded derivative.
6550:
6291:
6245:
6202:
5582:
5489:
5273:
5215:
4535:
4106:
3357:
988:
824:
355:
92:
5235:
5018:{\displaystyle x_{t+1}=(1-\epsilon )x_{t}+\epsilon \sigma (wx_{t}+b)+\epsilon w'u_{t}}
2245:
Training the network requires us to define a loss function to be minimized. Let it be
1016:-layer network, meaning that the gradient (error signal) decreases exponentially with
8296:
8236:
7813:
7727:
7493:
3991:
1036:
743:
672:
554:
285:
170:
8074:
8021:
7672:
6706:. This approach is not based on gradient and avoids the vanishing gradient problem.
6509:'s multi-level hierarchy of networks (1992) pre-trained one level at a time through
3262:{\displaystyle x_{t}=F(x_{t-1},u_{t},\theta )=W_{rec}\sigma (x_{t-1})+W_{in}u_{t}+b}
7951:
7860:
7599:
6699:
8162:
8212:
8058:
7742:
6734:
A more general loss function could depend on the entire sequence of outputs, as
6596:, without any prior knowledge about the three different languages to be learned.
8251:
7844:
7770:
Santurkar, Shibani; Tsipras, Dimitris; Ilyas, Andrew; Madry, Aleksander (2018).
6546:
549:
43:
8092:
7966:
Offline
Handwriting Recognition with Multidimensional Recurrent Neural Networks
7935:
7647:[Proceedings] 1992 IEEE International Symposium on Circuits and Systems
7642:
7476:
7449:
8179:
Kumar, Siddharth
Krishna. "On weight initialization in deep neural networks."
7900:
7875:
7654:
7001:{\displaystyle W_{rec}={\begin{bmatrix}0&2\\\epsilon &0\end{bmatrix}}}
4899:
1005:
698:
394:
320:
8220:
8035:
Schmidhuber, Jürgen (2015). "Deep learning in neural networks: An overview".
7711:
7485:
1027:
deep artificial neural networks from scratch, initially with little success.
7972:, Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552
857:
638:
8228:
8066:
8013:
8005:
7852:
7719:
7591:
5506:
is large enough that the system has settled into one of the stable points.
7943:
6387:
cited. Unsourced or poorly sourced material may be challenged and removed.
5550:
puts the system very close to an unstable point, then a tiny variation in
8116:
5375:{\displaystyle (x,b)=\left(x,\ln \left({\frac {x}{1-x}}\right)-5x\right)}
7450:"Gradient amplification: An efficient way to train deep neural networks"
7448:
Basodi, Sunitha; Ji, Chunyan; Zhang, Haiping; Pan, Yi (September 2020).
7300:{\displaystyle (W_{rec}D)^{2N}=(2\epsilon \cdot c^{2})^{N}I_{2\times 2}}
6051:
For the general case, the intuition still holds ( Figures 3, 4, and 5).
3117:
For a concrete example, consider a typical recurrent network defined by
633:
8099:
2016 IEEE Conference on
Computer Vision and Pattern Recognition (CVPR)
7703:
7583:
5773:
puts the system far from an unstable point, then a small variation in
1449:. The vanishing gradient problem already presents itself clearly when
1012:
of these small numbers to compute gradients of the early layers in an
1032:
384:
7688:"Learning long-term dependencies with gradient descent is difficult"
6918:
for which the problem is the same, just with more complex notations.
8181:
8146:
8107:
7755:
7566:
7466:
6695:
to solve problems like image reconstruction and face localization.
8049:
7626:
6688:
6589:
4890:
628:
623:
350:
7516:(Diplom thesis). Institut f. Informatik, Technische Univ. Munich.
4692:{\displaystyle \|\nabla _{\theta }F(x_{t-k-1},u_{t-k},\theta )\|}
8161:
Glorot, Xavier; Bordes, Antoine; Bengio, Yoshua (14 June 2011).
7060:{\displaystyle D={\begin{bmatrix}c&0\\0&c\end{bmatrix}}}
6640:
6609:
8259:. Lecture Notes in Computer Science. Vol. 2766. Springer.
8091:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2016).
7984:
IEEE Transactions on Pattern Analysis and Machine Intelligence
6414:
6356:
6315:
5149:{\displaystyle {\frac {dx}{dt}}=-x(t)+\sigma (wx(t)+b)+w'u(t)}
4123:, the above multiplication has operator norm bounded above by
8271:"Sepp Hochreiter's Fundamental Deep Learning Problem (1991)"
7307:, which might go to infinity or zero depending on choice of
6849:
7643:"Bifurcations in the learning of recurrent neural networks"
6486:
To overcome this problem, several methods were proposed.
6199:. This produces a rather pathological loss landscape: as
6671:
of the initial weights is set according to the formula:
916:
List of datasets in computer vision and image processing
6285:, the attractor basin changes, and loss jumps to 0.50.
4156:. This is the prototypical vanishing gradient problem.
1489:, so we simplify our notation to the special case with:
1067:
On the difficulty of training Recurrent Neural Networks
7026:
6967:
1339:{\displaystyle (h_{t},x_{t})=F(h_{t-1},u_{t},\theta )}
8253:
Hierarchical Neural Networks for Image Interpretation
7403:
7371:
7342:
7313:
7210:
7177:
7144:
7106:
7073:
7014:
6942:
6740:
6294:
6268:
6248:
6242:
from above, the loss approaches zero, but as soon as
6225:
6205:
6133:
6065:
5901:
5837:
5808:
5779:
5738:
5689:
5634:
5605:
5585:
5556:
5515:
5492:
5446:
5391:
5296:
5276:
5238:
5218:
5192:
5166:
5051:
5031:
4912:
4863:
4821:
4791:
4761:
4731:
4705:
4616:
4558:
4538:
4509:
4458:
4171:
4129:
4109:
4083:
4050:
4000:
3955:
3526:
3383:
3360:
3336:
3275:
3125:
2922:
2899:
2631:
2330:
2251:
1573:
1495:
1455:
1406:
1379:
1352:
1259:
1239:
1187:
1135:
1083:
7529:
A Field Guide to Dynamical Recurrent Neural Networks
6529:. Then the network is trained further by supervised
6059:
Continue using the above one-neuron network, fixing
5885:{\displaystyle {\frac {\Delta x(T)}{\Delta x(0)}}=0}
5628:
move from one stable point to the other. This makes
3354:, applied to each vector coordinate separately, and
2886:{\displaystyle \Delta \theta =-\eta \cdot \left^{T}}
7686:Bengio, Y.; Simard, P.; Frasconi, P. (March 1994).
5729:both very large, a case of the exploding gradient.
8195:Yilmaz, Ahmet; Poli, Riccardo (1 September 2022).
7415:
7389:
7357:
7319:
7299:
7196:
7163:
7130:
7092:
7059:
7000:
6910:
6300:
6277:
6254:
6234:
6211:
6191:
6119:
6039:
5884:
5823:
5794:
5765:
5721:
5676:{\displaystyle {\frac {\Delta x(T)}{\Delta x(0)}}}
5675:
5620:
5591:
5571:
5542:
5498:
5478:
5433:{\displaystyle {\frac {\Delta x(T)}{\Delta x(0)}}}
5432:
5374:
5282:
5262:
5224:
5204:
5178:
5148:
5037:
5017:
4875:
4846:
4807:
4777:
4747:
4717:
4691:
4602:
4544:
4524:
4495:
4444:
4148:
4115:
4095:
4069:
4032:
3982:
3941:
3512:
3366:
3342:
3322:
3261:
3113:Example: recurrent network with sigmoid activation
3102:
2905:
2885:
2602:
2311:
2237:
1555:
1481:
1441:
1392:
1365:
1338:
1245:
1225:
1173:
1121:
7776:Advances in Neural Information Processing Systems
7772:"How Does Batch Normalization Help Optimization?"
7741:Ioffe, Sergey; Szegedy, Christian (1 June 2015).
7814:"A fast learning algorithm for deep belief nets"
6687:Behnke relied only on the sign of the gradient (
3994:of the above multiplication is bounded above by
1004:, and backpropagation computes gradients by the
7510:Untersuchungen zu dynamischen neuronalen Netzen
5722:{\displaystyle {\frac {\Delta x(T)}{\Delta b}}}
5479:{\displaystyle {\frac {\Delta x(T)}{\Delta b}}}
4496:{\displaystyle \nabla _{\theta }F(x,u,\theta )}
8101:. Las Vegas, NV, USA: IEEE. pp. 770–778.
2319:, then minimizing it by gradient descent gives
1556:{\displaystyle x_{t}=F(x_{t-1},u_{t},\theta )}
1077:A generic recurrent network has hidden states
1023:Back-propagation allowed researchers to train
911:List of datasets for machine-learning research
7804:
7802:
944:
8:
8094:Deep Residual Learning for Image Recognition
7747:International Conference on Machine Learning
7443:
7441:
7439:
4686:
4617:
4021:
4001:
7093:{\displaystyle \epsilon >{\frac {1}{2}}}
6350:Learn how and when to remove these messages
6048:episode) by shaping its stable attractors.
4161:
6127:, and consider a loss function defined by
5045:limit, the dynamics of the network becomes
1020:while the early layers train very slowly.
951:
937:
29:
8145:
8106:
8086:
8084:
8048:
7995:
7899:
7834:
7754:
7649:. Vol. 6. IEEE. pp. 2777–2780.
7625:
7565:
7475:
7465:
7402:
7370:
7341:
7312:
7285:
7275:
7265:
7237:
7218:
7209:
7197:{\displaystyle {\sqrt {2\epsilon }}>1}
7178:
7176:
7149:
7143:
7105:
7080:
7072:
7021:
7013:
6962:
6947:
6941:
6899:
6874:
6861:
6848:
6847:
6841:
6830:
6814:
6789:
6776:
6751:
6739:
6474:Learn how and when to remove this message
6456:Learn how and when to remove this message
6403:Learn how and when to remove this message
6293:
6267:
6247:
6224:
6204:
6183:
6132:
6064:
6028:
5972:
5934:
5902:
5900:
5838:
5836:
5807:
5778:
5737:
5690:
5688:
5635:
5633:
5604:
5584:
5555:
5514:
5491:
5447:
5445:
5392:
5390:
5336:
5295:
5275:
5237:
5217:
5191:
5165:
5052:
5050:
5030:
5009:
4976:
4951:
4917:
4911:
4862:
4832:
4820:
4796:
4790:
4769:
4760:
4736:
4730:
4704:
4665:
4640:
4624:
4615:
4576:
4563:
4557:
4537:
4508:
4463:
4457:
4410:
4391:
4375:
4356:
4337:
4321:
4299:
4280:
4264:
4246:
4221:
4208:
4192:
4176:
4170:
4134:
4128:
4108:
4082:
4055:
4049:
4024:
4008:
3999:
3969:
3956:
3954:
3917:
3882:
3870:
3845:
3810:
3798:
3774:
3739:
3727:
3689:
3670:
3654:
3626:
3607:
3591:
3570:
3551:
3535:
3527:
3525:
3492:
3457:
3445:
3423:
3404:
3388:
3382:
3359:
3335:
3308:
3289:
3274:
3247:
3234:
3212:
3190:
3168:
3149:
3130:
3124:
3076:
3057:
3041:
3016:
2997:
2981:
2962:
2943:
2927:
2921:
2898:
2877:
2840:
2821:
2805:
2786:
2767:
2751:
2729:
2710:
2694:
2676:
2660:
2630:
2562:
2543:
2527:
2508:
2489:
2473:
2451:
2432:
2416:
2398:
2373:
2360:
2344:
2329:
2300:
2275:
2262:
2250:
2193:
2174:
2158:
2139:
2120:
2104:
2082:
2063:
2047:
2000:
1972:
1953:
1937:
1903:
1884:
1868:
1846:
1827:
1811:
1783:
1764:
1748:
1722:
1700:
1681:
1665:
1637:
1618:
1602:
1585:
1574:
1572:
1538:
1519:
1500:
1494:
1473:
1460:
1454:
1430:
1411:
1405:
1384:
1378:
1357:
1351:
1321:
1302:
1280:
1267:
1258:
1238:
1205:
1192:
1186:
1153:
1140:
1134:
1101:
1088:
1082:
6572:Another technique particularly used for
6192:{\displaystyle L(x(T))=(0.855-x(T))^{2}}
4902:, and can be used as a one-bit memory.
3323:{\displaystyle \theta =(W_{rec},W_{in})}
2312:{\displaystyle L(x_{T},u_{1},...,u_{T})}
8163:"Deep Sparse Rectifier Neural Networks"
7964:Graves, Alex; and Schmidhuber, Jürgen;
7435:
6727:
37:
27:Machine learning model training problem
6663:is the number of neurons in a layer.
1008:. This has the effect of multiplying
1000:function have gradients in the range
7:
7692:IEEE Transactions on Neural Networks
7615:
7613:
7611:
7609:
5892:, a case of the vanishing gradient.
2321:
6120:{\displaystyle w=5,x(0)=0.5,u(t)=0}
1065:This section is based on the paper
906:Glossary of artificial intelligence
7922:(1997). "Long Short-Term Memory".
7554:Journal of Computational Chemistry
6288:Consequently, attempting to train
5954:
5937:
5922:
5905:
5858:
5841:
5710:
5693:
5655:
5638:
5467:
5450:
5412:
5395:
4808:{\displaystyle \nabla _{\theta }L}
4793:
4748:{\displaystyle \nabla _{\theta }L}
4733:
4621:
4460:
4372:
4318:
4261:
4189:
4173:
3651:
3588:
3532:
3385:
3038:
2978:
2924:
2802:
2748:
2691:
2657:
2632:
2524:
2470:
2413:
2341:
2155:
2101:
2044:
1934:
1865:
1808:
1745:
1662:
1599:
25:
6331:This section has multiple issues.
4603:{\displaystyle u_{t},u_{t-1},...}
1069:by Pascanu, Mikolov, and Bengio.
7812:; Osindero, S.; Teh, Y. (2006).
7397:, and the unstable attractor is
7365:, the two stable attractors are
6419:
6361:
6320:
4785:. This means that, effectively,
4149:{\displaystyle \gamma ^{k}\to 0}
3983:{\displaystyle |\sigma '|\leq 1}
8182:arXiv preprint arXiv:1704.08863
6592:2009 competitions in connected
6339:or discuss these issues on the
4847:{\displaystyle O(\gamma ^{-1})}
4033:{\displaystyle \|W_{rec}\|^{k}}
1253:, so that the system evolves as
1226:{\displaystyle x_{1},x_{2},...}
1174:{\displaystyle u_{1},u_{2},...}
1122:{\displaystyle h_{1},h_{2},...}
977:gradient-based learning methods
7272:
7249:
7234:
7211:
7125:
7113:
6905:
6854:
6820:
6744:
6533:to classify labeled data. The
6180:
6176:
6170:
6158:
6152:
6149:
6143:
6137:
6108:
6102:
6087:
6081:
6011:
6008:
6002:
5990:
5987:
5981:
5949:
5943:
5917:
5911:
5870:
5864:
5853:
5847:
5818:
5812:
5789:
5783:
5760:
5751:
5745:
5739:
5705:
5699:
5667:
5661:
5650:
5644:
5615:
5609:
5566:
5560:
5537:
5528:
5522:
5516:
5462:
5456:
5424:
5418:
5407:
5401:
5309:
5297:
5257:
5239:
5143:
5137:
5120:
5111:
5105:
5096:
5087:
5081:
4988:
4966:
4944:
4932:
4841:
4825:
4815:is affected only by the first
4683:
4633:
4519:
4513:
4490:
4472:
4428:
4384:
4368:
4330:
4311:
4273:
4252:
4201:
4140:
3970:
3957:
3932:
3929:
3910:
3899:
3860:
3857:
3838:
3827:
3789:
3786:
3767:
3756:
3713:
3663:
3644:
3600:
3582:
3544:
3507:
3504:
3485:
3474:
3435:
3397:
3317:
3282:
3224:
3205:
3180:
3142:
3094:
3050:
3034:
2990:
2974:
2936:
2858:
2814:
2798:
2760:
2741:
2703:
2682:
2669:
2580:
2536:
2520:
2482:
2463:
2425:
2404:
2353:
2306:
2255:
2211:
2167:
2151:
2113:
2094:
2056:
2012:
1990:
1946:
1921:
1877:
1861:
1858:
1820:
1795:
1757:
1712:
1674:
1649:
1611:
1550:
1512:
1442:{\displaystyle x_{t}=G(h_{t})}
1436:
1423:
1333:
1295:
1286:
1260:
326:Relevance vector machine (RVM)
1:
7454:Big Data Mining and Analytics
7390:{\displaystyle x=0.145,0.855}
6378:secondary or tertiary sources
4876:{\displaystyle \gamma \geq 1}
971:is encountered when training
815:Computational learning theory
379:Expectation–maximization (EM)
8213:10.1016/j.neunet.2022.05.030
8059:10.1016/j.neunet.2014.09.003
6543:restricted Boltzmann machine
4778:{\displaystyle M\gamma ^{k}}
4096:{\displaystyle \gamma <1}
1233:. Let it be parametrized by
1048:backpropagation through time
772:Coefficient of determination
619:Convolutional neural network
331:Support vector machine (SVM)
7845:10.1162/neco.2006.18.7.1527
3352:sigmoid activation function
1482:{\displaystyle x_{t}=h_{t}}
923:Outline of machine learning
820:Empirical risk minimization
8319:
8303:Artificial neural networks
7936:10.1162/neco.1997.9.8.1735
7477:10.26599/BDMA.2020.9020004
7131:{\displaystyle c\in (0,1)}
6693:Neural Abstraction Pyramid
6632:Other activation functions
6580:(LSTM) network of 1997 by
6565:
4525:{\displaystyle \sigma (x)}
3330:is the network parameter,
1055:exploding gradient problem
969:vanishing gradient problem
560:Feedforward neural network
311:Artificial neural networks
18:Vanishing-gradient problem
7901:10.4249/scholarpedia.5947
7782:. Curran Associates, Inc.
7655:10.1109/iscas.1992.230622
6574:recurrent neural networks
6428:Some of this section 's
6385:primary research articles
5038:{\displaystyle \epsilon }
543:Artificial neural network
6622:residual neural networks
5895:Note that in this case,
5802:would have no effect on
5766:{\displaystyle (x(0),b)}
5543:{\displaystyle (x(0),b)}
4699:is also bounded by some
852:Journals and conferences
799:Mathematical foundations
709:Temporal difference (TD)
565:Recurrent neural network
485:Conditional random field
408:Dimensionality reduction
156:Dimensionality reduction
118:Quantum machine learning
113:Neuromorphic engineering
73:Self-supervised learning
68:Semi-supervised learning
7796:, 4, pp. 234–242, 1992.
7507:Hochreiter, S. (1991).
7164:{\displaystyle W_{rec}}
6702:or more systematically
6594:handwriting recognition
4887:Dynamical systems model
4503:are just components of
4070:{\displaystyle W_{rec}}
3343:{\displaystyle \sigma }
1246:{\displaystyle \theta }
1073:Recurrent network model
261:Apprenticeship learning
8006:10.1109/tpami.2008.137
7876:"Deep belief networks"
7417:
7391:
7359:
7358:{\displaystyle b=-2.5}
7321:
7301:
7198:
7165:
7132:
7094:
7061:
7002:
6912:
6846:
6578:long short-term memory
6568:Long short-term memory
6562:Long short-term memory
6302:
6279:
6256:
6236:
6213:
6193:
6121:
6041:
5886:
5825:
5796:
5767:
5723:
5677:
5622:
5593:
5573:
5544:
5500:
5480:
5434:
5376:
5284:
5264:
5226:
5206:
5180:
5150:
5039:
5019:
4903:
4877:
4848:
4809:
4779:
4749:
4725:, and so the terms in
4719:
4718:{\displaystyle M>0}
4693:
4604:
4546:
4526:
4497:
4446:
4150:
4117:
4097:
4071:
4034:
3984:
3943:
3514:
3368:
3344:
3324:
3263:
3104:
2913:is the learning rate.
2907:
2887:
2604:
2313:
2239:
1557:
1483:
1443:
1394:
1367:
1340:
1247:
1227:
1175:
1123:
810:Bias–variance tradeoff
692:Reinforcement learning
668:Spiking neural network
78:Reinforcement learning
7418:
7416:{\displaystyle x=0.5}
7392:
7360:
7322:
7302:
7199:
7166:
7133:
7095:
7062:
7003:
6913:
6826:
6647:Weight initialization
6513:, fine-tuned through
6511:unsupervised learning
6501:Multi-level hierarchy
6303:
6280:
6257:
6237:
6214:
6194:
6122:
6042:
5887:
5826:
5797:
5768:
5724:
5678:
5623:
5594:
5574:
5545:
5501:
5481:
5435:
5377:
5285:
5265:
5227:
5207:
5205:{\displaystyle w=5.0}
5181:
5151:
5040:
5020:
4894:
4878:
4849:
4810:
4780:
4750:
4720:
4694:
4605:
4547:
4527:
4498:
4447:
4151:
4118:
4098:
4072:
4035:
3985:
3944:
3515:
3369:
3345:
3325:
3264:
3105:
2908:
2906:{\displaystyle \eta }
2888:
2605:
2314:
2240:
1558:
1484:
1444:
1395:
1393:{\displaystyle h_{t}}
1368:
1366:{\displaystyle x_{t}}
1341:
1248:
1228:
1176:
1124:
646:Neural radiance field
468:Structured prediction
191:Structured prediction
63:Unsupervised learning
8250:Sven Behnke (2003).
8117:10.1109/CVPR.2016.90
7401:
7369:
7340:
7311:
7208:
7175:
7171:has spectral radius
7142:
7104:
7071:
7012:
6940:
6738:
6691:) when training his
6292:
6278:{\displaystyle -2.5}
6266:
6246:
6235:{\displaystyle -2.5}
6223:
6203:
6131:
6063:
5899:
5835:
5824:{\displaystyle x(T)}
5806:
5795:{\displaystyle x(0)}
5777:
5736:
5687:
5632:
5621:{\displaystyle x(T)}
5603:
5583:
5572:{\displaystyle x(0)}
5554:
5513:
5490:
5444:
5389:
5294:
5274:
5236:
5216:
5190:
5164:
5049:
5029:
4910:
4861:
4819:
4789:
4759:
4729:
4703:
4614:
4556:
4536:
4507:
4456:
4169:
4127:
4107:
4081:
4048:
3998:
3953:
3524:
3381:
3374:is the bias vector.
3358:
3334:
3273:
3123:
2920:
2897:
2629:
2328:
2249:
1571:
1493:
1453:
1404:
1377:
1350:
1257:
1237:
1185:
1133:
1081:
1040:feedforward networks
994:activation functions
835:Statistical learning
733:Learning with humans
525:Local outlier factor
7920:Schmidhuber, Jürgen
7892:2009SchpJ...4.5947H
7874:Hinton, G. (2009).
7576:2017arXiv170104503G
7336:This is because at
6535:deep belief network
6495:Batch normalization
6490:Batch normalization
5179:{\displaystyle u=0}
5156:Consider first the
4896:Bifurcation diagram
1061:Prototypical models
678:Electrochemical RAM
585:reservoir computing
316:Logistic regression
235:Supervised learning
221:Multimodal learning
196:Feature engineering
141:Generative modeling
103:Rule-based learning
98:Curriculum learning
58:Supervised learning
33:Part of a series on
7924:Neural Computation
7822:Neural Computation
7794:Neural Computation
7413:
7387:
7355:
7317:
7297:
7194:
7161:
7128:
7090:
7057:
7051:
6998:
6992:
6908:
6507:Jürgen Schmidhuber
6298:
6275:
6252:
6232:
6209:
6189:
6117:
6037:
5882:
5821:
5792:
5763:
5719:
5673:
5618:
5589:
5569:
5540:
5496:
5476:
5430:
5372:
5280:
5260:
5222:
5202:
5176:
5146:
5035:
5015:
4904:
4873:
4854:terms in the sum.
4844:
4805:
4775:
4745:
4715:
4689:
4610:are bounded, then
4600:
4542:
4522:
4493:
4452:The components of
4442:
4146:
4113:
4093:
4067:
4030:
3980:
3939:
3937:
3510:
3364:
3340:
3320:
3259:
3100:
2903:
2883:
2600:
2309:
2235:
2233:
1553:
1479:
1439:
1390:
1363:
1346:Often, the output
1336:
1243:
1223:
1171:
1119:
1044:recurrent networks
998:hyperbolic tangent
985:partial derivative
246: •
161:Density estimation
8126:978-1-4673-8851-1
7749:. PMLR: 448–456.
7704:10.1109/72.279181
7641:Doya, K. (1992).
7584:10.1002/jcc.24764
7560:(16): 1291–1307.
7320:{\displaystyle c}
7186:
7088:
6704:genetic algorithm
6616:Residual networks
6527:feature detectors
6484:
6483:
6476:
6466:
6465:
6458:
6413:
6412:
6405:
6372:needs additional
6354:
6301:{\displaystyle b}
6255:{\displaystyle b}
6212:{\displaystyle b}
6015:
5961:
5929:
5874:
5717:
5671:
5592:{\displaystyle b}
5499:{\displaystyle T}
5474:
5428:
5352:
5283:{\displaystyle b}
5225:{\displaystyle b}
5070:
4545:{\displaystyle u}
4162:loss differential
4116:{\displaystyle k}
3367:{\displaystyle b}
2624:
2623:
2617:loss differential
1373:is a function of
961:
960:
766:Model diagnostics
749:Human-in-the-loop
592:Boltzmann machine
505:Anomaly detection
301:Linear regression
216:Ontology learning
211:Grammar induction
186:Semantic analysis
181:Association rules
166:Anomaly detection
108:Neuro-symbolic AI
16:(Redirected from
8310:
8286:
8285:
8283:
8281:
8267:
8261:
8260:
8258:
8247:
8241:
8240:
8192:
8186:
8177:
8171:
8170:
8158:
8152:
8151:
8149:
8137:
8131:
8130:
8110:
8088:
8079:
8078:
8052:
8032:
8026:
8025:
7999:
7979:
7973:
7962:
7956:
7955:
7930:(8): 1735–1780.
7916:Hochreiter, Sepp
7912:
7906:
7905:
7903:
7871:
7865:
7864:
7838:
7829:(7): 1527–1554.
7818:
7806:
7797:
7790:
7784:
7783:
7767:
7761:
7760:
7758:
7738:
7732:
7731:
7683:
7677:
7676:
7638:
7632:
7631:
7629:
7617:
7604:
7603:
7569:
7549:
7543:
7542:
7524:
7518:
7517:
7515:
7504:
7498:
7497:
7479:
7469:
7445:
7424:
7422:
7420:
7419:
7414:
7396:
7394:
7393:
7388:
7364:
7362:
7361:
7356:
7334:
7328:
7326:
7324:
7323:
7318:
7306:
7304:
7303:
7298:
7296:
7295:
7280:
7279:
7270:
7269:
7245:
7244:
7229:
7228:
7203:
7201:
7200:
7195:
7187:
7179:
7170:
7168:
7167:
7162:
7160:
7159:
7137:
7135:
7134:
7129:
7099:
7097:
7096:
7091:
7089:
7081:
7066:
7064:
7063:
7058:
7056:
7055:
7007:
7005:
7004:
6999:
6997:
6996:
6958:
6957:
6934:
6928:
6925:
6919:
6917:
6915:
6914:
6909:
6904:
6903:
6879:
6878:
6866:
6865:
6853:
6852:
6845:
6840:
6819:
6818:
6794:
6793:
6781:
6780:
6756:
6755:
6732:
6674:
6662:
6658:
6555:generative model
6539:latent variables
6521:Related approach
6479:
6472:
6461:
6454:
6450:
6447:
6441:
6423:
6415:
6408:
6401:
6397:
6394:
6388:
6365:
6364:
6357:
6346:
6324:
6323:
6316:
6307:
6305:
6304:
6299:
6284:
6282:
6281:
6276:
6261:
6259:
6258:
6253:
6241:
6239:
6238:
6233:
6218:
6216:
6215:
6210:
6198:
6196:
6195:
6190:
6188:
6187:
6126:
6124:
6123:
6118:
6046:
6044:
6043:
6038:
6036:
6035:
6027:
6023:
6016:
6014:
5973:
5962:
5960:
5952:
5935:
5930:
5928:
5920:
5903:
5891:
5889:
5888:
5883:
5875:
5873:
5856:
5839:
5830:
5828:
5827:
5822:
5801:
5799:
5798:
5793:
5772:
5770:
5769:
5764:
5728:
5726:
5725:
5720:
5718:
5716:
5708:
5691:
5682:
5680:
5679:
5674:
5672:
5670:
5653:
5636:
5627:
5625:
5624:
5619:
5598:
5596:
5595:
5590:
5578:
5576:
5575:
5570:
5549:
5547:
5546:
5541:
5505:
5503:
5502:
5497:
5485:
5483:
5482:
5477:
5475:
5473:
5465:
5448:
5439:
5437:
5436:
5431:
5429:
5427:
5410:
5393:
5381:
5379:
5378:
5373:
5371:
5367:
5357:
5353:
5351:
5337:
5289:
5287:
5286:
5281:
5269:
5267:
5266:
5263:{\displaystyle }
5261:
5231:
5229:
5228:
5223:
5211:
5209:
5208:
5203:
5185:
5183:
5182:
5177:
5155:
5153:
5152:
5147:
5133:
5071:
5069:
5061:
5053:
5044:
5042:
5041:
5036:
5024:
5022:
5021:
5016:
5014:
5013:
5004:
4981:
4980:
4956:
4955:
4928:
4927:
4882:
4880:
4879:
4874:
4853:
4851:
4850:
4845:
4840:
4839:
4814:
4812:
4811:
4806:
4801:
4800:
4784:
4782:
4781:
4776:
4774:
4773:
4754:
4752:
4751:
4746:
4741:
4740:
4724:
4722:
4721:
4716:
4698:
4696:
4695:
4690:
4676:
4675:
4657:
4656:
4629:
4628:
4609:
4607:
4606:
4601:
4587:
4586:
4568:
4567:
4551:
4549:
4548:
4543:
4531:
4529:
4528:
4523:
4502:
4500:
4499:
4494:
4468:
4467:
4451:
4449:
4448:
4443:
4441:
4437:
4421:
4420:
4402:
4401:
4380:
4379:
4361:
4360:
4348:
4347:
4326:
4325:
4304:
4303:
4291:
4290:
4269:
4268:
4251:
4250:
4226:
4225:
4213:
4212:
4197:
4196:
4181:
4180:
4155:
4153:
4152:
4147:
4139:
4138:
4122:
4120:
4119:
4114:
4103:, then at large
4102:
4100:
4099:
4094:
4076:
4074:
4073:
4068:
4066:
4065:
4039:
4037:
4036:
4031:
4029:
4028:
4019:
4018:
3989:
3987:
3986:
3981:
3973:
3968:
3960:
3948:
3946:
3945:
3940:
3938:
3928:
3927:
3909:
3895:
3881:
3880:
3856:
3855:
3837:
3823:
3809:
3808:
3785:
3784:
3766:
3752:
3738:
3737:
3706:
3705:
3681:
3680:
3659:
3658:
3637:
3636:
3618:
3617:
3596:
3595:
3575:
3574:
3562:
3561:
3540:
3539:
3519:
3517:
3516:
3511:
3503:
3502:
3484:
3470:
3456:
3455:
3428:
3427:
3415:
3414:
3393:
3392:
3373:
3371:
3370:
3365:
3349:
3347:
3346:
3341:
3329:
3327:
3326:
3321:
3316:
3315:
3300:
3299:
3268:
3266:
3265:
3260:
3252:
3251:
3242:
3241:
3223:
3222:
3201:
3200:
3173:
3172:
3160:
3159:
3135:
3134:
3109:
3107:
3106:
3101:
3087:
3086:
3068:
3067:
3046:
3045:
3027:
3026:
3008:
3007:
2986:
2985:
2967:
2966:
2954:
2953:
2932:
2931:
2912:
2910:
2909:
2904:
2892:
2890:
2889:
2884:
2882:
2881:
2876:
2872:
2871:
2867:
2851:
2850:
2832:
2831:
2810:
2809:
2791:
2790:
2778:
2777:
2756:
2755:
2734:
2733:
2721:
2720:
2699:
2698:
2681:
2680:
2665:
2664:
2618:
2609:
2607:
2606:
2601:
2593:
2589:
2573:
2572:
2554:
2553:
2532:
2531:
2513:
2512:
2500:
2499:
2478:
2477:
2456:
2455:
2443:
2442:
2421:
2420:
2403:
2402:
2378:
2377:
2365:
2364:
2349:
2348:
2322:
2318:
2316:
2315:
2310:
2305:
2304:
2280:
2279:
2267:
2266:
2244:
2242:
2241:
2236:
2234:
2224:
2220:
2204:
2203:
2185:
2184:
2163:
2162:
2144:
2143:
2131:
2130:
2109:
2108:
2087:
2086:
2074:
2073:
2052:
2051:
2031:
2018:
2011:
2010:
1983:
1982:
1964:
1963:
1942:
1941:
1914:
1913:
1895:
1894:
1873:
1872:
1851:
1850:
1838:
1837:
1816:
1815:
1788:
1787:
1775:
1774:
1753:
1752:
1737:
1733:
1732:
1705:
1704:
1692:
1691:
1670:
1669:
1642:
1641:
1629:
1628:
1607:
1606:
1590:
1589:
1562:
1560:
1559:
1554:
1543:
1542:
1530:
1529:
1505:
1504:
1488:
1486:
1485:
1480:
1478:
1477:
1465:
1464:
1448:
1446:
1445:
1440:
1435:
1434:
1416:
1415:
1399:
1397:
1396:
1391:
1389:
1388:
1372:
1370:
1369:
1364:
1362:
1361:
1345:
1343:
1342:
1337:
1326:
1325:
1313:
1312:
1285:
1284:
1272:
1271:
1252:
1250:
1249:
1244:
1232:
1230:
1229:
1224:
1210:
1209:
1197:
1196:
1180:
1178:
1177:
1172:
1158:
1157:
1145:
1144:
1128:
1126:
1125:
1120:
1106:
1105:
1093:
1092:
1019:
1015:
1011:
1003:
965:machine learning
953:
946:
939:
900:Related articles
777:Confusion matrix
530:Isolation forest
475:Graphical models
254:
253:
206:Learning to rank
201:Feature learning
39:Machine learning
30:
21:
8318:
8317:
8313:
8312:
8311:
8309:
8308:
8307:
8293:
8292:
8289:
8279:
8277:
8275:people.idsia.ch
8269:
8268:
8264:
8256:
8249:
8248:
8244:
8201:Neural Networks
8194:
8193:
8189:
8178:
8174:
8160:
8159:
8155:
8139:
8138:
8134:
8127:
8090:
8089:
8082:
8037:Neural Networks
8034:
8033:
8029:
7997:10.1.1.139.4502
7981:
7980:
7976:
7963:
7959:
7914:
7913:
7909:
7873:
7872:
7868:
7816:
7808:
7807:
7800:
7791:
7787:
7769:
7768:
7764:
7740:
7739:
7735:
7685:
7684:
7680:
7665:
7640:
7639:
7635:
7619:
7618:
7607:
7551:
7550:
7546:
7539:
7526:
7525:
7521:
7513:
7506:
7505:
7501:
7447:
7446:
7437:
7433:
7428:
7427:
7399:
7398:
7367:
7366:
7338:
7337:
7335:
7331:
7309:
7308:
7281:
7271:
7261:
7233:
7214:
7206:
7205:
7173:
7172:
7145:
7140:
7139:
7102:
7101:
7069:
7068:
7050:
7049:
7044:
7038:
7037:
7032:
7022:
7010:
7009:
6991:
6990:
6985:
6979:
6978:
6973:
6963:
6943:
6938:
6937:
6935:
6931:
6926:
6922:
6895:
6870:
6857:
6810:
6785:
6772:
6747:
6736:
6735:
6733:
6729:
6724:
6716:Spectral radius
6712:
6685:
6677:backpropagation
6672:
6660:
6656:
6649:
6634:
6618:
6602:
6600:Faster hardware
6570:
6564:
6531:backpropagation
6523:
6515:backpropagation
6503:
6492:
6480:
6469:
6468:
6467:
6462:
6451:
6445:
6442:
6439:
6424:
6409:
6398:
6392:
6389:
6382:
6366:
6362:
6325:
6321:
6314:
6290:
6289:
6264:
6263:
6244:
6243:
6221:
6220:
6201:
6200:
6179:
6129:
6128:
6061:
6060:
6057:
6055:Geometric model
5977:
5971:
5967:
5966:
5953:
5936:
5921:
5904:
5897:
5896:
5857:
5840:
5833:
5832:
5804:
5803:
5775:
5774:
5734:
5733:
5709:
5692:
5685:
5684:
5654:
5637:
5630:
5629:
5601:
5600:
5581:
5580:
5552:
5551:
5511:
5510:
5488:
5487:
5466:
5449:
5442:
5441:
5411:
5394:
5387:
5386:
5341:
5332:
5319:
5315:
5292:
5291:
5272:
5271:
5234:
5233:
5214:
5213:
5188:
5187:
5162:
5161:
5126:
5062:
5054:
5047:
5046:
5027:
5026:
5005:
4997:
4972:
4947:
4913:
4908:
4907:
4889:
4859:
4858:
4828:
4817:
4816:
4792:
4787:
4786:
4765:
4757:
4756:
4732:
4727:
4726:
4701:
4700:
4661:
4636:
4620:
4612:
4611:
4572:
4559:
4554:
4553:
4534:
4533:
4505:
4504:
4459:
4454:
4453:
4406:
4387:
4371:
4352:
4333:
4317:
4295:
4276:
4260:
4259:
4255:
4242:
4217:
4204:
4188:
4172:
4167:
4166:
4130:
4125:
4124:
4105:
4104:
4079:
4078:
4051:
4046:
4045:
4042:spectral radius
4020:
4004:
3996:
3995:
3961:
3951:
3950:
3936:
3935:
3913:
3902:
3866:
3841:
3830:
3794:
3792:
3770:
3759:
3723:
3717:
3716:
3685:
3666:
3650:
3622:
3603:
3587:
3585:
3566:
3547:
3531:
3522:
3521:
3488:
3477:
3441:
3419:
3400:
3384:
3379:
3378:
3356:
3355:
3332:
3331:
3304:
3285:
3271:
3270:
3243:
3230:
3208:
3186:
3164:
3145:
3126:
3121:
3120:
3115:
3072:
3053:
3037:
3012:
2993:
2977:
2958:
2939:
2923:
2918:
2917:
2895:
2894:
2836:
2817:
2801:
2782:
2763:
2747:
2725:
2706:
2690:
2689:
2685:
2672:
2656:
2655:
2651:
2650:
2627:
2626:
2616:
2558:
2539:
2523:
2504:
2485:
2469:
2447:
2428:
2412:
2411:
2407:
2394:
2369:
2356:
2340:
2326:
2325:
2296:
2271:
2258:
2247:
2246:
2232:
2231:
2189:
2170:
2154:
2135:
2116:
2100:
2078:
2059:
2043:
2042:
2038:
2029:
2028:
2016:
2015:
1996:
1968:
1949:
1933:
1899:
1880:
1864:
1842:
1823:
1807:
1779:
1760:
1744:
1735:
1734:
1718:
1696:
1677:
1661:
1633:
1614:
1598:
1591:
1581:
1569:
1568:
1534:
1515:
1496:
1491:
1490:
1469:
1456:
1451:
1450:
1426:
1407:
1402:
1401:
1380:
1375:
1374:
1353:
1348:
1347:
1317:
1298:
1276:
1263:
1255:
1254:
1235:
1234:
1201:
1188:
1183:
1182:
1149:
1136:
1131:
1130:
1097:
1084:
1079:
1078:
1075:
1063:
1017:
1013:
1009:
1001:
989:sequence length
981:backpropagation
973:neural networks
957:
928:
927:
901:
893:
892:
853:
845:
844:
805:Kernel machines
800:
792:
791:
767:
759:
758:
739:Active learning
734:
726:
725:
694:
684:
683:
609:Diffusion model
545:
535:
534:
507:
497:
496:
470:
460:
459:
415:Factor analysis
410:
400:
399:
383:
346:
336:
335:
256:
255:
239:
238:
237:
226:
225:
131:
123:
122:
88:Online learning
53:
41:
28:
23:
22:
15:
12:
11:
5:
8316:
8314:
8306:
8305:
8295:
8294:
8288:
8287:
8262:
8242:
8187:
8172:
8153:
8132:
8125:
8080:
8027:
7990:(5): 855–868.
7974:
7957:
7907:
7866:
7836:10.1.1.76.1541
7798:
7785:
7762:
7733:
7698:(2): 157–166.
7678:
7663:
7633:
7605:
7544:
7537:
7531:. IEEE Press.
7519:
7499:
7434:
7432:
7429:
7426:
7425:
7412:
7409:
7406:
7386:
7383:
7380:
7377:
7374:
7354:
7351:
7348:
7345:
7329:
7316:
7294:
7291:
7288:
7284:
7278:
7274:
7268:
7264:
7260:
7257:
7254:
7251:
7248:
7243:
7240:
7236:
7232:
7227:
7224:
7221:
7217:
7213:
7193:
7190:
7185:
7182:
7158:
7155:
7152:
7148:
7127:
7124:
7121:
7118:
7115:
7112:
7109:
7087:
7084:
7079:
7076:
7054:
7048:
7045:
7043:
7040:
7039:
7036:
7033:
7031:
7028:
7027:
7025:
7020:
7017:
6995:
6989:
6986:
6984:
6981:
6980:
6977:
6974:
6972:
6969:
6968:
6966:
6961:
6956:
6953:
6950:
6946:
6929:
6920:
6907:
6902:
6898:
6894:
6891:
6888:
6885:
6882:
6877:
6873:
6869:
6864:
6860:
6856:
6851:
6844:
6839:
6836:
6833:
6829:
6825:
6822:
6817:
6813:
6809:
6806:
6803:
6800:
6797:
6792:
6788:
6784:
6779:
6775:
6771:
6768:
6765:
6762:
6759:
6754:
6750:
6746:
6743:
6726:
6725:
6723:
6720:
6719:
6718:
6711:
6708:
6684:
6681:
6648:
6645:
6633:
6630:
6617:
6614:
6610:Xeon processor
6601:
6598:
6566:Main article:
6563:
6560:
6551:log likelihood
6522:
6519:
6502:
6499:
6491:
6488:
6482:
6481:
6464:
6463:
6430:listed sources
6427:
6425:
6418:
6411:
6410:
6369:
6367:
6360:
6355:
6329:
6328:
6326:
6319:
6313:
6310:
6297:
6274:
6271:
6251:
6231:
6228:
6208:
6186:
6182:
6178:
6175:
6172:
6169:
6166:
6163:
6160:
6157:
6154:
6151:
6148:
6145:
6142:
6139:
6136:
6116:
6113:
6110:
6107:
6104:
6101:
6098:
6095:
6092:
6089:
6086:
6083:
6080:
6077:
6074:
6071:
6068:
6056:
6053:
6034:
6031:
6026:
6022:
6019:
6013:
6010:
6007:
6004:
6001:
5998:
5995:
5992:
5989:
5986:
5983:
5980:
5976:
5970:
5965:
5959:
5956:
5951:
5948:
5945:
5942:
5939:
5933:
5927:
5924:
5919:
5916:
5913:
5910:
5907:
5881:
5878:
5872:
5869:
5866:
5863:
5860:
5855:
5852:
5849:
5846:
5843:
5820:
5817:
5814:
5811:
5791:
5788:
5785:
5782:
5762:
5759:
5756:
5753:
5750:
5747:
5744:
5741:
5715:
5712:
5707:
5704:
5701:
5698:
5695:
5669:
5666:
5663:
5660:
5657:
5652:
5649:
5646:
5643:
5640:
5617:
5614:
5611:
5608:
5588:
5568:
5565:
5562:
5559:
5539:
5536:
5533:
5530:
5527:
5524:
5521:
5518:
5495:
5472:
5469:
5464:
5461:
5458:
5455:
5452:
5426:
5423:
5420:
5417:
5414:
5409:
5406:
5403:
5400:
5397:
5370:
5366:
5363:
5360:
5356:
5350:
5347:
5344:
5340:
5335:
5331:
5328:
5325:
5322:
5318:
5314:
5311:
5308:
5305:
5302:
5299:
5279:
5259:
5256:
5253:
5250:
5247:
5244:
5241:
5221:
5201:
5198:
5195:
5175:
5172:
5169:
5145:
5142:
5139:
5136:
5132:
5129:
5125:
5122:
5119:
5116:
5113:
5110:
5107:
5104:
5101:
5098:
5095:
5092:
5089:
5086:
5083:
5080:
5077:
5074:
5068:
5065:
5060:
5057:
5034:
5012:
5008:
5003:
5000:
4996:
4993:
4990:
4987:
4984:
4979:
4975:
4971:
4968:
4965:
4962:
4959:
4954:
4950:
4946:
4943:
4940:
4937:
4934:
4931:
4926:
4923:
4920:
4916:
4888:
4885:
4872:
4869:
4866:
4843:
4838:
4835:
4831:
4827:
4824:
4804:
4799:
4795:
4772:
4768:
4764:
4744:
4739:
4735:
4714:
4711:
4708:
4688:
4685:
4682:
4679:
4674:
4671:
4668:
4664:
4660:
4655:
4652:
4649:
4646:
4643:
4639:
4635:
4632:
4627:
4623:
4619:
4599:
4596:
4593:
4590:
4585:
4582:
4579:
4575:
4571:
4566:
4562:
4541:
4521:
4518:
4515:
4512:
4492:
4489:
4486:
4483:
4480:
4477:
4474:
4471:
4466:
4462:
4440:
4436:
4433:
4430:
4427:
4424:
4419:
4416:
4413:
4409:
4405:
4400:
4397:
4394:
4390:
4386:
4383:
4378:
4374:
4370:
4367:
4364:
4359:
4355:
4351:
4346:
4343:
4340:
4336:
4332:
4329:
4324:
4320:
4316:
4313:
4310:
4307:
4302:
4298:
4294:
4289:
4286:
4283:
4279:
4275:
4272:
4267:
4263:
4258:
4254:
4249:
4245:
4241:
4238:
4235:
4232:
4229:
4224:
4220:
4216:
4211:
4207:
4203:
4200:
4195:
4191:
4187:
4184:
4179:
4175:
4145:
4142:
4137:
4133:
4112:
4092:
4089:
4086:
4064:
4061:
4058:
4054:
4027:
4023:
4017:
4014:
4011:
4007:
4003:
3979:
3976:
3972:
3967:
3964:
3959:
3934:
3931:
3926:
3923:
3920:
3916:
3912:
3908:
3905:
3901:
3898:
3894:
3891:
3888:
3885:
3879:
3876:
3873:
3869:
3865:
3862:
3859:
3854:
3851:
3848:
3844:
3840:
3836:
3833:
3829:
3826:
3822:
3819:
3816:
3813:
3807:
3804:
3801:
3797:
3793:
3791:
3788:
3783:
3780:
3777:
3773:
3769:
3765:
3762:
3758:
3755:
3751:
3748:
3745:
3742:
3736:
3733:
3730:
3726:
3722:
3719:
3718:
3715:
3712:
3709:
3704:
3701:
3698:
3695:
3692:
3688:
3684:
3679:
3676:
3673:
3669:
3665:
3662:
3657:
3653:
3649:
3646:
3643:
3640:
3635:
3632:
3629:
3625:
3621:
3616:
3613:
3610:
3606:
3602:
3599:
3594:
3590:
3586:
3584:
3581:
3578:
3573:
3569:
3565:
3560:
3557:
3554:
3550:
3546:
3543:
3538:
3534:
3530:
3529:
3509:
3506:
3501:
3498:
3495:
3491:
3487:
3483:
3480:
3476:
3473:
3469:
3466:
3463:
3460:
3454:
3451:
3448:
3444:
3440:
3437:
3434:
3431:
3426:
3422:
3418:
3413:
3410:
3407:
3403:
3399:
3396:
3391:
3387:
3363:
3339:
3319:
3314:
3311:
3307:
3303:
3298:
3295:
3292:
3288:
3284:
3281:
3278:
3258:
3255:
3250:
3246:
3240:
3237:
3233:
3229:
3226:
3221:
3218:
3215:
3211:
3207:
3204:
3199:
3196:
3193:
3189:
3185:
3182:
3179:
3176:
3171:
3167:
3163:
3158:
3155:
3152:
3148:
3144:
3141:
3138:
3133:
3129:
3114:
3111:
3099:
3096:
3093:
3090:
3085:
3082:
3079:
3075:
3071:
3066:
3063:
3060:
3056:
3052:
3049:
3044:
3040:
3036:
3033:
3030:
3025:
3022:
3019:
3015:
3011:
3006:
3003:
3000:
2996:
2992:
2989:
2984:
2980:
2976:
2973:
2970:
2965:
2961:
2957:
2952:
2949:
2946:
2942:
2938:
2935:
2930:
2926:
2902:
2880:
2875:
2870:
2866:
2863:
2860:
2857:
2854:
2849:
2846:
2843:
2839:
2835:
2830:
2827:
2824:
2820:
2816:
2813:
2808:
2804:
2800:
2797:
2794:
2789:
2785:
2781:
2776:
2773:
2770:
2766:
2762:
2759:
2754:
2750:
2746:
2743:
2740:
2737:
2732:
2728:
2724:
2719:
2716:
2713:
2709:
2705:
2702:
2697:
2693:
2688:
2684:
2679:
2675:
2671:
2668:
2663:
2659:
2654:
2649:
2646:
2643:
2640:
2637:
2634:
2622:
2621:
2612:
2610:
2599:
2596:
2592:
2588:
2585:
2582:
2579:
2576:
2571:
2568:
2565:
2561:
2557:
2552:
2549:
2546:
2542:
2538:
2535:
2530:
2526:
2522:
2519:
2516:
2511:
2507:
2503:
2498:
2495:
2492:
2488:
2484:
2481:
2476:
2472:
2468:
2465:
2462:
2459:
2454:
2450:
2446:
2441:
2438:
2435:
2431:
2427:
2424:
2419:
2415:
2410:
2406:
2401:
2397:
2393:
2390:
2387:
2384:
2381:
2376:
2372:
2368:
2363:
2359:
2355:
2352:
2347:
2343:
2339:
2336:
2333:
2308:
2303:
2299:
2295:
2292:
2289:
2286:
2283:
2278:
2274:
2270:
2265:
2261:
2257:
2254:
2230:
2227:
2223:
2219:
2216:
2213:
2210:
2207:
2202:
2199:
2196:
2192:
2188:
2183:
2180:
2177:
2173:
2169:
2166:
2161:
2157:
2153:
2150:
2147:
2142:
2138:
2134:
2129:
2126:
2123:
2119:
2115:
2112:
2107:
2103:
2099:
2096:
2093:
2090:
2085:
2081:
2077:
2072:
2069:
2066:
2062:
2058:
2055:
2050:
2046:
2041:
2037:
2034:
2032:
2030:
2027:
2024:
2021:
2019:
2017:
2014:
2009:
2006:
2003:
1999:
1995:
1992:
1989:
1986:
1981:
1978:
1975:
1971:
1967:
1962:
1959:
1956:
1952:
1948:
1945:
1940:
1936:
1932:
1929:
1926:
1923:
1920:
1917:
1912:
1909:
1906:
1902:
1898:
1893:
1890:
1887:
1883:
1879:
1876:
1871:
1867:
1863:
1860:
1857:
1854:
1849:
1845:
1841:
1836:
1833:
1830:
1826:
1822:
1819:
1814:
1810:
1806:
1803:
1800:
1797:
1794:
1791:
1786:
1782:
1778:
1773:
1770:
1767:
1763:
1759:
1756:
1751:
1747:
1743:
1740:
1738:
1736:
1731:
1728:
1725:
1721:
1717:
1714:
1711:
1708:
1703:
1699:
1695:
1690:
1687:
1684:
1680:
1676:
1673:
1668:
1664:
1660:
1657:
1654:
1651:
1648:
1645:
1640:
1636:
1632:
1627:
1624:
1621:
1617:
1613:
1610:
1605:
1601:
1597:
1594:
1592:
1588:
1584:
1580:
1577:
1576:
1563:Now, take its
1552:
1549:
1546:
1541:
1537:
1533:
1528:
1525:
1522:
1518:
1514:
1511:
1508:
1503:
1499:
1476:
1472:
1468:
1463:
1459:
1438:
1433:
1429:
1425:
1422:
1419:
1414:
1410:
1387:
1383:
1360:
1356:
1335:
1332:
1329:
1324:
1320:
1316:
1311:
1308:
1305:
1301:
1297:
1294:
1291:
1288:
1283:
1279:
1275:
1270:
1266:
1262:
1242:
1222:
1219:
1216:
1213:
1208:
1204:
1200:
1195:
1191:
1181:, and outputs
1170:
1167:
1164:
1161:
1156:
1152:
1148:
1143:
1139:
1118:
1115:
1112:
1109:
1104:
1100:
1096:
1091:
1087:
1074:
1071:
1062:
1059:
1002:[-1,1]
959:
958:
956:
955:
948:
941:
933:
930:
929:
926:
925:
920:
919:
918:
908:
902:
899:
898:
895:
894:
891:
890:
885:
880:
875:
870:
865:
860:
854:
851:
850:
847:
846:
843:
842:
837:
832:
827:
825:Occam learning
822:
817:
812:
807:
801:
798:
797:
794:
793:
790:
789:
784:
782:Learning curve
779:
774:
768:
765:
764:
761:
760:
757:
756:
751:
746:
741:
735:
732:
731:
728:
727:
724:
723:
722:
721:
711:
706:
701:
695:
690:
689:
686:
685:
682:
681:
675:
670:
665:
660:
659:
658:
648:
643:
642:
641:
636:
631:
626:
616:
611:
606:
601:
600:
599:
589:
588:
587:
582:
577:
572:
562:
557:
552:
546:
541:
540:
537:
536:
533:
532:
527:
522:
514:
508:
503:
502:
499:
498:
495:
494:
493:
492:
487:
482:
471:
466:
465:
462:
461:
458:
457:
452:
447:
442:
437:
432:
427:
422:
417:
411:
406:
405:
402:
401:
398:
397:
392:
387:
381:
376:
371:
363:
358:
353:
347:
342:
341:
338:
337:
334:
333:
328:
323:
318:
313:
308:
303:
298:
290:
289:
288:
283:
278:
268:
266:Decision trees
263:
257:
243:classification
233:
232:
231:
228:
227:
224:
223:
218:
213:
208:
203:
198:
193:
188:
183:
178:
173:
168:
163:
158:
153:
148:
143:
138:
136:Classification
132:
129:
128:
125:
124:
121:
120:
115:
110:
105:
100:
95:
93:Batch learning
90:
85:
80:
75:
70:
65:
60:
54:
51:
50:
47:
46:
35:
34:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
8315:
8304:
8301:
8300:
8298:
8291:
8276:
8272:
8266:
8263:
8255:
8254:
8246:
8243:
8238:
8234:
8230:
8226:
8222:
8218:
8214:
8210:
8206:
8202:
8198:
8191:
8188:
8184:
8183:
8176:
8173:
8168:
8164:
8157:
8154:
8148:
8143:
8136:
8133:
8128:
8122:
8118:
8114:
8109:
8104:
8100:
8096:
8095:
8087:
8085:
8081:
8076:
8072:
8068:
8064:
8060:
8056:
8051:
8046:
8042:
8038:
8031:
8028:
8023:
8019:
8015:
8011:
8007:
8003:
7998:
7993:
7989:
7985:
7978:
7975:
7971:
7967:
7961:
7958:
7953:
7949:
7945:
7941:
7937:
7933:
7929:
7925:
7921:
7917:
7911:
7908:
7902:
7897:
7893:
7889:
7885:
7881:
7877:
7870:
7867:
7862:
7858:
7854:
7850:
7846:
7842:
7837:
7832:
7828:
7824:
7823:
7815:
7811:
7810:Hinton, G. E.
7805:
7803:
7799:
7795:
7789:
7786:
7781:
7777:
7773:
7766:
7763:
7757:
7752:
7748:
7744:
7737:
7734:
7729:
7725:
7721:
7717:
7713:
7709:
7705:
7701:
7697:
7693:
7689:
7682:
7679:
7674:
7670:
7666:
7664:0-7803-0593-0
7660:
7656:
7652:
7648:
7644:
7637:
7634:
7628:
7623:
7616:
7614:
7612:
7610:
7606:
7601:
7597:
7593:
7589:
7585:
7581:
7577:
7573:
7568:
7563:
7559:
7555:
7548:
7545:
7540:
7538:0-7803-5369-2
7534:
7530:
7523:
7520:
7512:
7511:
7503:
7500:
7495:
7491:
7487:
7483:
7478:
7473:
7468:
7463:
7459:
7455:
7451:
7444:
7442:
7440:
7436:
7430:
7410:
7407:
7404:
7384:
7381:
7378:
7375:
7372:
7352:
7349:
7346:
7343:
7333:
7330:
7314:
7292:
7289:
7286:
7282:
7276:
7266:
7262:
7258:
7255:
7252:
7246:
7241:
7238:
7230:
7225:
7222:
7219:
7215:
7191:
7188:
7183:
7180:
7156:
7153:
7150:
7146:
7122:
7119:
7116:
7110:
7107:
7085:
7082:
7077:
7074:
7052:
7046:
7041:
7034:
7029:
7023:
7018:
7015:
6993:
6987:
6982:
6975:
6970:
6964:
6959:
6954:
6951:
6948:
6944:
6933:
6930:
6924:
6921:
6900:
6896:
6892:
6889:
6886:
6883:
6880:
6875:
6871:
6867:
6862:
6858:
6842:
6837:
6834:
6831:
6827:
6823:
6815:
6811:
6807:
6804:
6801:
6798:
6795:
6790:
6786:
6782:
6777:
6773:
6769:
6766:
6763:
6760:
6757:
6752:
6748:
6741:
6731:
6728:
6721:
6717:
6714:
6713:
6709:
6707:
6705:
6701:
6696:
6694:
6690:
6682:
6680:
6678:
6670:
6664:
6652:
6646:
6644:
6642:
6638:
6631:
6629:
6625:
6623:
6615:
6613:
6611:
6607:
6599:
6597:
6595:
6591:
6587:
6583:
6579:
6575:
6569:
6561:
6559:
6556:
6552:
6548:
6544:
6540:
6536:
6532:
6528:
6520:
6518:
6516:
6512:
6508:
6500:
6498:
6496:
6489:
6487:
6478:
6475:
6460:
6457:
6449:
6446:December 2017
6437:
6436:
6431:
6426:
6422:
6417:
6416:
6407:
6404:
6396:
6393:December 2017
6386:
6380:
6379:
6375:
6370:This section
6368:
6359:
6358:
6353:
6351:
6344:
6343:
6338:
6337:
6332:
6327:
6318:
6317:
6311:
6309:
6295:
6286:
6272:
6269:
6249:
6229:
6226:
6206:
6184:
6173:
6167:
6164:
6161:
6155:
6146:
6140:
6134:
6114:
6111:
6105:
6099:
6096:
6093:
6090:
6084:
6078:
6075:
6072:
6069:
6066:
6054:
6052:
6049:
6032:
6029:
6024:
6020:
6017:
6005:
5999:
5996:
5993:
5984:
5978:
5974:
5968:
5963:
5957:
5946:
5940:
5931:
5925:
5914:
5908:
5893:
5879:
5876:
5867:
5861:
5850:
5844:
5815:
5809:
5786:
5780:
5757:
5754:
5748:
5742:
5730:
5713:
5702:
5696:
5664:
5658:
5647:
5641:
5612:
5606:
5586:
5563:
5557:
5534:
5531:
5525:
5519:
5507:
5493:
5470:
5459:
5453:
5421:
5415:
5404:
5398:
5385:Now consider
5383:
5368:
5364:
5361:
5358:
5354:
5348:
5345:
5342:
5338:
5333:
5329:
5326:
5323:
5320:
5316:
5312:
5306:
5303:
5300:
5277:
5254:
5251:
5248:
5245:
5242:
5219:
5199:
5196:
5193:
5173:
5170:
5167:
5159:
5140:
5134:
5130:
5127:
5123:
5117:
5114:
5108:
5102:
5099:
5093:
5090:
5084:
5078:
5075:
5072:
5066:
5063:
5058:
5055:
5032:
5025:At the small
5010:
5006:
5001:
4998:
4994:
4991:
4985:
4982:
4977:
4973:
4969:
4963:
4960:
4957:
4952:
4948:
4941:
4938:
4935:
4929:
4924:
4921:
4918:
4914:
4901:
4897:
4893:
4886:
4884:
4870:
4867:
4864:
4855:
4836:
4833:
4829:
4822:
4802:
4797:
4770:
4766:
4762:
4742:
4737:
4712:
4709:
4706:
4680:
4677:
4672:
4669:
4666:
4662:
4658:
4653:
4650:
4647:
4644:
4641:
4637:
4630:
4625:
4597:
4594:
4591:
4588:
4583:
4580:
4577:
4573:
4569:
4564:
4560:
4539:
4516:
4510:
4487:
4484:
4481:
4478:
4475:
4469:
4464:
4438:
4434:
4431:
4425:
4422:
4417:
4414:
4411:
4407:
4403:
4398:
4395:
4392:
4388:
4381:
4376:
4365:
4362:
4357:
4353:
4349:
4344:
4341:
4338:
4334:
4327:
4322:
4314:
4308:
4305:
4300:
4296:
4292:
4287:
4284:
4281:
4277:
4270:
4265:
4256:
4247:
4243:
4239:
4236:
4233:
4230:
4227:
4222:
4218:
4214:
4209:
4205:
4198:
4193:
4185:
4182:
4177:
4164:
4163:
4157:
4143:
4135:
4131:
4110:
4090:
4087:
4084:
4062:
4059:
4056:
4052:
4043:
4025:
4015:
4012:
4009:
4005:
3993:
3992:operator norm
3977:
3974:
3965:
3962:
3924:
3921:
3918:
3914:
3906:
3903:
3896:
3892:
3889:
3886:
3883:
3877:
3874:
3871:
3867:
3863:
3852:
3849:
3846:
3842:
3834:
3831:
3824:
3820:
3817:
3814:
3811:
3805:
3802:
3799:
3795:
3781:
3778:
3775:
3771:
3763:
3760:
3753:
3749:
3746:
3743:
3740:
3734:
3731:
3728:
3724:
3720:
3710:
3707:
3702:
3699:
3696:
3693:
3690:
3686:
3682:
3677:
3674:
3671:
3667:
3660:
3655:
3647:
3641:
3638:
3633:
3630:
3627:
3623:
3619:
3614:
3611:
3608:
3604:
3597:
3592:
3579:
3576:
3571:
3567:
3563:
3558:
3555:
3552:
3548:
3541:
3536:
3499:
3496:
3493:
3489:
3481:
3478:
3471:
3467:
3464:
3461:
3458:
3452:
3449:
3446:
3442:
3438:
3432:
3429:
3424:
3420:
3416:
3411:
3408:
3405:
3401:
3394:
3389:
3375:
3361:
3353:
3337:
3312:
3309:
3305:
3301:
3296:
3293:
3290:
3286:
3279:
3276:
3256:
3253:
3248:
3244:
3238:
3235:
3231:
3227:
3219:
3216:
3213:
3209:
3202:
3197:
3194:
3191:
3187:
3183:
3177:
3174:
3169:
3165:
3161:
3156:
3153:
3150:
3146:
3139:
3136:
3131:
3127:
3118:
3112:
3110:
3097:
3091:
3088:
3083:
3080:
3077:
3073:
3069:
3064:
3061:
3058:
3054:
3047:
3042:
3031:
3028:
3023:
3020:
3017:
3013:
3009:
3004:
3001:
2998:
2994:
2987:
2982:
2971:
2968:
2963:
2959:
2955:
2950:
2947:
2944:
2940:
2933:
2928:
2914:
2900:
2878:
2873:
2868:
2864:
2861:
2855:
2852:
2847:
2844:
2841:
2837:
2833:
2828:
2825:
2822:
2818:
2811:
2806:
2795:
2792:
2787:
2783:
2779:
2774:
2771:
2768:
2764:
2757:
2752:
2744:
2738:
2735:
2730:
2726:
2722:
2717:
2714:
2711:
2707:
2700:
2695:
2686:
2677:
2673:
2666:
2661:
2652:
2647:
2644:
2641:
2638:
2635:
2620:
2613:
2611:
2597:
2594:
2590:
2586:
2583:
2577:
2574:
2569:
2566:
2563:
2559:
2555:
2550:
2547:
2544:
2540:
2533:
2528:
2517:
2514:
2509:
2505:
2501:
2496:
2493:
2490:
2486:
2479:
2474:
2466:
2460:
2457:
2452:
2448:
2444:
2439:
2436:
2433:
2429:
2422:
2417:
2408:
2399:
2395:
2391:
2388:
2385:
2382:
2379:
2374:
2370:
2366:
2361:
2357:
2350:
2345:
2337:
2334:
2331:
2324:
2323:
2320:
2301:
2297:
2293:
2290:
2287:
2284:
2281:
2276:
2272:
2268:
2263:
2259:
2252:
2228:
2225:
2221:
2217:
2214:
2208:
2205:
2200:
2197:
2194:
2190:
2186:
2181:
2178:
2175:
2171:
2164:
2159:
2148:
2145:
2140:
2136:
2132:
2127:
2124:
2121:
2117:
2110:
2105:
2097:
2091:
2088:
2083:
2079:
2075:
2070:
2067:
2064:
2060:
2053:
2048:
2039:
2035:
2033:
2025:
2022:
2020:
2007:
2004:
2001:
1997:
1993:
1987:
1984:
1979:
1976:
1973:
1969:
1965:
1960:
1957:
1954:
1950:
1943:
1938:
1930:
1927:
1924:
1918:
1915:
1910:
1907:
1904:
1900:
1896:
1891:
1888:
1885:
1881:
1874:
1869:
1855:
1852:
1847:
1843:
1839:
1834:
1831:
1828:
1824:
1817:
1812:
1804:
1801:
1798:
1792:
1789:
1784:
1780:
1776:
1771:
1768:
1765:
1761:
1754:
1749:
1741:
1739:
1729:
1726:
1723:
1719:
1715:
1709:
1706:
1701:
1697:
1693:
1688:
1685:
1682:
1678:
1671:
1666:
1658:
1655:
1652:
1646:
1643:
1638:
1634:
1630:
1625:
1622:
1619:
1615:
1608:
1603:
1595:
1593:
1586:
1582:
1578:
1566:
1547:
1544:
1539:
1535:
1531:
1526:
1523:
1520:
1516:
1509:
1506:
1501:
1497:
1474:
1470:
1466:
1461:
1457:
1431:
1427:
1420:
1417:
1412:
1408:
1385:
1381:
1358:
1354:
1330:
1327:
1322:
1318:
1314:
1309:
1306:
1303:
1299:
1292:
1289:
1281:
1277:
1273:
1268:
1264:
1240:
1220:
1217:
1214:
1211:
1206:
1202:
1198:
1193:
1189:
1168:
1165:
1162:
1159:
1154:
1150:
1146:
1141:
1137:
1116:
1113:
1110:
1107:
1102:
1098:
1094:
1089:
1085:
1072:
1070:
1068:
1060:
1058:
1056:
1051:
1049:
1045:
1041:
1038:
1034:
1030:
1026:
1021:
1007:
999:
995:
990:
986:
982:
978:
974:
970:
966:
954:
949:
947:
942:
940:
935:
934:
932:
931:
924:
921:
917:
914:
913:
912:
909:
907:
904:
903:
897:
896:
889:
886:
884:
881:
879:
876:
874:
871:
869:
866:
864:
861:
859:
856:
855:
849:
848:
841:
838:
836:
833:
831:
828:
826:
823:
821:
818:
816:
813:
811:
808:
806:
803:
802:
796:
795:
788:
785:
783:
780:
778:
775:
773:
770:
769:
763:
762:
755:
752:
750:
747:
745:
744:Crowdsourcing
742:
740:
737:
736:
730:
729:
720:
717:
716:
715:
712:
710:
707:
705:
702:
700:
697:
696:
693:
688:
687:
679:
676:
674:
673:Memtransistor
671:
669:
666:
664:
661:
657:
654:
653:
652:
649:
647:
644:
640:
637:
635:
632:
630:
627:
625:
622:
621:
620:
617:
615:
612:
610:
607:
605:
602:
598:
595:
594:
593:
590:
586:
583:
581:
578:
576:
573:
571:
568:
567:
566:
563:
561:
558:
556:
555:Deep learning
553:
551:
548:
547:
544:
539:
538:
531:
528:
526:
523:
521:
519:
515:
513:
510:
509:
506:
501:
500:
491:
490:Hidden Markov
488:
486:
483:
481:
478:
477:
476:
473:
472:
469:
464:
463:
456:
453:
451:
448:
446:
443:
441:
438:
436:
433:
431:
428:
426:
423:
421:
418:
416:
413:
412:
409:
404:
403:
396:
393:
391:
388:
386:
382:
380:
377:
375:
372:
370:
368:
364:
362:
359:
357:
354:
352:
349:
348:
345:
340:
339:
332:
329:
327:
324:
322:
319:
317:
314:
312:
309:
307:
304:
302:
299:
297:
295:
291:
287:
286:Random forest
284:
282:
279:
277:
274:
273:
272:
269:
267:
264:
262:
259:
258:
251:
250:
245:
244:
236:
230:
229:
222:
219:
217:
214:
212:
209:
207:
204:
202:
199:
197:
194:
192:
189:
187:
184:
182:
179:
177:
174:
172:
171:Data cleaning
169:
167:
164:
162:
159:
157:
154:
152:
149:
147:
144:
142:
139:
137:
134:
133:
127:
126:
119:
116:
114:
111:
109:
106:
104:
101:
99:
96:
94:
91:
89:
86:
84:
83:Meta-learning
81:
79:
76:
74:
71:
69:
66:
64:
61:
59:
56:
55:
49:
48:
45:
40:
36:
32:
31:
19:
8290:
8278:. Retrieved
8274:
8265:
8252:
8245:
8204:
8200:
8190:
8180:
8175:
8166:
8156:
8135:
8098:
8093:
8040:
8036:
8030:
7987:
7983:
7977:
7969:
7965:
7960:
7927:
7923:
7910:
7883:
7880:Scholarpedia
7879:
7869:
7826:
7820:
7793:
7788:
7779:
7775:
7765:
7746:
7736:
7695:
7691:
7681:
7646:
7636:
7557:
7553:
7547:
7528:
7522:
7509:
7502:
7457:
7453:
7332:
6932:
6923:
6730:
6700:random guess
6697:
6686:
6673:max(−1,-8/N)
6668:
6665:
6653:
6650:
6635:
6626:
6619:
6612:, not GPUs.
6603:
6571:
6541:. It uses a
6524:
6504:
6493:
6485:
6470:
6452:
6443:
6432:
6399:
6390:
6371:
6347:
6340:
6334:
6333:Please help
6330:
6287:
6058:
6050:
5894:
5731:
5508:
5384:
4905:
4856:
4160:
4158:
4040:. So if the
3376:
3119:
3116:
2915:
2625:
2614:
1565:differential
1076:
1066:
1064:
1054:
1052:
1037:many-layered
1022:
996:such as the
968:
962:
830:PAC learning
517:
366:
361:Hierarchical
293:
247:
241:
7886:(5): 5947.
6657:3.6/sqrt(N)
6586:Schmidhuber
6547:lower-bound
6433:may not be
5599:would make
5212:, and vary
5160:case, with
1042:, but also
714:Multi-agent
651:Transformer
550:Autoencoder
306:Naive Bayes
44:data mining
8207:: 87–103.
8169:: 315–323.
8147:1605.06431
8108:1512.03385
8043:: 85–117.
7756:1502.03167
7567:1701.04503
7467:2006.10560
7460:(3): 198.
7431:References
6637:Rectifiers
6582:Hochreiter
6336:improve it
5158:autonomous
4900:hysteresis
1400:, as some
1029:Hochreiter
1025:supervised
1006:chain rule
699:Q-learning
597:Restricted
395:Mean shift
344:Clustering
321:Perceptron
249:regression
151:Clustering
146:Regression
8280:7 January
8237:249487697
8221:0893-6080
8050:1404.7828
7992:CiteSeerX
7831:CiteSeerX
7728:206457500
7712:1941-0093
7627:1211.5063
7494:219792172
7486:2096-0654
7350:−
7290:×
7259:⋅
7256:ϵ
7184:ϵ
7111:∈
7075:ϵ
6983:ϵ
6936:Consider
6828:∑
6374:citations
6342:talk page
6312:Solutions
6270:−
6227:−
6219:approach
6165:−
6030:−
6018:−
5997:−
5955:∂
5938:∂
5932:≈
5923:Δ
5906:Δ
5859:Δ
5842:Δ
5831:, making
5711:Δ
5694:Δ
5656:Δ
5639:Δ
5468:Δ
5451:Δ
5413:Δ
5396:Δ
5359:−
5346:−
5330:
5252:−
5243:−
5094:σ
5076:−
5033:ϵ
4995:ϵ
4964:σ
4961:ϵ
4942:ϵ
4939:−
4868:≥
4865:γ
4834:−
4830:γ
4798:θ
4794:∇
4767:γ
4755:decay as
4738:θ
4734:∇
4687:‖
4681:θ
4670:−
4651:−
4645:−
4626:θ
4622:∇
4618:‖
4581:−
4511:σ
4488:θ
4465:θ
4461:∇
4435:⋯
4426:θ
4415:−
4396:−
4377:θ
4373:∇
4366:θ
4342:−
4319:∇
4309:θ
4285:−
4266:θ
4262:∇
4190:∇
4178:θ
4174:∇
4141:→
4132:γ
4085:γ
4022:‖
4002:‖
3975:≤
3963:σ
3922:−
3904:σ
3897:
3864:⋯
3850:−
3832:σ
3825:
3779:−
3761:σ
3754:
3711:θ
3694:−
3675:−
3652:∇
3648:⋯
3642:θ
3631:−
3612:−
3589:∇
3580:θ
3556:−
3533:∇
3520:, and so
3497:−
3479:σ
3472:
3433:θ
3409:−
3386:∇
3338:σ
3277:θ
3217:−
3203:σ
3178:θ
3154:−
3098:⋯
3092:θ
3081:−
3062:−
3039:∇
3032:θ
3021:−
3002:−
2979:∇
2972:θ
2948:−
2925:∇
2901:η
2865:⋯
2856:θ
2845:−
2826:−
2807:θ
2803:∇
2796:θ
2772:−
2749:∇
2739:θ
2715:−
2696:θ
2692:∇
2658:∇
2648:⋅
2645:η
2642:−
2636:θ
2633:Δ
2598:θ
2587:⋯
2578:θ
2567:−
2548:−
2529:θ
2525:∇
2518:θ
2494:−
2471:∇
2461:θ
2437:−
2418:θ
2414:∇
2342:∇
2229:θ
2218:⋯
2209:θ
2198:−
2179:−
2160:θ
2156:∇
2149:θ
2125:−
2102:∇
2092:θ
2068:−
2049:θ
2045:∇
2026:⋯
2005:−
1988:θ
1977:−
1958:−
1935:∇
1928:θ
1919:θ
1908:−
1889:−
1870:θ
1866:∇
1856:θ
1832:−
1809:∇
1802:θ
1793:θ
1769:−
1750:θ
1746:∇
1727:−
1710:θ
1686:−
1663:∇
1656:θ
1647:θ
1623:−
1604:θ
1600:∇
1548:θ
1524:−
1331:θ
1307:−
1241:θ
858:ECML PKDD
840:VC theory
787:ROC curve
719:Self-play
639:DeepDream
480:Bayes net
271:Ensembles
52:Paradigms
8297:Category
8229:35714424
8075:11715509
8067:25462637
8022:14635907
8014:19299860
7853:16764513
7720:18267787
7673:15069221
7592:28272810
6710:See also
6659:, where
6639:such as
6435:reliable
6262:crosses
5486:, where
5131:′
5002:′
4552:, so if
3966:′
3907:′
3835:′
3764:′
3482:′
281:Boosting
130:Problems
8185:(2017).
7952:1915014
7944:9377276
7888:Bibcode
7861:2309950
7600:6831636
7572:Bibcode
7138:. Then
7067:, with
6576:is the
6549:of the
6505:One is
3350:is the
1129:inputs
863:NeurIPS
680:(ECRAM)
634:AlexNet
276:Bagging
8235:
8227:
8219:
8123:
8073:
8065:
8020:
8012:
7994:
7950:
7942:
7859:
7851:
7833:
7726:
7718:
7710:
7671:
7661:
7598:
7590:
7535:
7492:
7484:
7204:, and
6584:&
5186:. Set
3990:, the
3949:Since
3377:Then,
3269:where
2893:where
1033:diplom
967:, the
656:Vision
512:RANSAC
390:OPTICS
385:DBSCAN
369:-means
176:AutoML
8257:(PDF)
8233:S2CID
8142:arXiv
8103:arXiv
8071:S2CID
8045:arXiv
8018:S2CID
7948:S2CID
7857:S2CID
7817:(PDF)
7751:arXiv
7724:S2CID
7669:S2CID
7622:arXiv
7596:S2CID
7562:arXiv
7514:(PDF)
7490:S2CID
7462:arXiv
7385:0.855
7379:0.145
6722:Notes
6689:Rprop
6683:Other
6590:ICDAR
6162:0.855
5270:. As
975:with
878:IJCAI
704:SARSA
663:Mamba
629:LeNet
624:U-Net
450:t-SNE
374:Fuzzy
351:BIRCH
8282:2017
8225:PMID
8217:ISSN
8167:PMLR
8121:ISBN
8063:PMID
8010:PMID
7940:PMID
7849:PMID
7716:PMID
7708:ISSN
7659:ISBN
7588:PMID
7533:ISBN
7482:ISSN
7189:>
7100:and
7078:>
7008:and
6669:mean
6641:ReLU
6606:GPUs
5683:and
5440:and
4710:>
4532:and
4088:<
979:and
888:JMLR
873:ICLR
868:ICML
754:RLHF
570:LSTM
356:CURE
42:and
8209:doi
8205:153
8113:doi
8055:doi
8002:doi
7932:doi
7896:doi
7841:doi
7700:doi
7651:doi
7580:doi
7472:doi
7411:0.5
7353:2.5
6376:to
6273:2.5
6230:2.5
6094:0.5
5732:If
5579:or
5509:If
5232:in
5200:5.0
4857:If
4077:is
4044:of
1050:.)
1031:'s
963:In
614:SOM
604:GAN
580:ESN
575:GRU
520:-NN
455:SDL
445:PGD
440:PCA
435:NMF
430:LDA
425:ICA
420:CCA
296:-NN
8299::
8273:.
8231:.
8223:.
8215:.
8203:.
8199:.
8165:.
8119:.
8111:.
8097:.
8083:^
8069:.
8061:.
8053:.
8041:61
8039:.
8016:.
8008:.
8000:.
7988:31
7986:.
7946:.
7938:.
7926:.
7918:;
7894:.
7882:.
7878:.
7855:.
7847:.
7839:.
7827:18
7825:.
7819:.
7801:^
7780:31
7778:.
7774:.
7745:.
7722:.
7714:.
7706:.
7694:.
7690:.
7667:.
7657:.
7645:.
7608:^
7594:.
7586:.
7578:.
7570:.
7558:38
7556:.
7488:.
7480:.
7470:.
7456:.
7452:.
7438:^
6679:.
6345:.
5382:.
5327:ln
4165:):
1057:.
883:ML
8284:.
8239:.
8211::
8150:.
8144::
8129:.
8115::
8105::
8077:.
8057::
8047::
8024:.
8004::
7954:.
7934::
7928:9
7904:.
7898::
7890::
7884:4
7863:.
7843::
7759:.
7753::
7730:.
7702::
7696:5
7675:.
7653::
7630:.
7624::
7602:.
7582::
7574::
7564::
7541:.
7496:.
7474::
7464::
7458:3
7423:.
7408:=
7405:x
7382:,
7376:=
7373:x
7347:=
7344:b
7327:.
7315:c
7293:2
7287:2
7283:I
7277:N
7273:)
7267:2
7263:c
7253:2
7250:(
7247:=
7242:N
7239:2
7235:)
7231:D
7226:c
7223:e
7220:r
7216:W
7212:(
7192:1
7181:2
7157:c
7154:e
7151:r
7147:W
7126:)
7123:1
7120:,
7117:0
7114:(
7108:c
7086:2
7083:1
7053:]
7047:c
7042:0
7035:0
7030:c
7024:[
7019:=
7016:D
6994:]
6988:0
6976:2
6971:0
6965:[
6960:=
6955:c
6952:e
6949:r
6945:W
6906:)
6901:t
6897:u
6893:,
6890:.
6887:.
6884:.
6881:,
6876:1
6872:u
6868:,
6863:t
6859:x
6855:(
6850:E
6843:T
6838:1
6835:=
6832:t
6824:=
6821:)
6816:T
6812:u
6808:,
6805:.
6802:.
6799:.
6796:,
6791:1
6787:u
6783:,
6778:T
6774:x
6770:,
6767:.
6764:.
6761:.
6758:,
6753:1
6749:x
6745:(
6742:L
6661:N
6477:)
6471:(
6459:)
6453:(
6448:)
6444:(
6438:.
6406:)
6400:(
6395:)
6391:(
6381:.
6352:)
6348:(
6296:b
6250:b
6207:b
6185:2
6181:)
6177:)
6174:T
6171:(
6168:x
6159:(
6156:=
6153:)
6150:)
6147:T
6144:(
6141:x
6138:(
6135:L
6115:0
6112:=
6109:)
6106:t
6103:(
6100:u
6097:,
6091:=
6088:)
6085:0
6082:(
6079:x
6076:,
6073:5
6070:=
6067:w
6033:1
6025:)
6021:5
6012:)
6009:)
6006:T
6003:(
6000:x
5994:1
5991:(
5988:)
5985:T
5982:(
5979:x
5975:1
5969:(
5964:=
5958:b
5950:)
5947:T
5944:(
5941:x
5926:b
5918:)
5915:T
5912:(
5909:x
5880:0
5877:=
5871:)
5868:0
5865:(
5862:x
5854:)
5851:T
5848:(
5845:x
5819:)
5816:T
5813:(
5810:x
5790:)
5787:0
5784:(
5781:x
5761:)
5758:b
5755:,
5752:)
5749:0
5746:(
5743:x
5740:(
5714:b
5706:)
5703:T
5700:(
5697:x
5668:)
5665:0
5662:(
5659:x
5651:)
5648:T
5645:(
5642:x
5616:)
5613:T
5610:(
5607:x
5587:b
5567:)
5564:0
5561:(
5558:x
5538:)
5535:b
5532:,
5529:)
5526:0
5523:(
5520:x
5517:(
5494:T
5471:b
5463:)
5460:T
5457:(
5454:x
5425:)
5422:0
5419:(
5416:x
5408:)
5405:T
5402:(
5399:x
5369:)
5365:x
5362:5
5355:)
5349:x
5343:1
5339:x
5334:(
5324:,
5321:x
5317:(
5313:=
5310:)
5307:b
5304:,
5301:x
5298:(
5278:b
5258:]
5255:2
5249:,
5246:3
5240:[
5220:b
5197:=
5194:w
5174:0
5171:=
5168:u
5144:)
5141:t
5138:(
5135:u
5128:w
5124:+
5121:)
5118:b
5115:+
5112:)
5109:t
5106:(
5103:x
5100:w
5097:(
5091:+
5088:)
5085:t
5082:(
5079:x
5073:=
5067:t
5064:d
5059:x
5056:d
5011:t
5007:u
4999:w
4992:+
4989:)
4986:b
4983:+
4978:t
4974:x
4970:w
4967:(
4958:+
4953:t
4949:x
4945:)
4936:1
4933:(
4930:=
4925:1
4922:+
4919:t
4915:x
4871:1
4842:)
4837:1
4826:(
4823:O
4803:L
4771:k
4763:M
4743:L
4713:0
4707:M
4684:)
4678:,
4673:k
4667:t
4663:u
4659:,
4654:1
4648:k
4642:t
4638:x
4634:(
4631:F
4598:.
4595:.
4592:.
4589:,
4584:1
4578:t
4574:u
4570:,
4565:t
4561:u
4540:u
4520:)
4517:x
4514:(
4491:)
4485:,
4482:u
4479:,
4476:x
4473:(
4470:F
4439:)
4432:+
4429:)
4423:,
4418:1
4412:t
4408:u
4404:,
4399:2
4393:t
4389:x
4385:(
4382:F
4369:)
4363:,
4358:t
4354:u
4350:,
4345:1
4339:t
4335:x
4331:(
4328:F
4323:x
4315:+
4312:)
4306:,
4301:t
4297:u
4293:,
4288:1
4282:t
4278:x
4274:(
4271:F
4257:(
4253:)
4248:T
4244:u
4240:,
4237:.
4234:.
4231:.
4228:,
4223:1
4219:u
4215:,
4210:T
4206:x
4202:(
4199:L
4194:x
4186:=
4183:L
4144:0
4136:k
4111:k
4091:1
4063:c
4060:e
4057:r
4053:W
4026:k
4016:c
4013:e
4010:r
4006:W
3978:1
3971:|
3958:|
3933:)
3930:)
3925:k
3919:t
3915:x
3911:(
3900:(
3893:g
3890:a
3887:i
3884:d
3878:c
3875:e
3872:r
3868:W
3861:)
3858:)
3853:2
3847:t
3843:x
3839:(
3828:(
3821:g
3818:a
3815:i
3812:d
3806:c
3803:e
3800:r
3796:W
3790:)
3787:)
3782:1
3776:t
3772:x
3768:(
3757:(
3750:g
3747:a
3744:i
3741:d
3735:c
3732:e
3729:r
3725:W
3721:=
3714:)
3708:,
3703:1
3700:+
3697:k
3691:t
3687:u
3683:,
3678:k
3672:t
3668:x
3664:(
3661:F
3656:x
3645:)
3639:,
3634:1
3628:t
3624:u
3620:,
3615:2
3609:t
3605:x
3601:(
3598:F
3593:x
3583:)
3577:,
3572:t
3568:u
3564:,
3559:1
3553:t
3549:x
3545:(
3542:F
3537:x
3508:)
3505:)
3500:1
3494:t
3490:x
3486:(
3475:(
3468:g
3465:a
3462:i
3459:d
3453:c
3450:e
3447:r
3443:W
3439:=
3436:)
3430:,
3425:t
3421:u
3417:,
3412:1
3406:t
3402:x
3398:(
3395:F
3390:x
3362:b
3318:)
3313:n
3310:i
3306:W
3302:,
3297:c
3294:e
3291:r
3287:W
3283:(
3280:=
3257:b
3254:+
3249:t
3245:u
3239:n
3236:i
3232:W
3228:+
3225:)
3220:1
3214:t
3210:x
3206:(
3198:c
3195:e
3192:r
3188:W
3184:=
3181:)
3175:,
3170:t
3166:u
3162:,
3157:1
3151:t
3147:x
3143:(
3140:F
3137:=
3132:t
3128:x
3095:)
3089:,
3084:2
3078:t
3074:u
3070:,
3065:3
3059:t
3055:x
3051:(
3048:F
3043:x
3035:)
3029:,
3024:1
3018:t
3014:u
3010:,
3005:2
2999:t
2995:x
2991:(
2988:F
2983:x
2975:)
2969:,
2964:t
2960:u
2956:,
2951:1
2945:t
2941:x
2937:(
2934:F
2929:x
2879:T
2874:]
2869:)
2862:+
2859:)
2853:,
2848:1
2842:t
2838:u
2834:,
2829:2
2823:t
2819:x
2815:(
2812:F
2799:)
2793:,
2788:t
2784:u
2780:,
2775:1
2769:t
2765:x
2761:(
2758:F
2753:x
2745:+
2742:)
2736:,
2731:t
2727:u
2723:,
2718:1
2712:t
2708:x
2704:(
2701:F
2687:(
2683:)
2678:T
2674:x
2670:(
2667:L
2662:x
2653:[
2639:=
2619:)
2615:(
2595:d
2591:)
2584:+
2581:)
2575:,
2570:1
2564:t
2560:u
2556:,
2551:2
2545:t
2541:x
2537:(
2534:F
2521:)
2515:,
2510:t
2506:u
2502:,
2497:1
2491:t
2487:x
2483:(
2480:F
2475:x
2467:+
2464:)
2458:,
2453:t
2449:u
2445:,
2440:1
2434:t
2430:x
2426:(
2423:F
2409:(
2405:)
2400:T
2396:u
2392:,
2389:.
2386:.
2383:.
2380:,
2375:1
2371:u
2367:,
2362:T
2358:x
2354:(
2351:L
2346:x
2338:=
2335:L
2332:d
2307:)
2302:T
2298:u
2294:,
2291:.
2288:.
2285:.
2282:,
2277:1
2273:u
2269:,
2264:T
2260:x
2256:(
2253:L
2226:d
2222:)
2215:+
2212:)
2206:,
2201:1
2195:t
2191:u
2187:,
2182:2
2176:t
2172:x
2168:(
2165:F
2152:)
2146:,
2141:t
2137:u
2133:,
2128:1
2122:t
2118:x
2114:(
2111:F
2106:x
2098:+
2095:)
2089:,
2084:t
2080:u
2076:,
2071:1
2065:t
2061:x
2057:(
2054:F
2040:(
2036:=
2023:=
2013:)
2008:2
2002:t
1998:x
1994:d
1991:)
1985:,
1980:1
1974:t
1970:u
1966:,
1961:2
1955:t
1951:x
1947:(
1944:F
1939:x
1931:+
1925:d
1922:)
1916:,
1911:1
1905:t
1901:u
1897:,
1892:2
1886:t
1882:x
1878:(
1875:F
1862:(
1859:)
1853:,
1848:t
1844:u
1840:,
1835:1
1829:t
1825:x
1821:(
1818:F
1813:x
1805:+
1799:d
1796:)
1790:,
1785:t
1781:u
1777:,
1772:1
1766:t
1762:x
1758:(
1755:F
1742:=
1730:1
1724:t
1720:x
1716:d
1713:)
1707:,
1702:t
1698:u
1694:,
1689:1
1683:t
1679:x
1675:(
1672:F
1667:x
1659:+
1653:d
1650:)
1644:,
1639:t
1635:u
1631:,
1626:1
1620:t
1616:x
1612:(
1609:F
1596:=
1587:t
1583:x
1579:d
1567::
1551:)
1545:,
1540:t
1536:u
1532:,
1527:1
1521:t
1517:x
1513:(
1510:F
1507:=
1502:t
1498:x
1475:t
1471:h
1467:=
1462:t
1458:x
1437:)
1432:t
1428:h
1424:(
1421:G
1418:=
1413:t
1409:x
1386:t
1382:h
1359:t
1355:x
1334:)
1328:,
1323:t
1319:u
1315:,
1310:1
1304:t
1300:h
1296:(
1293:F
1290:=
1287:)
1282:t
1278:x
1274:,
1269:t
1265:h
1261:(
1221:.
1218:.
1215:.
1212:,
1207:2
1203:x
1199:,
1194:1
1190:x
1169:.
1166:.
1163:.
1160:,
1155:2
1151:u
1147:,
1142:1
1138:u
1117:.
1114:.
1111:.
1108:,
1103:2
1099:h
1095:,
1090:1
1086:h
1018:n
1014:n
1010:n
952:e
945:t
938:v
518:k
367:k
294:k
252:)
240:(
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.