5959:
5939:
964:
3656:
1749:
1569:
1339:
Rectifying activation functions were used to separate specific excitation and unspecific inhibition in the neural abstraction pyramid, which was trained in a supervised way to learn several computer vision tasks. In 2011, the use of the rectifier as a non-linearity has been shown to enable training
1375:
Dying ReLU problem: ReLU (rectified linear unit) neurons can sometimes be pushed into states in which they become inactive for essentially all inputs. In this state, no gradients flow backward through the neuron, and so the neuron becomes stuck in a perpetually inactive state and "dies". This is a
3460:
1380:. In some cases, large numbers of neurons in a network can become stuck in dead states, effectively decreasing the model capacity. This problem typically arises when the learning rate is set too high. It may be mitigated by using leaky ReLUs instead, which assign a small positive slope for
3312:
1585:
1145:
2706:
1409:
3087:
2889:
1368:
Not zero-centered: ReLU outputs are always non-negative. This can make it harder for the network to learn during backpropagation because gradient updates tend to push weights in one direction (positive or negative). Batch normalization can help address
2411:
2201:
3437:
3651:{\displaystyle f(x)={\begin{cases}x&{\text{if }}x>0,\\a\left(e^{x}-1\right)&{\text{otherwise}}.\end{cases}}\qquad \qquad f'(x)={\begin{cases}1&{\text{if }}x>0,\\a\cdot e^{x}&{\text{otherwise}}.\end{cases}}}
4051:
3869:
3120:
4651:
1744:{\displaystyle f(x)={\begin{cases}x&{\text{if }}x>0,\\a\cdot x&{\text{otherwise}}.\end{cases}}\qquad \qquad \qquad f'(x)={\begin{cases}1&{\text{if }}x>0,\\a&{\text{otherwise}}.\end{cases}}}
1564:{\displaystyle f(x)={\begin{cases}x&{\text{if }}x>0,\\0.01x&{\text{otherwise}}.\end{cases}}\qquad \qquad f'(x)={\begin{cases}1&{\text{if }}x>0,\\0.01&{\text{otherwise}}.\end{cases}}}
999:
2558:
1965:
1333:
2114:
4969:
4264:, making it well-suited for settings where computational resources or instruction sets are limited. Additionally, squareplus requires no special consideration to ensure numerical stability when
5833:
2920:
877:
3940:
2758:
2236:
1194:
motivations and mathematical justifications. In 2011 it was found to enable better training of deeper networks, compared to the widely used activation functions prior to 2011, e.g., the
915:
2018:
3454:
Exponential linear units try to make the mean activations closer to zero, which speeds up learning. It has been shown that ELUs can obtain higher classification accuracy than ReLUs.
1884:
4483:
Hahnloser, R.; Sarpeshkar, R.; Mahowald, M. A.; Douglas, R. J.; Seung, H. S. (2000). "Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit".
872:
3765:
1812:
862:
5675:
4228:
4199:
703:
5081:
2265:
4255:
3446:; the softmax with the first argument set to zero is the multivariable generalization of the logistic function. Both LogSumExp and softmax are used in machine learning.
2527:
2750:
3904:
1579:
Parametric ReLUs (PReLUs) take this idea further by making the coefficient of leakage into a parameter that is learned along with the other neural-network parameters.
4080:
3709:
2468:
4158:
4132:
4106:
910:
4282:
3785:
3679:
3323:
2912:
2547:
2488:
2442:
1168:
867:
718:
2122:
449:
5019:
3979:
950:
753:
5191:
829:
4994:
Clevert, Djork-Arné; Unterthiner, Thomas; Hochreiter, Sepp (2015). "Fast and
Accurate Deep Network Learning by Exponential Linear Units (ELUs)".
4941:
378:
3307:{\displaystyle \operatorname {LSE_{0}} ^{+}(x_{1},\dots ,x_{n}):=\operatorname {LSE} (0,x_{1},\dots ,x_{n})=\ln(1+e^{x_{1}}+\cdots +e^{x_{n}}).}
5074:
3801:
4824:
4423:
4328:
5864:
887:
650:
185:
4851:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2015). "Delving Deep into
Rectifiers: Surpassing Human-Level Performance on Image
1242:
Sparse activation: For example, in a randomly initialized network, only about 50% of hidden units are activated (have a non-zero output).
5993:
5965:
5516:
5253:
4406:
Fukushima, K.; Miyake, S. (1982). "Neocognitron: A Self-Organizing Neural
Network Model for a Mechanism of Visual Pattern Recognition".
905:
1140:{\displaystyle f(x)=x^{+}=\max(0,x)={\frac {x+|x|}{2}}={\begin{cases}x&{\text{if }}x>0,\\0&{\text{otherwise}},\end{cases}}}
738:
713:
662:
5777:
5404:
5211:
5067:
3682:
786:
781:
434:
2701:{\displaystyle \ln \left(1+e^{x}\right)\approx {\begin{cases}\ln 2,&x=0,\\{\frac {x}{1-e^{-x/\ln 2}}},&x\neq 0\end{cases}}}
5732:
2021:
1352:
or similar activation functions, allow faster and effective training of deep neural architectures on large and complex datasets.
444:
82:
1262:
2063:
1403:
Leaky ReLUs allow a small, positive gradient when the unit is not active, helping to mitigate the vanishing gradient problem.
839:
5919:
5859:
5457:
1190:
in 1969 in the context of visual feature extraction in hierarchical neural networks. It was later argued that it has strong
943:
603:
424:
5452:
5141:
814:
516:
292:
5894:
5248:
5201:
5196:
771:
708:
618:
596:
439:
429:
5945:
5167:
3082:{\displaystyle f(x)={\frac {\ln(1+e^{kx})}{k}},\qquad \qquad f'(x)={\frac {e^{kx}}{1+e^{kx}}}={\frac {1}{1+e^{-kx}}}.}
1892:
922:
834:
819:
280:
102:
809:
2884:{\displaystyle \log _{2}(1+2^{y})\approx {\begin{cases}1,&y=0,\\{\frac {y}{1-e^{-y}}},&y\neq 0.\end{cases}}}
5569:
5504:
5105:
4784:
Engelken, Rainer; Wolf, Fred; Abbott, L. F. (2020-06-03). "Lyapunov spectra of chaotic recurrent neural networks".
2031:
This activation function is illustrated in the figure at the start of this article. It has a "bump" to the left of
1377:
1246:
1230:
882:
559:
454:
242:
175:
135:
3913:
5970:
5828:
5467:
5298:
5121:
2209:
978:
936:
542:
310:
180:
5869:
5126:
564:
484:
407:
325:
155:
117:
112:
72:
67:
1973:
5914:
5899:
5552:
5547:
5447:
5315:
5096:
3104:
1175:
511:
360:
260:
87:
2805:
2602:
1836:
5874:
5634:
5353:
5348:
4620:
Ramachandran, Prajit; Barret, Zoph; Quoc, V. Le (October 16, 2017). "Searching for
Activation Functions".
4595:
4379:
Fukushima, K. (1969). "Visual feature extraction by a multilayered network of analog threshold elements".
4169:
1179:
691:
667:
569:
330:
305:
265:
77:
5904:
5889:
5854:
5542:
5442:
5310:
4308:
2036:
1345:
645:
467:
419:
275:
190:
62:
5772:
4668:
5924:
5879:
5325:
5270:
5116:
5111:
4750:
4731:
Kadmon, Jonathan; Sompolinsky, Haim (2015-11-19). "Transition to Chaos in Random
Neuronal Networks".
4494:
4354:
3717:
2406:{\displaystyle f(x)=\ln(1+e^{x}),\qquad \qquad f'(x)={\frac {e^{x}}{1+e^{x}}}={\frac {1}{1+e^{-x}}},}
1764:
1256:
574:
524:
3586:
3484:
1692:
1609:
1512:
1433:
1088:
5499:
5477:
5226:
5221:
5179:
5131:
4949:
Proceedings of the 13th
International Conference on Neural Information Processing Systems (NIPS'00)
4459:
4261:
4204:
4175:
2025:
1341:
1203:
1183:
990:
677:
613:
584:
489:
315:
248:
234:
220:
195:
145:
97:
57:
5938:
5884:
5462:
5291:
5044:
4995:
4913:
4880:
4856:
4830:
4785:
4766:
4740:
4621:
4518:
4463:
4433:
4233:
4165:
3907:
3795:
The mish function can also be used as a smooth approximation of the rectifier. It is defined as
1222:
1207:
1199:
1187:
655:
579:
365:
160:
3954:
2493:
2714:
5950:
5742:
5394:
5265:
5258:
4940:
Dugas, Charles; Bengio, Yoshua; Bélisle, François; Nadeau, Claude; Garcia, René (2000-01-01).
4820:
4713:
4510:
4419:
3877:
3093:
2257:
1361:
Non-differentiable at zero; however, it is differentiable anywhere else, and the value of the
1195:
748:
591:
504:
300:
270:
215:
210:
165:
107:
4904:
4575:
Rectifier and softplus activation functions. The second one is a smooth version of the first.
4059:
3688:
5695:
5685:
5492:
5286:
5236:
5231:
5174:
5162:
4812:
4758:
4703:
4695:
4565:
4502:
4485:
4411:
4388:
4298:
4293:
3958:
3443:
3100:
2239:
1349:
776:
529:
479:
389:
373:
343:
205:
200:
150:
140:
38:
5043:
Barron, Jonathan T. (22 December 2021). "Squareplus: A Softplus-Like
Algebraic Rectifier".
2447:
5808:
5752:
5574:
5216:
5136:
4445:
4257:
3714:
The ELU can be viewed as a smoothed version of a shifted ReLU (SReLU), which has the form
3432:{\displaystyle \operatorname {LSE} (x_{1},\dots ,x_{n})=\ln(e^{x_{1}}+\cdots +e^{x_{n}}),}
1218:
804:
608:
474:
414:
4683:"How noise contributes to contrast invariance of orientation tuning in cat visual cortex"
4137:
4111:
4085:
4754:
4642:
4498:
5782:
5747:
5737:
5562:
5320:
5146:
4708:
4699:
4682:
4267:
3770:
3664:
2897:
2532:
2473:
2427:
2054:
2048:
1153:
824:
355:
92:
4599:
2196:{\displaystyle f'(x)=x\cdot \operatorname {sigmoid} '(x)+\operatorname {sigmoid} (x),}
5987:
5727:
5707:
5624:
5303:
4806:
4561:
4161:
1249:
problems compared to sigmoidal activation functions that saturate in both directions.
1226:
1211:
1171:
743:
672:
554:
285:
170:
5813:
5644:
4834:
4770:
4687:
4522:
4046:{\displaystyle \operatorname {squareplus} _{b}(x)={\frac {x+{\sqrt {x^{2}+b}}}{2}}}
4957:
has a positive first derivative, its primitive, which we call softplus, is convex.
4415:
5909:
5680:
5589:
5584:
5206:
5184:
4591:
4303:
549:
43:
5803:
5762:
5757:
5670:
5579:
5487:
5399:
5379:
4762:
4587:
1362:
1191:
698:
394:
320:
4879:
Hendrycks, Dan; Gimpel, Kevin (2016). "Gaussian Error Linear Units (GELUs)".
4410:. Lecture Notes in Biomathematics. Vol. 45. Springer. pp. 267–285.
4392:
5798:
5767:
5665:
5509:
5472:
5409:
5363:
5358:
5343:
3950:
3111:
857:
638:
5059:
4942:"Incorporating second-order functional knowledge for better option pricing"
4717:
4514:
963:
5700:
5532:
4082:
is a hyperparameter that determines the "size" of the curved region near
3943:
3864:{\displaystyle f(x)=x\tanh {\big (}\operatorname {softplus} (x){\big )},}
2251:
1210:. The rectifier is, as of 2017, the most popular activation function for
5823:
5660:
5614:
5537:
5437:
5432:
5384:
633:
5838:
5818:
5690:
5482:
4918:
4506:
384:
3110:
The multivariable generalization of single-variable softplus is the
1252:
Efficient computation: Only comparison, addition and multiplication.
17:
5049:
5000:
4885:
4861:
4816:
4790:
4745:
4626:
4538:
Permitted and
Forbidden Sets in Symmetric Threshold-Linear Networks
4468:
5639:
5619:
5609:
5604:
5599:
5594:
5557:
5389:
4164:
function.) Squareplus shares many properties with softplus: It is
3103:
is a smooth approximation of the derivative of the rectifier, the
967:
Plot of the ReLU rectifier (blue) and GELU (green) functions near
962:
628:
623:
350:
2057:
is another smooth approximation, first coined in the GELU paper:
5629:
3962:
5063:
4811:. Lecture Notes in Computer Science. Vol. 2766. Springer.
4669:
2035:< 0 and serves as the default activation for models such as
4974:
Developer Guide for Intel Data
Analytics Acceleration Library
4644:
Phone Recognition with Deep Sparse Rectifier Neural Networks
4462:(2022). "Annotated History of Modern AI and Deep Learning".
4329:"A Gentle Introduction to the Rectified Linear Unit (ReLU)"
3644:
3555:
2877:
2694:
1737:
1660:
1557:
1481:
1133:
4906:
Mish: A Self Regularized Non-Monotonic Activation Function
1328:{\displaystyle \max(0,ax)=a\max(0,x){\text{ for }}a\geq 0}
4970:"Smooth Rectifier Linear Unit (SmoothReLU) Forward Layer"
2109:{\displaystyle f(x)=x\cdot \operatorname {sigmoid} (x),}
916:
List of datasets in computer vision and image processing
1384: < 0; however, the performance is reduced.
4898:
4896:
4808:
Hierarchical Neural Networks for Image Interpretation
4667:
Andrew L. Maas, Awni Y. Hannun, Andrew Y. Ng (2014).
4270:
4236:
4207:
4178:
4140:
4114:
4088:
4062:
3982:
3916:
3880:
3804:
3773:
3720:
3691:
3667:
3463:
3326:
3123:
2923:
2900:
2761:
2717:
2561:
2535:
2496:
2476:
2450:
2430:
2268:
2212:
2125:
2066:
1976:
1895:
1839:
1767:
1588:
1412:
1265:
1156:
1002:
4555:
4553:
4551:
4549:
4547:
4381:
IEEE Transactions on Systems Science and Cybernetics
1365:
at zero can be arbitrarily chosen to be 0 or 1.
5847:
5791:
5720:
5653:
5525:
5425:
5418:
5372:
5336:
5279:
5155:
5095:
4276:
4249:
4222:
4193:
4152:
4126:
4100:
4074:
4045:
3934:
3898:
3863:
3779:
3759:
3703:
3673:
3650:
3431:
3306:
3081:
2906:
2883:
2744:
2700:
2541:
2521:
2482:
2462:
2436:
2405:
2230:
2195:
2108:
2012:
1959:
1878:
1806:
1743:
1563:
1348:pre-training. Rectified linear units, compared to
1327:
1170:is the input to a neuron. This is also known as a
1162:
1139:
993:defined as the non-negative part of its argument:
4874:
4872:
4260:. However, squareplus can be computed using only
1830:GELU is a smooth approximation to the rectifier:
5020:"Activation Functions Compared with Experiments"
3736:
1960:{\displaystyle f'(x)=x\cdot \Phi '(x)+\Phi (x),}
1783:
1293:
1266:
1031:
987:ReLU (rectified linear unit) activation function
2256:A smooth approximation to the rectifier is the
1817:and thus has a relation to "maxout" networks.
911:List of datasets for machine-learning research
5075:
3853:
3831:
944:
8:
5013:
5011:
3935:{\displaystyle \operatorname {softplus} (x)}
2470:, so just above 0, while for large positive
1217:Rectified linear units find applications in
4663:
4661:
2231:{\displaystyle \operatorname {sigmoid} (x)}
5422:
5082:
5068:
5060:
4846:
4844:
4408:Competition and Cooperation in Neural Nets
1206:) and its more practical counterpart, the
951:
937:
29:
5048:
4999:
4917:
4884:
4860:
4789:
4744:
4707:
4625:
4467:
4269:
4241:
4235:
4206:
4177:
4139:
4113:
4087:
4061:
4023:
4017:
4008:
3987:
3981:
3915:
3879:
3852:
3851:
3830:
3829:
3803:
3772:
3719:
3690:
3666:
3633:
3625:
3594:
3581:
3544:
3525:
3492:
3479:
3462:
3415:
3410:
3389:
3384:
3359:
3340:
3325:
3290:
3285:
3264:
3259:
3228:
3209:
3178:
3159:
3143:
3136:
3125:
3122:
3061:
3045:
3030:
3010:
3004:
2961:
2939:
2922:
2899:
2848:
2832:
2800:
2788:
2766:
2760:
2716:
2658:
2651:
2635:
2597:
2583:
2560:
2534:
2510:
2495:
2475:
2449:
2429:
2388:
2372:
2360:
2343:
2337:
2303:
2267:
2211:
2124:
2065:
1975:
1894:
1838:
1766:
1726:
1700:
1687:
1649:
1617:
1604:
1587:
1546:
1520:
1507:
1470:
1441:
1428:
1411:
1311:
1264:
1155:
1122:
1096:
1083:
1069:
1061:
1052:
1022:
1001:
2013:{\displaystyle \Phi (x)=P(X\leqslant x)}
4319:
37:
4681:Hansel, D.; van Vreeswijk, C. (2002).
4441:
4431:
2552:This function can be approximated as:
4567:Deep sparse rectifier neural networks
3114:with the first argument set to zero:
1879:{\displaystyle f(x)=x\cdot \Phi (x),}
7:
5920:Generative adversarial network (GAN)
4607:Neural Networks: Tricks of the Trade
4536:Hahnloser, R.; Seung, H. S. (2001).
3767:, given the same interpretation of
1245:Better gradient propagation: Fewer
906:Glossary of artificial intelligence
4700:10.1523/JNEUROSCI.22-12-05118.2002
4327:Brownlee, Jason (8 January 2019).
4242:
4217:
4188:
3133:
3129:
3126:
3092:The derivative of softplus is the
2711:By making the change of variables
2053:The SiLU (sigmoid linear unit) or
1977:
1942:
1923:
1861:
1344:neural networks without requiring
25:
4353:Liu, Danqing (30 November 2017).
1826:Gaussian-error linear unit (GELU)
5958:
5957:
5937:
3685:to be tuned with the constraint
2022:cumulative distribution function
4605:. In G. Orr; K. MĂĽller (eds.).
4560:Xavier Glorot; Antoine Bordes;
3760:{\displaystyle f(x)=\max(-a,x)}
3560:
3559:
2983:
2982:
2316:
2315:
1807:{\displaystyle f(x)=\max(x,ax)}
1666:
1665:
1664:
1486:
1485:
5870:Recurrent neural network (RNN)
5860:Differentiable neural computer
4211:
4182:
4002:
3996:
3929:
3923:
3893:
3887:
3848:
3842:
3814:
3808:
3754:
3739:
3730:
3724:
3575:
3569:
3473:
3467:
3423:
3377:
3365:
3333:
3298:
3246:
3234:
3196:
3184:
3152:
2998:
2992:
2970:
2948:
2933:
2927:
2794:
2775:
2739:
2733:
2516:
2503:
2331:
2325:
2309:
2290:
2278:
2272:
2225:
2219:
2187:
2181:
2169:
2163:
2140:
2134:
2100:
2094:
2076:
2070:
2007:
1995:
1986:
1980:
1951:
1945:
1936:
1930:
1910:
1904:
1870:
1864:
1849:
1843:
1801:
1786:
1777:
1771:
1681:
1675:
1598:
1592:
1501:
1495:
1422:
1416:
1308:
1296:
1284:
1269:
1070:
1062:
1046:
1034:
1012:
1006:
326:Relevance vector machine (RVM)
1:
5915:Variational autoencoder (VAE)
5875:Long short-term memory (LSTM)
5142:Computational learning theory
4903:Diganta Misra (23 Aug 2019),
4223:{\displaystyle x\to +\infty }
4201:, approaches the identity as
4194:{\displaystyle x\to -\infty }
2424:function. For large negative
815:Computational learning theory
379:Expectation–maximization (EM)
5895:Convolutional neural network
4416:10.1007/978-3-642-46466-9_18
772:Coefficient of determination
619:Convolutional neural network
331:Support vector machine (SVM)
5890:Multilayer perceptron (MLP)
4355:"A Practical Guide to ReLU"
4250:{\displaystyle C^{\infty }}
3973:Squareplus is the function
1758:≤ 1, this is equivalent to
923:Outline of machine learning
820:Empirical risk minimization
6010:
5994:Artificial neural networks
5966:Artificial neural networks
5880:Gated recurrent unit (GRU)
5106:Differentiable programming
5018:Shaw, Sweta (2020-05-10).
3317:The LogSumExp function is
2522:{\displaystyle \ln(e^{x})}
2249:
2046:
1378:vanishing gradient problem
1231:computational neuroscience
979:artificial neural networks
560:Feedforward neural network
311:Artificial neural networks
5933:
5299:Artificial neural network
5122:Automatic differentiation
4763:10.1103/PhysRevX.5.041030
4134:yields ReLU, and letting
2745:{\displaystyle x=y\ln(2)}
1821:Other non-linear variants
1394:Piecewise-linear variants
543:Artificial neural network
5127:Neuromorphic engineering
5090:Differentiable computing
4393:10.1109/TSSC.1969.300225
4333:Machine Learning Mastery
4108:. (For example, letting
3899:{\displaystyle \tanh(x)}
3442:and its gradient is the
2752:, this is equivalent to
852:Journals and conferences
799:Mathematical foundations
709:Temporal difference (TD)
565:Recurrent neural network
485:Conditional random field
408:Dimensionality reduction
156:Dimensionality reduction
118:Quantum machine learning
113:Neuromorphic engineering
73:Self-supervised learning
68:Semi-supervised learning
5900:Residual neural network
5316:Artificial Intelligence
4075:{\displaystyle b\geq 0}
3704:{\displaystyle a\geq 0}
3105:Heaviside step function
1176:half-wave rectification
261:Apprenticeship learning
4951:. MIT Press: 451–457.
4278:
4251:
4224:
4195:
4154:
4128:
4102:
4076:
4047:
3961:, itself a variant of
3936:
3900:
3865:
3781:
3761:
3705:
3675:
3652:
3433:
3308:
3083:
2908:
2894:A sharpness parameter
2885:
2746:
2702:
2543:
2523:
2484:
2464:
2438:
2407:
2232:
2197:
2110:
2014:
1961:
1880:
1808:
1745:
1565:
1329:
1198:(which is inspired by
1180:electrical engineering
1164:
1141:
974:
810:Bias–variance tradeoff
692:Reinforcement learning
668:Spiking neural network
78:Reinforcement learning
5855:Neural Turing machine
5443:Human image synthesis
4805:Behnke, Sven (2003).
4309:Layer (deep learning)
4279:
4252:
4225:
4196:
4155:
4129:
4103:
4077:
4048:
3957:. It was inspired by
3937:
3901:
3866:
3782:
3762:
3706:
3676:
3653:
3434:
3309:
3084:
2909:
2886:
2747:
2703:
2544:
2524:
2485:
2465:
2463:{\displaystyle \ln 1}
2439:
2408:
2233:
2198:
2111:
2015:
1962:
1881:
1809:
1746:
1566:
1330:
1165:
1142:
966:
646:Neural radiance field
468:Structured prediction
191:Structured prediction
63:Unsupervised learning
5946:Computer programming
5925:Graph neural network
5500:Text-to-video models
5478:Text-to-image models
5326:Large language model
5311:Scientific computing
5117:Statistical manifold
5112:Information geometry
4641:László Tóth (2013).
4600:"Efficient BackProp"
4594:; Genevieve B. Orr;
4460:Schmidhuber, Juergen
4268:
4234:
4205:
4176:
4138:
4112:
4086:
4060:
3980:
3914:
3878:
3802:
3771:
3718:
3689:
3665:
3461:
3324:
3121:
2921:
2898:
2759:
2715:
2559:
2533:
2494:
2474:
2448:
2428:
2416:which is called the
2266:
2210:
2123:
2064:
1974:
1893:
1837:
1765:
1586:
1410:
1263:
1212:deep neural networks
1174:and is analogous to
1154:
1000:
835:Statistical learning
733:Learning with humans
525:Local outlier factor
5292:In-context learning
5132:Pattern recognition
4755:2015PhRvX...5d1030K
4596:Klaus-Robert MĂĽller
4499:2000Natur.405..947H
4262:algebraic functions
4153:{\displaystyle b=4}
4127:{\displaystyle b=0}
4101:{\displaystyle x=0}
3661:In these formulas,
2026:normal distribution
1204:logistic regression
1184:activation function
991:activation function
678:Electrochemical RAM
585:reservoir computing
316:Logistic regression
235:Supervised learning
221:Multimodal learning
196:Feature engineering
141:Generative modeling
103:Rule-based learning
98:Curriculum learning
58:Supervised learning
33:Part of a series on
27:Activation function
5885:Echo state network
5773:JĂĽrgen Schmidhuber
5468:Facial recognition
5463:Speech recognition
5373:Software libraries
4953:Since the sigmoid
4274:
4247:
4220:
4191:
4172:, approaches 0 as
4150:
4124:
4098:
4072:
4043:
3932:
3908:hyperbolic tangent
3896:
3861:
3777:
3757:
3701:
3671:
3648:
3643:
3554:
3429:
3304:
3079:
2904:
2881:
2876:
2742:
2698:
2693:
2539:
2519:
2480:
2460:
2434:
2403:
2228:
2193:
2106:
2010:
1957:
1876:
1804:
1741:
1736:
1659:
1561:
1556:
1480:
1356:Potential problems
1325:
1247:vanishing gradient
1223:speech recognition
1208:hyperbolic tangent
1200:probability theory
1188:Kunihiko Fukushima
1186:was introduced by
1160:
1137:
1132:
977:In the context of
975:
246: •
161:Density estimation
5981:
5980:
5743:Stephen Grossberg
5716:
5715:
4855:Classification".
4826:978-3-540-40722-5
4733:Physical Review X
4694:(12): 5118–5128.
4493:(6789): 947–951.
4425:978-3-540-11574-8
4277:{\displaystyle x}
4041:
4035:
3780:{\displaystyle a}
3674:{\displaystyle a}
3636:
3597:
3547:
3495:
3094:logistic function
3074:
3040:
2977:
2914:may be included:
2907:{\displaystyle k}
2858:
2675:
2542:{\displaystyle x}
2483:{\displaystyle x}
2437:{\displaystyle x}
2398:
2367:
2258:analytic function
1729:
1703:
1652:
1620:
1549:
1523:
1473:
1444:
1314:
1255:Scale-invariant (
1163:{\displaystyle x}
1125:
1099:
1078:
961:
960:
766:Model diagnostics
749:Human-in-the-loop
592:Boltzmann machine
505:Anomaly detection
301:Linear regression
216:Ontology learning
211:Grammar induction
186:Semantic analysis
181:Association rules
166:Anomaly detection
108:Neuro-symbolic AI
16:(Redirected from
6001:
5971:Machine learning
5961:
5960:
5941:
5696:Action selection
5686:Self-driving car
5493:Stable Diffusion
5458:Speech synthesis
5423:
5287:Machine learning
5163:Gradient descent
5084:
5077:
5070:
5061:
5055:
5054:
5052:
5040:
5034:
5033:
5031:
5030:
5015:
5006:
5005:
5003:
4991:
4985:
4984:
4982:
4981:
4966:
4960:
4959:
4946:
4937:
4931:
4929:
4928:
4926:
4921:
4911:
4900:
4891:
4890:
4888:
4876:
4867:
4866:
4864:
4848:
4839:
4838:
4802:
4796:
4795:
4793:
4781:
4775:
4774:
4748:
4728:
4722:
4721:
4711:
4678:
4672:
4665:
4656:
4655:
4649:
4638:
4632:
4631:
4629:
4617:
4611:
4610:
4604:
4584:
4578:
4577:
4572:
4557:
4542:
4541:
4533:
4527:
4526:
4507:10.1038/35016072
4480:
4474:
4473:
4471:
4456:
4450:
4449:
4443:
4439:
4437:
4429:
4403:
4397:
4396:
4376:
4370:
4369:
4367:
4365:
4350:
4344:
4343:
4341:
4339:
4324:
4299:Sigmoid function
4294:Softmax function
4283:
4281:
4280:
4275:
4256:
4254:
4253:
4248:
4246:
4245:
4229:
4227:
4226:
4221:
4200:
4198:
4197:
4192:
4159:
4157:
4156:
4151:
4133:
4131:
4130:
4125:
4107:
4105:
4104:
4099:
4081:
4079:
4078:
4073:
4052:
4050:
4049:
4044:
4042:
4037:
4036:
4028:
4027:
4018:
4009:
3992:
3991:
3941:
3939:
3938:
3933:
3905:
3903:
3902:
3897:
3870:
3868:
3867:
3862:
3857:
3856:
3835:
3834:
3786:
3784:
3783:
3778:
3766:
3764:
3763:
3758:
3710:
3708:
3707:
3702:
3680:
3678:
3677:
3672:
3657:
3655:
3654:
3649:
3647:
3646:
3637:
3634:
3630:
3629:
3598:
3595:
3568:
3558:
3557:
3548:
3545:
3541:
3537:
3530:
3529:
3496:
3493:
3438:
3436:
3435:
3430:
3422:
3421:
3420:
3419:
3396:
3395:
3394:
3393:
3364:
3363:
3345:
3344:
3313:
3311:
3310:
3305:
3297:
3296:
3295:
3294:
3271:
3270:
3269:
3268:
3233:
3232:
3214:
3213:
3183:
3182:
3164:
3163:
3148:
3147:
3142:
3141:
3140:
3101:sigmoid function
3088:
3086:
3085:
3080:
3075:
3073:
3072:
3071:
3046:
3041:
3039:
3038:
3037:
3018:
3017:
3005:
2991:
2978:
2973:
2969:
2968:
2940:
2913:
2911:
2910:
2905:
2890:
2888:
2887:
2882:
2880:
2879:
2859:
2857:
2856:
2855:
2833:
2793:
2792:
2771:
2770:
2751:
2749:
2748:
2743:
2707:
2705:
2704:
2699:
2697:
2696:
2676:
2674:
2673:
2672:
2662:
2636:
2593:
2589:
2588:
2587:
2548:
2546:
2545:
2540:
2529:, so just above
2528:
2526:
2525:
2520:
2515:
2514:
2489:
2487:
2486:
2481:
2469:
2467:
2466:
2461:
2443:
2441:
2440:
2435:
2412:
2410:
2409:
2404:
2399:
2397:
2396:
2395:
2373:
2368:
2366:
2365:
2364:
2348:
2347:
2338:
2324:
2308:
2307:
2240:sigmoid function
2237:
2235:
2234:
2229:
2202:
2200:
2199:
2194:
2159:
2133:
2115:
2113:
2112:
2107:
2024:of the standard
2019:
2017:
2016:
2011:
1966:
1964:
1963:
1958:
1929:
1903:
1885:
1883:
1882:
1877:
1813:
1811:
1810:
1805:
1750:
1748:
1747:
1742:
1740:
1739:
1730:
1727:
1704:
1701:
1674:
1663:
1662:
1653:
1650:
1621:
1618:
1570:
1568:
1567:
1562:
1560:
1559:
1550:
1547:
1524:
1521:
1494:
1484:
1483:
1474:
1471:
1445:
1442:
1350:sigmoid function
1334:
1332:
1331:
1326:
1315:
1312:
1227:deep neural nets
1196:logistic sigmoid
1169:
1167:
1166:
1161:
1146:
1144:
1143:
1138:
1136:
1135:
1126:
1123:
1100:
1097:
1079:
1074:
1073:
1065:
1053:
1027:
1026:
973:
953:
946:
939:
900:Related articles
777:Confusion matrix
530:Isolation forest
475:Graphical models
254:
253:
206:Learning to rank
201:Feature learning
39:Machine learning
30:
21:
6009:
6008:
6004:
6003:
6002:
6000:
5999:
5998:
5984:
5983:
5982:
5977:
5929:
5843:
5809:Google DeepMind
5787:
5753:Geoffrey Hinton
5712:
5649:
5575:Project Debater
5521:
5419:Implementations
5414:
5368:
5332:
5275:
5217:Backpropagation
5151:
5137:Tensor calculus
5091:
5088:
5058:
5042:
5041:
5037:
5028:
5026:
5017:
5016:
5009:
4993:
4992:
4988:
4979:
4977:
4968:
4967:
4963:
4944:
4939:
4938:
4934:
4924:
4922:
4909:
4902:
4901:
4894:
4878:
4877:
4870:
4850:
4849:
4842:
4827:
4804:
4803:
4799:
4783:
4782:
4778:
4730:
4729:
4725:
4680:
4679:
4675:
4666:
4659:
4647:
4640:
4639:
4635:
4619:
4618:
4614:
4602:
4586:
4585:
4581:
4570:
4559:
4558:
4545:
4535:
4534:
4530:
4482:
4481:
4477:
4458:
4457:
4453:
4440:
4430:
4426:
4405:
4404:
4400:
4378:
4377:
4373:
4363:
4361:
4352:
4351:
4347:
4337:
4335:
4326:
4325:
4321:
4317:
4290:
4266:
4265:
4237:
4232:
4231:
4203:
4202:
4174:
4173:
4136:
4135:
4110:
4109:
4084:
4083:
4058:
4057:
4019:
4010:
3983:
3978:
3977:
3971:
3912:
3911:
3876:
3875:
3800:
3799:
3793:
3769:
3768:
3716:
3715:
3687:
3686:
3683:hyper-parameter
3663:
3662:
3642:
3641:
3631:
3621:
3612:
3611:
3592:
3582:
3561:
3553:
3552:
3542:
3521:
3520:
3516:
3510:
3509:
3490:
3480:
3459:
3458:
3452:
3411:
3406:
3385:
3380:
3355:
3336:
3322:
3321:
3286:
3281:
3260:
3255:
3224:
3205:
3174:
3155:
3132:
3124:
3119:
3118:
3057:
3050:
3026:
3019:
3006:
2984:
2957:
2941:
2919:
2918:
2896:
2895:
2875:
2874:
2863:
2844:
2837:
2829:
2828:
2814:
2801:
2784:
2762:
2757:
2756:
2713:
2712:
2692:
2691:
2680:
2647:
2640:
2632:
2631:
2617:
2598:
2579:
2572:
2568:
2557:
2556:
2531:
2530:
2506:
2492:
2491:
2472:
2471:
2446:
2445:
2426:
2425:
2384:
2377:
2356:
2349:
2339:
2317:
2299:
2264:
2263:
2254:
2248:
2208:
2207:
2152:
2126:
2121:
2120:
2062:
2061:
2051:
2045:
1972:
1971:
1922:
1896:
1891:
1890:
1835:
1834:
1828:
1823:
1763:
1762:
1735:
1734:
1724:
1718:
1717:
1698:
1688:
1667:
1658:
1657:
1647:
1635:
1634:
1615:
1605:
1584:
1583:
1577:
1575:Parametric ReLU
1555:
1554:
1544:
1538:
1537:
1518:
1508:
1487:
1479:
1478:
1468:
1459:
1458:
1439:
1429:
1408:
1407:
1401:
1396:
1391:
1358:
1313: for
1261:
1260:
1239:
1219:computer vision
1152:
1151:
1131:
1130:
1120:
1114:
1113:
1094:
1084:
1054:
1018:
998:
997:
968:
957:
928:
927:
901:
893:
892:
853:
845:
844:
805:Kernel machines
800:
792:
791:
767:
759:
758:
739:Active learning
734:
726:
725:
694:
684:
683:
609:Diffusion model
545:
535:
534:
507:
497:
496:
470:
460:
459:
415:Factor analysis
410:
400:
399:
383:
346:
336:
335:
256:
255:
239:
238:
237:
226:
225:
131:
123:
122:
88:Online learning
53:
41:
28:
23:
22:
15:
12:
11:
5:
6007:
6005:
5997:
5996:
5986:
5985:
5979:
5978:
5976:
5975:
5974:
5973:
5968:
5955:
5954:
5953:
5948:
5934:
5931:
5930:
5928:
5927:
5922:
5917:
5912:
5907:
5902:
5897:
5892:
5887:
5882:
5877:
5872:
5867:
5862:
5857:
5851:
5849:
5845:
5844:
5842:
5841:
5836:
5831:
5826:
5821:
5816:
5811:
5806:
5801:
5795:
5793:
5789:
5788:
5786:
5785:
5783:Ilya Sutskever
5780:
5775:
5770:
5765:
5760:
5755:
5750:
5748:Demis Hassabis
5745:
5740:
5738:Ian Goodfellow
5735:
5730:
5724:
5722:
5718:
5717:
5714:
5713:
5711:
5710:
5705:
5704:
5703:
5693:
5688:
5683:
5678:
5673:
5668:
5663:
5657:
5655:
5651:
5650:
5648:
5647:
5642:
5637:
5632:
5627:
5622:
5617:
5612:
5607:
5602:
5597:
5592:
5587:
5582:
5577:
5572:
5567:
5566:
5565:
5555:
5550:
5545:
5540:
5535:
5529:
5527:
5523:
5522:
5520:
5519:
5514:
5513:
5512:
5507:
5497:
5496:
5495:
5490:
5485:
5475:
5470:
5465:
5460:
5455:
5450:
5445:
5440:
5435:
5429:
5427:
5420:
5416:
5415:
5413:
5412:
5407:
5402:
5397:
5392:
5387:
5382:
5376:
5374:
5370:
5369:
5367:
5366:
5361:
5356:
5351:
5346:
5340:
5338:
5334:
5333:
5331:
5330:
5329:
5328:
5321:Language model
5318:
5313:
5308:
5307:
5306:
5296:
5295:
5294:
5283:
5281:
5277:
5276:
5274:
5273:
5271:Autoregression
5268:
5263:
5262:
5261:
5251:
5249:Regularization
5246:
5245:
5244:
5239:
5234:
5224:
5219:
5214:
5212:Loss functions
5209:
5204:
5199:
5194:
5189:
5188:
5187:
5177:
5172:
5171:
5170:
5159:
5157:
5153:
5152:
5150:
5149:
5147:Inductive bias
5144:
5139:
5134:
5129:
5124:
5119:
5114:
5109:
5101:
5099:
5093:
5092:
5089:
5087:
5086:
5079:
5072:
5064:
5057:
5056:
5035:
5007:
4986:
4961:
4932:
4892:
4868:
4840:
4825:
4817:10.1007/b11963
4797:
4776:
4723:
4673:
4657:
4633:
4612:
4579:
4543:
4528:
4475:
4451:
4442:|journal=
4424:
4398:
4387:(4): 322–333.
4371:
4345:
4318:
4316:
4313:
4312:
4311:
4306:
4301:
4296:
4289:
4286:
4273:
4244:
4240:
4219:
4216:
4213:
4210:
4190:
4187:
4184:
4181:
4149:
4146:
4143:
4123:
4120:
4117:
4097:
4094:
4091:
4071:
4068:
4065:
4054:
4053:
4040:
4034:
4031:
4026:
4022:
4016:
4013:
4007:
4004:
4001:
3998:
3995:
3990:
3986:
3970:
3967:
3931:
3928:
3925:
3922:
3919:
3895:
3892:
3889:
3886:
3883:
3872:
3871:
3860:
3855:
3850:
3847:
3844:
3841:
3838:
3833:
3828:
3825:
3822:
3819:
3816:
3813:
3810:
3807:
3792:
3789:
3776:
3756:
3753:
3750:
3747:
3744:
3741:
3738:
3735:
3732:
3729:
3726:
3723:
3700:
3697:
3694:
3670:
3659:
3658:
3645:
3640:
3632:
3628:
3624:
3620:
3617:
3614:
3613:
3610:
3607:
3604:
3601:
3593:
3591:
3588:
3587:
3585:
3580:
3577:
3574:
3571:
3567:
3564:
3556:
3551:
3543:
3540:
3536:
3533:
3528:
3524:
3519:
3515:
3512:
3511:
3508:
3505:
3502:
3499:
3491:
3489:
3486:
3485:
3483:
3478:
3475:
3472:
3469:
3466:
3451:
3448:
3440:
3439:
3428:
3425:
3418:
3414:
3409:
3405:
3402:
3399:
3392:
3388:
3383:
3379:
3376:
3373:
3370:
3367:
3362:
3358:
3354:
3351:
3348:
3343:
3339:
3335:
3332:
3329:
3315:
3314:
3303:
3300:
3293:
3289:
3284:
3280:
3277:
3274:
3267:
3263:
3258:
3254:
3251:
3248:
3245:
3242:
3239:
3236:
3231:
3227:
3223:
3220:
3217:
3212:
3208:
3204:
3201:
3198:
3195:
3192:
3189:
3186:
3181:
3177:
3173:
3170:
3167:
3162:
3158:
3154:
3151:
3146:
3139:
3135:
3131:
3128:
3090:
3089:
3078:
3070:
3067:
3064:
3060:
3056:
3053:
3049:
3044:
3036:
3033:
3029:
3025:
3022:
3016:
3013:
3009:
3003:
3000:
2997:
2994:
2990:
2987:
2981:
2976:
2972:
2967:
2964:
2960:
2956:
2953:
2950:
2947:
2944:
2938:
2935:
2932:
2929:
2926:
2903:
2892:
2891:
2878:
2873:
2870:
2867:
2864:
2862:
2854:
2851:
2847:
2843:
2840:
2836:
2831:
2830:
2827:
2824:
2821:
2818:
2815:
2813:
2810:
2807:
2806:
2804:
2799:
2796:
2791:
2787:
2783:
2780:
2777:
2774:
2769:
2765:
2741:
2738:
2735:
2732:
2729:
2726:
2723:
2720:
2709:
2708:
2695:
2690:
2687:
2684:
2681:
2679:
2671:
2668:
2665:
2661:
2657:
2654:
2650:
2646:
2643:
2639:
2634:
2633:
2630:
2627:
2624:
2621:
2618:
2616:
2613:
2610:
2607:
2604:
2603:
2601:
2596:
2592:
2586:
2582:
2578:
2575:
2571:
2567:
2564:
2538:
2518:
2513:
2509:
2505:
2502:
2499:
2490:it is roughly
2479:
2459:
2456:
2453:
2444:it is roughly
2433:
2414:
2413:
2402:
2394:
2391:
2387:
2383:
2380:
2376:
2371:
2363:
2359:
2355:
2352:
2346:
2342:
2336:
2333:
2330:
2327:
2323:
2320:
2314:
2311:
2306:
2302:
2298:
2295:
2292:
2289:
2286:
2283:
2280:
2277:
2274:
2271:
2250:Main article:
2247:
2244:
2227:
2224:
2221:
2218:
2215:
2204:
2203:
2192:
2189:
2186:
2183:
2180:
2177:
2174:
2171:
2168:
2165:
2162:
2158:
2155:
2151:
2148:
2145:
2142:
2139:
2136:
2132:
2129:
2117:
2116:
2105:
2102:
2099:
2096:
2093:
2090:
2087:
2084:
2081:
2078:
2075:
2072:
2069:
2055:swish function
2049:Swish function
2047:Main article:
2044:
2041:
2009:
2006:
2003:
2000:
1997:
1994:
1991:
1988:
1985:
1982:
1979:
1968:
1967:
1956:
1953:
1950:
1947:
1944:
1941:
1938:
1935:
1932:
1928:
1925:
1921:
1918:
1915:
1912:
1909:
1906:
1902:
1899:
1887:
1886:
1875:
1872:
1869:
1866:
1863:
1860:
1857:
1854:
1851:
1848:
1845:
1842:
1827:
1824:
1822:
1819:
1815:
1814:
1803:
1800:
1797:
1794:
1791:
1788:
1785:
1782:
1779:
1776:
1773:
1770:
1754:Note that for
1752:
1751:
1738:
1733:
1725:
1723:
1720:
1719:
1716:
1713:
1710:
1707:
1699:
1697:
1694:
1693:
1691:
1686:
1683:
1680:
1677:
1673:
1670:
1661:
1656:
1648:
1646:
1643:
1640:
1637:
1636:
1633:
1630:
1627:
1624:
1616:
1614:
1611:
1610:
1608:
1603:
1600:
1597:
1594:
1591:
1576:
1573:
1572:
1571:
1558:
1553:
1545:
1543:
1540:
1539:
1536:
1533:
1530:
1527:
1519:
1517:
1514:
1513:
1511:
1506:
1503:
1500:
1497:
1493:
1490:
1482:
1477:
1469:
1467:
1464:
1461:
1460:
1457:
1454:
1451:
1448:
1440:
1438:
1435:
1434:
1432:
1427:
1424:
1421:
1418:
1415:
1400:
1397:
1395:
1392:
1390:
1387:
1386:
1385:
1373:
1370:
1366:
1357:
1354:
1337:
1336:
1324:
1321:
1318:
1310:
1307:
1304:
1301:
1298:
1295:
1292:
1289:
1286:
1283:
1280:
1277:
1274:
1271:
1268:
1253:
1250:
1243:
1238:
1235:
1159:
1148:
1147:
1134:
1129:
1121:
1119:
1116:
1115:
1112:
1109:
1106:
1103:
1095:
1093:
1090:
1089:
1087:
1082:
1077:
1072:
1068:
1064:
1060:
1057:
1051:
1048:
1045:
1042:
1039:
1036:
1033:
1030:
1025:
1021:
1017:
1014:
1011:
1008:
1005:
959:
958:
956:
955:
948:
941:
933:
930:
929:
926:
925:
920:
919:
918:
908:
902:
899:
898:
895:
894:
891:
890:
885:
880:
875:
870:
865:
860:
854:
851:
850:
847:
846:
843:
842:
837:
832:
827:
825:Occam learning
822:
817:
812:
807:
801:
798:
797:
794:
793:
790:
789:
784:
782:Learning curve
779:
774:
768:
765:
764:
761:
760:
757:
756:
751:
746:
741:
735:
732:
731:
728:
727:
724:
723:
722:
721:
711:
706:
701:
695:
690:
689:
686:
685:
682:
681:
675:
670:
665:
660:
659:
658:
648:
643:
642:
641:
636:
631:
626:
616:
611:
606:
601:
600:
599:
589:
588:
587:
582:
577:
572:
562:
557:
552:
546:
541:
540:
537:
536:
533:
532:
527:
522:
514:
508:
503:
502:
499:
498:
495:
494:
493:
492:
487:
482:
471:
466:
465:
462:
461:
458:
457:
452:
447:
442:
437:
432:
427:
422:
417:
411:
406:
405:
402:
401:
398:
397:
392:
387:
381:
376:
371:
363:
358:
353:
347:
342:
341:
338:
337:
334:
333:
328:
323:
318:
313:
308:
303:
298:
290:
289:
288:
283:
278:
268:
266:Decision trees
263:
257:
243:classification
233:
232:
231:
228:
227:
224:
223:
218:
213:
208:
203:
198:
193:
188:
183:
178:
173:
168:
163:
158:
153:
148:
143:
138:
136:Classification
132:
129:
128:
125:
124:
121:
120:
115:
110:
105:
100:
95:
93:Batch learning
90:
85:
80:
75:
70:
65:
60:
54:
51:
50:
47:
46:
35:
34:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
6006:
5995:
5992:
5991:
5989:
5972:
5969:
5967:
5964:
5963:
5956:
5952:
5949:
5947:
5944:
5943:
5940:
5936:
5935:
5932:
5926:
5923:
5921:
5918:
5916:
5913:
5911:
5908:
5906:
5903:
5901:
5898:
5896:
5893:
5891:
5888:
5886:
5883:
5881:
5878:
5876:
5873:
5871:
5868:
5866:
5863:
5861:
5858:
5856:
5853:
5852:
5850:
5848:Architectures
5846:
5840:
5837:
5835:
5832:
5830:
5827:
5825:
5822:
5820:
5817:
5815:
5812:
5810:
5807:
5805:
5802:
5800:
5797:
5796:
5794:
5792:Organizations
5790:
5784:
5781:
5779:
5776:
5774:
5771:
5769:
5766:
5764:
5761:
5759:
5756:
5754:
5751:
5749:
5746:
5744:
5741:
5739:
5736:
5734:
5731:
5729:
5728:Yoshua Bengio
5726:
5725:
5723:
5719:
5709:
5708:Robot control
5706:
5702:
5699:
5698:
5697:
5694:
5692:
5689:
5687:
5684:
5682:
5679:
5677:
5674:
5672:
5669:
5667:
5664:
5662:
5659:
5658:
5656:
5652:
5646:
5643:
5641:
5638:
5636:
5633:
5631:
5628:
5626:
5625:Chinchilla AI
5623:
5621:
5618:
5616:
5613:
5611:
5608:
5606:
5603:
5601:
5598:
5596:
5593:
5591:
5588:
5586:
5583:
5581:
5578:
5576:
5573:
5571:
5568:
5564:
5561:
5560:
5559:
5556:
5554:
5551:
5549:
5546:
5544:
5541:
5539:
5536:
5534:
5531:
5530:
5528:
5524:
5518:
5515:
5511:
5508:
5506:
5503:
5502:
5501:
5498:
5494:
5491:
5489:
5486:
5484:
5481:
5480:
5479:
5476:
5474:
5471:
5469:
5466:
5464:
5461:
5459:
5456:
5454:
5451:
5449:
5446:
5444:
5441:
5439:
5436:
5434:
5431:
5430:
5428:
5424:
5421:
5417:
5411:
5408:
5406:
5403:
5401:
5398:
5396:
5393:
5391:
5388:
5386:
5383:
5381:
5378:
5377:
5375:
5371:
5365:
5362:
5360:
5357:
5355:
5352:
5350:
5347:
5345:
5342:
5341:
5339:
5335:
5327:
5324:
5323:
5322:
5319:
5317:
5314:
5312:
5309:
5305:
5304:Deep learning
5302:
5301:
5300:
5297:
5293:
5290:
5289:
5288:
5285:
5284:
5282:
5278:
5272:
5269:
5267:
5264:
5260:
5257:
5256:
5255:
5252:
5250:
5247:
5243:
5240:
5238:
5235:
5233:
5230:
5229:
5228:
5225:
5223:
5220:
5218:
5215:
5213:
5210:
5208:
5205:
5203:
5200:
5198:
5195:
5193:
5192:Hallucination
5190:
5186:
5183:
5182:
5181:
5178:
5176:
5173:
5169:
5166:
5165:
5164:
5161:
5160:
5158:
5154:
5148:
5145:
5143:
5140:
5138:
5135:
5133:
5130:
5128:
5125:
5123:
5120:
5118:
5115:
5113:
5110:
5108:
5107:
5103:
5102:
5100:
5098:
5094:
5085:
5080:
5078:
5073:
5071:
5066:
5065:
5062:
5051:
5046:
5039:
5036:
5025:
5021:
5014:
5012:
5008:
5002:
4997:
4990:
4987:
4975:
4971:
4965:
4962:
4958:
4956:
4950:
4943:
4936:
4933:
4920:
4915:
4908:
4907:
4899:
4897:
4893:
4887:
4882:
4875:
4873:
4869:
4863:
4858:
4854:
4847:
4845:
4841:
4836:
4832:
4828:
4822:
4818:
4814:
4810:
4809:
4801:
4798:
4792:
4787:
4780:
4777:
4772:
4768:
4764:
4760:
4756:
4752:
4747:
4742:
4739:(4): 041030.
4738:
4734:
4727:
4724:
4719:
4715:
4710:
4705:
4701:
4697:
4693:
4690:
4689:
4684:
4677:
4674:
4670:
4664:
4662:
4658:
4653:
4646:
4645:
4637:
4634:
4628:
4623:
4616:
4613:
4608:
4601:
4597:
4593:
4589:
4583:
4580:
4576:
4569:
4568:
4563:
4562:Yoshua Bengio
4556:
4554:
4552:
4550:
4548:
4544:
4539:
4532:
4529:
4524:
4520:
4516:
4512:
4508:
4504:
4500:
4496:
4492:
4488:
4487:
4479:
4476:
4470:
4465:
4461:
4455:
4452:
4447:
4435:
4427:
4421:
4417:
4413:
4409:
4402:
4399:
4394:
4390:
4386:
4382:
4375:
4372:
4360:
4356:
4349:
4346:
4334:
4330:
4323:
4320:
4314:
4310:
4307:
4305:
4302:
4300:
4297:
4295:
4292:
4291:
4287:
4285:
4271:
4263:
4259:
4238:
4214:
4208:
4185:
4179:
4171:
4167:
4163:
4162:metallic mean
4147:
4144:
4141:
4121:
4118:
4115:
4095:
4092:
4089:
4069:
4066:
4063:
4038:
4032:
4029:
4024:
4020:
4014:
4011:
4005:
3999:
3993:
3988:
3984:
3976:
3975:
3974:
3968:
3966:
3964:
3960:
3956:
3952:
3947:
3945:
3926:
3920:
3917:
3909:
3890:
3884:
3881:
3858:
3845:
3839:
3836:
3826:
3823:
3820:
3817:
3811:
3805:
3798:
3797:
3796:
3790:
3788:
3774:
3751:
3748:
3745:
3742:
3733:
3727:
3721:
3712:
3698:
3695:
3692:
3684:
3668:
3638:
3626:
3622:
3618:
3615:
3608:
3605:
3602:
3599:
3589:
3583:
3578:
3572:
3565:
3562:
3549:
3538:
3534:
3531:
3526:
3522:
3517:
3513:
3506:
3503:
3500:
3497:
3487:
3481:
3476:
3470:
3464:
3457:
3456:
3455:
3449:
3447:
3445:
3426:
3416:
3412:
3407:
3403:
3400:
3397:
3390:
3386:
3381:
3374:
3371:
3368:
3360:
3356:
3352:
3349:
3346:
3341:
3337:
3330:
3327:
3320:
3319:
3318:
3301:
3291:
3287:
3282:
3278:
3275:
3272:
3265:
3261:
3256:
3252:
3249:
3243:
3240:
3237:
3229:
3225:
3221:
3218:
3215:
3210:
3206:
3202:
3199:
3193:
3190:
3187:
3179:
3175:
3171:
3168:
3165:
3160:
3156:
3149:
3144:
3137:
3117:
3116:
3115:
3113:
3108:
3106:
3102:
3099:The logistic
3097:
3095:
3076:
3068:
3065:
3062:
3058:
3054:
3051:
3047:
3042:
3034:
3031:
3027:
3023:
3020:
3014:
3011:
3007:
3001:
2995:
2988:
2985:
2979:
2974:
2965:
2962:
2958:
2954:
2951:
2945:
2942:
2936:
2930:
2924:
2917:
2916:
2915:
2901:
2871:
2868:
2865:
2860:
2852:
2849:
2845:
2841:
2838:
2834:
2825:
2822:
2819:
2816:
2811:
2808:
2802:
2797:
2789:
2785:
2781:
2778:
2772:
2767:
2763:
2755:
2754:
2753:
2736:
2730:
2727:
2724:
2721:
2718:
2688:
2685:
2682:
2677:
2669:
2666:
2663:
2659:
2655:
2652:
2648:
2644:
2641:
2637:
2628:
2625:
2622:
2619:
2614:
2611:
2608:
2605:
2599:
2594:
2590:
2584:
2580:
2576:
2573:
2569:
2565:
2562:
2555:
2554:
2553:
2550:
2536:
2511:
2507:
2500:
2497:
2477:
2457:
2454:
2451:
2431:
2423:
2419:
2400:
2392:
2389:
2385:
2381:
2378:
2374:
2369:
2361:
2357:
2353:
2350:
2344:
2340:
2334:
2328:
2321:
2318:
2312:
2304:
2300:
2296:
2293:
2287:
2284:
2281:
2275:
2269:
2262:
2261:
2260:
2259:
2253:
2245:
2243:
2241:
2222:
2216:
2213:
2190:
2184:
2178:
2175:
2172:
2166:
2160:
2156:
2153:
2149:
2146:
2143:
2137:
2130:
2127:
2119:
2118:
2103:
2097:
2091:
2088:
2085:
2082:
2079:
2073:
2067:
2060:
2059:
2058:
2056:
2050:
2042:
2040:
2038:
2034:
2029:
2027:
2023:
2004:
2001:
1998:
1992:
1989:
1983:
1954:
1948:
1939:
1933:
1926:
1919:
1916:
1913:
1907:
1900:
1897:
1889:
1888:
1873:
1867:
1858:
1855:
1852:
1846:
1840:
1833:
1832:
1831:
1825:
1820:
1818:
1798:
1795:
1792:
1789:
1780:
1774:
1768:
1761:
1760:
1759:
1757:
1731:
1721:
1714:
1711:
1708:
1705:
1695:
1689:
1684:
1678:
1671:
1668:
1654:
1644:
1641:
1638:
1631:
1628:
1625:
1622:
1612:
1606:
1601:
1595:
1589:
1582:
1581:
1580:
1574:
1551:
1541:
1534:
1531:
1528:
1525:
1515:
1509:
1504:
1498:
1491:
1488:
1475:
1465:
1462:
1455:
1452:
1449:
1446:
1436:
1430:
1425:
1419:
1413:
1406:
1405:
1404:
1398:
1393:
1388:
1383:
1379:
1374:
1371:
1367:
1364:
1360:
1359:
1355:
1353:
1351:
1347:
1343:
1322:
1319:
1316:
1305:
1302:
1299:
1290:
1287:
1281:
1278:
1275:
1272:
1258:
1254:
1251:
1248:
1244:
1241:
1240:
1236:
1234:
1232:
1228:
1224:
1220:
1215:
1213:
1209:
1205:
1201:
1197:
1193:
1189:
1185:
1181:
1177:
1173:
1172:ramp function
1157:
1127:
1117:
1110:
1107:
1104:
1101:
1091:
1085:
1080:
1075:
1066:
1058:
1055:
1049:
1043:
1040:
1037:
1028:
1023:
1019:
1015:
1009:
1003:
996:
995:
994:
992:
988:
984:
980:
971:
965:
954:
949:
947:
942:
940:
935:
934:
932:
931:
924:
921:
917:
914:
913:
912:
909:
907:
904:
903:
897:
896:
889:
886:
884:
881:
879:
876:
874:
871:
869:
866:
864:
861:
859:
856:
855:
849:
848:
841:
838:
836:
833:
831:
828:
826:
823:
821:
818:
816:
813:
811:
808:
806:
803:
802:
796:
795:
788:
785:
783:
780:
778:
775:
773:
770:
769:
763:
762:
755:
752:
750:
747:
745:
744:Crowdsourcing
742:
740:
737:
736:
730:
729:
720:
717:
716:
715:
712:
710:
707:
705:
702:
700:
697:
696:
693:
688:
687:
679:
676:
674:
673:Memtransistor
671:
669:
666:
664:
661:
657:
654:
653:
652:
649:
647:
644:
640:
637:
635:
632:
630:
627:
625:
622:
621:
620:
617:
615:
612:
610:
607:
605:
602:
598:
595:
594:
593:
590:
586:
583:
581:
578:
576:
573:
571:
568:
567:
566:
563:
561:
558:
556:
555:Deep learning
553:
551:
548:
547:
544:
539:
538:
531:
528:
526:
523:
521:
519:
515:
513:
510:
509:
506:
501:
500:
491:
490:Hidden Markov
488:
486:
483:
481:
478:
477:
476:
473:
472:
469:
464:
463:
456:
453:
451:
448:
446:
443:
441:
438:
436:
433:
431:
428:
426:
423:
421:
418:
416:
413:
412:
409:
404:
403:
396:
393:
391:
388:
386:
382:
380:
377:
375:
372:
370:
368:
364:
362:
359:
357:
354:
352:
349:
348:
345:
340:
339:
332:
329:
327:
324:
322:
319:
317:
314:
312:
309:
307:
304:
302:
299:
297:
295:
291:
287:
286:Random forest
284:
282:
279:
277:
274:
273:
272:
269:
267:
264:
262:
259:
258:
251:
250:
245:
244:
236:
230:
229:
222:
219:
217:
214:
212:
209:
207:
204:
202:
199:
197:
194:
192:
189:
187:
184:
182:
179:
177:
174:
172:
171:Data cleaning
169:
167:
164:
162:
159:
157:
154:
152:
149:
147:
144:
142:
139:
137:
134:
133:
127:
126:
119:
116:
114:
111:
109:
106:
104:
101:
99:
96:
94:
91:
89:
86:
84:
83:Meta-learning
81:
79:
76:
74:
71:
69:
66:
64:
61:
59:
56:
55:
49:
48:
45:
40:
36:
32:
31:
19:
5814:Hugging Face
5778:David Silver
5426:Audio–visual
5280:Applications
5259:Augmentation
5241:
5104:
5038:
5027:. Retrieved
5023:
4989:
4978:. Retrieved
4973:
4964:
4954:
4952:
4948:
4935:
4923:, retrieved
4919:1908.08681v1
4905:
4852:
4807:
4800:
4779:
4736:
4732:
4726:
4691:
4688:J. Neurosci.
4686:
4676:
4643:
4636:
4615:
4606:
4582:
4574:
4566:
4540:. NIPS 2001.
4537:
4531:
4490:
4484:
4478:
4454:
4407:
4401:
4384:
4380:
4374:
4362:. Retrieved
4358:
4348:
4336:. Retrieved
4332:
4322:
4055:
3972:
3949:Mish is non-
3948:
3873:
3794:
3713:
3660:
3453:
3441:
3316:
3109:
3098:
3091:
2893:
2710:
2551:
2421:
2417:
2415:
2255:
2205:
2052:
2032:
2030:
1969:
1829:
1816:
1755:
1753:
1578:
1402:
1381:
1376:form of the
1346:unsupervised
1338:
1216:
1149:
986:
982:
976:
969:
830:PAC learning
517:
366:
361:Hierarchical
293:
247:
241:
5962:Categories
5910:Autoencoder
5865:Transformer
5733:Alex Graves
5681:OpenAI Five
5585:IBM Watsonx
5207:Convolution
5185:Overfitting
4609:. Springer.
4592:Leon Bottou
4573:. AISTATS.
4304:Tobit model
4168:, strictly
4160:yields the
1257:homogeneous
714:Multi-agent
651:Transformer
550:Autoencoder
306:Naive Bayes
44:data mining
5951:Technology
5804:EleutherAI
5763:Fei-Fei Li
5758:Yann LeCun
5671:Q-learning
5654:Decisional
5580:IBM Watson
5488:Midjourney
5380:TensorFlow
5227:Activation
5180:Regression
5175:Clustering
5050:2112.11687
5029:2022-07-11
5001:1511.07289
4980:2018-12-04
4886:1606.08415
4862:1502.01852
4791:2006.02427
4746:1508.06486
4627:1710.05941
4588:Yann LeCun
4469:2212.11279
4315:References
4284:is large.
3985:squareplus
3969:Squareplus
3955:self-gated
3946:function.
2422:SmoothReLU
1399:Leaky ReLU
1372:Unbounded.
1363:derivative
1342:supervised
1237:Advantages
1192:biological
699:Q-learning
597:Restricted
395:Mean shift
344:Clustering
321:Perceptron
249:regression
151:Clustering
146:Regression
5834:MIT CSAIL
5799:Anthropic
5768:Andrew Ng
5666:AlphaZero
5510:VideoPoet
5473:AlphaFold
5410:MindSpore
5364:SpiNNaker
5359:Memristor
5266:Diffusion
5242:Rectifier
5222:Batchnorm
5202:Attention
5197:Adversary
4444:ignored (
4434:cite book
4243:∞
4230:, and is
4218:∞
4212:→
4189:∞
4186:−
4183:→
4166:monotonic
4067:≥
3994:
3951:monotonic
3921:
3885:
3840:
3827:
3743:−
3696:≥
3635:otherwise
3619:⋅
3546:otherwise
3532:−
3401:⋯
3375:
3350:…
3331:
3276:⋯
3244:
3219:…
3194:
3169:…
3150:
3112:LogSumExp
3063:−
2946:
2869:≠
2850:−
2842:−
2798:≈
2773:
2731:
2686:≠
2667:
2653:−
2645:−
2609:
2595:≈
2566:
2501:
2455:
2390:−
2288:
2217:
2179:
2161:
2150:⋅
2092:
2086:⋅
2002:⩽
1978:Φ
1943:Φ
1924:Φ
1920:⋅
1862:Φ
1859:⋅
1728:otherwise
1651:otherwise
1642:⋅
1548:otherwise
1472:otherwise
1320:≥
1124:otherwise
983:rectifier
858:ECML PKDD
840:VC theory
787:ROC curve
719:Self-play
639:DeepDream
480:Bayes net
271:Ensembles
52:Paradigms
5988:Category
5942:Portals
5701:Auto-GPT
5533:Word2vec
5337:Hardware
5254:Datasets
5156:Concepts
4925:26 March
4718:12077207
4598:(1998).
4564:(2011).
4515:10879535
4288:See also
4170:positive
3944:softplus
3918:softplus
3837:softplus
3596:if
3566:′
3494:if
2989:′
2418:softplus
2322:′
2252:Softplus
2246:Softplus
2157:′
2131:′
1927:′
1901:′
1702:if
1672:′
1619:if
1522:if
1492:′
1443:if
1389:Variants
1098:if
281:Boosting
130:Problems
5824:Meta AI
5661:AlphaGo
5645:PanGu-ÎŁ
5615:ChatGPT
5590:Granite
5538:Seq2seq
5517:Whisper
5438:WaveNet
5433:AlexNet
5405:Flux.jl
5385:PyTorch
5237:Sigmoid
5232:Softmax
5097:General
5024:W&B
4835:1304548
4771:7813832
4751:Bibcode
4709:6757721
4523:4399014
4495:Bibcode
4364:8 April
4338:8 April
3942:is the
3906:is the
3444:softmax
2238:is the
2214:sigmoid
2176:sigmoid
2154:sigmoid
2089:sigmoid
2020:is the
1182:. This
863:NeurIPS
680:(ECRAM)
634:AlexNet
276:Bagging
5839:Huawei
5819:OpenAI
5721:People
5691:MuZero
5553:Gemini
5548:Claude
5483:DALL-E
5395:Theano
4976:. 2017
4833:
4823:
4769:
4716:
4706:
4652:ICASSP
4521:
4513:
4486:Nature
4422:
4359:Medium
4258:smooth
4056:where
3910:, and
3874:where
2206:where
1970:where
1225:using
1202:; see
1150:where
989:is an
981:, the
656:Vision
512:RANSAC
390:OPTICS
385:DBSCAN
369:-means
176:AutoML
5905:Mamba
5676:SARSA
5640:LLaMA
5635:BLOOM
5620:GPT-J
5610:GPT-4
5605:GPT-3
5600:GPT-2
5595:GPT-1
5558:LaMDA
5390:Keras
5045:arXiv
4996:arXiv
4945:(PDF)
4914:arXiv
4910:(PDF)
4881:arXiv
4857:arXiv
4831:S2CID
4786:arXiv
4767:S2CID
4741:arXiv
4648:(PDF)
4622:arXiv
4603:(PDF)
4571:(PDF)
4519:S2CID
4464:arXiv
3959:Swish
3681:is a
1369:this.
1340:deep
878:IJCAI
704:SARSA
663:Mamba
629:LeNet
624:U-Net
450:t-SNE
374:Fuzzy
351:BIRCH
5829:Mila
5630:PaLM
5563:Bard
5543:BERT
5526:Text
5505:Sora
4927:2022
4821:ISBN
4714:PMID
4511:PMID
4446:help
4420:ISBN
4366:2021
4340:2021
3963:ReLU
3953:and
3882:tanh
3824:tanh
3791:Mish
3603:>
3501:>
2043:SiLU
2037:BERT
1709:>
1626:>
1542:0.01
1529:>
1463:0.01
1450:>
1229:and
1221:and
1105:>
888:JMLR
873:ICLR
868:ICML
754:RLHF
570:LSTM
356:CURE
42:and
18:ReLU
5570:NMT
5453:OCR
5448:HWR
5400:JAX
5354:VPU
5349:TPU
5344:IPU
5168:SGD
4853:Net
4813:doi
4759:doi
4704:PMC
4696:doi
4503:doi
4491:405
4412:doi
4389:doi
3737:max
3450:ELU
3328:LSE
3191:LSE
2764:log
2549:.
2420:or
1784:max
1294:max
1267:max
1259:):
1178:in
1032:max
985:or
972:= 0
614:SOM
604:GAN
580:ESN
575:GRU
520:-NN
455:SDL
445:PGD
440:PCA
435:NMF
430:LDA
425:ICA
420:CCA
296:-NN
5990::
5022:.
5010:^
4972:.
4947:.
4912:,
4895:^
4871:^
4843:^
4829:.
4819:.
4765:.
4757:.
4749:.
4735:.
4712:.
4702:.
4692:22
4685:.
4660:^
4650:.
4590:;
4546:^
4517:.
4509:.
4501:.
4489:.
4438::
4436:}}
4432:{{
4418:.
4383:.
4357:.
4331:.
3965:.
3787:.
3711:.
3372:ln
3241:ln
3188::=
3107:.
3096:.
2943:ln
2872:0.
2728:ln
2664:ln
2606:ln
2563:ln
2498:ln
2452:ln
2285:ln
2242:.
2039:.
2028:.
1233:.
1214:.
883:ML
5083:e
5076:t
5069:v
5053:.
5047::
5032:.
5004:.
4998::
4983:.
4955:h
4930:.
4916::
4889:.
4883::
4865:.
4859::
4837:.
4815::
4794:.
4788::
4773:.
4761::
4753::
4743::
4737:5
4720:.
4698::
4671:.
4654:.
4630:.
4624::
4525:.
4505::
4497::
4472:.
4466::
4448:)
4428:.
4414::
4395:.
4391::
4385:5
4368:.
4342:.
4272:x
4239:C
4215:+
4209:x
4180:x
4148:4
4145:=
4142:b
4122:0
4119:=
4116:b
4096:0
4093:=
4090:x
4070:0
4064:b
4039:2
4033:b
4030:+
4025:2
4021:x
4015:+
4012:x
4006:=
4003:)
4000:x
3997:(
3989:b
3930:)
3927:x
3924:(
3894:)
3891:x
3888:(
3859:,
3854:)
3849:)
3846:x
3843:(
3832:(
3821:x
3818:=
3815:)
3812:x
3809:(
3806:f
3775:a
3755:)
3752:x
3749:,
3746:a
3740:(
3734:=
3731:)
3728:x
3725:(
3722:f
3699:0
3693:a
3669:a
3639:.
3627:x
3623:e
3616:a
3609:,
3606:0
3600:x
3590:1
3584:{
3579:=
3576:)
3573:x
3570:(
3563:f
3550:.
3539:)
3535:1
3527:x
3523:e
3518:(
3514:a
3507:,
3504:0
3498:x
3488:x
3482:{
3477:=
3474:)
3471:x
3468:(
3465:f
3427:,
3424:)
3417:n
3413:x
3408:e
3404:+
3398:+
3391:1
3387:x
3382:e
3378:(
3369:=
3366:)
3361:n
3357:x
3353:,
3347:,
3342:1
3338:x
3334:(
3302:.
3299:)
3292:n
3288:x
3283:e
3279:+
3273:+
3266:1
3262:x
3257:e
3253:+
3250:1
3247:(
3238:=
3235:)
3230:n
3226:x
3222:,
3216:,
3211:1
3207:x
3203:,
3200:0
3197:(
3185:)
3180:n
3176:x
3172:,
3166:,
3161:1
3157:x
3153:(
3145:+
3138:0
3134:E
3130:S
3127:L
3077:.
3069:x
3066:k
3059:e
3055:+
3052:1
3048:1
3043:=
3035:x
3032:k
3028:e
3024:+
3021:1
3015:x
3012:k
3008:e
3002:=
2999:)
2996:x
2993:(
2986:f
2980:,
2975:k
2971:)
2966:x
2963:k
2959:e
2955:+
2952:1
2949:(
2937:=
2934:)
2931:x
2928:(
2925:f
2902:k
2866:y
2861:,
2853:y
2846:e
2839:1
2835:y
2826:,
2823:0
2820:=
2817:y
2812:,
2809:1
2803:{
2795:)
2790:y
2786:2
2782:+
2779:1
2776:(
2768:2
2740:)
2737:2
2734:(
2725:y
2722:=
2719:x
2689:0
2683:x
2678:,
2670:2
2660:/
2656:x
2649:e
2642:1
2638:x
2629:,
2626:0
2623:=
2620:x
2615:,
2612:2
2600:{
2591:)
2585:x
2581:e
2577:+
2574:1
2570:(
2537:x
2517:)
2512:x
2508:e
2504:(
2478:x
2458:1
2432:x
2401:,
2393:x
2386:e
2382:+
2379:1
2375:1
2370:=
2362:x
2358:e
2354:+
2351:1
2345:x
2341:e
2335:=
2332:)
2329:x
2326:(
2319:f
2313:,
2310:)
2305:x
2301:e
2297:+
2294:1
2291:(
2282:=
2279:)
2276:x
2273:(
2270:f
2226:)
2223:x
2220:(
2191:,
2188:)
2185:x
2182:(
2173:+
2170:)
2167:x
2164:(
2147:x
2144:=
2141:)
2138:x
2135:(
2128:f
2104:,
2101:)
2098:x
2095:(
2083:x
2080:=
2077:)
2074:x
2071:(
2068:f
2033:x
2008:)
2005:x
1999:X
1996:(
1993:P
1990:=
1987:)
1984:x
1981:(
1955:,
1952:)
1949:x
1946:(
1940:+
1937:)
1934:x
1931:(
1917:x
1914:=
1911:)
1908:x
1905:(
1898:f
1874:,
1871:)
1868:x
1865:(
1856:x
1853:=
1850:)
1847:x
1844:(
1841:f
1802:)
1799:x
1796:a
1793:,
1790:x
1787:(
1781:=
1778:)
1775:x
1772:(
1769:f
1756:a
1732:.
1722:a
1715:,
1712:0
1706:x
1696:1
1690:{
1685:=
1682:)
1679:x
1676:(
1669:f
1655:.
1645:x
1639:a
1632:,
1629:0
1623:x
1613:x
1607:{
1602:=
1599:)
1596:x
1593:(
1590:f
1552:.
1535:,
1532:0
1526:x
1516:1
1510:{
1505:=
1502:)
1499:x
1496:(
1489:f
1476:.
1466:x
1456:,
1453:0
1447:x
1437:x
1431:{
1426:=
1423:)
1420:x
1417:(
1414:f
1382:x
1335:.
1323:0
1317:a
1309:)
1306:x
1303:,
1300:0
1297:(
1291:a
1288:=
1285:)
1282:x
1279:a
1276:,
1273:0
1270:(
1158:x
1128:,
1118:0
1111:,
1108:0
1102:x
1092:x
1086:{
1081:=
1076:2
1071:|
1067:x
1063:|
1059:+
1056:x
1050:=
1047:)
1044:x
1041:,
1038:0
1035:(
1029:=
1024:+
1020:x
1016:=
1013:)
1010:x
1007:(
1004:f
970:x
952:e
945:t
938:v
518:k
367:k
294:k
252:)
240:(
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.