Knowledge (XXG)

Flow-based generative model

Source đź“ť

1003: 6141: 6545: 5172: 1674:, and are trained to minimize the negative log-likelihood of data samples from the target distribution. These architectures are usually designed such that only the forward pass of the neural network is required in both the inverse and the Jacobian determinant calculations. Examples of such architectures include NICE, RealNVP, and Glow. 5855: 4713: 7361:, this allows "free-form" Jacobian. Here, "free-form" means that there is no restriction on the Jacobian's form. It is contrasted with previous discrete models of normalizing flow, where the Jacobian is carefully designed to be only upper- or lower-diagonal, so that the Jacobian can be evaluated efficiently. 6275: 8073:
Despite normalizing flows success in estimating high-dimensional densities, some downsides still exist in their designs. First of all, their latent space where input data is projected onto is not a lower-dimensional space and therefore, flow-based models do not allow for compression of data by
8016: 4961: 8064:
are hyperparameters. The first term punishes the model for oscillating the flow field over time, and the second term punishes it for oscillating the flow field over space. Both terms together guide the model into a flow that is smooth (not "bumpy") over space and time.
8077:
Flow-based models are also notorious for failing in estimating the likelihood of out-of-distribution samples (i.e.: samples that were not drawn from the same distribution as the training set). Some hypotheses were formulated to explain this phenomenon, among which the
3035: 1561: 7731:
By adding extra dimensions, the CNF gains enough freedom to reverse orientation and go beyond ambient isotopy (just like how one can pick up a polygon from a desk and flip it around in 3-space, or unknot a knot in 4-space), yielding the "augmented neural ODE".
4411: 4538: 2265: 2666: 3849: 3615: 7154: 2504: 1978: 5314: 7840: 6136:{\displaystyle {\begin{aligned}x_{1}\sim &N(\mu _{1},\sigma _{1}^{2})\\x_{2}\sim &N(\mu _{2}(x_{1}),\sigma _{2}(x_{1})^{2})\\&\cdots \\x_{n}\sim &N(\mu _{n}(x_{1:n-1}),\sigma _{n}(x_{1:n-1})^{2})\\\end{aligned}}} 8093:
map. This property is given by constraints in the design of the models (cf.: RealNVP, Glow) which guarantee theoretical invertibility. The integrity of the inverse is important in order to ensure the applicability of the
6693: 7319: 3736: 3327: 6540:{\displaystyle {\begin{aligned}x_{1}=&\mu _{1}+\sigma _{1}z_{1}\\x_{2}=&\mu _{2}(x_{1})+\sigma _{2}(x_{1})z_{2}\\&\cdots \\x_{n}=&\mu _{n}(x_{1:n-1})+\sigma _{n}(x_{1:n-1})z_{n}\\\end{aligned}}} 6949: 3352:
we want the model to learn, which only leaves the expectation of the negative log-likelihood to minimize under the target distribution. This intractable term can be approximated with a Monte-Carlo method by
4021: 6263: 4214: 8151: 6197: 5745: 4892: 2059: 5388: 5167:{\displaystyle x={\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}=f_{\theta }(z)={\begin{bmatrix}z_{1}\\e^{s_{\theta }(z_{1})}\odot z_{2}\end{bmatrix}}+{\begin{bmatrix}0\\m_{\theta }(z_{1})\end{bmatrix}}} 8949:
Kumar, Manoj; Babaeizadeh, Mohammad; Erhan, Dumitru; Finn, Chelsea; Levine, Sergey; Dinh, Laurent; Kingma, Durk (2019). "VideoFlow: A Conditional Flow-Based Model for Stochastic Video Generation".
2851: 1377: 6280: 5860: 5519: 8062: 7407: 866: 5612: 8569:
Grathwohl, Will; Chen, Ricky T. Q.; Bettencourt, Jesse; Sutskever, Ilya; Duvenaud, David (2018-10-22). "FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models".
904: 4481: 3949: 2332: 4533: 1798: 8527:
Grathwohl, Will; Chen, Ricky T. Q.; Bettencourt, Jesse; Sutskever, Ilya; Duvenaud, David (2018). "FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models".
7442: 2839: 7640: 5657: 1191: 7800: 2117: 7359: 4257: 3407: 2778: 2515: 861: 7762: 6758: 6593: 5848: 4797: 4249: 4053: 3747: 4130: 3451: 1668: 1616: 1270: 851: 8865:
Behrmann, Jens; Vicol, Paul; Wang, Kuan-Chieh; Grosse, Roger; Jacobsen, Jörn-Henrik (2020). "Understanding and Mitigating Exploding Inverses in Invertible Neural Networks".
5421: 692: 8928:
Yang, Guandao; Huang, Xun; Hao, Zekun; Liu, Ming-Yu; Belongie, Serge; Hariharan, Bharath (2019). "PointFlow: 3D Point Cloud Generation with Continuous Normalizing Flows".
5784: 1082: 6723: 4740: 4708:{\displaystyle x={\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}=f_{\theta }(z)={\begin{bmatrix}z_{1}\\z_{2}\end{bmatrix}}+{\begin{bmatrix}0\\m_{\theta }(z_{1})\end{bmatrix}}} 3074: 2102: 1308: 1129: 6829: 4440: 3443: 7547: 4760: 3900: 3350: 2729: 8907:
Shi, Chence; Xu, Minkai; Zhu, Zhaocheng; Zhang, Weinan; Zhang, Ming; Tang, Jian (2020). "GraphAF: A Flow-based Autoregressive Model for Molecular Graph Generation".
7586: 6983: 2364: 1840: 6793: 4948: 3101: 2696: 1832: 1734: 1707: 1365: 1335: 1218: 1035: 8102:
of the map as well as sampling with the model. However, in practice this invertibility is violated and the inverse map explodes because of numerical imprecision.
7487: 4921: 899: 7831: 7726: 7706: 7667: 7177: 6972: 5804: 5441: 4086: 2352: 5179: 5813:
The idea of using the invertible 1x1 convolution is to permute all layers in general, instead of merely permuting the first and second half, as in Real NVP.
856: 707: 438: 939: 742: 8782:
Nalisnick, Eric; Matsukawa, Teh; Zhao, Yee Whye; Song, Zhao (2019). "Detecting Out-of-Distribution Inputs to Deep Generative Models Using Typicality".
979:
The direct modeling of likelihood provides many advantages. For example, the negative log-likelihood can be directly computed and minimized as the
4135: 818: 3332:
The second term on the right-hand side of the equation corresponds to the entropy of the target distribution and is independent of the parameter
367: 8011:{\displaystyle \lambda _{K}\int _{0}^{T}\left\|f(z_{t},t)\right\|^{2}dt+\lambda _{J}\int _{0}^{T}\left\|\nabla _{z}f(z_{t},t)\right\|_{F}^{2}dt} 6604: 7185: 3626: 8716:
Zhang, Han; Gao, Xi; Unterman, Jacob; Arodz, Tom (2019-07-30). "Approximation Capabilities of Neural ODEs and Invertible Residual Networks".
3109: 8970:
Rudolph, Marco; Wandt, Bastian; Rosenhahn, Bodo (2021). "Same Same But DifferNet: Semi-Supervised Defect Detection with Normalizing Flows".
876: 639: 174: 8740:
Helminger, Leonhard; Djelouah, Abdelaziz; Gross, Markus; Schroers, Christopher (2020). "Lossy Image Compression with Normalizing Flows".
8662: 894: 8548:
Lipman, Yaron; Chen, Ricky T. Q.; Ben-Hamu, Heli; Nickel, Maximilian; Le, Matt (2022-10-01). "Flow Matching for Generative Modeling".
6837: 5665: 727: 702: 651: 3957: 775: 770: 423: 6202: 6768:
Instead of constructing flow by function composition, another approach is to formulate the flow as a continuous-time dynamic. Let
5536: 6146: 433: 71: 8694: 8490: 8457: 4802: 3855: 3046: 983:. Additionally, novel samples can be generated by sampling from the initial distribution, and applying the flow transformation. 8099: 7807: 2066: 1986: 8886:
Ping, Wei; Peng, Kainan; Gorur, Dilan; Lakshminarayanan, Balaji (2019). "WaveFlow: A Compact Flow-based Model for Raw Audio".
828: 5319: 3030:{\displaystyle \log p_{K}(z_{K})=\log p_{0}(z_{0})-\sum _{i=1}^{K}\log \left|\det {\frac {df_{i}(z_{i-1})}{dz_{i-1}}}\right|} 1556:{\displaystyle \log p_{K}(z_{K})=\log p_{0}(z_{0})-\sum _{i=1}^{K}\log \left|\det {\frac {df_{i}(z_{i-1})}{dz_{i-1}}}\right|} 991: 932: 592: 413: 8761:
Nalisnick, Eric; Matsukawa, Teh; Zhao, Yee Whye; Song, Zhao (2018). "Do Deep Generative Models Know What They Don't Know?".
7684:(for example, it's impossible to flip a left-hand to a right-hand by continuous deforming of space, and it's impossible to 9024: 3859: 803: 505: 281: 8994: 8844:
Caterini, Anthony L.; Loaiza-Ganem, Gabriel (2022). "Entropic Issues in Likelihood-Based OOD Detection". pp. 21–26.
8337:
Papamakarios, George; Nalisnick, Eric; Rezende, Danilo Jimenez; Shakir, Mohamed; Balaji, Lakshminarayanan (March 2021).
8095: 8082:
hypothesis, estimation issues when training models, or fundamental issues due to the entropy of the data distributions.
5446: 1804: 973: 760: 697: 607: 585: 428: 418: 9019: 911: 823: 808: 269: 91: 798: 8021: 7371: 871: 548: 443: 231: 164: 124: 8230:
Papamakarios, George; Nalisnick, Eric; Jimenez Rezende, Danilo; Mohamed, Shakir; Bakshminarayanan, Balaji (2021).
9014: 7803: 6598:
The forward mapping is slow (because it's sequential), but the backward mapping is fast (because it's parallel).
925: 531: 299: 169: 8074:
default and require a lot of computation. However, it is still possible to perform image compression with them.
1618:
should be 1. easy to invert, and 2. easy to compute the determinant of its Jacobian. In practice, the functions
6760:
of MAF results in Inverse Autoregressive Flow (IAF), which has fast forward mapping and slow backward mapping.
4445: 3905: 2273: 2108: 965: 553: 473: 396: 314: 144: 106: 101: 61: 56: 4494: 6269: 3045:
As is generally done when training a deep learning model, the goal with normalizing flows is to minimize the
1739: 8266:
Dinh, Laurent; Krueger, David; Bengio, Yoshua (2014). "NICE: Non-linear Independent Components Estimation".
7643: 2260:{\displaystyle p_{1}(z_{1})=p_{0}(z_{0})\left|\det \left({\frac {df_{1}(z_{0})}{dz_{0}}}\right)^{-1}\right|} 500: 349: 249: 76: 7412: 2783: 8316:
Kingma, Diederik P.; Dhariwal, Prafulla (2018). "Glow: Generative Flow with Invertible 1x1 Convolutions".
7591: 5617: 4406:{\displaystyle |\det(I+h'(\langle w,z\rangle +b)uw^{T})|=|1+h'(\langle w,z\rangle +b)\langle u,w\rangle |} 2842: 2661:{\displaystyle \log p_{1}(z_{1})=\log p_{0}(z_{0})-\log \left|\det {\frac {df_{1}(z_{0})}{dz_{0}}}\right|} 1134: 680: 656: 558: 319: 294: 254: 66: 7767: 3844:{\displaystyle {\underset {\theta }{\operatorname {arg\,max} }}\ \sum _{i=0}^{N}\log(p_{\theta }(x_{i}))} 7327: 3610:{\displaystyle -{\hat {\mathbb {E} }}_{p^{*}(x)}=-{\frac {1}{N}}\sum _{i=0}^{N}\log(p_{\theta }(x_{i}))} 634: 456: 408: 264: 179: 51: 7833:, one can impose regularization losses. The paper proposed the following regularization loss based on 3360: 2734: 7738: 6728: 6550: 5824: 4767: 4219: 4029: 4091: 1621: 1569: 1223: 563: 513: 7673:
methods would be needed. Indeed, CNF was first proposed in the same paper that proposed neural ODE.
5393: 3354: 666: 602: 573: 478: 304: 237: 223: 209: 184: 134: 86: 46: 8489:
Kingma, Durk P; Salimans, Tim; Jozefowicz, Rafal; Chen, Xi; Sutskever, Ilya; Welling, Max (2016).
5750: 1044: 8971: 8950: 8929: 8908: 8887: 8866: 8845: 8783: 8762: 8741: 8717: 8673: 8668:. In Bengio, S.; Wallach, H.; Larochelle, H.; Grauman, K.; Cesa-Bianchi, N.; Garnett, R. (eds.). 8602: 8570: 8549: 8528: 8502: 8469: 8436: 8417: 8383: 8350: 8317: 8291: 8290:
Dinh, Laurent; Sohl-Dickstein, Jascha; Bengio, Samy (2016). "Density estimation using Real NVP".
8267: 8243: 8212: 8079: 6701: 4718: 3052: 644: 568: 354: 149: 8435:
Danilo Jimenez Rezende; Mohamed, Shakir (2015). "Variational Inference with Normalizing Flows".
2072: 1278: 8623: 7149:{\displaystyle z_{0}=F^{-1}(x)=z_{T}+\int _{T}^{0}f(z_{t},t)dt=z_{T}-\int _{0}^{T}f(z_{t},t)dt} 2499:{\displaystyle p_{1}(z_{1})=p_{0}(z_{0})\left|\det {\frac {df_{1}(z_{0})}{dz_{0}}}\right|^{-1}} 1973:{\displaystyle p_{1}(z_{1})=p_{0}(z_{0})\left|\det {\frac {df_{1}^{-1}(z_{1})}{dz_{1}}}\right|} 1090: 8826: 8643: 8409: 8401: 7834: 6798: 4419: 3412: 2355: 737: 580: 493: 289: 259: 204: 199: 154: 96: 8624:"A Stochastic Estimator of the Trace of the Influence Matrix for Laplacian Smoothing Splines" 7492: 4745: 3872: 3335: 2701: 8816: 8635: 8393: 8202: 8194: 8182: 8163: 8086: 7556: 1273: 961: 957: 765: 518: 468: 378: 362: 332: 194: 189: 139: 129: 27: 7708:
might be ill-behaved, due to degeneracy (that is, there are an infinite number of possible
6771: 4926: 3079: 2674: 1810: 1712: 1685: 1343: 1313: 1196: 1013: 8110:
Flow-based generative models have been applied on a variety of modeling tasks, including:
7685: 7681: 4535:
be even-dimensional, and split them in the middle. Then the normalizing flow functions are
1038: 793: 597: 463: 403: 7447: 5309:{\displaystyle z_{1}=x_{1},z_{2}=e^{-s_{\theta }(x_{1})}\odot (x_{2}-m_{\theta }(x_{1}))} 4900: 8821: 7816: 7711: 7691: 7652: 7162: 6957: 5789: 5426: 4071: 2337: 813: 344: 81: 9008: 8999: 8805:"Understanding Failures in Out-of-Distribution Detection with Deep Generative Models" 8593:
Finlay, Chris; Jacobsen, Joern-Henrik; Nurbekyan, Levon; Oberman, Adam (2020-11-21).
8421: 8231: 7677: 3049:
between the model's likelihood and the target distribution to be estimated. Denoting
1671: 980: 732: 661: 543: 274: 159: 8594: 8216: 8804: 8661:
Chen, Ricky T. Q.; Rubanova, Yulia; Bettencourt, Jesse; Duvenaud, David K. (2018).
8595:"How to Train Your Neural ODE: the World of Jacobian and Kinetic Regularization" 7676:
There are two main deficiencies of CNF, one is that a continuous flow must be a
4416:
For it to be invertible everywhere, it must be nonzero everywhere. For example,
2062: 987: 538: 32: 8397: 8371: 8167: 1002: 8639: 7670: 687: 383: 309: 8647: 8405: 8090: 3858:
between the model's likelihood and the target distribution is equivalent to
976:
law of probabilities to transform a simple distribution into a complex one.
846: 627: 8830: 8413: 8338: 6831:. Map this latent variable to data space with the following flow function: 6688:{\displaystyle \sigma _{1}\sigma _{2}(x_{1})\cdots \sigma _{n}(x_{1:n-1})} 7314:{\displaystyle \log(p(x))=\log(p(z_{0}))-\int _{0}^{T}{\text{Tr}}\leftdt} 5423:. Since the Real NVP map keeps the first and second halves of the vector 8370:
Kobyzev, Ivan; Prince, Simon J.D.; Brubaker, Marcus A. (November 2021).
3731:{\displaystyle {\underset {\theta }{\operatorname {arg\,min} }}\ D_{KL}} 6974:
is an arbitrary function and can be modeled with e.g. neural networks.
3322:{\displaystyle D_{KL}=-\mathbb {E} _{p^{*}(x)}+\mathbb {E} _{p^{*}(x)}} 622: 8735: 8733: 8207: 8198: 4894:, and the Jacobian is just 1, that is, the flow is volume-preserving. 8767: 373: 8976: 8955: 8934: 8913: 8892: 8871: 8850: 8788: 8746: 8722: 8678: 8607: 8575: 8554: 8533: 8507: 8474: 8441: 8388: 8355: 8322: 8296: 8248: 8085:
One of the most interesting properties of normalizing flows is the
8372:"Normalizing Flows: An Introduction and Review of Current Methods" 8272: 3103:
the target distribution to learn, the (forward) KL-divergence is:
1001: 986:
In contrast, many alternative generative modeling methods such as
617: 612: 339: 8491:"Improved Variational Inference with Inverse Autoregressive Flow" 3409:
of samples each independently drawn from the target distribution
6944:{\displaystyle x=F(z_{0})=z_{T}=z_{0}+\int _{0}^{T}f(z_{t},t)dt} 6272:, the autoregressive model is generalized to a normalizing flow: 4958:
The Real Non-Volume Preserving model generalizes NICE model by:
8376:
IEEE Transactions on Pattern Analysis and Machine Intelligence
4016:{\displaystyle \max _{\theta }\sum _{j}\ln p_{\theta }(x_{j})} 7324:
Since the trace depends only on the diagonal of the Jacobian
6258:{\displaystyle \sigma _{i}:\mathbb {R} ^{i-1}\to (0,\infty )} 8339:"Normalizing Flows for Probabilistic Modeling and Inference" 8232:"Normalizing flows for probabilistic modeling and inference" 4209:{\displaystyle x=f_{\theta }(z)=z+uh(\langle w,z\rangle +b)} 8456:
Papamakarios, George; Pavlakou, Theo; Murray, Iain (2017).
6192:{\displaystyle \mu _{i}:\mathbb {R} ^{i-1}\to \mathbb {R} } 3865:
A pseudocode for training normalizing flows is as follows:
7688:, or undo a knot), and the other is that the learned flow 6601:
The Jacobian matrix is lower-diagonal, so the Jacobian is
6265:
are fixed functions that define the autoregressive model.
4887:{\displaystyle z_{1}=x_{1},z_{2}=x_{2}-m_{\theta }(x_{1})} 8628:
Communications in Statistics - Simulation and Computation
8183:"A family of nonparametric density estimation algorithms" 8152:"Density estimation by dual ascent of the log-likelihood" 2054:{\displaystyle \det {\frac {df_{1}^{-1}(z_{1})}{dz_{1}}}} 1566:
To efficiently compute the log likelihood, the functions
8803:
Zhang, Lily; Goldstein, Mark; Ranganath, Rajesh (2021).
905:
List of datasets in computer vision and image processing
8693:
Dupont, Emilien; Doucet, Arnaud; Teh, Yee Whye (2019).
5383:{\displaystyle \prod _{i=1}^{n}e^{s_{\theta }(z_{1,})}} 5120: 5041: 4976: 4661: 4618: 4553: 8024: 7843: 7819: 7770: 7741: 7714: 7694: 7655: 7594: 7559: 7495: 7450: 7415: 7374: 7330: 7188: 7165: 6986: 6960: 6840: 6801: 6774: 6731: 6704: 6607: 6553: 6278: 6205: 6149: 5858: 5827: 5792: 5753: 5668: 5620: 5539: 5449: 5443:
separate, it's usually required to add a permutation
5429: 5396: 5322: 5182: 4964: 4929: 4903: 4805: 4770: 4748: 4721: 4541: 4497: 4448: 4422: 4260: 4222: 4138: 4094: 4074: 4032: 3960: 3908: 3875: 3750: 3629: 3454: 3415: 3363: 3338: 3112: 3082: 3055: 2854: 2786: 2737: 2704: 2677: 2518: 2367: 2340: 2276: 2120: 2075: 1989: 1843: 1813: 1742: 1715: 1688: 1624: 1572: 1380: 1346: 1316: 1281: 1226: 1199: 1137: 1093: 1047: 1016: 994:
do not explicitly represent the likelihood function.
2841:
subtracted by a non-recursive term, we can infer by
8588: 8586: 8458:"Masked Autoregressive Flow for Density Estimation" 7364:
The trace can be estimated by "Hutchinson's trick":
4068:The earliest example. Fix some activation function 3862:under observed samples of the target distribution. 1193:be a sequence of random variables transformed from 8056: 8010: 7825: 7794: 7756: 7720: 7700: 7661: 7634: 7580: 7541: 7481: 7436: 7401: 7353: 7313: 7171: 7148: 6966: 6943: 6823: 6787: 6752: 6717: 6687: 6587: 6539: 6257: 6191: 6135: 5842: 5798: 5778: 5739: 5651: 5606: 5529:In generative flow model, each layer has 3 parts: 5514:{\displaystyle (x_{1},x_{2})\mapsto (x_{2},x_{1})} 5513: 5435: 5415: 5382: 5308: 5166: 4942: 4915: 4886: 4791: 4754: 4734: 4707: 4527: 4487:Nonlinear Independent Components Estimation (NICE) 4475: 4434: 4405: 4243: 4208: 4124: 4080: 4047: 4015: 3943: 3894: 3843: 3730: 3609: 3437: 3401: 3344: 3321: 3095: 3068: 3029: 2833: 2772: 2723: 2690: 2660: 2498: 2346: 2326: 2259: 2096: 2053: 1972: 1826: 1792: 1728: 1701: 1662: 1610: 1555: 1359: 1329: 1302: 1264: 1212: 1185: 1123: 1076: 1029: 8699:Advances in Neural Information Processing Systems 8670:Advances in Neural Information Processing Systems 8495:Advances in Neural Information Processing Systems 8462:Advances in Neural Information Processing Systems 7764:can be approximated by a neural ODE operating on 6547:The autoregressive model is recovered by setting 5809:Real NVP, with Jacobian as described in Real NVP. 1368: 8150:Tabak, Esteban G.; Vanden-Eijnden, Eric (2010). 5850:is defined as the following stochastic process: 5754: 5740:{\displaystyle z_{cij}=\sum _{c'}K_{cc'}y_{cij}} 4266: 3962: 2957: 2600: 2429: 2302: 2277: 2181: 1990: 1904: 1483: 8181:Tabak, Esteban G.; Turner, Cristina V. (2012). 8187:Communications on Pure and Applied Mathematics 8057:{\displaystyle \lambda _{K},\lambda _{J}>0} 900:List of datasets for machine-learning research 7402:{\displaystyle W\in \mathbb {R} ^{n\times n}} 5821:An autoregressive model of a distribution on 4923:, this is seen as a curvy shearing along the 933: 8: 8599:International Conference on Machine Learning 7623: 7595: 5607:{\displaystyle y_{cij}=s_{c}(x_{cij}+b_{c})} 4461: 4449: 4395: 4383: 4371: 4359: 4301: 4289: 4194: 4182: 3378: 3364: 7553:Usually, the random vector is sampled from 7549:. (Proof: expand the expectation directly.) 972:, which is a statistical method using the 940: 926: 18: 16:Statistical model used in machine learning 8975: 8954: 8933: 8912: 8891: 8870: 8849: 8820: 8787: 8766: 8745: 8721: 8677: 8606: 8574: 8553: 8532: 8522: 8520: 8518: 8506: 8473: 8440: 8387: 8354: 8321: 8295: 8271: 8247: 8206: 8042: 8029: 8023: 7996: 7991: 7971: 7955: 7939: 7934: 7924: 7905: 7885: 7863: 7858: 7848: 7842: 7818: 7777: 7773: 7772: 7769: 7748: 7744: 7743: 7740: 7713: 7693: 7654: 7626: 7612: 7605: 7593: 7558: 7506: 7494: 7464: 7449: 7428: 7424: 7423: 7414: 7387: 7383: 7382: 7373: 7340: 7335: 7329: 7292: 7274: 7265: 7259: 7254: 7235: 7187: 7164: 7125: 7109: 7104: 7091: 7063: 7047: 7042: 7029: 7004: 6991: 6985: 6959: 6920: 6904: 6899: 6886: 6873: 6857: 6839: 6812: 6800: 6795:be the latent variable with distribution 6779: 6773: 6741: 6736: 6730: 6709: 6703: 6664: 6651: 6635: 6622: 6612: 6606: 6576: 6552: 6527: 6502: 6489: 6461: 6448: 6433: 6409: 6396: 6383: 6367: 6354: 6339: 6325: 6315: 6302: 6287: 6279: 6277: 6225: 6221: 6220: 6210: 6204: 6185: 6184: 6169: 6165: 6164: 6154: 6148: 6120: 6098: 6085: 6057: 6044: 6023: 5996: 5986: 5973: 5957: 5944: 5923: 5906: 5901: 5888: 5867: 5859: 5857: 5834: 5830: 5829: 5826: 5791: 5767: 5752: 5725: 5707: 5692: 5673: 5667: 5640: 5635: 5625: 5619: 5595: 5576: 5563: 5544: 5538: 5502: 5489: 5470: 5457: 5448: 5428: 5401: 5395: 5390:. The NICE model is recovered by setting 5366: 5353: 5348: 5338: 5327: 5321: 5294: 5281: 5268: 5247: 5234: 5226: 5213: 5200: 5187: 5181: 5147: 5134: 5115: 5098: 5080: 5067: 5062: 5048: 5036: 5018: 4997: 4983: 4971: 4963: 4934: 4928: 4902: 4875: 4862: 4849: 4836: 4823: 4810: 4804: 4780: 4775: 4769: 4747: 4726: 4720: 4688: 4675: 4656: 4639: 4625: 4613: 4595: 4574: 4560: 4548: 4540: 4516: 4512: 4511: 4496: 4476:{\displaystyle \langle u,w\rangle >-1} 4447: 4421: 4398: 4337: 4329: 4320: 4261: 4259: 4232: 4227: 4221: 4149: 4137: 4093: 4073: 4034: 4033: 4031: 4004: 3991: 3975: 3965: 3959: 3944:{\displaystyle f_{\theta }(\cdot ),p_{0}} 3935: 3913: 3907: 3880: 3874: 3829: 3816: 3797: 3786: 3763: 3753: 3751: 3749: 3710: 3701: 3696: 3681: 3665: 3642: 3632: 3630: 3628: 3595: 3582: 3563: 3552: 3538: 3511: 3478: 3473: 3463: 3462: 3460: 3459: 3453: 3420: 3414: 3381: 3371: 3362: 3337: 3298: 3265: 3260: 3256: 3255: 3230: 3197: 3192: 3188: 3187: 3162: 3153: 3148: 3133: 3117: 3111: 3087: 3081: 3060: 3054: 3007: 2983: 2970: 2960: 2940: 2929: 2913: 2900: 2878: 2865: 2853: 2816: 2797: 2785: 2761: 2748: 2736: 2709: 2703: 2682: 2676: 2644: 2626: 2613: 2603: 2577: 2564: 2542: 2529: 2517: 2487: 2473: 2455: 2442: 2432: 2414: 2401: 2385: 2372: 2366: 2339: 2327:{\displaystyle \det(A^{-1})=\det(A)^{-1}} 2315: 2287: 2275: 2243: 2230: 2212: 2199: 2189: 2167: 2154: 2138: 2125: 2119: 2085: 2080: 2074: 2042: 2024: 2008: 2003: 1993: 1988: 1956: 1938: 1922: 1917: 1907: 1890: 1877: 1861: 1848: 1842: 1818: 1812: 1781: 1765: 1760: 1747: 1741: 1720: 1714: 1693: 1687: 1654: 1629: 1623: 1602: 1577: 1571: 1533: 1509: 1496: 1486: 1466: 1455: 1439: 1426: 1404: 1391: 1379: 1351: 1345: 1321: 1315: 1291: 1286: 1280: 1256: 1231: 1225: 1204: 1198: 1168: 1155: 1142: 1136: 1092: 1065: 1052: 1046: 1021: 1015: 8809:Proceedings of Machine Learning Research 8672:. Vol. 31. Curran Associates, Inc. 8663:"Neural Ordinary Differential Equations" 6977:The inverse function is then naturally: 4528:{\displaystyle x,z\in \mathbb {R} ^{2n}} 4251:has no closed-form solution in general. 8156:Communications in Mathematical Sciences 8142: 1793:{\displaystyle z_{0}=f_{1}^{-1}(z_{1})} 26: 8311: 8309: 8307: 3445:, then this term can be estimated as: 7437:{\displaystyle u\in \mathbb {R} ^{n}} 4954:Real Non-Volume Preserving (Real NVP) 4132:with the appropriate dimensions, then 2834:{\displaystyle \log p_{i-1}(z_{i-1})} 2671:In general, the above applies to any 7: 8343:Journal of Machine Learning Research 8285: 8283: 8261: 8259: 8236:Journal of Machine Learning Research 7669:is implemented as a neural network, 7635:{\displaystyle \{\pm n^{-1/2}\}^{n}} 5652:{\displaystyle \prod _{c}s_{c}^{HW}} 1186:{\displaystyle z_{i}=f_{i}(z_{i-1})} 7795:{\displaystyle \mathbb {R} ^{2n+1}} 4742:is any neural network with weights 895:Glossary of artificial intelligence 7952: 7728:that all solve the same problem). 7354:{\displaystyle \partial _{z_{t}}f} 7332: 7285: 7277: 6249: 3770: 3767: 3764: 3760: 3757: 3754: 3649: 3646: 3643: 3639: 3636: 3633: 3620:Therefore, the learning objective 14: 8995:Flow-based Deep Generative Models 8622:Hutchinson, M.F. (January 1989). 6764:Continuous Normalizing Flow (CNF) 3402:{\displaystyle \{x_{i}\}_{i=1:N}} 2773:{\displaystyle \log p_{i}(z_{i})} 7757:{\displaystyle \mathbb {R} ^{n}} 7680:, thus preserve orientation and 6753:{\displaystyle f_{\theta }^{-1}} 6588:{\displaystyle z\sim N(0,I_{n})} 5843:{\displaystyle \mathbb {R} ^{n}} 5817:Masked autoregressive flow (MAF) 4792:{\displaystyle f_{\theta }^{-1}} 4244:{\displaystyle f_{\theta }^{-1}} 4048:{\displaystyle {\hat {\theta }}} 1337:models the target distribution. 7808:universal approximation theorem 4125:{\displaystyle \theta =(u,w,b)} 3860:maximizing the model likelihood 3854:In other words, minimizing the 3357:. Indeed, if we have a dataset 1663:{\displaystyle f_{1},...,f_{K}} 1611:{\displaystyle f_{1},...,f_{K}} 1272:should be invertible, i.e. the 1265:{\displaystyle f_{1},...,f_{K}} 7987: 7983: 7964: 7947: 7901: 7897: 7878: 7871: 7575: 7563: 7536: 7530: 7518: 7499: 7470: 7454: 7244: 7241: 7228: 7222: 7210: 7207: 7201: 7195: 7137: 7118: 7075: 7056: 7019: 7013: 6932: 6913: 6863: 6850: 6818: 6805: 6682: 6657: 6641: 6628: 6582: 6563: 6520: 6495: 6479: 6454: 6402: 6389: 6373: 6360: 6252: 6240: 6237: 6181: 6126: 6117: 6091: 6075: 6050: 6037: 6002: 5993: 5979: 5963: 5950: 5937: 5912: 5881: 5764: 5757: 5601: 5569: 5508: 5482: 5479: 5476: 5450: 5375: 5359: 5303: 5300: 5287: 5261: 5253: 5240: 5153: 5140: 5086: 5073: 5030: 5024: 4881: 4868: 4694: 4681: 4607: 4601: 4399: 4380: 4356: 4338: 4330: 4326: 4310: 4286: 4269: 4262: 4203: 4179: 4161: 4155: 4119: 4101: 4039: 4010: 3997: 3925: 3919: 3838: 3835: 3822: 3809: 3725: 3722: 3716: 3702: 3697: 3693: 3687: 3674: 3604: 3601: 3588: 3575: 3529: 3526: 3523: 3517: 3504: 3495: 3490: 3484: 3467: 3432: 3426: 3316: 3313: 3310: 3304: 3291: 3282: 3277: 3271: 3248: 3245: 3242: 3236: 3223: 3214: 3209: 3203: 3177: 3174: 3168: 3154: 3149: 3145: 3139: 3126: 2995: 2976: 2919: 2906: 2884: 2871: 2828: 2809: 2767: 2754: 2632: 2619: 2583: 2570: 2548: 2535: 2461: 2448: 2420: 2407: 2391: 2378: 2312: 2305: 2296: 2280: 2218: 2205: 2173: 2160: 2144: 2131: 2030: 2017: 1944: 1931: 1896: 1883: 1867: 1854: 1787: 1774: 1521: 1502: 1445: 1432: 1410: 1397: 1180: 1161: 1071: 1058: 992:generative adversarial network 315:Relevance vector machine (RVM) 1: 5533:channel-wise affine transform 5416:{\displaystyle s_{\theta }=0} 1807:formula, the distribution of 1037:be a (possibly multivariate) 988:variational autoencoder (VAE) 804:Computational learning theory 368:Expectation–maximization (EM) 5779:{\displaystyle \det(K)^{HW}} 5521:after every Real NVP layer. 2509:The log likelihood is thus: 1678:Derivation of log likelihood 1077:{\displaystyle p_{0}(z_{0})} 1006:Scheme for normalizing flows 761:Coefficient of determination 608:Convolutional neural network 320:Support vector machine (SVM) 6718:{\displaystyle f_{\theta }} 4735:{\displaystyle m_{\theta }} 4483:satisfies the requirement. 3856:Kullback–Leibler divergence 3076:the model's likelihood and 3069:{\displaystyle p_{\theta }} 3047:Kullback–Leibler divergence 954:flow-based generative model 912:Outline of machine learning 809:Empirical risk minimization 9041: 8501:. Curran Associates, Inc. 8468:. Curran Associates, Inc. 8398:10.1109/TPAMI.2020.2992934 8168:10.4310/CMS.2010.v8.n1.a11 8120:Molecular graph generation 8096:change-of-variable theorem 7159:And the log-likelihood of 5662:invertible 1x1 convolution 2097:{\displaystyle f_{1}^{-1}} 1303:{\displaystyle f_{i}^{-1}} 549:Feedforward neural network 300:Artificial neural networks 8705:. Curran Associates, Inc. 8640:10.1080/03610918908812806 8098:, the computation of the 7804:Whitney embedding theorem 7588:(normal distribution) or 5806:is any invertible matrix. 3902:, normalizing flow model 1310:exists. The final output 1124:{\displaystyle i=1,...,K} 964:that explicitly models a 532:Artificial neural network 7835:optimal transport theory 7686:turn a sphere inside out 6824:{\displaystyle p(z_{0})} 4435:{\displaystyle h=\tanh } 3438:{\displaystyle p^{*}(x)} 2109:inverse function theorem 966:probability distribution 841:Journals and conferences 788:Mathematical foundations 698:Temporal difference (TD) 554:Recurrent neural network 474:Conditional random field 397:Dimensionality reduction 145:Dimensionality reduction 107:Quantum machine learning 102:Neuromorphic engineering 62:Self-supervised learning 57:Semi-supervised learning 9000:Normalizing flow models 8695:"Augmented Neural ODEs" 8129:Lossy image compression 7813:To regularize the flow 7644:Radamacher distribution 7542:{\displaystyle E=tr(W)} 6698:Reversing the two maps 6270:reparametrization trick 4755:{\displaystyle \theta } 3895:{\displaystyle x_{1:n}} 3345:{\displaystyle \theta } 2724:{\displaystyle z_{i-1}} 250:Apprenticeship learning 8058: 8012: 7827: 7806:for manifolds and the 7802:, proved by combining 7796: 7758: 7722: 7702: 7663: 7636: 7582: 7581:{\displaystyle N(0,I)} 7551: 7543: 7483: 7438: 7403: 7355: 7315: 7173: 7150: 6968: 6945: 6825: 6789: 6754: 6719: 6689: 6589: 6541: 6259: 6193: 6137: 5844: 5800: 5780: 5741: 5653: 5608: 5525:Generative Flow (Glow) 5515: 5437: 5417: 5384: 5343: 5316:, and its Jacobian is 5310: 5168: 4944: 4917: 4888: 4793: 4756: 4736: 4709: 4529: 4477: 4436: 4407: 4245: 4210: 4126: 4082: 4049: 4017: 3945: 3896: 3845: 3802: 3732: 3611: 3568: 3439: 3403: 3346: 3323: 3097: 3070: 3031: 2945: 2835: 2774: 2725: 2692: 2662: 2500: 2348: 2328: 2261: 2098: 2055: 1974: 1828: 1794: 1730: 1703: 1664: 1612: 1557: 1471: 1361: 1340:The log likelihood of 1331: 1304: 1266: 1214: 1187: 1125: 1078: 1031: 1007: 799:Bias–variance tradeoff 681:Reinforcement learning 657:Spiking neural network 67:Reinforcement learning 8059: 8013: 7828: 7810:for neural networks. 7797: 7759: 7735:Any homeomorphism of 7723: 7703: 7664: 7637: 7583: 7544: 7484: 7439: 7404: 7366: 7356: 7316: 7174: 7151: 6969: 6946: 6826: 6790: 6788:{\displaystyle z_{0}} 6755: 6720: 6690: 6590: 6542: 6260: 6194: 6138: 5845: 5801: 5781: 5742: 5654: 5609: 5516: 5438: 5418: 5385: 5323: 5311: 5169: 4945: 4943:{\displaystyle x_{2}} 4918: 4889: 4794: 4757: 4737: 4710: 4530: 4478: 4437: 4408: 4246: 4211: 4127: 4083: 4050: 4018: 3946: 3897: 3846: 3782: 3733: 3612: 3548: 3440: 3404: 3347: 3324: 3098: 3096:{\displaystyle p^{*}} 3071: 3032: 2925: 2836: 2775: 2726: 2693: 2691:{\displaystyle z_{i}} 2663: 2501: 2349: 2329: 2262: 2099: 2056: 1975: 1829: 1827:{\displaystyle z_{1}} 1795: 1731: 1729:{\displaystyle z_{0}} 1704: 1702:{\displaystyle z_{1}} 1665: 1613: 1558: 1451: 1362: 1360:{\displaystyle z_{K}} 1332: 1330:{\displaystyle z_{K}} 1305: 1267: 1215: 1213:{\displaystyle z_{0}} 1188: 1126: 1079: 1032: 1030:{\displaystyle z_{0}} 1005: 635:Neural radiance field 457:Structured prediction 180:Structured prediction 52:Unsupervised learning 9025:Probabilistic models 8123:Point-cloud modeling 8022: 7841: 7817: 7768: 7739: 7712: 7692: 7653: 7592: 7557: 7493: 7448: 7413: 7372: 7328: 7186: 7163: 6984: 6958: 6838: 6799: 6772: 6729: 6702: 6605: 6551: 6276: 6203: 6147: 5856: 5825: 5790: 5751: 5666: 5618: 5537: 5447: 5427: 5394: 5320: 5180: 4962: 4927: 4901: 4803: 4768: 4746: 4719: 4539: 4495: 4446: 4420: 4258: 4220: 4136: 4092: 4072: 4030: 3958: 3906: 3873: 3748: 3627: 3452: 3413: 3361: 3336: 3110: 3080: 3053: 2852: 2784: 2735: 2702: 2675: 2516: 2365: 2338: 2274: 2118: 2073: 1987: 1841: 1811: 1740: 1713: 1686: 1672:deep neural networks 1622: 1570: 1378: 1344: 1314: 1279: 1224: 1197: 1135: 1091: 1045: 1014: 824:Statistical learning 722:Learning with humans 514:Local outlier factor 8601:. PMLR: 3154–3164. 8001: 7944: 7868: 7482:{\displaystyle E=I} 7264: 7114: 7052: 6909: 6749: 5911: 5648: 4916:{\displaystyle n=1} 4788: 4240: 4023:by gradient descent 3355:importance sampling 2093: 2016: 1930: 1773: 1299: 667:Electrochemical RAM 574:reservoir computing 305:Logistic regression 224:Supervised learning 210:Multimodal learning 185:Feature engineering 130:Generative modeling 92:Rule-based learning 87:Curriculum learning 47:Supervised learning 22:Part of a series on 9020:Statistical models 8054: 8008: 7945: 7930: 7854: 7823: 7792: 7754: 7718: 7698: 7659: 7632: 7578: 7539: 7479: 7434: 7399: 7351: 7311: 7250: 7169: 7146: 7100: 7038: 6964: 6941: 6895: 6821: 6785: 6750: 6732: 6715: 6685: 6585: 6537: 6535: 6255: 6189: 6133: 6131: 5897: 5840: 5796: 5776: 5737: 5702: 5649: 5631: 5630: 5604: 5511: 5433: 5413: 5380: 5306: 5164: 5158: 5106: 5005: 4940: 4913: 4884: 4789: 4771: 4752: 4732: 4705: 4699: 4647: 4582: 4525: 4473: 4432: 4403: 4241: 4223: 4206: 4122: 4078: 4045: 4013: 3980: 3970: 3941: 3892: 3841: 3777: 3728: 3656: 3607: 3435: 3399: 3342: 3319: 3093: 3066: 3027: 2831: 2770: 2721: 2688: 2658: 2496: 2344: 2324: 2257: 2094: 2076: 2051: 1999: 1970: 1913: 1824: 1805:change of variable 1790: 1756: 1726: 1699: 1670:are modeled using 1660: 1608: 1553: 1357: 1327: 1300: 1282: 1262: 1210: 1183: 1121: 1074: 1041:with distribution 1027: 1008: 974:change-of-variable 235: • 150:Density estimation 8382:(11): 3964–3979. 8199:10.1002/cpa.21423 8132:Anomaly detection 8089:of their learned 7826:{\displaystyle f} 7721:{\displaystyle f} 7701:{\displaystyle f} 7662:{\displaystyle f} 7409:, and any random 7368:Given any matrix 7299: 7268: 7179:can be found as: 7172:{\displaystyle x} 6967:{\displaystyle f} 5799:{\displaystyle K} 5688: 5621: 5436:{\displaystyle x} 4081:{\displaystyle h} 4042: 3971: 3961: 3781: 3752: 3660: 3631: 3546: 3470: 3020: 2651: 2480: 2356:invertible matrix 2347:{\displaystyle A} 2237: 2049: 1963: 1546: 950: 949: 755:Model diagnostics 738:Human-in-the-loop 581:Boltzmann machine 494:Anomaly detection 290:Linear regression 205:Ontology learning 200:Grammar induction 175:Semantic analysis 170:Association rules 155:Anomaly detection 97:Neuro-symbolic AI 9032: 9015:Machine learning 8982: 8981: 8979: 8967: 8961: 8960: 8958: 8946: 8940: 8939: 8937: 8925: 8919: 8918: 8916: 8904: 8898: 8897: 8895: 8883: 8877: 8876: 8874: 8862: 8856: 8855: 8853: 8841: 8835: 8834: 8824: 8800: 8794: 8793: 8791: 8779: 8773: 8772: 8770: 8758: 8752: 8751: 8749: 8737: 8728: 8727: 8725: 8713: 8707: 8706: 8690: 8684: 8683: 8681: 8667: 8658: 8652: 8651: 8634:(3): 1059–1076. 8619: 8613: 8612: 8610: 8590: 8581: 8580: 8578: 8566: 8560: 8559: 8557: 8545: 8539: 8538: 8536: 8524: 8513: 8512: 8510: 8486: 8480: 8479: 8477: 8453: 8447: 8446: 8444: 8432: 8426: 8425: 8391: 8367: 8361: 8360: 8358: 8334: 8328: 8327: 8325: 8313: 8302: 8301: 8299: 8287: 8278: 8277: 8275: 8263: 8254: 8253: 8251: 8242:(1): 2617–2680. 8227: 8221: 8220: 8210: 8178: 8172: 8171: 8147: 8126:Video generation 8117:Image generation 8114:Audio generation 8063: 8061: 8060: 8055: 8047: 8046: 8034: 8033: 8017: 8015: 8014: 8009: 8000: 7995: 7990: 7986: 7976: 7975: 7960: 7959: 7943: 7938: 7929: 7928: 7910: 7909: 7904: 7900: 7890: 7889: 7867: 7862: 7853: 7852: 7832: 7830: 7829: 7824: 7801: 7799: 7798: 7793: 7791: 7790: 7776: 7763: 7761: 7760: 7755: 7753: 7752: 7747: 7727: 7725: 7724: 7719: 7707: 7705: 7704: 7699: 7668: 7666: 7665: 7660: 7641: 7639: 7638: 7633: 7631: 7630: 7621: 7620: 7616: 7587: 7585: 7584: 7579: 7548: 7546: 7545: 7540: 7511: 7510: 7488: 7486: 7485: 7480: 7469: 7468: 7443: 7441: 7440: 7435: 7433: 7432: 7427: 7408: 7406: 7405: 7400: 7398: 7397: 7386: 7360: 7358: 7357: 7352: 7347: 7346: 7345: 7344: 7320: 7318: 7317: 7312: 7304: 7300: 7298: 7297: 7296: 7283: 7275: 7269: 7266: 7263: 7258: 7240: 7239: 7178: 7176: 7175: 7170: 7155: 7153: 7152: 7147: 7130: 7129: 7113: 7108: 7096: 7095: 7068: 7067: 7051: 7046: 7034: 7033: 7012: 7011: 6996: 6995: 6973: 6971: 6970: 6965: 6950: 6948: 6947: 6942: 6925: 6924: 6908: 6903: 6891: 6890: 6878: 6877: 6862: 6861: 6830: 6828: 6827: 6822: 6817: 6816: 6794: 6792: 6791: 6786: 6784: 6783: 6759: 6757: 6756: 6751: 6748: 6740: 6724: 6722: 6721: 6716: 6714: 6713: 6694: 6692: 6691: 6686: 6681: 6680: 6656: 6655: 6640: 6639: 6627: 6626: 6617: 6616: 6594: 6592: 6591: 6586: 6581: 6580: 6546: 6544: 6543: 6538: 6536: 6532: 6531: 6519: 6518: 6494: 6493: 6478: 6477: 6453: 6452: 6438: 6437: 6418: 6414: 6413: 6401: 6400: 6388: 6387: 6372: 6371: 6359: 6358: 6344: 6343: 6330: 6329: 6320: 6319: 6307: 6306: 6292: 6291: 6264: 6262: 6261: 6256: 6236: 6235: 6224: 6215: 6214: 6198: 6196: 6195: 6190: 6188: 6180: 6179: 6168: 6159: 6158: 6142: 6140: 6139: 6134: 6132: 6125: 6124: 6115: 6114: 6090: 6089: 6074: 6073: 6049: 6048: 6028: 6027: 6008: 6001: 6000: 5991: 5990: 5978: 5977: 5962: 5961: 5949: 5948: 5928: 5927: 5910: 5905: 5893: 5892: 5872: 5871: 5849: 5847: 5846: 5841: 5839: 5838: 5833: 5805: 5803: 5802: 5797: 5785: 5783: 5782: 5777: 5775: 5774: 5746: 5744: 5743: 5738: 5736: 5735: 5720: 5719: 5718: 5701: 5700: 5684: 5683: 5658: 5656: 5655: 5650: 5647: 5639: 5629: 5613: 5611: 5610: 5605: 5600: 5599: 5587: 5586: 5568: 5567: 5555: 5554: 5520: 5518: 5517: 5512: 5507: 5506: 5494: 5493: 5475: 5474: 5462: 5461: 5442: 5440: 5439: 5434: 5422: 5420: 5419: 5414: 5406: 5405: 5389: 5387: 5386: 5381: 5379: 5378: 5374: 5373: 5358: 5357: 5342: 5337: 5315: 5313: 5312: 5307: 5299: 5298: 5286: 5285: 5273: 5272: 5257: 5256: 5252: 5251: 5239: 5238: 5218: 5217: 5205: 5204: 5192: 5191: 5173: 5171: 5170: 5165: 5163: 5162: 5152: 5151: 5139: 5138: 5111: 5110: 5103: 5102: 5090: 5089: 5085: 5084: 5072: 5071: 5053: 5052: 5023: 5022: 5010: 5009: 5002: 5001: 4988: 4987: 4949: 4947: 4946: 4941: 4939: 4938: 4922: 4920: 4919: 4914: 4893: 4891: 4890: 4885: 4880: 4879: 4867: 4866: 4854: 4853: 4841: 4840: 4828: 4827: 4815: 4814: 4798: 4796: 4795: 4790: 4787: 4779: 4761: 4759: 4758: 4753: 4741: 4739: 4738: 4733: 4731: 4730: 4714: 4712: 4711: 4706: 4704: 4703: 4693: 4692: 4680: 4679: 4652: 4651: 4644: 4643: 4630: 4629: 4600: 4599: 4587: 4586: 4579: 4578: 4565: 4564: 4534: 4532: 4531: 4526: 4524: 4523: 4515: 4482: 4480: 4479: 4474: 4441: 4439: 4438: 4433: 4412: 4410: 4409: 4404: 4402: 4355: 4341: 4333: 4325: 4324: 4285: 4265: 4254:The Jacobian is 4250: 4248: 4247: 4242: 4239: 4231: 4215: 4213: 4212: 4207: 4154: 4153: 4131: 4129: 4128: 4123: 4087: 4085: 4084: 4079: 4054: 4052: 4051: 4046: 4044: 4043: 4035: 4022: 4020: 4019: 4014: 4009: 4008: 3996: 3995: 3979: 3969: 3950: 3948: 3947: 3942: 3940: 3939: 3918: 3917: 3901: 3899: 3898: 3893: 3891: 3890: 3850: 3848: 3847: 3842: 3834: 3833: 3821: 3820: 3801: 3796: 3779: 3778: 3773: 3741:is replaced by 3737: 3735: 3734: 3729: 3715: 3714: 3705: 3700: 3686: 3685: 3673: 3672: 3658: 3657: 3652: 3616: 3614: 3613: 3608: 3600: 3599: 3587: 3586: 3567: 3562: 3547: 3539: 3516: 3515: 3494: 3493: 3483: 3482: 3472: 3471: 3466: 3461: 3444: 3442: 3441: 3436: 3425: 3424: 3408: 3406: 3405: 3400: 3398: 3397: 3376: 3375: 3351: 3349: 3348: 3343: 3328: 3326: 3325: 3320: 3303: 3302: 3281: 3280: 3270: 3269: 3259: 3235: 3234: 3213: 3212: 3202: 3201: 3191: 3167: 3166: 3157: 3152: 3138: 3137: 3125: 3124: 3102: 3100: 3099: 3094: 3092: 3091: 3075: 3073: 3072: 3067: 3065: 3064: 3036: 3034: 3033: 3028: 3026: 3022: 3021: 3019: 3018: 3017: 2998: 2994: 2993: 2975: 2974: 2961: 2944: 2939: 2918: 2917: 2905: 2904: 2883: 2882: 2870: 2869: 2840: 2838: 2837: 2832: 2827: 2826: 2808: 2807: 2779: 2777: 2776: 2771: 2766: 2765: 2753: 2752: 2730: 2728: 2727: 2722: 2720: 2719: 2697: 2695: 2694: 2689: 2687: 2686: 2667: 2665: 2664: 2659: 2657: 2653: 2652: 2650: 2649: 2648: 2635: 2631: 2630: 2618: 2617: 2604: 2582: 2581: 2569: 2568: 2547: 2546: 2534: 2533: 2505: 2503: 2502: 2497: 2495: 2494: 2486: 2482: 2481: 2479: 2478: 2477: 2464: 2460: 2459: 2447: 2446: 2433: 2419: 2418: 2406: 2405: 2390: 2389: 2377: 2376: 2353: 2351: 2350: 2345: 2333: 2331: 2330: 2325: 2323: 2322: 2295: 2294: 2270:By the identity 2266: 2264: 2263: 2258: 2256: 2252: 2251: 2250: 2242: 2238: 2236: 2235: 2234: 2221: 2217: 2216: 2204: 2203: 2190: 2172: 2171: 2159: 2158: 2143: 2142: 2130: 2129: 2103: 2101: 2100: 2095: 2092: 2084: 2060: 2058: 2057: 2052: 2050: 2048: 2047: 2046: 2033: 2029: 2028: 2015: 2007: 1994: 1979: 1977: 1976: 1971: 1969: 1965: 1964: 1962: 1961: 1960: 1947: 1943: 1942: 1929: 1921: 1908: 1895: 1894: 1882: 1881: 1866: 1865: 1853: 1852: 1833: 1831: 1830: 1825: 1823: 1822: 1799: 1797: 1796: 1791: 1786: 1785: 1772: 1764: 1752: 1751: 1735: 1733: 1732: 1727: 1725: 1724: 1708: 1706: 1705: 1700: 1698: 1697: 1669: 1667: 1666: 1661: 1659: 1658: 1634: 1633: 1617: 1615: 1614: 1609: 1607: 1606: 1582: 1581: 1562: 1560: 1559: 1554: 1552: 1548: 1547: 1545: 1544: 1543: 1524: 1520: 1519: 1501: 1500: 1487: 1470: 1465: 1444: 1443: 1431: 1430: 1409: 1408: 1396: 1395: 1366: 1364: 1363: 1358: 1356: 1355: 1336: 1334: 1333: 1328: 1326: 1325: 1309: 1307: 1306: 1301: 1298: 1290: 1274:inverse function 1271: 1269: 1268: 1263: 1261: 1260: 1236: 1235: 1220:. The functions 1219: 1217: 1216: 1211: 1209: 1208: 1192: 1190: 1189: 1184: 1179: 1178: 1160: 1159: 1147: 1146: 1130: 1128: 1127: 1122: 1083: 1081: 1080: 1075: 1070: 1069: 1057: 1056: 1036: 1034: 1033: 1028: 1026: 1025: 970:normalizing flow 962:machine learning 958:generative model 942: 935: 928: 889:Related articles 766:Confusion matrix 519:Isolation forest 464:Graphical models 243: 242: 195:Learning to rank 190:Feature learning 28:Machine learning 19: 9040: 9039: 9035: 9034: 9033: 9031: 9030: 9029: 9005: 9004: 8991: 8986: 8985: 8969: 8968: 8964: 8948: 8947: 8943: 8927: 8926: 8922: 8906: 8905: 8901: 8885: 8884: 8880: 8864: 8863: 8859: 8843: 8842: 8838: 8815:: 12427–12436. 8802: 8801: 8797: 8781: 8780: 8776: 8760: 8759: 8755: 8739: 8738: 8731: 8715: 8714: 8710: 8692: 8691: 8687: 8665: 8660: 8659: 8655: 8621: 8620: 8616: 8592: 8591: 8584: 8568: 8567: 8563: 8547: 8546: 8542: 8526: 8525: 8516: 8488: 8487: 8483: 8455: 8454: 8450: 8434: 8433: 8429: 8369: 8368: 8364: 8336: 8335: 8331: 8315: 8314: 8305: 8289: 8288: 8281: 8265: 8264: 8257: 8229: 8228: 8224: 8180: 8179: 8175: 8149: 8148: 8144: 8139: 8108: 8071: 8038: 8025: 8020: 8019: 7967: 7951: 7950: 7946: 7920: 7881: 7874: 7870: 7869: 7844: 7839: 7838: 7815: 7814: 7771: 7766: 7765: 7742: 7737: 7736: 7710: 7709: 7690: 7689: 7682:ambient isotopy 7651: 7650: 7622: 7601: 7590: 7589: 7555: 7554: 7502: 7491: 7490: 7460: 7446: 7445: 7422: 7411: 7410: 7381: 7370: 7369: 7336: 7331: 7326: 7325: 7288: 7284: 7276: 7270: 7231: 7184: 7183: 7161: 7160: 7121: 7087: 7059: 7025: 7000: 6987: 6982: 6981: 6956: 6955: 6916: 6882: 6869: 6853: 6836: 6835: 6808: 6797: 6796: 6775: 6770: 6769: 6766: 6727: 6726: 6705: 6700: 6699: 6660: 6647: 6631: 6618: 6608: 6603: 6602: 6572: 6549: 6548: 6534: 6533: 6523: 6498: 6485: 6457: 6444: 6442: 6429: 6426: 6425: 6416: 6415: 6405: 6392: 6379: 6363: 6350: 6348: 6335: 6332: 6331: 6321: 6311: 6298: 6296: 6283: 6274: 6273: 6219: 6206: 6201: 6200: 6163: 6150: 6145: 6144: 6130: 6129: 6116: 6094: 6081: 6053: 6040: 6032: 6019: 6016: 6015: 6006: 6005: 5992: 5982: 5969: 5953: 5940: 5932: 5919: 5916: 5915: 5884: 5876: 5863: 5854: 5853: 5828: 5823: 5822: 5819: 5788: 5787: 5763: 5749: 5748: 5721: 5711: 5703: 5693: 5669: 5664: 5663: 5616: 5615: 5591: 5572: 5559: 5540: 5535: 5534: 5527: 5498: 5485: 5466: 5453: 5445: 5444: 5425: 5424: 5397: 5392: 5391: 5362: 5349: 5344: 5318: 5317: 5290: 5277: 5264: 5243: 5230: 5222: 5209: 5196: 5183: 5178: 5177: 5176:Its inverse is 5157: 5156: 5143: 5130: 5127: 5126: 5116: 5105: 5104: 5094: 5076: 5063: 5058: 5055: 5054: 5044: 5037: 5014: 5004: 5003: 4993: 4990: 4989: 4979: 4972: 4960: 4959: 4956: 4930: 4925: 4924: 4899: 4898: 4871: 4858: 4845: 4832: 4819: 4806: 4801: 4800: 4766: 4765: 4744: 4743: 4722: 4717: 4716: 4698: 4697: 4684: 4671: 4668: 4667: 4657: 4646: 4645: 4635: 4632: 4631: 4621: 4614: 4591: 4581: 4580: 4570: 4567: 4566: 4556: 4549: 4537: 4536: 4510: 4493: 4492: 4489: 4444: 4443: 4418: 4417: 4348: 4316: 4278: 4256: 4255: 4218: 4217: 4145: 4134: 4133: 4090: 4089: 4070: 4069: 4066: 4061: 4028: 4027: 4000: 3987: 3956: 3955: 3931: 3909: 3904: 3903: 3876: 3871: 3870: 3869:INPUT. dataset 3825: 3812: 3746: 3745: 3706: 3677: 3661: 3625: 3624: 3591: 3578: 3507: 3474: 3458: 3450: 3449: 3416: 3411: 3410: 3377: 3367: 3359: 3358: 3334: 3333: 3294: 3261: 3254: 3226: 3193: 3186: 3158: 3129: 3113: 3108: 3107: 3083: 3078: 3077: 3056: 3051: 3050: 3043: 3041:Training method 3003: 2999: 2979: 2966: 2962: 2956: 2952: 2909: 2896: 2874: 2861: 2850: 2849: 2812: 2793: 2782: 2781: 2757: 2744: 2733: 2732: 2705: 2700: 2699: 2678: 2673: 2672: 2640: 2636: 2622: 2609: 2605: 2599: 2595: 2573: 2560: 2538: 2525: 2514: 2513: 2469: 2465: 2451: 2438: 2434: 2428: 2424: 2423: 2410: 2397: 2381: 2368: 2363: 2362: 2336: 2335: 2311: 2283: 2272: 2271: 2226: 2222: 2208: 2195: 2191: 2185: 2184: 2180: 2176: 2163: 2150: 2134: 2121: 2116: 2115: 2071: 2070: 2067:Jacobian matrix 2038: 2034: 2020: 1995: 1985: 1984: 1952: 1948: 1934: 1909: 1903: 1899: 1886: 1873: 1857: 1844: 1839: 1838: 1814: 1809: 1808: 1777: 1743: 1738: 1737: 1716: 1711: 1710: 1689: 1684: 1683: 1680: 1650: 1625: 1620: 1619: 1598: 1573: 1568: 1567: 1529: 1525: 1505: 1492: 1488: 1482: 1478: 1435: 1422: 1400: 1387: 1376: 1375: 1347: 1342: 1341: 1317: 1312: 1311: 1277: 1276: 1252: 1227: 1222: 1221: 1200: 1195: 1194: 1164: 1151: 1138: 1133: 1132: 1089: 1088: 1061: 1048: 1043: 1042: 1039:random variable 1017: 1012: 1011: 1000: 946: 917: 916: 890: 882: 881: 842: 834: 833: 794:Kernel machines 789: 781: 780: 756: 748: 747: 728:Active learning 723: 715: 714: 683: 673: 672: 598:Diffusion model 534: 524: 523: 496: 486: 485: 459: 449: 448: 404:Factor analysis 399: 389: 388: 372: 335: 325: 324: 245: 244: 228: 227: 226: 215: 214: 120: 112: 111: 77:Online learning 42: 30: 17: 12: 11: 5: 9038: 9036: 9028: 9027: 9022: 9017: 9007: 9006: 9003: 9002: 8997: 8990: 8989:External links 8987: 8984: 8983: 8962: 8941: 8920: 8899: 8878: 8857: 8836: 8795: 8774: 8753: 8729: 8708: 8685: 8653: 8614: 8582: 8561: 8540: 8514: 8481: 8448: 8427: 8362: 8329: 8303: 8279: 8255: 8222: 8193:(2): 145–164. 8173: 8162:(1): 217–233. 8141: 8140: 8138: 8135: 8134: 8133: 8130: 8127: 8124: 8121: 8118: 8115: 8107: 8104: 8070: 8067: 8053: 8050: 8045: 8041: 8037: 8032: 8028: 8007: 8004: 7999: 7994: 7989: 7985: 7982: 7979: 7974: 7970: 7966: 7963: 7958: 7954: 7949: 7942: 7937: 7933: 7927: 7923: 7919: 7916: 7913: 7908: 7903: 7899: 7896: 7893: 7888: 7884: 7880: 7877: 7873: 7866: 7861: 7857: 7851: 7847: 7822: 7789: 7786: 7783: 7780: 7775: 7751: 7746: 7717: 7697: 7658: 7629: 7625: 7619: 7615: 7611: 7608: 7604: 7600: 7597: 7577: 7574: 7571: 7568: 7565: 7562: 7538: 7535: 7532: 7529: 7526: 7523: 7520: 7517: 7514: 7509: 7505: 7501: 7498: 7478: 7475: 7472: 7467: 7463: 7459: 7456: 7453: 7431: 7426: 7421: 7418: 7396: 7393: 7390: 7385: 7380: 7377: 7350: 7343: 7339: 7334: 7322: 7321: 7310: 7307: 7303: 7295: 7291: 7287: 7282: 7279: 7273: 7262: 7257: 7253: 7249: 7246: 7243: 7238: 7234: 7230: 7227: 7224: 7221: 7218: 7215: 7212: 7209: 7206: 7203: 7200: 7197: 7194: 7191: 7168: 7157: 7156: 7145: 7142: 7139: 7136: 7133: 7128: 7124: 7120: 7117: 7112: 7107: 7103: 7099: 7094: 7090: 7086: 7083: 7080: 7077: 7074: 7071: 7066: 7062: 7058: 7055: 7050: 7045: 7041: 7037: 7032: 7028: 7024: 7021: 7018: 7015: 7010: 7007: 7003: 6999: 6994: 6990: 6963: 6952: 6951: 6940: 6937: 6934: 6931: 6928: 6923: 6919: 6915: 6912: 6907: 6902: 6898: 6894: 6889: 6885: 6881: 6876: 6872: 6868: 6865: 6860: 6856: 6852: 6849: 6846: 6843: 6820: 6815: 6811: 6807: 6804: 6782: 6778: 6765: 6762: 6747: 6744: 6739: 6735: 6712: 6708: 6684: 6679: 6676: 6673: 6670: 6667: 6663: 6659: 6654: 6650: 6646: 6643: 6638: 6634: 6630: 6625: 6621: 6615: 6611: 6584: 6579: 6575: 6571: 6568: 6565: 6562: 6559: 6556: 6530: 6526: 6522: 6517: 6514: 6511: 6508: 6505: 6501: 6497: 6492: 6488: 6484: 6481: 6476: 6473: 6470: 6467: 6464: 6460: 6456: 6451: 6447: 6443: 6441: 6436: 6432: 6428: 6427: 6424: 6421: 6419: 6417: 6412: 6408: 6404: 6399: 6395: 6391: 6386: 6382: 6378: 6375: 6370: 6366: 6362: 6357: 6353: 6349: 6347: 6342: 6338: 6334: 6333: 6328: 6324: 6318: 6314: 6310: 6305: 6301: 6297: 6295: 6290: 6286: 6282: 6281: 6254: 6251: 6248: 6245: 6242: 6239: 6234: 6231: 6228: 6223: 6218: 6213: 6209: 6187: 6183: 6178: 6175: 6172: 6167: 6162: 6157: 6153: 6128: 6123: 6119: 6113: 6110: 6107: 6104: 6101: 6097: 6093: 6088: 6084: 6080: 6077: 6072: 6069: 6066: 6063: 6060: 6056: 6052: 6047: 6043: 6039: 6036: 6033: 6031: 6026: 6022: 6018: 6017: 6014: 6011: 6009: 6007: 6004: 5999: 5995: 5989: 5985: 5981: 5976: 5972: 5968: 5965: 5960: 5956: 5952: 5947: 5943: 5939: 5936: 5933: 5931: 5926: 5922: 5918: 5917: 5914: 5909: 5904: 5900: 5896: 5891: 5887: 5883: 5880: 5877: 5875: 5870: 5866: 5862: 5861: 5837: 5832: 5818: 5815: 5811: 5810: 5807: 5795: 5773: 5770: 5766: 5762: 5759: 5756: 5747:with Jacobian 5734: 5731: 5728: 5724: 5717: 5714: 5710: 5706: 5699: 5696: 5691: 5687: 5682: 5679: 5676: 5672: 5660: 5646: 5643: 5638: 5634: 5628: 5624: 5614:with Jacobian 5603: 5598: 5594: 5590: 5585: 5582: 5579: 5575: 5571: 5566: 5562: 5558: 5553: 5550: 5547: 5543: 5526: 5523: 5510: 5505: 5501: 5497: 5492: 5488: 5484: 5481: 5478: 5473: 5469: 5465: 5460: 5456: 5452: 5432: 5412: 5409: 5404: 5400: 5377: 5372: 5369: 5365: 5361: 5356: 5352: 5347: 5341: 5336: 5333: 5330: 5326: 5305: 5302: 5297: 5293: 5289: 5284: 5280: 5276: 5271: 5267: 5263: 5260: 5255: 5250: 5246: 5242: 5237: 5233: 5229: 5225: 5221: 5216: 5212: 5208: 5203: 5199: 5195: 5190: 5186: 5161: 5155: 5150: 5146: 5142: 5137: 5133: 5129: 5128: 5125: 5122: 5121: 5119: 5114: 5109: 5101: 5097: 5093: 5088: 5083: 5079: 5075: 5070: 5066: 5061: 5057: 5056: 5051: 5047: 5043: 5042: 5040: 5035: 5032: 5029: 5026: 5021: 5017: 5013: 5008: 5000: 4996: 4992: 4991: 4986: 4982: 4978: 4977: 4975: 4970: 4967: 4955: 4952: 4937: 4933: 4912: 4909: 4906: 4883: 4878: 4874: 4870: 4865: 4861: 4857: 4852: 4848: 4844: 4839: 4835: 4831: 4826: 4822: 4818: 4813: 4809: 4786: 4783: 4778: 4774: 4751: 4729: 4725: 4702: 4696: 4691: 4687: 4683: 4678: 4674: 4670: 4669: 4666: 4663: 4662: 4660: 4655: 4650: 4642: 4638: 4634: 4633: 4628: 4624: 4620: 4619: 4617: 4612: 4609: 4606: 4603: 4598: 4594: 4590: 4585: 4577: 4573: 4569: 4568: 4563: 4559: 4555: 4554: 4552: 4547: 4544: 4522: 4519: 4514: 4509: 4506: 4503: 4500: 4488: 4485: 4472: 4469: 4466: 4463: 4460: 4457: 4454: 4451: 4431: 4428: 4425: 4401: 4397: 4394: 4391: 4388: 4385: 4382: 4379: 4376: 4373: 4370: 4367: 4364: 4361: 4358: 4354: 4351: 4347: 4344: 4340: 4336: 4332: 4328: 4323: 4319: 4315: 4312: 4309: 4306: 4303: 4300: 4297: 4294: 4291: 4288: 4284: 4281: 4277: 4274: 4271: 4268: 4264: 4238: 4235: 4230: 4226: 4205: 4202: 4199: 4196: 4193: 4190: 4187: 4184: 4181: 4178: 4175: 4172: 4169: 4166: 4163: 4160: 4157: 4152: 4148: 4144: 4141: 4121: 4118: 4115: 4112: 4109: 4106: 4103: 4100: 4097: 4077: 4065: 4062: 4060: 4057: 4056: 4055: 4041: 4038: 4024: 4012: 4007: 4003: 3999: 3994: 3990: 3986: 3983: 3978: 3974: 3968: 3964: 3952: 3938: 3934: 3930: 3927: 3924: 3921: 3916: 3912: 3889: 3886: 3883: 3879: 3852: 3851: 3840: 3837: 3832: 3828: 3824: 3819: 3815: 3811: 3808: 3805: 3800: 3795: 3792: 3789: 3785: 3776: 3772: 3769: 3766: 3762: 3759: 3756: 3739: 3738: 3727: 3724: 3721: 3718: 3713: 3709: 3704: 3699: 3695: 3692: 3689: 3684: 3680: 3676: 3671: 3668: 3664: 3655: 3651: 3648: 3645: 3641: 3638: 3635: 3618: 3617: 3606: 3603: 3598: 3594: 3590: 3585: 3581: 3577: 3574: 3571: 3566: 3561: 3558: 3555: 3551: 3545: 3542: 3537: 3534: 3531: 3528: 3525: 3522: 3519: 3514: 3510: 3506: 3503: 3500: 3497: 3492: 3489: 3486: 3481: 3477: 3469: 3465: 3457: 3434: 3431: 3428: 3423: 3419: 3396: 3393: 3390: 3387: 3384: 3380: 3374: 3370: 3366: 3341: 3330: 3329: 3318: 3315: 3312: 3309: 3306: 3301: 3297: 3293: 3290: 3287: 3284: 3279: 3276: 3273: 3268: 3264: 3258: 3253: 3250: 3247: 3244: 3241: 3238: 3233: 3229: 3225: 3222: 3219: 3216: 3211: 3208: 3205: 3200: 3196: 3190: 3185: 3182: 3179: 3176: 3173: 3170: 3165: 3161: 3156: 3151: 3147: 3144: 3141: 3136: 3132: 3128: 3123: 3120: 3116: 3090: 3086: 3063: 3059: 3042: 3039: 3038: 3037: 3025: 3016: 3013: 3010: 3006: 3002: 2997: 2992: 2989: 2986: 2982: 2978: 2973: 2969: 2965: 2959: 2955: 2951: 2948: 2943: 2938: 2935: 2932: 2928: 2924: 2921: 2916: 2912: 2908: 2903: 2899: 2895: 2892: 2889: 2886: 2881: 2877: 2873: 2868: 2864: 2860: 2857: 2830: 2825: 2822: 2819: 2815: 2811: 2806: 2803: 2800: 2796: 2792: 2789: 2769: 2764: 2760: 2756: 2751: 2747: 2743: 2740: 2718: 2715: 2712: 2708: 2685: 2681: 2669: 2668: 2656: 2647: 2643: 2639: 2634: 2629: 2625: 2621: 2616: 2612: 2608: 2602: 2598: 2594: 2591: 2588: 2585: 2580: 2576: 2572: 2567: 2563: 2559: 2556: 2553: 2550: 2545: 2541: 2537: 2532: 2528: 2524: 2521: 2507: 2506: 2493: 2490: 2485: 2476: 2472: 2468: 2463: 2458: 2454: 2450: 2445: 2441: 2437: 2431: 2427: 2422: 2417: 2413: 2409: 2404: 2400: 2396: 2393: 2388: 2384: 2380: 2375: 2371: 2343: 2321: 2318: 2314: 2310: 2307: 2304: 2301: 2298: 2293: 2290: 2286: 2282: 2279: 2268: 2267: 2255: 2249: 2246: 2241: 2233: 2229: 2225: 2220: 2215: 2211: 2207: 2202: 2198: 2194: 2188: 2183: 2179: 2175: 2170: 2166: 2162: 2157: 2153: 2149: 2146: 2141: 2137: 2133: 2128: 2124: 2091: 2088: 2083: 2079: 2045: 2041: 2037: 2032: 2027: 2023: 2019: 2014: 2011: 2006: 2002: 1998: 1992: 1981: 1980: 1968: 1959: 1955: 1951: 1946: 1941: 1937: 1933: 1928: 1925: 1920: 1916: 1912: 1906: 1902: 1898: 1893: 1889: 1885: 1880: 1876: 1872: 1869: 1864: 1860: 1856: 1851: 1847: 1821: 1817: 1789: 1784: 1780: 1776: 1771: 1768: 1763: 1759: 1755: 1750: 1746: 1723: 1719: 1696: 1692: 1679: 1676: 1657: 1653: 1649: 1646: 1643: 1640: 1637: 1632: 1628: 1605: 1601: 1597: 1594: 1591: 1588: 1585: 1580: 1576: 1564: 1563: 1551: 1542: 1539: 1536: 1532: 1528: 1523: 1518: 1515: 1512: 1508: 1504: 1499: 1495: 1491: 1485: 1481: 1477: 1474: 1469: 1464: 1461: 1458: 1454: 1450: 1447: 1442: 1438: 1434: 1429: 1425: 1421: 1418: 1415: 1412: 1407: 1403: 1399: 1394: 1390: 1386: 1383: 1354: 1350: 1324: 1320: 1297: 1294: 1289: 1285: 1259: 1255: 1251: 1248: 1245: 1242: 1239: 1234: 1230: 1207: 1203: 1182: 1177: 1174: 1171: 1167: 1163: 1158: 1154: 1150: 1145: 1141: 1120: 1117: 1114: 1111: 1108: 1105: 1102: 1099: 1096: 1073: 1068: 1064: 1060: 1055: 1051: 1024: 1020: 999: 996: 968:by leveraging 948: 947: 945: 944: 937: 930: 922: 919: 918: 915: 914: 909: 908: 907: 897: 891: 888: 887: 884: 883: 880: 879: 874: 869: 864: 859: 854: 849: 843: 840: 839: 836: 835: 832: 831: 826: 821: 816: 814:Occam learning 811: 806: 801: 796: 790: 787: 786: 783: 782: 779: 778: 773: 771:Learning curve 768: 763: 757: 754: 753: 750: 749: 746: 745: 740: 735: 730: 724: 721: 720: 717: 716: 713: 712: 711: 710: 700: 695: 690: 684: 679: 678: 675: 674: 671: 670: 664: 659: 654: 649: 648: 647: 637: 632: 631: 630: 625: 620: 615: 605: 600: 595: 590: 589: 588: 578: 577: 576: 571: 566: 561: 551: 546: 541: 535: 530: 529: 526: 525: 522: 521: 516: 511: 503: 497: 492: 491: 488: 487: 484: 483: 482: 481: 476: 471: 460: 455: 454: 451: 450: 447: 446: 441: 436: 431: 426: 421: 416: 411: 406: 400: 395: 394: 391: 390: 387: 386: 381: 376: 370: 365: 360: 352: 347: 342: 336: 331: 330: 327: 326: 323: 322: 317: 312: 307: 302: 297: 292: 287: 279: 278: 277: 272: 267: 257: 255:Decision trees 252: 246: 232:classification 222: 221: 220: 217: 216: 213: 212: 207: 202: 197: 192: 187: 182: 177: 172: 167: 162: 157: 152: 147: 142: 137: 132: 127: 125:Classification 121: 118: 117: 114: 113: 110: 109: 104: 99: 94: 89: 84: 82:Batch learning 79: 74: 69: 64: 59: 54: 49: 43: 40: 39: 36: 35: 24: 23: 15: 13: 10: 9: 6: 4: 3: 2: 9037: 9026: 9023: 9021: 9018: 9016: 9013: 9012: 9010: 9001: 8998: 8996: 8993: 8992: 8988: 8978: 8973: 8966: 8963: 8957: 8952: 8945: 8942: 8936: 8931: 8924: 8921: 8915: 8910: 8903: 8900: 8894: 8889: 8882: 8879: 8873: 8868: 8861: 8858: 8852: 8847: 8840: 8837: 8832: 8828: 8823: 8818: 8814: 8810: 8806: 8799: 8796: 8790: 8785: 8778: 8775: 8769: 8764: 8757: 8754: 8748: 8743: 8736: 8734: 8730: 8724: 8719: 8712: 8709: 8704: 8700: 8696: 8689: 8686: 8680: 8675: 8671: 8664: 8657: 8654: 8649: 8645: 8641: 8637: 8633: 8629: 8625: 8618: 8615: 8609: 8604: 8600: 8596: 8589: 8587: 8583: 8577: 8572: 8565: 8562: 8556: 8551: 8544: 8541: 8535: 8530: 8523: 8521: 8519: 8515: 8509: 8504: 8500: 8496: 8492: 8485: 8482: 8476: 8471: 8467: 8463: 8459: 8452: 8449: 8443: 8438: 8431: 8428: 8423: 8419: 8415: 8411: 8407: 8403: 8399: 8395: 8390: 8385: 8381: 8377: 8373: 8366: 8363: 8357: 8352: 8348: 8344: 8340: 8333: 8330: 8324: 8319: 8312: 8310: 8308: 8304: 8298: 8293: 8286: 8284: 8280: 8274: 8269: 8262: 8260: 8256: 8250: 8245: 8241: 8237: 8233: 8226: 8223: 8218: 8214: 8209: 8204: 8200: 8196: 8192: 8188: 8184: 8177: 8174: 8169: 8165: 8161: 8157: 8153: 8146: 8143: 8136: 8131: 8128: 8125: 8122: 8119: 8116: 8113: 8112: 8111: 8105: 8103: 8101: 8097: 8092: 8088: 8087:invertibility 8083: 8081: 8075: 8068: 8066: 8051: 8048: 8043: 8039: 8035: 8030: 8026: 8005: 8002: 7997: 7992: 7980: 7977: 7972: 7968: 7961: 7956: 7940: 7935: 7931: 7925: 7921: 7917: 7914: 7911: 7906: 7894: 7891: 7886: 7882: 7875: 7864: 7859: 7855: 7849: 7845: 7836: 7820: 7811: 7809: 7805: 7787: 7784: 7781: 7778: 7749: 7733: 7729: 7715: 7695: 7687: 7683: 7679: 7678:homeomorphism 7674: 7672: 7656: 7647: 7645: 7627: 7617: 7613: 7609: 7606: 7602: 7598: 7572: 7569: 7566: 7560: 7550: 7533: 7527: 7524: 7521: 7515: 7512: 7507: 7503: 7496: 7476: 7473: 7465: 7461: 7457: 7451: 7429: 7419: 7416: 7394: 7391: 7388: 7378: 7375: 7365: 7362: 7348: 7341: 7337: 7308: 7305: 7301: 7293: 7289: 7280: 7271: 7260: 7255: 7251: 7247: 7236: 7232: 7225: 7219: 7216: 7213: 7204: 7198: 7192: 7189: 7182: 7181: 7180: 7166: 7143: 7140: 7134: 7131: 7126: 7122: 7115: 7110: 7105: 7101: 7097: 7092: 7088: 7084: 7081: 7078: 7072: 7069: 7064: 7060: 7053: 7048: 7043: 7039: 7035: 7030: 7026: 7022: 7016: 7008: 7005: 7001: 6997: 6992: 6988: 6980: 6979: 6978: 6975: 6961: 6938: 6935: 6929: 6926: 6921: 6917: 6910: 6905: 6900: 6896: 6892: 6887: 6883: 6879: 6874: 6870: 6866: 6858: 6854: 6847: 6844: 6841: 6834: 6833: 6832: 6813: 6809: 6802: 6780: 6776: 6763: 6761: 6745: 6742: 6737: 6733: 6710: 6706: 6696: 6677: 6674: 6671: 6668: 6665: 6661: 6652: 6648: 6644: 6636: 6632: 6623: 6619: 6613: 6609: 6599: 6596: 6577: 6573: 6569: 6566: 6560: 6557: 6554: 6528: 6524: 6515: 6512: 6509: 6506: 6503: 6499: 6490: 6486: 6482: 6474: 6471: 6468: 6465: 6462: 6458: 6449: 6445: 6439: 6434: 6430: 6422: 6420: 6410: 6406: 6397: 6393: 6384: 6380: 6376: 6368: 6364: 6355: 6351: 6345: 6340: 6336: 6326: 6322: 6316: 6312: 6308: 6303: 6299: 6293: 6288: 6284: 6271: 6266: 6246: 6243: 6232: 6229: 6226: 6216: 6211: 6207: 6176: 6173: 6170: 6160: 6155: 6151: 6121: 6111: 6108: 6105: 6102: 6099: 6095: 6086: 6082: 6078: 6070: 6067: 6064: 6061: 6058: 6054: 6045: 6041: 6034: 6029: 6024: 6020: 6012: 6010: 5997: 5987: 5983: 5974: 5970: 5966: 5958: 5954: 5945: 5941: 5934: 5929: 5924: 5920: 5907: 5902: 5898: 5894: 5889: 5885: 5878: 5873: 5868: 5864: 5851: 5835: 5816: 5814: 5808: 5793: 5771: 5768: 5760: 5732: 5729: 5726: 5722: 5715: 5712: 5708: 5704: 5697: 5694: 5689: 5685: 5680: 5677: 5674: 5670: 5661: 5644: 5641: 5636: 5632: 5626: 5622: 5596: 5592: 5588: 5583: 5580: 5577: 5573: 5564: 5560: 5556: 5551: 5548: 5545: 5541: 5532: 5531: 5530: 5524: 5522: 5503: 5499: 5495: 5490: 5486: 5471: 5467: 5463: 5458: 5454: 5430: 5410: 5407: 5402: 5398: 5370: 5367: 5363: 5354: 5350: 5345: 5339: 5334: 5331: 5328: 5324: 5295: 5291: 5282: 5278: 5274: 5269: 5265: 5258: 5248: 5244: 5235: 5231: 5227: 5223: 5219: 5214: 5210: 5206: 5201: 5197: 5193: 5188: 5184: 5174: 5159: 5148: 5144: 5135: 5131: 5123: 5117: 5112: 5107: 5099: 5095: 5091: 5081: 5077: 5068: 5064: 5059: 5049: 5045: 5038: 5033: 5027: 5019: 5015: 5011: 5006: 4998: 4994: 4984: 4980: 4973: 4968: 4965: 4953: 4951: 4935: 4931: 4910: 4907: 4904: 4895: 4876: 4872: 4863: 4859: 4855: 4850: 4846: 4842: 4837: 4833: 4829: 4824: 4820: 4816: 4811: 4807: 4784: 4781: 4776: 4772: 4763: 4749: 4727: 4723: 4700: 4689: 4685: 4676: 4672: 4664: 4658: 4653: 4648: 4640: 4636: 4626: 4622: 4615: 4610: 4604: 4596: 4592: 4588: 4583: 4575: 4571: 4561: 4557: 4550: 4545: 4542: 4520: 4517: 4507: 4504: 4501: 4498: 4486: 4484: 4470: 4467: 4464: 4458: 4455: 4452: 4429: 4426: 4423: 4414: 4392: 4389: 4386: 4377: 4374: 4368: 4365: 4362: 4352: 4349: 4345: 4342: 4334: 4321: 4317: 4313: 4307: 4304: 4298: 4295: 4292: 4282: 4279: 4275: 4272: 4252: 4236: 4233: 4228: 4224: 4200: 4197: 4191: 4188: 4185: 4176: 4173: 4170: 4167: 4164: 4158: 4150: 4146: 4142: 4139: 4116: 4113: 4110: 4107: 4104: 4098: 4095: 4075: 4063: 4058: 4036: 4025: 4005: 4001: 3992: 3988: 3984: 3981: 3976: 3972: 3966: 3953: 3936: 3932: 3928: 3922: 3914: 3910: 3887: 3884: 3881: 3877: 3868: 3867: 3866: 3863: 3861: 3857: 3830: 3826: 3817: 3813: 3806: 3803: 3798: 3793: 3790: 3787: 3783: 3774: 3744: 3743: 3742: 3719: 3711: 3707: 3690: 3682: 3678: 3669: 3666: 3662: 3653: 3623: 3622: 3621: 3596: 3592: 3583: 3579: 3572: 3569: 3564: 3559: 3556: 3553: 3549: 3543: 3540: 3535: 3532: 3520: 3512: 3508: 3501: 3498: 3487: 3479: 3475: 3455: 3448: 3447: 3446: 3429: 3421: 3417: 3394: 3391: 3388: 3385: 3382: 3372: 3368: 3356: 3339: 3307: 3299: 3295: 3288: 3285: 3274: 3266: 3262: 3251: 3239: 3231: 3227: 3220: 3217: 3206: 3198: 3194: 3183: 3180: 3171: 3163: 3159: 3142: 3134: 3130: 3121: 3118: 3114: 3106: 3105: 3104: 3088: 3084: 3061: 3057: 3048: 3040: 3023: 3014: 3011: 3008: 3004: 3000: 2990: 2987: 2984: 2980: 2971: 2967: 2963: 2953: 2949: 2946: 2941: 2936: 2933: 2930: 2926: 2922: 2914: 2910: 2901: 2897: 2893: 2890: 2887: 2879: 2875: 2866: 2862: 2858: 2855: 2848: 2847: 2846: 2844: 2823: 2820: 2817: 2813: 2804: 2801: 2798: 2794: 2790: 2787: 2762: 2758: 2749: 2745: 2741: 2738: 2716: 2713: 2710: 2706: 2683: 2679: 2654: 2645: 2641: 2637: 2627: 2623: 2614: 2610: 2606: 2596: 2592: 2589: 2586: 2578: 2574: 2565: 2561: 2557: 2554: 2551: 2543: 2539: 2530: 2526: 2522: 2519: 2512: 2511: 2510: 2491: 2488: 2483: 2474: 2470: 2466: 2456: 2452: 2443: 2439: 2435: 2425: 2415: 2411: 2402: 2398: 2394: 2386: 2382: 2373: 2369: 2361: 2360: 2359: 2357: 2341: 2319: 2316: 2308: 2299: 2291: 2288: 2284: 2253: 2247: 2244: 2239: 2231: 2227: 2223: 2213: 2209: 2200: 2196: 2192: 2186: 2177: 2168: 2164: 2155: 2151: 2147: 2139: 2135: 2126: 2122: 2114: 2113: 2112: 2110: 2105: 2089: 2086: 2081: 2077: 2068: 2064: 2043: 2039: 2035: 2025: 2021: 2012: 2009: 2004: 2000: 1996: 1966: 1957: 1953: 1949: 1939: 1935: 1926: 1923: 1918: 1914: 1910: 1900: 1891: 1887: 1878: 1874: 1870: 1862: 1858: 1849: 1845: 1837: 1836: 1835: 1819: 1815: 1806: 1801: 1782: 1778: 1769: 1766: 1761: 1757: 1753: 1748: 1744: 1721: 1717: 1694: 1690: 1677: 1675: 1673: 1655: 1651: 1647: 1644: 1641: 1638: 1635: 1630: 1626: 1603: 1599: 1595: 1592: 1589: 1586: 1583: 1578: 1574: 1549: 1540: 1537: 1534: 1530: 1526: 1516: 1513: 1510: 1506: 1497: 1493: 1489: 1479: 1475: 1472: 1467: 1462: 1459: 1456: 1452: 1448: 1440: 1436: 1427: 1423: 1419: 1416: 1413: 1405: 1401: 1392: 1388: 1384: 1381: 1374: 1373: 1372: 1370: 1352: 1348: 1338: 1322: 1318: 1295: 1292: 1287: 1283: 1275: 1257: 1253: 1249: 1246: 1243: 1240: 1237: 1232: 1228: 1205: 1201: 1175: 1172: 1169: 1165: 1156: 1152: 1148: 1143: 1139: 1118: 1115: 1112: 1109: 1106: 1103: 1100: 1097: 1094: 1085: 1066: 1062: 1053: 1049: 1040: 1022: 1018: 1004: 997: 995: 993: 989: 984: 982: 981:loss function 977: 975: 971: 967: 963: 959: 955: 943: 938: 936: 931: 929: 924: 923: 921: 920: 913: 910: 906: 903: 902: 901: 898: 896: 893: 892: 886: 885: 878: 875: 873: 870: 868: 865: 863: 860: 858: 855: 853: 850: 848: 845: 844: 838: 837: 830: 827: 825: 822: 820: 817: 815: 812: 810: 807: 805: 802: 800: 797: 795: 792: 791: 785: 784: 777: 774: 772: 769: 767: 764: 762: 759: 758: 752: 751: 744: 741: 739: 736: 734: 733:Crowdsourcing 731: 729: 726: 725: 719: 718: 709: 706: 705: 704: 701: 699: 696: 694: 691: 689: 686: 685: 682: 677: 676: 668: 665: 663: 662:Memtransistor 660: 658: 655: 653: 650: 646: 643: 642: 641: 638: 636: 633: 629: 626: 624: 621: 619: 616: 614: 611: 610: 609: 606: 604: 601: 599: 596: 594: 591: 587: 584: 583: 582: 579: 575: 572: 570: 567: 565: 562: 560: 557: 556: 555: 552: 550: 547: 545: 544:Deep learning 542: 540: 537: 536: 533: 528: 527: 520: 517: 515: 512: 510: 508: 504: 502: 499: 498: 495: 490: 489: 480: 479:Hidden Markov 477: 475: 472: 470: 467: 466: 465: 462: 461: 458: 453: 452: 445: 442: 440: 437: 435: 432: 430: 427: 425: 422: 420: 417: 415: 412: 410: 407: 405: 402: 401: 398: 393: 392: 385: 382: 380: 377: 375: 371: 369: 366: 364: 361: 359: 357: 353: 351: 348: 346: 343: 341: 338: 337: 334: 329: 328: 321: 318: 316: 313: 311: 308: 306: 303: 301: 298: 296: 293: 291: 288: 286: 284: 280: 276: 275:Random forest 273: 271: 268: 266: 263: 262: 261: 258: 256: 253: 251: 248: 247: 240: 239: 234: 233: 225: 219: 218: 211: 208: 206: 203: 201: 198: 196: 193: 191: 188: 186: 183: 181: 178: 176: 173: 171: 168: 166: 163: 161: 160:Data cleaning 158: 156: 153: 151: 148: 146: 143: 141: 138: 136: 133: 131: 128: 126: 123: 122: 116: 115: 108: 105: 103: 100: 98: 95: 93: 90: 88: 85: 83: 80: 78: 75: 73: 72:Meta-learning 70: 68: 65: 63: 60: 58: 55: 53: 50: 48: 45: 44: 38: 37: 34: 29: 25: 21: 20: 8965: 8944: 8923: 8902: 8881: 8860: 8839: 8812: 8808: 8798: 8777: 8768:1810.09136v3 8756: 8711: 8702: 8698: 8688: 8669: 8656: 8631: 8627: 8617: 8598: 8564: 8543: 8498: 8494: 8484: 8465: 8461: 8451: 8430: 8379: 8375: 8365: 8349:(57): 1–64. 8346: 8342: 8332: 8239: 8235: 8225: 8190: 8186: 8176: 8159: 8155: 8145: 8109: 8106:Applications 8084: 8076: 8072: 7812: 7734: 7730: 7675: 7648: 7552: 7367: 7363: 7323: 7158: 6976: 6953: 6767: 6697: 6600: 6597: 6267: 5852: 5820: 5812: 5528: 5175: 4957: 4896: 4764: 4490: 4415: 4253: 4216:The inverse 4067: 3864: 3853: 3740: 3619: 3331: 3044: 2780:is equal to 2670: 2508: 2358:), we have: 2269: 2106: 1982: 1802: 1736:. Note that 1681: 1565: 1339: 1086: 1009: 985: 978: 969: 953: 951: 819:PAC learning 506: 355: 350:Hierarchical 282: 236: 230: 8080:typical set 4950:direction. 4064:Planar Flow 2063:determinant 703:Multi-agent 640:Transformer 539:Autoencoder 295:Naive Bayes 33:data mining 9009:Categories 8977:2008.12577 8956:1903.01434 8935:1906.12320 8914:2001.09382 8893:1912.01219 8872:2006.09347 8851:2109.10794 8789:1906.02994 8747:2008.10486 8723:1907.12998 8679:1806.07366 8608:2002.02798 8576:1810.01367 8555:2210.02747 8534:1810.01367 8508:1606.04934 8475:1705.07057 8442:1505.05770 8389:1908.09257 8356:1912.02762 8323:1807.03039 8297:1605.08803 8249:1912.02762 8208:11336/8930 8137:References 7671:neural ODE 7489:, we have 4088:, and let 1369:derivation 688:Q-learning 586:Restricted 384:Mean shift 333:Clustering 310:Perceptron 238:regression 140:Clustering 135:Regression 8648:0361-0918 8422:208910764 8406:1939-3539 8273:1410.8516 8091:bijective 8069:Downsides 8040:λ 8027:λ 7953:∇ 7932:∫ 7922:λ 7856:∫ 7846:λ 7607:− 7599:± 7420:∈ 7392:× 7379:∈ 7333:∂ 7286:∂ 7278:∂ 7252:∫ 7248:− 7220:⁡ 7193:⁡ 7102:∫ 7098:− 7040:∫ 7006:− 6897:∫ 6743:− 6738:θ 6711:θ 6675:− 6649:σ 6645:⋯ 6620:σ 6610:σ 6558:∼ 6513:− 6487:σ 6472:− 6446:μ 6423:⋯ 6381:σ 6352:μ 6313:σ 6300:μ 6250:∞ 6238:→ 6230:− 6208:σ 6182:→ 6174:− 6152:μ 6109:− 6083:σ 6068:− 6042:μ 6030:∼ 6013:⋯ 5971:σ 5942:μ 5930:∼ 5899:σ 5886:μ 5874:∼ 5690:∑ 5623:∏ 5480:↦ 5403:θ 5355:θ 5325:∏ 5283:θ 5275:− 5259:⊙ 5236:θ 5228:− 5136:θ 5092:⊙ 5069:θ 5020:θ 4864:θ 4856:− 4782:− 4777:θ 4750:θ 4728:θ 4677:θ 4597:θ 4508:∈ 4468:− 4462:⟩ 4450:⟨ 4396:⟩ 4384:⟨ 4372:⟩ 4360:⟨ 4302:⟩ 4290:⟨ 4234:− 4229:θ 4195:⟩ 4183:⟨ 4151:θ 4096:θ 4040:^ 4037:θ 3993:θ 3985:⁡ 3973:∑ 3967:θ 3923:⋅ 3915:θ 3818:θ 3807:⁡ 3784:∑ 3775:θ 3712:θ 3683:∗ 3654:θ 3584:θ 3573:⁡ 3550:∑ 3536:− 3513:θ 3502:⁡ 3480:∗ 3468:^ 3456:− 3422:∗ 3340:θ 3300:∗ 3289:⁡ 3267:∗ 3232:θ 3221:⁡ 3199:∗ 3184:− 3164:θ 3135:∗ 3089:∗ 3062:θ 3012:− 2988:− 2950:⁡ 2927:∑ 2923:− 2894:⁡ 2859:⁡ 2843:induction 2821:− 2802:− 2791:⁡ 2742:⁡ 2714:− 2593:⁡ 2587:− 2558:⁡ 2523:⁡ 2489:− 2317:− 2289:− 2245:− 2087:− 2010:− 1924:− 1767:− 1682:Consider 1538:− 1514:− 1476:⁡ 1453:∑ 1449:− 1420:⁡ 1385:⁡ 1293:− 1173:− 847:ECML PKDD 829:VC theory 776:ROC curve 708:Self-play 628:DeepDream 469:Bayes net 260:Ensembles 41:Paradigms 8831:35860036 8414:32396070 8217:17820269 8100:Jacobian 7988:‖ 7948:‖ 7902:‖ 7872:‖ 5716:′ 5698:′ 4799:is just 4353:′ 4283:′ 4059:Variants 4026:RETURN. 2731:. Since 1367:is (see 960:used in 270:Boosting 119:Problems 8822:9295254 6268:By the 5786:. Here 3954:SOLVE. 2334:(where 2107:By the 2065:of the 2061:is the 1803:By the 852:NeurIPS 669:(ECRAM) 623:AlexNet 265:Bagging 8829:  8819:  8646:  8420:  8412:  8404:  8215:  8018:where 6954:where 6143:where 4715:where 3780:  3659:  2845:that: 2354:is an 1983:Where 1131:, let 998:Method 645:Vision 501:RANSAC 379:OPTICS 374:DBSCAN 358:-means 165:AutoML 8972:arXiv 8951:arXiv 8930:arXiv 8909:arXiv 8888:arXiv 8867:arXiv 8846:arXiv 8784:arXiv 8763:arXiv 8742:arXiv 8718:arXiv 8674:arXiv 8666:(PDF) 8603:arXiv 8571:arXiv 8550:arXiv 8529:arXiv 8503:arXiv 8470:arXiv 8437:arXiv 8418:S2CID 8384:arXiv 8351:arXiv 8318:arXiv 8292:arXiv 8268:arXiv 8244:arXiv 8213:S2CID 7649:When 7444:with 4897:When 956:is a 867:IJCAI 693:SARSA 652:Mamba 618:LeNet 613:U-Net 439:t-SNE 363:Fuzzy 340:BIRCH 8827:PMID 8644:ISSN 8410:PMID 8402:ISSN 8049:> 6725:and 6199:and 4491:Let 4465:> 4442:and 4430:tanh 2698:and 1834:is: 1709:and 1087:For 1010:Let 990:and 877:JMLR 862:ICLR 857:ICML 743:RLHF 559:LSTM 345:CURE 31:and 8817:PMC 8813:139 8636:doi 8394:doi 8203:hdl 8195:doi 8164:doi 7646:). 7217:log 7190:log 5755:det 4267:det 3963:max 3804:log 3570:log 3499:log 3286:log 3218:log 2958:det 2947:log 2891:log 2856:log 2788:log 2739:log 2601:det 2590:log 2555:log 2520:log 2430:det 2303:det 2278:det 2182:det 2069:of 1991:det 1905:det 1484:det 1473:log 1417:log 1382:log 1371:): 603:SOM 593:GAN 569:ESN 564:GRU 509:-NN 444:SDL 434:PGD 429:PCA 424:NMF 419:LDA 414:ICA 409:CCA 285:-NN 9011:: 8825:. 8811:. 8807:. 8732:^ 8703:32 8701:. 8697:. 8642:. 8632:18 8630:. 8626:. 8597:. 8585:^ 8517:^ 8499:29 8497:. 8493:. 8466:30 8464:. 8460:. 8416:. 8408:. 8400:. 8392:. 8380:43 8378:. 8374:. 8347:22 8345:. 8341:. 8306:^ 8282:^ 8258:^ 8240:22 8238:. 8234:. 8211:. 8201:. 8191:66 8189:. 8185:. 8158:. 8154:. 7267:Tr 6695:. 6595:. 4762:. 4413:. 3982:ln 2111:: 2104:. 1800:. 1084:. 952:A 872:ML 8980:. 8974:: 8959:. 8953:: 8938:. 8932:: 8917:. 8911:: 8896:. 8890:: 8875:. 8869:: 8854:. 8848:: 8833:. 8792:. 8786:: 8771:. 8765:: 8750:. 8744:: 8726:. 8720:: 8682:. 8676:: 8650:. 8638:: 8611:. 8605:: 8579:. 8573:: 8558:. 8552:: 8537:. 8531:: 8511:. 8505:: 8478:. 8472:: 8445:. 8439:: 8424:. 8396:: 8386:: 8359:. 8353:: 8326:. 8320:: 8300:. 8294:: 8276:. 8270:: 8252:. 8246:: 8219:. 8205:: 8197:: 8170:. 8166:: 8160:8 8052:0 8044:J 8036:, 8031:K 8006:t 8003:d 7998:2 7993:F 7984:) 7981:t 7978:, 7973:t 7969:z 7965:( 7962:f 7957:z 7941:T 7936:0 7926:J 7918:+ 7915:t 7912:d 7907:2 7898:) 7895:t 7892:, 7887:t 7883:z 7879:( 7876:f 7865:T 7860:0 7850:K 7837:: 7821:f 7788:1 7785:+ 7782:n 7779:2 7774:R 7750:n 7745:R 7716:f 7696:f 7657:f 7642:( 7628:n 7624:} 7618:2 7614:/ 7610:1 7603:n 7596:{ 7576:) 7573:I 7570:, 7567:0 7564:( 7561:N 7537:) 7534:W 7531:( 7528:r 7525:t 7522:= 7519:] 7516:u 7513:W 7508:T 7504:u 7500:[ 7497:E 7477:I 7474:= 7471:] 7466:T 7462:u 7458:u 7455:[ 7452:E 7430:n 7425:R 7417:u 7395:n 7389:n 7384:R 7376:W 7349:f 7342:t 7338:z 7309:t 7306:d 7302:] 7294:t 7290:z 7281:f 7272:[ 7261:T 7256:0 7245:) 7242:) 7237:0 7233:z 7229:( 7226:p 7223:( 7214:= 7211:) 7208:) 7205:x 7202:( 7199:p 7196:( 7167:x 7144:t 7141:d 7138:) 7135:t 7132:, 7127:t 7123:z 7119:( 7116:f 7111:T 7106:0 7093:T 7089:z 7085:= 7082:t 7079:d 7076:) 7073:t 7070:, 7065:t 7061:z 7057:( 7054:f 7049:0 7044:T 7036:+ 7031:T 7027:z 7023:= 7020:) 7017:x 7014:( 7009:1 7002:F 6998:= 6993:0 6989:z 6962:f 6939:t 6936:d 6933:) 6930:t 6927:, 6922:t 6918:z 6914:( 6911:f 6906:T 6901:0 6893:+ 6888:0 6884:z 6880:= 6875:T 6871:z 6867:= 6864:) 6859:0 6855:z 6851:( 6848:F 6845:= 6842:x 6819:) 6814:0 6810:z 6806:( 6803:p 6781:0 6777:z 6746:1 6734:f 6707:f 6683:) 6678:1 6672:n 6669:: 6666:1 6662:x 6658:( 6653:n 6642:) 6637:1 6633:x 6629:( 6624:2 6614:1 6583:) 6578:n 6574:I 6570:, 6567:0 6564:( 6561:N 6555:z 6529:n 6525:z 6521:) 6516:1 6510:n 6507:: 6504:1 6500:x 6496:( 6491:n 6483:+ 6480:) 6475:1 6469:n 6466:: 6463:1 6459:x 6455:( 6450:n 6440:= 6435:n 6431:x 6411:2 6407:z 6403:) 6398:1 6394:x 6390:( 6385:2 6377:+ 6374:) 6369:1 6365:x 6361:( 6356:2 6346:= 6341:2 6337:x 6327:1 6323:z 6317:1 6309:+ 6304:1 6294:= 6289:1 6285:x 6253:) 6247:, 6244:0 6241:( 6233:1 6227:i 6222:R 6217:: 6212:i 6186:R 6177:1 6171:i 6166:R 6161:: 6156:i 6127:) 6122:2 6118:) 6112:1 6106:n 6103:: 6100:1 6096:x 6092:( 6087:n 6079:, 6076:) 6071:1 6065:n 6062:: 6059:1 6055:x 6051:( 6046:n 6038:( 6035:N 6025:n 6021:x 6003:) 5998:2 5994:) 5988:1 5984:x 5980:( 5975:2 5967:, 5964:) 5959:1 5955:x 5951:( 5946:2 5938:( 5935:N 5925:2 5921:x 5913:) 5908:2 5903:1 5895:, 5890:1 5882:( 5879:N 5869:1 5865:x 5836:n 5831:R 5794:K 5772:W 5769:H 5765:) 5761:K 5758:( 5733:j 5730:i 5727:c 5723:y 5713:c 5709:c 5705:K 5695:c 5686:= 5681:j 5678:i 5675:c 5671:z 5659:. 5645:W 5642:H 5637:c 5633:s 5627:c 5602:) 5597:c 5593:b 5589:+ 5584:j 5581:i 5578:c 5574:x 5570:( 5565:c 5561:s 5557:= 5552:j 5549:i 5546:c 5542:y 5509:) 5504:1 5500:x 5496:, 5491:2 5487:x 5483:( 5477:) 5472:2 5468:x 5464:, 5459:1 5455:x 5451:( 5431:x 5411:0 5408:= 5399:s 5376:) 5371:, 5368:1 5364:z 5360:( 5351:s 5346:e 5340:n 5335:1 5332:= 5329:i 5304:) 5301:) 5296:1 5292:x 5288:( 5279:m 5270:2 5266:x 5262:( 5254:) 5249:1 5245:x 5241:( 5232:s 5224:e 5220:= 5215:2 5211:z 5207:, 5202:1 5198:x 5194:= 5189:1 5185:z 5160:] 5154:) 5149:1 5145:z 5141:( 5132:m 5124:0 5118:[ 5113:+ 5108:] 5100:2 5096:z 5087:) 5082:1 5078:z 5074:( 5065:s 5060:e 5050:1 5046:z 5039:[ 5034:= 5031:) 5028:z 5025:( 5016:f 5012:= 5007:] 4999:2 4995:x 4985:1 4981:x 4974:[ 4969:= 4966:x 4936:2 4932:x 4911:1 4908:= 4905:n 4882:) 4877:1 4873:x 4869:( 4860:m 4851:2 4847:x 4843:= 4838:2 4834:z 4830:, 4825:1 4821:x 4817:= 4812:1 4808:z 4785:1 4773:f 4724:m 4701:] 4695:) 4690:1 4686:z 4682:( 4673:m 4665:0 4659:[ 4654:+ 4649:] 4641:2 4637:z 4627:1 4623:z 4616:[ 4611:= 4608:) 4605:z 4602:( 4593:f 4589:= 4584:] 4576:2 4572:x 4562:1 4558:x 4551:[ 4546:= 4543:x 4521:n 4518:2 4513:R 4505:z 4502:, 4499:x 4471:1 4459:w 4456:, 4453:u 4427:= 4424:h 4400:| 4393:w 4390:, 4387:u 4381:) 4378:b 4375:+ 4369:z 4366:, 4363:w 4357:( 4350:h 4346:+ 4343:1 4339:| 4335:= 4331:| 4327:) 4322:T 4318:w 4314:u 4311:) 4308:b 4305:+ 4299:z 4296:, 4293:w 4287:( 4280:h 4276:+ 4273:I 4270:( 4263:| 4237:1 4225:f 4204:) 4201:b 4198:+ 4192:z 4189:, 4186:w 4180:( 4177:h 4174:u 4171:+ 4168:z 4165:= 4162:) 4159:z 4156:( 4147:f 4143:= 4140:x 4120:) 4117:b 4114:, 4111:w 4108:, 4105:u 4102:( 4099:= 4076:h 4011:) 4006:j 4002:x 3998:( 3989:p 3977:j 3951:. 3937:0 3933:p 3929:, 3926:) 3920:( 3911:f 3888:n 3885:: 3882:1 3878:x 3839:) 3836:) 3831:i 3827:x 3823:( 3814:p 3810:( 3799:N 3794:0 3791:= 3788:i 3771:x 3768:a 3765:m 3761:g 3758:r 3755:a 3726:] 3723:) 3720:x 3717:( 3708:p 3703:| 3698:| 3694:) 3691:x 3688:( 3679:p 3675:[ 3670:L 3667:K 3663:D 3650:n 3647:i 3644:m 3640:g 3637:r 3634:a 3605:) 3602:) 3597:i 3593:x 3589:( 3580:p 3576:( 3565:N 3560:0 3557:= 3554:i 3544:N 3541:1 3533:= 3530:] 3527:) 3524:) 3521:x 3518:( 3509:p 3505:( 3496:[ 3491:) 3488:x 3485:( 3476:p 3464:E 3433:) 3430:x 3427:( 3418:p 3395:N 3392:: 3389:1 3386:= 3383:i 3379:} 3373:i 3369:x 3365:{ 3317:] 3314:) 3311:) 3308:x 3305:( 3296:p 3292:( 3283:[ 3278:) 3275:x 3272:( 3263:p 3257:E 3252:+ 3249:] 3246:) 3243:) 3240:x 3237:( 3228:p 3224:( 3215:[ 3210:) 3207:x 3204:( 3195:p 3189:E 3181:= 3178:] 3175:) 3172:x 3169:( 3160:p 3155:| 3150:| 3146:) 3143:x 3140:( 3131:p 3127:[ 3122:L 3119:K 3115:D 3085:p 3058:p 3024:| 3015:1 3009:i 3005:z 3001:d 2996:) 2991:1 2985:i 2981:z 2977:( 2972:i 2968:f 2964:d 2954:| 2942:K 2937:1 2934:= 2931:i 2920:) 2915:0 2911:z 2907:( 2902:0 2898:p 2888:= 2885:) 2880:K 2876:z 2872:( 2867:K 2863:p 2829:) 2824:1 2818:i 2814:z 2810:( 2805:1 2799:i 2795:p 2768:) 2763:i 2759:z 2755:( 2750:i 2746:p 2717:1 2711:i 2707:z 2684:i 2680:z 2655:| 2646:0 2642:z 2638:d 2633:) 2628:0 2624:z 2620:( 2615:1 2611:f 2607:d 2597:| 2584:) 2579:0 2575:z 2571:( 2566:0 2562:p 2552:= 2549:) 2544:1 2540:z 2536:( 2531:1 2527:p 2492:1 2484:| 2475:0 2471:z 2467:d 2462:) 2457:0 2453:z 2449:( 2444:1 2440:f 2436:d 2426:| 2421:) 2416:0 2412:z 2408:( 2403:0 2399:p 2395:= 2392:) 2387:1 2383:z 2379:( 2374:1 2370:p 2342:A 2320:1 2313:) 2309:A 2306:( 2300:= 2297:) 2292:1 2285:A 2281:( 2254:| 2248:1 2240:) 2232:0 2228:z 2224:d 2219:) 2214:0 2210:z 2206:( 2201:1 2197:f 2193:d 2187:( 2178:| 2174:) 2169:0 2165:z 2161:( 2156:0 2152:p 2148:= 2145:) 2140:1 2136:z 2132:( 2127:1 2123:p 2090:1 2082:1 2078:f 2044:1 2040:z 2036:d 2031:) 2026:1 2022:z 2018:( 2013:1 2005:1 2001:f 1997:d 1967:| 1958:1 1954:z 1950:d 1945:) 1940:1 1936:z 1932:( 1927:1 1919:1 1915:f 1911:d 1901:| 1897:) 1892:0 1888:z 1884:( 1879:0 1875:p 1871:= 1868:) 1863:1 1859:z 1855:( 1850:1 1846:p 1820:1 1816:z 1788:) 1783:1 1779:z 1775:( 1770:1 1762:1 1758:f 1754:= 1749:0 1745:z 1722:0 1718:z 1695:1 1691:z 1656:K 1652:f 1648:, 1645:. 1642:. 1639:. 1636:, 1631:1 1627:f 1604:K 1600:f 1596:, 1593:. 1590:. 1587:. 1584:, 1579:1 1575:f 1550:| 1541:1 1535:i 1531:z 1527:d 1522:) 1517:1 1511:i 1507:z 1503:( 1498:i 1494:f 1490:d 1480:| 1468:K 1463:1 1460:= 1457:i 1446:) 1441:0 1437:z 1433:( 1428:0 1424:p 1414:= 1411:) 1406:K 1402:z 1398:( 1393:K 1389:p 1353:K 1349:z 1323:K 1319:z 1296:1 1288:i 1284:f 1258:K 1254:f 1250:, 1247:. 1244:. 1241:. 1238:, 1233:1 1229:f 1206:0 1202:z 1181:) 1176:1 1170:i 1166:z 1162:( 1157:i 1153:f 1149:= 1144:i 1140:z 1119:K 1116:, 1113:. 1110:. 1107:. 1104:, 1101:1 1098:= 1095:i 1072:) 1067:0 1063:z 1059:( 1054:0 1050:p 1023:0 1019:z 941:e 934:t 927:v 507:k 356:k 283:k 241:) 229:(

Index

Machine learning
data mining
Supervised learning
Unsupervised learning
Semi-supervised learning
Self-supervised learning
Reinforcement learning
Meta-learning
Online learning
Batch learning
Curriculum learning
Rule-based learning
Neuro-symbolic AI
Neuromorphic engineering
Quantum machine learning
Classification
Generative modeling
Regression
Clustering
Dimensionality reduction
Density estimation
Anomaly detection
Data cleaning
AutoML
Association rules
Semantic analysis
Structured prediction
Feature engineering
Feature learning
Learning to rank

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑