3421:
3509:, an application of PCA, using the plot of eigenvalues. A typical choice of the number of components with PCA is based on the "elbow" point, then the existence of the flat plateau is indicating that PCA is not capturing the data efficiently, and at last there exists a sudden drop reflecting the capture of random noise and falls into the regime of overfitting. For sequential NMF, the plot of eigenvalues is approximated by the plot of the fractional residual variance curves, where the curves decreases continuously, and converge to a higher level than PCA, which is the indication of less over-fitting of sequential NMF.
4208:. However, if the noise is non-stationary, the classical denoising algorithms usually have poor performance because the statistical information of the non-stationary noise is difficult to estimate. Schmidt et al. use NMF to do speech denoising under non-stationary noise, which is completely different from classical statistical approaches. The key idea is that clean speech signal can be sparsely represented by a speech dictionary, but non-stationary noise cannot. Similarly, non-stationary noise can also be sparsely represented by a noise dictionary, but speech cannot.
3564:
4287:
8319:
1047:, with the property that all three matrices have no negative elements. This non-negativity makes the resulting matrices easier to inspect. Also, in applications such as processing of audio spectrograms or muscular activity, non-negativity is inherent to the data being considered. Since the problem is not exactly solvable in general, it is commonly approximated numerically.
4012:
where forward modeling have to be adopted to recover the true flux. Forward modeling is currently optimized for point sources, however not for extended sources, especially for irregularly shaped structures such as circumstellar disks. In this situation, NMF has been an excellent method, being less over-fitting in the sense of the non-negativity and
3113:
2870:
2537:, where there may be many users and many items to recommend, and it would be inefficient to recalculate everything when one user or one item is added to the system. The cost function for optimization in these cases may or may not be the same as for standard NMF, but the algorithms need to be rather different.
4249:
data and finding the genes most representative of the clusters. In the analysis of cancer mutations it has been used to identify common patterns of mutations that occur in many cancers and that probably have distinct causes. NMF techniques can identify sources of variation such as cell types, disease
4032:
The data imputation procedure with NMF can be composed of two steps. First, when the NMF components are known, Ren et al. (2020) proved that impact from missing data during data imputation ("target modeling" in their study) is a second order effect. Second, when the NMF components are unknown, the
4028:
in statistics. By first proving that the missing data are ignored in the cost function, then proving that the impact from missing data can be as small as a second order effect, Ren et al. (2020) studied and applied such an approach for the field of astronomy. Their work focuses on two-dimensional
4011:
In direct imaging, to reveal the faint exoplanets and circumstellar disks from bright the surrounding stellar lights, which has a typical contrast from 10â” to 10Âčâ°, various statistical methods have been adopted, however the light from the exoplanets or circumstellar disks are usually over-fitted,
3976:
in the sense that astrophysical signals are non-negative. NMF has been applied to the spectroscopic observations and the direct imaging observations as a method to study the common properties of astronomical objects and post-process the astronomical observations. The advances in the spectroscopic
6616:
Wahhaj, Zahed; Cieza, Lucas A.; Mawet, Dimitri; Yang, Bin; Canovas, Hector; de Boer, Jozua; Casassus, Simon; Ménard, François; Schreiber, Matthias R.; Liu, Michael C.; Biller, Beth A.; Nielsen, Eric L.; Hayward, Thomas L. (2015). "Improving signal-to-noise in the direct imaging of exoplanets and
3424:
Fractional residual variance (FRV) plots for PCA and sequential NMF; for PCA, the theoretical values are the contribution from the residual eigenvalues. In comparison, the FRV curves for PCA reaches a flat plateau where no signal are captured effectively; while the NMF FRV curves are declining
3393:
Current algorithms are sub-optimal in that they only guarantee finding a local minimum, rather than a global minimum of the cost function. A provably optimal algorithm is unlikely in the near future as the problem has been shown to generalize the k-means clustering problem which is known to be
1094:
non-negative matrix factorization has a long history under the name "self modeling curve resolution". In this framework the vectors in the right matrix are continuous curves rather than discrete vectors. Also early work on non-negative matrix factorizations was performed by a
Finnish group of
4211:
The algorithm for NMF denoising goes as follows. Two dictionaries, one for speech and one for noise, need to be trained offline. Once a noisy speech is given, we first calculate the magnitude of the Short-Time-Fourier-Transform. Second, separate it into two parts via NMF, one can be sparsely
964:
4036:
Depending on the way that the NMF components are obtained, the former step above can be either independent or dependent from the latter. In addition, the imputation quality can be increased when the more NMF components are used, see Figure 4 of Ren et al. (2020) for their illustration.
1310:
When multiplying matrices, the dimensions of the factor matrices may be significantly lower than those of the product matrix and it is this property that forms the basis of NMF. NMF generates factors with significantly reduced dimensions compared to the original matrix. For example, if
4187:
measurements. This kind of method was firstly introduced in
Internet Distance Estimation Service (IDES). Afterwards, as a fully decentralized approach, Phoenix network coordinate system is proposed. It achieves better overall prediction accuracy by introducing the concept of weight.
3286:
4324:
Scalability: how to factorize million-by-billion matrices, which are commonplace in Web-scale data mining, e.g., see
Distributed Nonnegative Matrix Factorization (DNMF), Scalable Nonnegative Matrix Factorization (ScalableNMF), Distributed Stochastic Singular Value
4090:
Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, & Zhu (2013) have given polynomial-time algorithms to learn topic models using NMF. The algorithm assumes that the topic matrix satisfies a separability condition that is often found to hold in these settings.
3214:
2879:
2654:
4228:
in sampled genomes. In human genetic clustering, NMF algorithms provide estimates similar to those of the computer program STRUCTURE, but the algorithms are more efficient computationally and allow analysis of large population genomic data sets.
3450:(PCA) in astronomy. The contribution from the PCA components are ranked by the magnitude of their corresponding eigenvalues; for NMF, its components can be ranked empirically when they are constructed one by one (sequentially), i.e., learn the
2524:
Many standard NMF algorithms analyze all the data together; i.e., the whole matrix is available from the start. This may be unsatisfactory in applications where there are too many data to fit into memory or where the data are provided in
4094:
Hassani, Iranmanesh and
Mansouri (2019) proposed a feature agglomeration method for term-document matrices which operates using NMF. The algorithm reduces the term-document matrix into a smaller matrix more suitable for text clustering.
2488:
3684:
contains cluster membership indicators. This provides a theoretical foundation for using NMF for data clustering. However, k-means does not enforce non-negativity on its centroids, so the closest analogy is in fact with "semi-NMF".
1484:
as a document archetype comprising a set of words where each word's cell value defines the word's rank in the feature: The higher a word's cell value the higher the word's rank in the feature. A column in the coefficients matrix
1491:
represents an original document with a cell value defining the document's rank for a feature. We can now reconstruct a document (column vector) from our input matrix by a linear combination of our features (column vectors in
3539:
time in the dense case. Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, & Zhu (2013) give a polynomial time algorithm for exact NMF that works for the case where one of the factors W satisfies a separability condition.
2551:
represent data sampled over spatial or temporal dimensions, e.g. time signals, images, or video, features that are equivariant w.r.t. shifts along these dimensions can be learned by
Convolutional NMF. In this case,
3713:(SVM). However, SVM and NMF are related at a more intimate level than that of NQP, which allows direct application of the solution algorithms developed for either of the two methods to problems in both domains.
3875:
4212:
represented by the speech dictionary, and the other part can be sparsely represented by the noise dictionary. Third, the part that is represented by the speech dictionary will be the estimated clean speech.
1256:
7247:
Stein-OâBrien, Genevieve L.; Arora, Raman; Culhane, Aedin C.; Favorov, Alexander V.; Garmire, Lana X.; Greene, Casey S.; Goff, Loyal A.; Li, Yifeng; Ngom, Aloune; Ochs, Michael F.; Xu, Yanxun (2018-10-01).
3780:
8224:
Andrzej
Cichocki, Rafal Zdunek, Anh Huy Phan and Shun-ichi Amari: "Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation", Wiley,
1630:
4024:
To impute missing data in statistics, NMF can take missing data while minimizing its cost function, rather than treating these missing data as zeros. This makes it a mathematically proven method for
1173:
1905:
3325:
3977:
observations by
Blanton & Roweis (2007) takes into account of the uncertainties of astronomical observations, which is later improved by Zhu (2016) where missing data are also considered and
3219:
6832:
Berry, Michael W.; Browne, Murray; Langville, Amy N.; Paucac, V. Paul; Plemmonsc, Robert J. (15 September 2007). "Algorithms and
Applications for Approximate Nonnegative Matrix Factorization".
5178:
Zhang, T.; Fang, B.; Liu, W.; Tang, Y. Y.; He, G.; Wen, J. (2008). "Total variation norm-based nonnegative matrix factorization for identifying discriminant representation of image patterns".
1570:
4334:
Cohen and
Rothblum 1993 problem: whether a rational matrix always has an NMF of minimal inner dimension whose factors are also rational. Recently, this problem has been answered negatively.
2063:
1828:
3529:
contains a monomial sub matrix of rank equal to its rank was given by
Campbell and Poole in 1981. Kalofolias and Gallopoulos (2012) solved the symmetric counterpart of this problem, where
877:
3702:
Other extensions of NMF include joint factorization of several data matrices and tensors where some factors are shared. Such models are useful for sensor fusion and relational learning.
4016:
of the NMF modeling coefficients, therefore forward modeling can be performed with a few scaling factors, rather than a computationally intensive data re-reduction on generated models.
3823:
915:
3108:{\displaystyle \mathbf {W} _{}^{n+1}\leftarrow \mathbf {W} _{}^{n}{\frac {(\mathbf {V} (\mathbf {H} ^{n+1})^{T})_{}}{(\mathbf {W} ^{n}\mathbf {H} ^{n+1}(\mathbf {H} ^{n+1})^{T})_{}}}}
1725:
3151:
3940:
3911:
1475:
This last point is the basis of NMF because we can consider each original document in our example as being built from a small set of hidden features. NMF generates these features.
3643:
2283:
2865:{\displaystyle \mathbf {H} _{}^{n+1}\leftarrow \mathbf {H} _{}^{n}{\frac {((\mathbf {W} ^{n})^{T}\mathbf {V} )_{}}{((\mathbf {W} ^{n})^{T}\mathbf {W} ^{n}\mathbf {H} ^{n})_{}}}}
872:
1763:
3425:
continuously, indicating a better ability to capture signal. The FRV curves for NMF also converges to higher levels than PCA, indicating the less-overfitting property of NMF.
862:
1390:
with 10000 rows and 500 columns where words are in rows and documents are in columns. That is, we have 500 documents indexed by 10000 words. It follows that a column vector
703:
6745:
2405:
1788:
1595:
6242:
8128:
CĂ©dric FĂ©votte; Nancy Bertin & Jean-Louis Durrieu (2009). "Nonnegative Matrix Factorization with the Itakura-Saito Divergence: With Application to Music Analysis".
6861:
6135:
5575:
7968:
4156:
3480:
2413:
1940:
4185:
910:
3649:
It was later shown that some types of NMF are an instance of a more general probabilistic model called "multinomial PCA". When NMF is obtained by minimizing the
7901:
Chistikov, Dmitry; Kiefer, Stefan; MaruĆĄiÄ, Ines; Shirmohammadi, Mahsa; Worrell, James (2016-05-22). "Nonnegative Matrix Factorization Requires Irrationality".
7528:
4129:
3500:
2647:
2020:
2000:
1980:
1960:
1855:
1670:
1650:
5853:
Soummer, RĂ©mi; Pueyo, Laurent; Larkin, James (2012). "Detection and Characterization of Exoplanets and Disks Using Projections on Karhunen-LoĂšve Eigenimages".
3560:, and shows that although the three techniques may be written as factorizations, they implement different constraints and therefore produce different results.
6240:(1999). "The Multilinear Engine: A Table-Driven, Least Squares Program for Solving Multilinear Problems, including the n-Way Parallel Factor Analysis Model".
2382:
is defined on probability distributions). Each divergence leads to a different NMF algorithm, usually minimizing the divergence using iterative update rules.
867:
718:
5054:
C Ding, T Li, MI Jordan, Convex and semi-nonnegative matrix factorizations, IEEE Transactions on Pattern Analysis and Machine Intelligence, 32, 45-55, 2010
8173:
7736:
4331:
Collective (joint) factorization: factorizing multiple interrelated matrices for multiple-view learning, e.g. multi-view clustering, see CoNMF and MultiNMF
449:
4056:
is constructed with the weights of various terms (typically weighted word frequency information) from a set of documents. This matrix is factored into a
950:
753:
3413:
may affect not only the rate of convergence, but also the overall error at convergence. Some options for initialization include complete randomization,
6170:
5364:
Naiyang Guan; Dacheng Tao; Zhigang Luo & Bo Yuan (July 2012). "Online Nonnegative Matrix Factorization With Robust Stochastic Approximation".
829:
6811:
Hassani, Ali; Iranmanesh, Amir; Mansouri, Najme (2019-11-12). "Text Mining using Nonnegative Matrix Factorization and Latent Semantic Analysis".
6178:. Proc. 28th international ACM SIGIR conference on Research and development in information retrieval (SIGIR-05). pp. 601â602. Archived from
3992:
Ren et al. (2018) are able to prove the stability of NMF components when they are constructed sequentially (i.e., one by one), which enables the
378:
8269:
Julian Becker: "Nonnegative Matrix Factorization with Adaptive Elements for Monaural Audio Source Separation: 1 ", Shaker Verlag GmbH, Germany,
4257:
tasks in order to predict novel protein targets and therapeutic indications for approved drugs and to infer pair of synergic anticancer drugs.
4079:
email dataset with 65,033 messages and 91,133 terms into 50 clusters. NMF has also been applied to citations data, with one example clustering
3654:
2073:
6221:
8307:
7877:
7141:"DNA methylation profiling of medulloblastoma allows robust sub-classification and improved outcome prediction using formalin-fixed biopsies"
6417:
5765:
5428:
4569:
Ren, Bin; Pueyo, Laurent; Chen, Christine; Choquet, Elodie; Debes, John H; Duechene, Gaspard; Menard, Francois; Perrin, Marshall D. (2020).
8363:
5781:
Hafshejani, Sajad Fathi; Moaberfard, Zahra (November 2022). "Initialization for Nonnegative Matrix Factorization: a Comprehensive Review".
5657:
887:
650:
185:
6915:
3828:
6386:
2595:
905:
7810:
5612:
Naiyang Guan; Dacheng Tao; Zhigang Luo; Bo Yuan (June 2012). "NeNMF: An Optimal Gradient Method for Nonnegative Matrix Factorization".
8296:
8285:
8274:
8263:
8252:
8241:
8230:
8218:
4632:
1208:
738:
713:
662:
4770:
Pentti Paatero; Unto Tapper; Pasi Aalto; Markku Kulmala (1991). "Matrix factorization methods for analysing diffusion battery data".
8236:
Andri Mirzal: "Nonnegative Matrix Factorizations for Clustering and LSI: Theory and Programming", LAP LAMBERT Academic Publishing,
7932:
7477:
Sitek; Gullberg; Huesman (2002). "Correction for ambiguous solutions in factor analysis using a penalized least squares objective".
6385:. Proceedings of the 26th annual international ACM SIGIR conference on Research and development in information retrieval. New York:
5348:
5321:
4871:
4435:
Blanton, Michael R.; Roweis, Sam (2007). "K-corrections and filter transformations in the ultraviolet, optical, and near infrared".
3695:
NMF extends beyond matrices to tensors of arbitrary order. This extension may be viewed as a non-negative counterpart to, e.g., the
2306:
786:
781:
3731:
8213:
Andrzej Cichocki, Morten Mrup, et al.: "Advances in Nonnegative Matrix and Tensor Factorization", Hindawi Publishing Corporation,
5012:
2558:
is sparse with columns having local non-zero weight windows that are shared across shifts along the spatio-temporal dimensions of
7098:"Sparse non-negative matrix factorizations via alternating non-negativity-constrained least squares for microarray data analysis"
4869:; Unto Tapper; Olli JĂ€rvinen (1995). "Source identification of bulk wet deposition in Finland by positive matrix factorization".
4224:
for estimating individual admixture coefficients, detecting genetic clusters of individuals in a population sample or evaluating
1600:
444:
82:
4394:
3650:
3420:
2379:
2069:
8028:
Liu, W.X.; Zheng, N.N. & You, Q.B. (2006). "Nonnegative Matrix Factorization and its applications in pattern recognition".
6447:"Analysis of the emission of very small dust particles from Spitzer spectro-imagery data using blind signal separation methods"
4413:
3982:
5308:. Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '11. p. 1064.
3281:{\textstyle {\textstyle {\frac {\mathbf {V} \mathbf {H} ^{\mathsf {T}}}{\mathbf {W} \mathbf {H} \mathbf {H} ^{\mathsf {T}}}}}}
1139:
1107:
investigated the properties of the algorithm and published some simple and useful algorithms for two types of factorizations.
839:
7575:"Reconstruction of 4-D Dynamic SPECT Images From Inconsistent Projections Using a Spline Initialized FADS Algorithm (SIFADS)"
7322:
5829:
Zhu, Guangtun B. (2016-12-19). "Nonnegative Matrix Factorization (NMF) with Heteroscedastic Uncertainties and Missing data".
5748:
Ding, C.; He, X. & Simon, H.D. (2005). "On the equivalence of nonnegative matrix factorization and spectral clustering".
943:
603:
424:
6907:
1860:
7845:
Jialu Liu; Chi Wang; Jing Gao & Jiawei Han (2013). "Multi-View Clustering via Joint Nonnegative Matrix Factorization".
6859:
Yun Mao; Lawrence Saul & Jonathan M. Smith (2006). "IDES: An Internet Distance Estimation Service for Large Networks".
3506:
3295:
7698:
Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases
4349:
2574:
and repeatedly using the resulting representation as input to convolutional NMF, deep feature hierarchies can be learned.
814:
516:
292:
7622:
C. Boutsidis & E. Gallopoulos (2008). "SVD based initialization: A head start for nonnegative matrix factorization".
5704:"Algorithms for nonnegative matrix and tensor factorizations: A unified view based on block coordinate descent framework"
5240:"A framework for regularized non-negative matrix factorization, with application to the analysis of gene expression data"
3517:
Exact solutions for the variants of NMF can be expected (in polynomial time) when additional constraints hold for matrix
8130:
7991:
5655:
Jingu Kim & Haesun Park (2011). "Fast Nonnegative Matrix Factorization: An Active-set-like Method and Comparisons".
5464:
4970:
4270:
4029:
matrices, specifically, it includes mathematical derivation, simulated data imputation, and application to on-sky data.
3557:
3447:
3414:
2356:
771:
708:
618:
596:
439:
429:
7389:
Pinoli; Ceddia; Ceri; Masseroli (2021). "Predicting drug synergism by means of non-negative matrix tri-factorization".
6504:
LafreniÚre, David; Maroid, Christian; Doyon, René; Barman, Travis (2009). "HST/NICMOS Detection of HR 8799 b in 1998".
4817:"Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values"
1515:
7432:
DiPaola; Bazin; Aubry; Aurengo; Cavailloles; Herry; Kahn (1982). "Handling of dynamic sequences in nuclear medicine".
6279:
6040:
Arora, Sanjeev; Ge, Rong; Halpern, Yoni; Mimno, David; Moitra, Ankur; Sontag, David; Wu, Yichen; Zhu, Michael (2013).
4328:
Online: how to update the factorization when new data comes in without recomputing from scratch, e.g., see online CNSC
922:
834:
819:
280:
102:
8091:
5568:"Nonnegative Matrix Factorization Based on Alternating Nonnegativity Constrained Least Squares and Active Set Method"
5517:
Lin, Chih-Jen (2007). "On the Convergence of Multiplicative Update Algorithms for Nonnegative Matrix Factorization".
4265:
NMF, also referred in this field as factor analysis, has been used since the 1980s to analyze sequences of images in
809:
8258:
Ganesh R. Naik(Ed.): "Non-negative Matrix Factorization Techniques: Advances in Theory and Applications", Springer,
8067:
Ngoc-Diep Ho; Paul Van Dooren & Vincent Blondel (2008). "Descent Methods for Nonnegative Matrix Factorization".
5711:
3401:
In addition to the optimization step, initialization has a significant effect on NMF. The initial values chosen for
2028:
1793:
7190:
Alexandrov, Ludmil B.; Nik-Zainal, Serena; Wedge, David C.; Campbell, Peter J.; Stratton, Michael R. (2013-01-31).
7041:
4772:
3331:
882:
559:
454:
242:
175:
135:
4033:
authors proved that the impact from missing data during component construction is a first-to-second order effect.
7346:
Ceddia; Pinoli; Ceri; Masseroli (2020). "Matrix factorization-based technique for drug repurposing predictions".
3209:{\textstyle {\frac {\mathbf {W} ^{\mathsf {T}}\mathbf {V} }{\mathbf {W} ^{\mathsf {T}}\mathbf {W} \mathbf {H} }}}
2565:
936:
542:
310:
180:
7785:
5913:"Detection and Characterization of Exoplanets using Projections on Karhunen Loeve Eigenimages: Forward Modeling"
3788:
2339:
There are different types of non-negative matrix factorizations. The different types arise from using different
8030:
7102:
5180:
4064:
matrix. The features are derived from the contents of the documents, and the feature-document matrix describes
564:
484:
407:
325:
155:
117:
112:
72:
67:
8334:
4103:
NMF is also used to analyze spectral data; one such use is in the classification of space objects and debris.
3661:
estimation. That method is commonly used for analyzing and clustering textual data and is also related to the
2065:
is not explicitly imposed, the orthogonality holds to a large extent, and the clustering property holds too.
1681:
6445:; Deville, Y.; Smith, J. D.; Rapacioli, M.; Bernard, J. P.; Thomas, J.; Reach, W.; Abergel, A. (2007-07-01).
8353:
4197:
4025:
2530:
2501:
1071:
1063:
511:
360:
260:
87:
3916:
3887:
8368:
8358:
7855:
7759:
7639:
7579:
7479:
6924:
6870:
6296:
5674:
5584:
5526:
5473:
3710:
3598:
2236:
691:
667:
569:
330:
305:
265:
77:
6446:
5013:"On the equivalence between non-negative matrix factorization and probabilistic latent semantic indexing"
4730:
7524:"Clustering Initiated Factor Analysis (CIFA) Application for Tissue Classification in Dynamic Brain PET"
7145:
6743:
Berry, Michael W.; Browne, Murray (2005). "Email Surveillance Using Non-negative Matrix Factorization".
6179:
6129:
6076:
3981:
is enabled. Their method is then adopted by Ren et al. (2018) to the direct imaging field as one of the
3706:
1015:
645:
467:
419:
275:
190:
62:
4726:
1730:
8100:
8039:
7941:
7751:
7631:
7443:
7050:
6781:
6636:
6580:
6523:
6468:
6325:
6288:
6093:
6057:
5934:
5872:
5666:
5621:
5253:
4924:
4880:
4680:
4592:
4524:
4454:
4359:
4050:
2494:
1032:
1023:
574:
524:
7963:
7860:
7764:
7644:
6929:
6875:
6237:
5478:
4866:
4080:
3398:. However, as in many other data mining applications, a local minimum may still prove to be useful.
1453:
From the treatment of matrix multiplication above it follows that each column in the product matrix
7434:
5679:
5589:
5531:
4344:
4317:
Current research (since 2010) in nonnegative matrix factorization includes, but is not limited to,
4253:
A particular variant of NMF, namely Non-Negative Matrix Tri-Factorization (NMTF), has been use for
4221:
4005:
3986:
3973:
3878:
3553:
3390:
method, the optimal gradient method, and the block principal pivoting method among several others.
1512:
NMF has an inherent clustering property, i.e., it automatically clusters the columns of input data
1059:
677:
613:
584:
489:
315:
248:
234:
220:
195:
145:
97:
57:
7307:
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
2388:
1771:
1578:
8155:
8116:
8068:
8055:
8016:
7902:
7883:
7777:
7604:
7504:
7459:
7414:
7371:
7328:
6990:
6942:
6888:
6812:
6791:
6762:
6704:
6652:
6626:
6598:
6570:
6539:
6513:
6458:
6423:
6333:. High-Performance Scientific Computing: Algorithms and Applications. Springer. pp. 311â326.
6301:
6259:
6117:
6047:
5952:
5924:
5888:
5862:
5830:
5790:
5730:
5637:
5544:
5499:
5434:
5389:
5215:
5160:
5142:
4948:
4862:
4816:
4752:
4610:
4582:
4542:
4514:
4470:
4444:
4254:
4084:
3978:
3669:
3662:
3658:
2534:
2508:) is added to NMF with the mean squared error cost function, the resulting problem may be called
2505:
2230:
1831:
1075:
655:
579:
365:
160:
7037:"Nonnegative Matrix Factorization: An Analytical and Interpretive Tool in Computational Biology"
6158:. Proc. European Conference on Machine Learning (ECML-02). LNAI. Vol. 2430. pp. 23â34.
6041:
5409:"Discovering hierarchical speech features using convolutional non-negative matrix factorization"
4651:
17:
7690:
4669:"Non-Negative Matrix Factorization for Learning Alignment-Specific Models of Protein Evolution"
3563:
3552:
Lee and Seung proposed NMF mainly for parts-based decomposition of images. It compares NMF to
2385:
The factorization problem in the squared error version of NMF may be stated as: Given a matrix
8303:
8292:
8281:
8270:
8259:
8248:
8237:
8226:
8214:
8202:
8147:
8008:
7989:
Raul Kompass (2007). "A Generalized Divergence Measure for Nonnegative Matrix Factorization".
7873:
7669:"Distributed Nonnegative Matrix Factorization for Web-Scale Dyadic Data Analysis on MapReduce"
7596:
7555:
7496:
7406:
7363:
7318:
7305:
Ding; Li; Peng; Park (2006). "Orthogonal nonnegative matrix t-factorizations for clustering".
7287:
7269:
7229:
7211:
7172:
7121:
7078:
7017:
6696:
6486:
6413:
6109:
5808:
5761:
5491:
5424:
5381:
5344:
5317:
5281:
4940:
4836:
4789:
4708:
4225:
4111:
NMF is applied in scalable Internet distance (round-trip time) prediction. For a network with
4071:
One specific application used hierarchical NMF on a small subset of scientific abstracts from
2483:{\displaystyle F(\mathbf {W} ,\mathbf {H} )=\left\|\mathbf {V} -\mathbf {WH} \right\|_{F}^{2}}
748:
591:
504:
300:
270:
215:
210:
165:
107:
7928:"A receptor model using a specific non-negative transformation technique for ambient aerosol"
6345:
5302:
Fast coordinate descent methods with variable selection for non-negative matrix factorization
3535:
is symmetric and contains a diagonal principal sub matrix of rank r. Their algorithm runs in
3330:
More recently other algorithms have been developed. Some approaches are based on alternating
8192:
8182:
8139:
8108:
8082:
8047:
8000:
7977:
7949:
7865:
7821:
7769:
7649:
7588:
7545:
7537:
7488:
7451:
7398:
7355:
7310:
7277:
7261:
7219:
7203:
7162:
7154:
7111:
7068:
7058:
7007:
6999:
6934:
6880:
6841:
6754:
6688:
6644:
6640:
6588:
6531:
6476:
6472:
6442:
6405:
6306:
6251:
6101:
6084:
6019:
5983:
5942:
5880:
5800:
5753:
5720:
5684:
5629:
5594:
5536:
5483:
5416:
5373:
5309:
5271:
5261:
5189:
5152:
5115:
5073:
5027:
4978:. Advances in Neural Information Processing Systems 13: Proceedings of the 2000 Conference.
4932:
4915:
4888:
4844:
4828:
4797:
4781:
4744:
4698:
4688:
4600:
4532:
4462:
3692:
model with one layer of observed random variables and one layer of hidden random variables.
3689:
3592:
3383:
776:
529:
479:
389:
373:
343:
205:
200:
150:
140:
38:
6364:
Vamsi K. Potluru; Sergey M. Plis; Morten Morup; Vince D. Calhoun & Terran Lane (2009).
6004:
4134:
3453:
1918:
1498:) where each feature is weighted by the feature's cell value from the document's column in
8086:
4910:
4364:
4246:
4242:
4161:
3949:
1104:
1055:
804:
608:
474:
414:
4273:
dynamic medical imaging. Non-uniqueness of NMF was addressed using sparsity constraints.
3133:
Note that the updates are done on an element by element basis not matrix multiplication.
2378:) and an extension of the KullbackâLeibler divergence to positive matrices (the original
2022:-th cluster. This centroid's representation can be significantly enhanced by convex NMF.
8324:
8104:
8043:
7945:
7755:
7635:
7447:
7054:
6584:
6527:
6292:
6097:
6061:
5938:
5876:
5670:
5625:
5456:
5257:
4998:
4928:
4884:
4684:
4596:
4528:
4458:
4300:
Please help update this article to reflect recent events or newly available information.
8197:
8168:
7550:
7523:
7282:
7249:
7224:
7191:
7167:
7140:
7073:
7036:
7012:
6985:
6535:
5567:
5276:
5239:
5035:
4703:
4668:
4238:
4205:
4200:. There are many algorithms for denoising if the noise is stationary. For example, the
4114:
3722:
3485:
3417:, k-means clustering, and more advanced strategies based on these and other paradigms.
3289:
2632:
2598:
has been a popular method due to the simplicity of implementation. This algorithm is:
2375:
2005:
1985:
1965:
1945:
1840:
1655:
1635:
1444:
and, if the factorization worked, it is a reasonable approximation to the input matrix
1079:
1019:
824:
355:
92:
7981:
6310:
6207:
5947:
5912:
5884:
4785:
4639:. Proc. ACM SIGKDD Int'l Conf. on Knowledge discovery and data mining. pp. 69â77.
8347:
7954:
7927:
7846:
7668:
7418:
7375:
6692:
6602:
6593:
6558:
6203:
5988:
5971:
5956:
5703:
4892:
4735:
4614:
4201:
2513:
2340:
2196:
by significantly less data, then one has to infer some latent structure in the data.
743:
672:
554:
285:
170:
8159:
8059:
7781:
7608:
7463:
7116:
7097:
6938:
6892:
6766:
6708:
6656:
6427:
6402:
2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
5892:
5734:
5338:
4474:
4419:(Report). Max Planck Institute for Biological Cybernetics. Technical Report No. 193.
2325:
is called a nonnegative rank factorization (NRF). The problem of finding the NRF of
8120:
8020:
7508:
6946:
6671:
6543:
6368:. Proceedings of the 2009 SIAM Conference on Data Mining (SDM). pp. 1218â1229.
6121:
5641:
5548:
5503:
5438:
5393:
5164:
4952:
4546:
4065:
3959:
More control over the non-uniqueness of NMF is obtained with sparsity constraints.
2374:
Two simple divergence functions studied by Lee and Seung are the squared error (or
1091:
1067:
7332:
6648:
6210:". International Conference on Computer Vision (ICCV) Beijing, China, Oct., 2005.
5340:
Online Discussion Participation Prediction Using Non-negative Matrix Factorization
5300:
5106:
Thomas, L.B. (1974). "Problem 73-14, Rank factorization of nonnegative matrices".
4321:
Algorithmic: searching for global minima of the factors and factor initialization.
1830:, then the above minimization is mathematically equivalent to the minimization of
7653:
7207:
7063:
6950:
6481:
5487:
5266:
5193:
4693:
8143:
7887:
7003:
6967:
6672:"Mining the posterior cingulate: segregation between memory and pain components"
5563:
4999:"On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering"
4046:
4000:
property is used to separate the stellar light and the light scattered from the
3953:
3395:
2526:
1673:
1438:
is a matrix with 10000 rows and 500 columns, the same shape as the input matrix
549:
43:
8247:
Yong Xiang: "Blind Source Separation: Dependent Component Analysis", Springer,
8004:
7869:
6845:
6409:
6380:
5804:
5757:
5420:
5408:
5377:
5133:
Vavasis, S.A. (2009). "On the complexity of nonnegative matrix factorization".
5031:
4605:
4570:
4537:
4502:
8051:
7402:
7359:
7265:
7158:
6908:"Phoenix: A Weight-based Network Coordinate System Using Matrix Factorization"
6758:
6679:
6023:
5725:
5077:
4913:(1999). "Learning the parts of objects by non-negative matrix factorization".
4250:
subtypes, population stratification, tissue composition, and tumor clonality.
4001:
3387:
698:
394:
320:
8112:
7773:
7592:
7455:
7273:
7215:
6884:
6490:
5812:
5633:
5233:
5231:
4840:
4793:
4637:
Large-scale matrix factorization with distributed stochastic gradient descent
4503:"Non-negative Matrix Factorization: Robust Extraction of Extended Structures"
1478:
It is useful to think of each feature (column vector) in the features matrix
7966:(1997). "Least squares formulation of robust non-negative factor analysis".
7667:
Chao Liu; Hung-chih Yang; Jinliang Fan; Li-Wei He & Yi-Min Wang (2010).
7574:
7314:
6787:
5540:
5313:
4979:
3997:
3993:
3523:. A polynomial time algorithm for solving nonnegative rank factorization if
1178:
Matrix multiplication can be implemented as computing the column vectors of
1051:
1011:
857:
638:
8280:
Jen-Tzung Chien: "Source Separation and Machine Learning", Academic Press,
8206:
8151:
8012:
7600:
7559:
7500:
7410:
7367:
7291:
7233:
7176:
7125:
7082:
7021:
6700:
6150:
6113:
5495:
5385:
5285:
4944:
4832:
4712:
3505:
The contribution of the sequential NMF components can be compared with the
2172:
they become easier to store and manipulate. Another reason for factorizing
8187:
7541:
7192:"Deciphering signatures of mutational processes operative in human cancer"
6223:
Exponential Family Harmoniums with an Application to Information Retrieval
5413:
Proceedings of the International Joint Conference on Neural Networks, 2003
4848:
4801:
2151:. The elements of the residual matrix can either be negative or positive.
967:
Illustration of approximate non-negative matrix factorization: the matrix
6463:
5244:
4653:
TopicMF: Simultaneously Exploiting Ratings and Reviews for Recommendation
4449:
4013:
6277:
Max Welling & Markus Weber (2001). "Positive Tensor Factorization".
6046:. Proceedings of the 30th International Conference on Machine Learning.
4414:
Sparse nonnegative matrix approximation: new formulations and algorithms
4395:"Generalized Nonnegative Matrix Approximations with Bregman Divergences"
3668:
NMF with the least-squares objective is equivalent to a relaxed form of
1459:
is a linear combination of the 10 column vectors in the features matrix
7735:
Dong Wang; Ravichander Vipperla; Nick Evans; Thomas Fang Zheng (2013).
6263:
4756:
4087:
based on the outbound scientific citations in English Knowledge (XXG).
3696:
1405:
Assume we ask the algorithm to find 10 features in order to generate a
633:
7492:
5688:
5598:
5156:
5064:
Berman, A.; R.J. Plemmons (1974). "Inverses of nonnegative matrices".
4571:"Using Data Imputation for Signal Separation in High Contrast Imaging"
2190:, is that if one's goal is to approximately represent the elements of
7737:"Online Non-Negative Convolutive Pattern Learning for Speech Signals"
5337:
Fung, Yik-Hing; Li, Chun-Hung; Cheung, William K. (2 November 2007).
4501:
Ren, Bin; Pueyo, Laurent; Zhu, Guangtun B.; DuchĂȘne, Gaspard (2018).
4354:
4072:
3370:
may be the same or different, as some NMF variants regularize one of
384:
7848:
Proceedings of the 2013 SIAM International Conference on Data Mining
6327:
Fast Nonnegative Tensor Factorization with an Active-set-like Method
6255:
6077:"Learning the parts of objects by non-negative matrix factorization"
5119:
4748:
2223:
can be anything in that space. Convex NMF restricts the columns of
7907:
7691:"Scalable Nonnegative Matrix Factorization with Block-wise Updates"
6986:"Fast and efficient estimation of individual ancestry coefficients"
6817:
6631:
5929:
5835:
5795:
5220:
4587:
4519:
4466:
3870:{\displaystyle \mathbf {\tilde {H}} =\mathbf {B} ^{-1}\mathbf {H} }
3653:, it is in fact equivalent to another instance of multinomial PCA,
8073:
6984:
Frichot E, Mathieu F, Trouillon T, Bouchard G, Francois O (2014).
6796:
6575:
6518:
6105:
6052:
5867:
5147:
4936:
4266:
4076:
3725:
can be used to transform the two factorization matrices by, e.g.,
3562:
3550:
Learning the parts of objects by non-negative matrix factorization
628:
623:
350:
7391:
IEEE/ACM Transactions on Computational Biology and Bioinformatics
6723:
6043:
A practical algorithm for topic modeling with provable guarantees
5457:"Projected Gradient Methods for Nonnegative Matrix Factorization"
3952:. In this simple case it will just correspond to a scaling and a
8169:"Bayesian Inference for Nonnegative Matrix Factorisation Models"
5214:. Proc. IEEE Workshop on Neural Networks for Signal Processing.
1251:{\displaystyle \mathbf {v} _{i}=\mathbf {W} \mathbf {h} _{i}\,,}
7818:
Proceedings of the 23rd International World Wide Web Conference
7676:
Proceedings of the 19th International World Wide Web Conference
7250:"Enter the Matrix: Factorization Uncovers Knowledge from Omics"
6670:
Nielsen, Finn Ă
rup; Balslev, Daniela; Hansen, Lars Kai (2005).
5415:. Vol. 4. Portland, Oregon USA: IEEE. pp. 2758â2763.
6559:"PYNPOINT: an image processing package for finding exoplanets"
6382:
Document clustering based on non-negative matrix factorization
4280:
2285:. This greatly improves the quality of data representation of
963:
7809:
Xiangnan He; Min-Yen Kan; Peichu Xie & Xiao Chen (2014).
7711:
3775:{\displaystyle \mathbf {WH} =\mathbf {WBB} ^{-1}\mathbf {H} }
2407:
find nonnegative matrices W and H that minimize the function
7689:
Jiangtao Yin; Lixin Gao & Zhongfei (Mark) Zhang (2014).
6366:
Efficient Multiplicative updates for Support Vector Machines
6344:
Kenan Yilmaz; A. Taylan Cemgil & Umut Simsekli (2011).
5001:. Proc. SIAM Int'l Conf. Data Mining, pp. 606-610. May 2005
2516:
problem, although it may also still be referred to as NMF.
1625:{\displaystyle \mathbf {V} \simeq \mathbf {W} \mathbf {H} }
8302:
Nicolas Gillis: "Nonnegative Matrix Factorization", SIAM,
6972:
Machine Learning for Signal Processing, IEEE Workshop on
5366:
IEEE Transactions on Neural Networks and Learning Systems
8291:
Shoji Makino(Ed.): "Audio Source Separation", Springer,
6400:
Eggert, J.; Korner, E. (2004). "Sparse coding and NMF".
6208:
A Unifying Approach to Hard and Probabilistic Clustering
5343:. Wi-Iatw '07. IEEE Computer Society. pp. 284â287.
4158:
end-to-end links can be predicted after conducting only
3881:
they form another parametrization of the factorization.
1768:
If we furthermore impose an orthogonality constraint on
1380:
Here is an example based on a text-mining application:
1168:{\displaystyle \mathbf {V} =\mathbf {W} \mathbf {H} \,.}
916:
List of datasets in computer vision and image processing
8089:(2008). "Nonnegative Matrix and Tensor Factorization".
3567:
NMF as a probabilistic graphical model: visible units (
3358:
is found analogously. The procedures used to solve for
7811:"Comment-based Multi-View Clustering of Web 2.0 Items"
4131:
hosts, with the help of NMF, the distances of all the
3224:
3222:
3154:
1900:{\displaystyle \mathbf {H} _{kj}>\mathbf {H} _{ij}}
1465:
with coefficients supplied by the coefficients matrix
6968:
Wind noise reduction using non-negative sparse coding
6783:
Clustering of scientific citations in Knowledge (XXG)
6005:"Computing symmetric nonnegative rank factorizations"
4399:
Advances in Neural Information Processing Systems 18
4164:
4137:
4117:
3919:
3890:
3831:
3791:
3734:
3601:
3488:
3456:
3320:{\displaystyle \mathbf {V} =\mathbf {W} \mathbf {H} }
3298:
2882:
2657:
2635:
2416:
2391:
2239:
2031:
2008:
1988:
1968:
1948:
1921:
1863:
1843:
1796:
1774:
1733:
1684:
1658:
1638:
1603:
1581:
1518:
1211:
1142:
6906:
Yang Chen; Xiao Wang; Cong Shi; et al. (2011).
4964:
4962:
4196:
Speech denoising has been a long lasting problem in
1384:
Let the input matrix (the matrix to be factored) be
985:, which, when multiplied, approximately reconstruct
6966:
Schmidt, M.N., J. Larsen, and F.T. Hsiao. (2007). "
6916:
IEEE Transactions on Network and Service Management
5783:
International Journal of Data Science and Analytics
4412:Tandon, Rashish; Sra, Suvrit (September 13, 2010).
3346:found by a non-negative least squares solver, then
6746:Computational and Mathematical Organization Theory
4179:
4150:
4123:
3934:
3905:
3869:
3817:
3774:
3721:The factorization is not unique: A matrix and its
3637:
3494:
3474:
3319:
3280:
3208:
3107:
2864:
2641:
2482:
2399:
2277:
2057:
2014:
1994:
1974:
1954:
1934:
1899:
1849:
1822:
1782:
1757:
1719:
1664:
1644:
1624:
1589:
1564:
1250:
1167:
7522:Boutchko; Mitra; Baker; Jagust; Gullberg (2015).
7348:IEEE Journal of Biomedical and Health Informatics
6563:Monthly Notices of the Royal Astronomical Society
6243:Journal of Computational and Graphical Statistics
5093:Nonnegative matrices in the Mathematical Sciences
4430:
4428:
4426:
3926:
3897:
3838:
3799:
1565:{\displaystyle \mathbf {V} =(v_{1},\dots ,v_{n})}
6862:IEEE Journal on Selected Areas in Communications
6152:Variational Extensions to EM and Multinomial PCA
5702:Jingu Kim; Yunlong He & Haesun Park (2013).
5576:SIAM Journal on Matrix Analysis and Applications
4972:Algorithms for Non-negative Matrix Factorization
4075:. Another research group clustered parts of the
1184:as linear combinations of the column vectors in
7969:Chemometrics and Intelligent Laboratory Systems
4969:Daniel D. Lee & H. Sebastian Seung (2001).
3434:The sequential construction of NMF components (
2493:Another type of NMF for images is based on the
6172:Relation between PLSA and NMF and Implications
4993:
4991:
4989:
2119:then amounts to the two non-negative matrices
2076:(PLSA), a popular document clustering method.
2058:{\displaystyle \mathbf {H} \mathbf {H} ^{T}=I}
1823:{\displaystyle \mathbf {H} \mathbf {H} ^{T}=I}
911:List of datasets for machine-learning research
7529:Journal of Cerebral Blood Flow and Metabolism
6075:Lee, Daniel D.; Sebastian, Seung, H. (1999).
2085:Approximate non-negative matrix factorization
944:
8:
6834:Computational Statistics & Data Analysis
6134:: CS1 maint: multiple names: authors list (
6035:
6033:
5848:
5846:
5020:Computational Statistics & Data Analysis
4904:
4902:
3972:In astronomy, NMF is a promising method for
3382:. Specific approaches include the projected
3136:We note that the multiplicative factors for
2335:Different cost functions and regularizations
1672:that minimize the error function (using the
8174:Computational Intelligence and Neuroscience
7573:Abdalah; Boutchko; Mitra; Gullberg (2015).
5972:"Computing nonnegative rank factorizations"
3334:: in each step of such an algorithm, first
2291:. Furthermore, the resulting matrix factor
973:is represented by the two smaller matrices
6379:Wei Xu; Xin Liu & Yihong Gong (2003).
5906:
5904:
5902:
5205:
5203:
4733:(1971). "Self modeling curve resolution".
4393:Dhillon, Inderjit S.; Sra, Suvrit (2005).
3818:{\displaystyle \mathbf {{\tilde {W}}=WB} }
3595:from a probability distribution with mean
1190:using coefficients supplied by columns of
951:
937:
29:
8196:
8186:
8072:
7953:
7906:
7859:
7763:
7643:
7549:
7281:
7223:
7166:
7115:
7072:
7062:
7011:
6928:
6874:
6816:
6795:
6630:
6592:
6574:
6517:
6480:
6462:
6300:
6169:Eric Gaussier & Cyril Goutte (2005).
6051:
5987:
5946:
5928:
5866:
5834:
5794:
5724:
5678:
5588:
5530:
5477:
5275:
5265:
5238:Leo Taslaman & Björn Nilsson (2012).
5219:
5146:
4815:Pentti Paatero; Unto Tapper (June 1994).
4702:
4692:
4626:
4624:
4604:
4586:
4536:
4518:
4448:
4388:
4386:
4384:
4163:
4142:
4136:
4116:
3921:
3920:
3918:
3892:
3891:
3889:
3862:
3853:
3848:
3833:
3832:
3830:
3794:
3793:
3792:
3790:
3767:
3758:
3747:
3735:
3733:
3629:
3616:
3606:
3600:
3487:
3455:
3312:
3307:
3299:
3297:
3266:
3265:
3260:
3254:
3249:
3240:
3239:
3234:
3228:
3225:
3223:
3221:
3198:
3193:
3186:
3185:
3180:
3172:
3165:
3164:
3159:
3155:
3153:
3084:
3074:
3058:
3053:
3037:
3032:
3025:
3020:
2996:
2986:
2970:
2965:
2956:
2950:
2944:
2927:
2922:
2906:
2889:
2884:
2881:
2841:
2831:
2826:
2819:
2814:
2807:
2797:
2792:
2765:
2756:
2750:
2740:
2735:
2725:
2719:
2702:
2697:
2681:
2664:
2659:
2656:
2634:
2474:
2469:
2456:
2448:
2431:
2423:
2415:
2392:
2390:
2266:
2247:
2238:
2043:
2038:
2032:
2030:
2007:
2002:-th column gives the cluster centroid of
1987:
1967:
1947:
1926:
1920:
1888:
1883:
1870:
1865:
1862:
1842:
1808:
1803:
1797:
1795:
1775:
1773:
1732:
1708:
1683:
1657:
1637:
1617:
1612:
1604:
1602:
1582:
1580:
1553:
1534:
1519:
1517:
1244:
1238:
1233:
1227:
1218:
1213:
1210:
1161:
1156:
1151:
1143:
1141:
1050:NMF finds applications in such fields as
6347:Generalized Coupled Tensor Factorization
6003:Kalofolias, V.; Gallopoulos, E. (2012).
5824:
5822:
5450:
5448:
3419:
2331:, if it exists, is known to be NP-hard.
2200:Convex non-negative matrix factorization
1720:{\displaystyle \left\|V-WH\right\|_{F},}
1575:More specifically, the approximation of
1278:-th column vector of the product matrix
1095:researchers in the 1990s under the name
962:
5050:
5048:
4564:
4562:
4560:
4558:
4556:
4496:
4494:
4492:
4490:
4488:
4486:
4484:
4380:
3985:, especially for the direct imaging of
1982:gives the cluster centroids, i.e., the
1857:gives the cluster membership, i.e., if
37:
7744:IEEE Transactions on Signal Processing
7096:Hyunsoo Kim & Haesun Park (2007).
6557:Amara, Adam; Quanz, Sascha P. (2012).
6127:
5614:IEEE Transactions on Signal Processing
3655:probabilistic latent semantic analysis
3446:) was firstly used to relate NMF with
3267:
3241:
3187:
3166:
2074:probabilistic latent semantic analysis
2068:When the error function to be used is
5299:Hsieh, C. J.; Dhillon, I. S. (2011).
4237:NMF has been successfully applied in
4107:Scalable Internet distance prediction
3935:{\displaystyle \mathbf {\tilde {H}} }
3906:{\displaystyle \mathbf {\tilde {W}} }
2343:for measuring the divergence between
1414:with 10000 rows and 10 columns and a
7:
6324:Jingu Kim & Haesun Park (2012).
5658:SIAM Journal on Scientific Computing
5519:IEEE Transactions on Neural Networks
3638:{\displaystyle \sum _{a}W_{ia}h_{a}}
2582:There are several ways in which the
2297:becomes more sparse and orthogonal.
2278:{\displaystyle (v_{1},\dots ,v_{n})}
1915:, this suggests that the input data
1365:can be significantly less than both
6404:. Vol. 4. pp. 2529â2533.
6387:Association for Computing Machinery
5970:Campbell, S.L.; G.D. Poole (1981).
4997:C. Ding, X. He, H.D. Simon (2005).
2101:in NMF are selected so the product
906:Glossary of artificial intelligence
27:Algorithms for matrix decomposition
3705:NMF is an instance of nonnegative
2025:When the orthogonality constraint
25:
8335:Non-negative matrix factorization
6617:circumstellar disks with MLOCI".
6506:The Astrophysical Journal Letters
6220:Max Welling; et al. (2004).
5855:The Astrophysical Journal Letters
5752:. Vol. 4. pp. 606â610.
5091:A. Berman; R.J. Plemmons (1994).
4667:Ben Murrell; et al. (2011).
4049:applications. In this process, a
3996:of the NMF modeling process; the
3573:) are connected to hidden units (
2629:by computing the following, with
2089:Usually the number of columns of
1101:non-negative matrix factorization
1099:. It became more widely known as
1008:non-negative matrix approximation
996:Non-negative matrix factorization
8317:
6693:10.1016/j.neuroimage.2005.04.034
6594:10.1111/j.1365-2966.2012.21918.x
4285:
3923:
3894:
3863:
3849:
3835:
3811:
3808:
3805:
3796:
3768:
3754:
3751:
3748:
3739:
3736:
3313:
3308:
3300:
3261:
3255:
3250:
3235:
3229:
3199:
3194:
3181:
3173:
3160:
3054:
3033:
3021:
2966:
2957:
2923:
2885:
2827:
2815:
2793:
2757:
2736:
2698:
2660:
2568:. By spatio-temporal pooling of
2460:
2457:
2449:
2432:
2424:
2393:
2107:will become an approximation to
2039:
2033:
1884:
1866:
1804:
1798:
1776:
1758:{\displaystyle W\geq 0,H\geq 0.}
1618:
1613:
1605:
1583:
1520:
1301:-th column vector of the matrix
1234:
1228:
1214:
1157:
1152:
1144:
18:Nonnegative matrix factorization
8092:IEEE Signal Processing Magazine
6939:10.1109/tnsm.2011.110911.100079
4192:Non-stationary speech denoising
3983:methods of detecting exoplanets
3688:NMF can be seen as a two-layer
3678:contains cluster centroids and
2204:In standard NMF, matrix factor
1121:be the product of the matrices
7926:J. Shen; G. W. Israël (1989).
5712:Journal of Global Optimization
5066:Linear and Multilinear Algebra
4650:Yang Bao; et al. (2014).
4631:Rainer Gemulla; Erik Nijkamp;
4174:
4168:
3469:
3457:
3097:
3085:
3081:
3071:
3049:
3016:
3009:
2997:
2993:
2983:
2961:
2953:
2940:
2928:
2918:
2902:
2890:
2854:
2842:
2838:
2804:
2788:
2785:
2778:
2766:
2762:
2747:
2731:
2728:
2715:
2703:
2693:
2677:
2665:
2594:may be found: Lee and Seung's
2465:
2444:
2436:
2420:
2301:Nonnegative rank factorization
2272:
2240:
1704:
1687:
1559:
1527:
326:Relevance vector machine (RVM)
1:
7982:10.1016/S0169-7439(96)00044-5
7117:10.1093/bioinformatics/btm134
6722:Cohen, William (2005-04-04).
6311:10.1016/S0167-8655(01)00070-8
5011:Ding C, Li Y, Peng W (2008).
4786:10.1016/S0021-8502(05)80089-8
4350:Multilinear subspace learning
3482:-th component with the first
2649:as an index of the iteration.
2529:fashion. One such use is for
2512:due to the similarity to the
2315:is equal to its actual rank,
2113:. The full decomposition of
1423:with 10 rows and 500 columns.
1097:positive matrix factorization
815:Computational learning theory
379:Expectationâmaximization (EM)
7955:10.1016/0004-6981(89)90190-X
7654:10.1016/j.patcog.2007.09.010
7208:10.1016/j.celrep.2012.12.008
7064:10.1371/journal.pcbi.1000029
6619:Astronomy & Astrophysics
6536:10.1088/0004-637X/694/2/L148
6451:Astronomy & Astrophysics
5989:10.1016/0024-3795(81)90272-x
5488:10.1162/neco.2007.19.10.2756
5267:10.1371/journal.pone.0046331
5194:10.1016/j.neucom.2008.01.022
4893:10.1016/1352-2310(94)00367-T
4694:10.1371/journal.pone.0028898
3558:principal component analysis
3544:Relation to other techniques
3448:Principal Component Analysis
2400:{\displaystyle \mathbf {V} }
1783:{\displaystyle \mathbf {H} }
1590:{\displaystyle \mathbf {V} }
1202:can be computed as follows:
1035:into (usually) two matrices
772:Coefficient of determination
619:Convolutional neural network
331:Support vector machine (SVM)
8364:Machine learning algorithms
8144:10.1162/neco.2008.04-08-771
7004:10.1534/genetics.113.160572
6780:Nielsen, Finn Ă
rup (2008).
6649:10.1051/0004-6361/201525837
6280:Pattern Recognition Letters
5948:10.3847/0004-637X/824/2/117
5885:10.1088/2041-8205/755/2/L28
5750:Proc. SIAM Data Mining Conf
3651:KullbackâLeibler divergence
2380:KullbackâLeibler divergence
2070:KullbackâLeibler divergence
1196:. That is, each column of
923:Outline of machine learning
820:Empirical risk minimization
8385:
8167:Ali Taylan Cemgil (2009).
8005:10.1162/neco.2007.19.3.780
7870:10.1137/1.9781611972832.28
7042:PLOS Computational Biology
6846:10.1016/j.csda.2006.11.006
6482:10.1051/0004-6361:20066282
6410:10.1109/IJCNN.2004.1381036
5805:10.1007/s41060-022-00370-9
5758:10.1137/1.9781611972757.70
5421:10.1109/IJCNN.2003.1224004
5378:10.1109/TNNLS.2012.2197827
5212:Non-negative sparse coding
5032:10.1016/j.csda.2008.01.011
4773:Journal of Aerosol Science
4635:; Yannis Sismanis (2011).
4371:Sources and external links
3332:non-negative least squares
2617:Then update the values in
2596:multiplicative update rule
2510:non-negative sparse coding
2233:of the input data vectors
2095:and the number of rows of
2072:, NMF is identical to the
1962:-th cluster. The computed
1837:Furthermore, the computed
560:Feedforward neural network
311:Artificial neural networks
8052:10.1007/s11434-005-1109-6
7403:10.1109/TCBB.2021.3091814
7360:10.1109/JBHI.2020.2991763
7266:10.1016/j.tig.2018.07.003
7159:10.1007/s00401-012-1077-2
6759:10.1007/s10588-005-5380-5
6024:10.1016/j.laa.2011.03.016
5917:The Astrophysical Journal
5726:10.1007/s10898-013-0035-4
5210:Hoyer, Patrik O. (2002).
5078:10.1080/03081087408817055
4575:The Astrophysical Journal
4507:The Astrophysical Journal
4294:This section needs to be
4204:is suitable for additive
543:Artificial neural network
8113:10.1109/MSP.2008.4408452
8031:Chinese Science Bulletin
7774:10.1109/tsp.2012.2222381
7593:10.1109/TMI.2014.2352033
7456:10.1109/tns.1982.4332188
6885:10.1109/JSAC.2006.884026
5634:10.1109/TSP.2012.2190406
4606:10.3847/1538-4357/ab7024
4538:10.3847/1538-4357/aaa1f2
4437:The Astronomical Journal
3785:If the two new matrices
3502:components constructed.
852:Journals and conferences
799:Mathematical foundations
709:Temporal difference (TD)
565:Recurrent neural network
485:Conditional random field
408:Dimensionality reduction
156:Dimensionality reduction
118:Quantum machine learning
113:Neuromorphic engineering
73:Self-supervised learning
68:Semi-supervised learning
7933:Atmospheric Environment
7315:10.1145/1150402.1150420
6641:2015A&A...581A..24W
6473:2007A&A...469..575B
5911:Pueyo, Laurent (2016).
5541:10.1109/TNN.2007.895831
5353:– via dl.acm.org.
5314:10.1145/2020408.2020577
4872:Atmospheric Environment
4198:audio signal processing
4081:English Knowledge (XXG)
2531:collaborative filtering
1632:is achieved by finding
1072:audio signal processing
1064:missing data imputation
261:Apprenticeship learning
7580:IEEE Trans Med Imaging
7480:IEEE Trans Med Imaging
7035:Devarajan, K. (2008).
5455:Lin, Chih-Jen (2007).
4833:10.1002/ENV.3170050203
4220:Sparse NMF is used in
4181:
4152:
4125:
4099:Spectral data analysis
4068:of related documents.
3936:
3907:
3884:The non-negativity of
3871:
3819:
3776:
3711:support vector machine
3646:
3639:
3507:KarhunenâLoĂšve theorem
3496:
3476:
3426:
3321:
3282:
3210:
3109:
2866:
2643:
2535:recommendation systems
2484:
2401:
2279:
2178:into smaller matrices
2131:as well as a residual
2059:
2016:
1996:
1976:
1956:
1936:
1901:
1851:
1824:
1784:
1759:
1721:
1666:
1646:
1626:
1591:
1566:
1402:represents a document.
1252:
1169:
992:
810:Biasâvariance tradeoff
692:Reinforcement learning
668:Spiking neural network
78:Reinforcement learning
8085:; Rafal Zdunek &
7542:10.1038/jcbfm.2015.69
7146:Acta Neuropathologica
7139:Schwalbe, E. (2013).
6724:"Enron Email Dataset"
6149:Wray Buntine (2002).
5095:. Philadelphia: SIAM.
4182:
4153:
4151:{\displaystyle N^{2}}
4126:
3937:
3908:
3872:
3820:
3777:
3707:quadratic programming
3640:
3566:
3497:
3477:
3475:{\displaystyle (n+1)}
3423:
3322:
3283:
3211:
3110:
2867:
2644:
2485:
2402:
2280:
2060:
2017:
1997:
1977:
1957:
1937:
1935:{\displaystyle v_{j}}
1902:
1852:
1825:
1785:
1760:
1722:
1667:
1647:
1627:
1592:
1567:
1253:
1170:
1016:multivariate analysis
966:
646:Neural radiance field
468:Structured prediction
191:Structured prediction
63:Unsupervised learning
7854:. pp. 252â260.
7309:. pp. 126â135.
5188:(10â12): 1824â1831.
4909:Daniel D. Lee &
4360:Tensor decomposition
4180:{\displaystyle O(N)}
4162:
4135:
4115:
4045:NMF can be used for
3942:applies at least if
3917:
3888:
3829:
3789:
3732:
3672:: the matrix factor
3599:
3486:
3454:
3296:
3220:
3152:
2880:
2655:
2633:
2495:total variation norm
2414:
2389:
2237:
2029:
2006:
1986:
1966:
1946:
1919:
1861:
1841:
1794:
1772:
1731:
1682:
1656:
1636:
1601:
1579:
1516:
1209:
1140:
835:Statistical learning
733:Learning with humans
525:Local outlier factor
8188:10.1155/2009/785152
8105:2008ISPM...25R.142C
8044:2006ChSBu..51....7L
7946:1989AtmEn..23.2289S
7756:2013ITSP...61...44W
7636:2008PatRe..41.1350B
7624:Pattern Recognition
7448:1982ITNS...29.1310D
7435:IEEE Trans Nucl Sci
7055:2008PLSCB...4E0029D
6585:2012MNRAS.427..948A
6528:2009ApJ...694L.148L
6389:. pp. 267â273.
6293:2001PaReL..22.1255W
6098:1999Natur.401..788L
6062:2012arXiv1212.4777A
6012:Linear Algebra Appl
5976:Linear Algebra Appl
5939:2016ApJ...824..117P
5877:2012ApJ...755L..28S
5671:2011SJSC...33.3261K
5626:2012ITSP...60.2882G
5407:Behnke, S. (2003).
5258:2012PLoSO...746331T
4982:. pp. 556â562.
4929:1999Natur.401..788L
4885:1995AtmEn..29.1705A
4731:Edward A. Sylvestre
4685:2011PLoSO...628898M
4597:2020ApJ...892...74R
4529:2018ApJ...852..104R
4459:2007AJ....133..734B
4401:. pp. 283â290.
4345:Multilinear algebra
4222:Population genetics
4216:Population genetics
4085:scientific journals
4006:circumstellar disks
3987:circumstellar disks
3974:dimension reduction
3554:vector quantization
2949:
2917:
2724:
2692:
2566:convolution kernels
2479:
2231:convex combinations
1508:Clustering property
1416:coefficients matrix
1076:recommender systems
1060:document clustering
678:Electrochemical RAM
585:reservoir computing
316:Logistic regression
235:Supervised learning
221:Multimodal learning
196:Feature engineering
141:Generative modeling
103:Rule-based learning
98:Curriculum learning
58:Supervised learning
33:Part of a series on
8131:Neural Computation
7992:Neural Computation
7254:Trends in Genetics
5562:Hyunsoo Kim &
5465:Neural Computation
4911:H. Sebastian Seung
4177:
4148:
4121:
3979:parallel computing
3948:is a non-negative
3932:
3903:
3867:
3815:
3772:
3690:directed graphical
3670:K-means clustering
3663:latent class model
3659:maximum likelihood
3647:
3635:
3611:
3579:) through weights
3492:
3472:
3427:
3317:
3278:
3276:
3206:
3105:
2921:
2883:
2862:
2696:
2658:
2639:
2545:If the columns of
2480:
2442:
2397:
2275:
2055:
2012:
1992:
1972:
1952:
1932:
1897:
1847:
1832:K-means clustering
1820:
1780:
1755:
1717:
1662:
1642:
1622:
1587:
1562:
1248:
1165:
993:
246: •
161:Density estimation
8308:978-1-611976-40-3
7940:(10): 2289â2298.
7879:978-1-61197-262-7
7716:mahout.apache.org
7493:10.1109/42.996340
7354:(11): 3162â3172.
7110:(12): 1495â1502.
6869:(12): 2273â2284.
6419:978-0-7803-8359-3
6287:(12): 1255â1261.
6092:(6755): 788â791.
5767:978-0-89871-593-4
5689:10.1137/110821172
5599:10.1137/07069239x
5472:(10): 2756â2779.
5430:978-0-7803-7898-8
5157:10.1137/070709967
4923:(6755): 788â791.
4879:(14): 1705â1718.
4727:William H. Lawton
4315:
4314:
4226:genetic admixture
4124:{\displaystyle N}
3929:
3900:
3841:
3802:
3602:
3495:{\displaystyle n}
3274:
3204:
3103:
2860:
2642:{\displaystyle n}
2541:Convolutional NMF
2502:L1 regularization
2166:are smaller than
2015:{\displaystyle k}
1995:{\displaystyle k}
1975:{\displaystyle W}
1955:{\displaystyle k}
1850:{\displaystyle H}
1665:{\displaystyle H}
1645:{\displaystyle W}
961:
960:
766:Model diagnostics
749:Human-in-the-loop
592:Boltzmann machine
505:Anomaly detection
301:Linear regression
216:Ontology learning
211:Grammar induction
186:Semantic analysis
181:Association rules
166:Anomaly detection
108:Neuro-symbolic AI
16:(Redirected from
8376:
8321:
8320:
8210:
8200:
8190:
8163:
8124:
8083:Andrzej Cichocki
8078:
8076:
8063:
8024:
7985:
7959:
7957:
7913:
7912:
7910:
7898:
7892:
7891:
7863:
7853:
7842:
7836:
7835:
7833:
7832:
7826:
7820:. Archived from
7815:
7806:
7800:
7799:
7797:
7796:
7790:
7784:. Archived from
7767:
7741:
7732:
7726:
7725:
7723:
7722:
7708:
7702:
7701:
7695:
7686:
7680:
7679:
7673:
7664:
7658:
7657:
7647:
7630:(4): 1350â1362.
7619:
7613:
7612:
7570:
7564:
7563:
7553:
7519:
7513:
7512:
7474:
7468:
7467:
7429:
7423:
7422:
7397:(4): 1956â1967.
7386:
7380:
7379:
7343:
7337:
7336:
7302:
7296:
7295:
7285:
7244:
7238:
7237:
7227:
7187:
7181:
7180:
7170:
7136:
7130:
7129:
7119:
7093:
7087:
7086:
7076:
7066:
7032:
7026:
7025:
7015:
6981:
6975:
6964:
6958:
6957:
6955:
6949:. Archived from
6932:
6912:
6903:
6897:
6896:
6878:
6856:
6850:
6849:
6829:
6823:
6822:
6820:
6808:
6802:
6801:
6799:
6777:
6771:
6770:
6740:
6734:
6733:
6731:
6730:
6719:
6713:
6712:
6676:
6667:
6661:
6660:
6634:
6613:
6607:
6606:
6596:
6578:
6554:
6548:
6547:
6521:
6501:
6495:
6494:
6484:
6466:
6464:astro-ph/0703072
6438:
6432:
6431:
6397:
6391:
6390:
6376:
6370:
6369:
6361:
6355:
6354:
6352:
6341:
6335:
6334:
6332:
6321:
6315:
6314:
6304:
6274:
6268:
6267:
6234:
6228:
6227:
6217:
6211:
6200:
6194:
6193:
6191:
6190:
6184:
6177:
6166:
6160:
6159:
6157:
6146:
6140:
6139:
6133:
6125:
6081:
6072:
6066:
6065:
6055:
6037:
6028:
6027:
6009:
6000:
5994:
5993:
5991:
5967:
5961:
5960:
5950:
5932:
5908:
5897:
5896:
5870:
5850:
5841:
5840:
5838:
5826:
5817:
5816:
5798:
5778:
5772:
5771:
5745:
5739:
5738:
5728:
5708:
5699:
5693:
5692:
5682:
5665:(6): 3261â3281.
5652:
5646:
5645:
5620:(6): 2882â2898.
5609:
5603:
5602:
5592:
5572:
5559:
5553:
5552:
5534:
5525:(6): 1589â1596.
5514:
5508:
5507:
5481:
5461:
5452:
5443:
5442:
5404:
5398:
5397:
5372:(7): 1087â1099.
5361:
5355:
5354:
5334:
5328:
5327:
5307:
5296:
5290:
5289:
5279:
5269:
5235:
5226:
5225:
5223:
5207:
5198:
5197:
5175:
5169:
5168:
5150:
5141:(3): 1364â1377.
5130:
5124:
5123:
5103:
5097:
5096:
5088:
5082:
5081:
5061:
5055:
5052:
5043:
5042:
5040:
5034:. Archived from
5026:(8): 3913â3927.
5017:
5008:
5002:
4995:
4984:
4983:
4977:
4966:
4957:
4956:
4906:
4897:
4896:
4859:
4853:
4852:
4812:
4806:
4805:
4767:
4761:
4760:
4723:
4717:
4716:
4706:
4696:
4664:
4658:
4657:
4647:
4641:
4640:
4628:
4619:
4618:
4608:
4590:
4566:
4551:
4550:
4540:
4522:
4498:
4479:
4478:
4452:
4450:astro-ph/0606170
4432:
4421:
4420:
4418:
4409:
4403:
4402:
4390:
4310:
4307:
4301:
4289:
4288:
4281:
4277:Current research
4255:drug repurposing
4186:
4184:
4183:
4178:
4157:
4155:
4154:
4149:
4147:
4146:
4130:
4128:
4127:
4122:
4062:feature-document
3947:
3941:
3939:
3938:
3933:
3931:
3930:
3922:
3912:
3910:
3909:
3904:
3902:
3901:
3893:
3876:
3874:
3873:
3868:
3866:
3861:
3860:
3852:
3843:
3842:
3834:
3824:
3822:
3821:
3816:
3814:
3804:
3803:
3795:
3781:
3779:
3778:
3773:
3771:
3766:
3765:
3757:
3742:
3709:, just like the
3683:
3677:
3644:
3642:
3641:
3636:
3634:
3633:
3624:
3623:
3610:
3590:
3584:
3578:
3572:
3538:
3534:
3528:
3522:
3501:
3499:
3498:
3493:
3481:
3479:
3478:
3473:
3445:
3439:
3412:
3406:
3384:gradient descent
3381:
3375:
3369:
3363:
3357:
3351:
3345:
3339:
3326:
3324:
3323:
3318:
3316:
3311:
3303:
3290:matrices of ones
3287:
3285:
3284:
3279:
3277:
3275:
3273:
3272:
3271:
3270:
3264:
3258:
3253:
3247:
3246:
3245:
3244:
3238:
3232:
3226:
3215:
3213:
3212:
3207:
3205:
3203:
3202:
3197:
3192:
3191:
3190:
3184:
3177:
3176:
3171:
3170:
3169:
3163:
3156:
3147:
3141:
3128:
3122:
3114:
3112:
3111:
3106:
3104:
3102:
3101:
3100:
3079:
3078:
3069:
3068:
3057:
3048:
3047:
3036:
3030:
3029:
3024:
3014:
3013:
3012:
2991:
2990:
2981:
2980:
2969:
2960:
2951:
2948:
2943:
2926:
2916:
2905:
2888:
2871:
2869:
2868:
2863:
2861:
2859:
2858:
2857:
2836:
2835:
2830:
2824:
2823:
2818:
2812:
2811:
2802:
2801:
2796:
2783:
2782:
2781:
2760:
2755:
2754:
2745:
2744:
2739:
2726:
2723:
2718:
2701:
2691:
2680:
2663:
2648:
2646:
2645:
2640:
2628:
2622:
2613:
2607:
2593:
2587:
2573:
2563:
2557:
2550:
2489:
2487:
2486:
2481:
2478:
2473:
2468:
2464:
2463:
2452:
2435:
2427:
2406:
2404:
2403:
2398:
2396:
2370:
2364:
2355:and possibly by
2354:
2348:
2330:
2324:
2314:
2307:nonnegative rank
2296:
2290:
2284:
2282:
2281:
2276:
2271:
2270:
2252:
2251:
2228:
2222:
2216:
2195:
2189:
2183:
2177:
2171:
2165:
2159:
2150:
2136:
2130:
2124:
2118:
2112:
2106:
2100:
2094:
2064:
2062:
2061:
2056:
2048:
2047:
2042:
2036:
2021:
2019:
2018:
2013:
2001:
1999:
1998:
1993:
1981:
1979:
1978:
1973:
1961:
1959:
1958:
1953:
1941:
1939:
1938:
1933:
1931:
1930:
1906:
1904:
1903:
1898:
1896:
1895:
1887:
1878:
1877:
1869:
1856:
1854:
1853:
1848:
1829:
1827:
1826:
1821:
1813:
1812:
1807:
1801:
1789:
1787:
1786:
1781:
1779:
1764:
1762:
1761:
1756:
1726:
1724:
1723:
1718:
1713:
1712:
1707:
1703:
1671:
1669:
1668:
1663:
1651:
1649:
1648:
1643:
1631:
1629:
1628:
1623:
1621:
1616:
1608:
1596:
1594:
1593:
1588:
1586:
1571:
1569:
1568:
1563:
1558:
1557:
1539:
1538:
1523:
1503:
1497:
1490:
1483:
1470:
1464:
1458:
1449:
1443:
1437:
1431:
1422:
1413:
1401:
1395:
1389:
1376:
1370:
1364:
1358:
1348:
1342:
1332:
1326:
1316:
1306:
1300:
1294:
1283:
1277:
1271:
1257:
1255:
1254:
1249:
1243:
1242:
1237:
1231:
1223:
1222:
1217:
1201:
1195:
1189:
1183:
1174:
1172:
1171:
1166:
1160:
1155:
1147:
1132:
1126:
1120:
1046:
1040:
1030:
990:
984:
978:
972:
953:
946:
939:
900:Related articles
777:Confusion matrix
530:Isolation forest
475:Graphical models
254:
253:
206:Learning to rank
201:Feature learning
39:Machine learning
30:
21:
8384:
8383:
8379:
8378:
8377:
8375:
8374:
8373:
8344:
8343:
8342:
8341:
8340:
8322:
8318:
8313:
8166:
8127:
8087:Shun-ichi Amari
8081:
8066:
8038:(17â18): 7â18.
8027:
7988:
7962:
7925:
7921:
7916:
7900:
7899:
7895:
7880:
7861:10.1.1.301.1771
7851:
7844:
7843:
7839:
7830:
7828:
7824:
7813:
7808:
7807:
7803:
7794:
7792:
7788:
7765:10.1.1.707.7348
7739:
7734:
7733:
7729:
7720:
7718:
7712:"Apache Mahout"
7710:
7709:
7705:
7693:
7688:
7687:
7683:
7671:
7666:
7665:
7661:
7645:10.1.1.137.8281
7621:
7620:
7616:
7572:
7571:
7567:
7521:
7520:
7516:
7476:
7475:
7471:
7431:
7430:
7426:
7388:
7387:
7383:
7345:
7344:
7340:
7325:
7304:
7303:
7299:
7260:(10): 790â805.
7246:
7245:
7241:
7189:
7188:
7184:
7138:
7137:
7133:
7095:
7094:
7090:
7049:(7): e1000029.
7034:
7033:
7029:
6983:
6982:
6978:
6965:
6961:
6953:
6930:10.1.1.300.2851
6910:
6905:
6904:
6900:
6876:10.1.1.136.3837
6858:
6857:
6853:
6831:
6830:
6826:
6810:
6809:
6805:
6779:
6778:
6774:
6742:
6741:
6737:
6728:
6726:
6721:
6720:
6716:
6674:
6669:
6668:
6664:
6615:
6614:
6610:
6556:
6555:
6551:
6503:
6502:
6498:
6440:
6439:
6435:
6420:
6399:
6398:
6394:
6378:
6377:
6373:
6363:
6362:
6358:
6350:
6343:
6342:
6338:
6330:
6323:
6322:
6318:
6276:
6275:
6271:
6256:10.2307/1390831
6236:
6235:
6231:
6219:
6218:
6214:
6201:
6197:
6188:
6186:
6182:
6175:
6168:
6167:
6163:
6155:
6148:
6147:
6143:
6126:
6079:
6074:
6073:
6069:
6039:
6038:
6031:
6007:
6002:
6001:
5997:
5969:
5968:
5964:
5910:
5909:
5900:
5852:
5851:
5844:
5828:
5827:
5820:
5780:
5779:
5775:
5768:
5747:
5746:
5742:
5706:
5701:
5700:
5696:
5654:
5653:
5649:
5611:
5610:
5606:
5570:
5561:
5560:
5556:
5516:
5515:
5511:
5479:10.1.1.308.9135
5459:
5454:
5453:
5446:
5431:
5406:
5405:
5401:
5363:
5362:
5358:
5351:
5336:
5335:
5331:
5324:
5305:
5298:
5297:
5293:
5237:
5236:
5229:
5209:
5208:
5201:
5177:
5176:
5172:
5132:
5131:
5127:
5120:10.1137/1016064
5105:
5104:
5100:
5090:
5089:
5085:
5063:
5062:
5058:
5053:
5046:
5038:
5015:
5010:
5009:
5005:
4996:
4987:
4975:
4968:
4967:
4960:
4908:
4907:
4900:
4861:
4860:
4856:
4814:
4813:
4809:
4769:
4768:
4764:
4749:10.2307/1267173
4725:
4724:
4720:
4666:
4665:
4661:
4649:
4648:
4644:
4630:
4629:
4622:
4568:
4567:
4554:
4500:
4499:
4482:
4434:
4433:
4424:
4416:
4411:
4410:
4406:
4392:
4391:
4382:
4378:
4373:
4365:Tensor software
4341:
4311:
4305:
4302:
4299:
4290:
4286:
4279:
4263:
4261:Nuclear imaging
4247:DNA methylation
4243:gene expression
4241:for clustering
4235:
4218:
4194:
4160:
4159:
4138:
4133:
4132:
4113:
4112:
4109:
4101:
4043:
4026:data imputation
4022:
4020:Data imputation
3970:
3965:
3950:monomial matrix
3943:
3915:
3914:
3886:
3885:
3847:
3827:
3826:
3787:
3786:
3746:
3730:
3729:
3719:
3679:
3673:
3625:
3612:
3597:
3596:
3586:
3580:
3574:
3568:
3546:
3536:
3530:
3524:
3518:
3515:
3484:
3483:
3452:
3451:
3441:
3435:
3432:
3408:
3402:
3377:
3371:
3365:
3359:
3353:
3347:
3341:
3335:
3294:
3293:
3259:
3248:
3233:
3227:
3218:
3217:
3179:
3178:
3158:
3157:
3150:
3149:
3143:
3137:
3124:
3118:
3080:
3070:
3052:
3031:
3019:
3015:
2992:
2982:
2964:
2952:
2878:
2877:
2837:
2825:
2813:
2803:
2791:
2784:
2761:
2746:
2734:
2727:
2653:
2652:
2631:
2630:
2624:
2618:
2609:
2603:
2589:
2583:
2580:
2569:
2564:, representing
2559:
2553:
2546:
2543:
2522:
2447:
2443:
2412:
2411:
2387:
2386:
2366:
2360:
2350:
2344:
2337:
2326:
2316:
2310:
2303:
2292:
2286:
2262:
2243:
2235:
2234:
2224:
2218:
2215:
2205:
2202:
2191:
2185:
2179:
2173:
2167:
2161:
2155:
2138:
2132:
2126:
2120:
2114:
2108:
2102:
2096:
2090:
2087:
2082:
2037:
2027:
2026:
2004:
2003:
1984:
1983:
1964:
1963:
1944:
1943:
1922:
1917:
1916:
1882:
1864:
1859:
1858:
1839:
1838:
1802:
1792:
1791:
1770:
1769:
1729:
1728:
1690:
1686:
1685:
1680:
1679:
1654:
1653:
1634:
1633:
1599:
1598:
1577:
1576:
1549:
1530:
1514:
1513:
1510:
1499:
1493:
1486:
1479:
1466:
1460:
1454:
1445:
1439:
1433:
1427:
1426:The product of
1418:
1409:
1407:features matrix
1397:
1391:
1385:
1372:
1366:
1360:
1350:
1344:
1334:
1328:
1318:
1312:
1302:
1296:
1293:
1285:
1279:
1273:
1270:
1262:
1232:
1212:
1207:
1206:
1197:
1191:
1185:
1179:
1138:
1137:
1128:
1122:
1116:
1113:
1088:
1056:computer vision
1042:
1036:
1026:
986:
980:
974:
968:
957:
928:
927:
901:
893:
892:
853:
845:
844:
805:Kernel machines
800:
792:
791:
767:
759:
758:
739:Active learning
734:
726:
725:
694:
684:
683:
609:Diffusion model
545:
535:
534:
507:
497:
496:
470:
460:
459:
415:Factor analysis
410:
400:
399:
383:
346:
336:
335:
256:
255:
239:
238:
237:
226:
225:
131:
123:
122:
88:Online learning
53:
41:
28:
23:
22:
15:
12:
11:
5:
8382:
8380:
8372:
8371:
8366:
8361:
8356:
8354:Linear algebra
8346:
8345:
8323:
8316:
8315:
8314:
8312:
8311:
8300:
8297:978-3030103033
8289:
8286:978-0128177969
8278:
8275:978-3844048148
8267:
8264:978-3662517000
8256:
8253:978-9812872265
8245:
8242:978-3844324891
8234:
8231:978-0470746660
8222:
8219:978-9774540455
8211:
8164:
8138:(3): 793â830.
8125:
8099:(1): 142â145.
8079:
8064:
8025:
7999:(3): 780â791.
7986:
7964:Pentti Paatero
7960:
7922:
7920:
7917:
7915:
7914:
7893:
7878:
7837:
7801:
7727:
7703:
7681:
7659:
7614:
7565:
7536:(7): 1104â11.
7514:
7469:
7442:(4): 1310â21.
7424:
7381:
7338:
7323:
7297:
7239:
7202:(1): 246â259.
7182:
7153:(3): 359â371.
7131:
7103:Bioinformatics
7088:
7027:
6998:(4): 973â983.
6976:
6959:
6956:on 2011-11-14.
6923:(4): 334â347.
6898:
6851:
6840:(1): 155â173.
6824:
6803:
6772:
6753:(3): 249â264.
6735:
6714:
6687:(3): 520â522.
6662:
6608:
6549:
6496:
6457:(2): 575â586.
6433:
6418:
6392:
6371:
6356:
6336:
6316:
6269:
6250:(4): 854â888.
6238:Pentti Paatero
6229:
6212:
6195:
6161:
6141:
6067:
6029:
6018:(2): 421â435.
5995:
5962:
5898:
5842:
5818:
5789:(1): 119â134.
5773:
5766:
5740:
5719:(2): 285â319.
5694:
5680:10.1.1.419.798
5647:
5604:
5590:10.1.1.70.3485
5583:(2): 713â730.
5554:
5532:10.1.1.407.318
5509:
5444:
5429:
5399:
5356:
5349:
5329:
5322:
5291:
5252:(11): e46331.
5227:
5199:
5181:Neurocomputing
5170:
5125:
5114:(3): 393â394.
5098:
5083:
5072:(2): 161â172.
5056:
5044:
5041:on 2016-03-04.
5003:
4985:
4958:
4898:
4867:Pentti Paatero
4854:
4827:(2): 111â126.
4821:Environmetrics
4807:
4762:
4743:(3): 617â633.
4718:
4679:(12): e28898.
4659:
4642:
4620:
4552:
4480:
4467:10.1086/510127
4443:(2): 734â754.
4422:
4404:
4379:
4377:
4374:
4372:
4369:
4368:
4367:
4362:
4357:
4352:
4347:
4340:
4337:
4336:
4335:
4332:
4329:
4326:
4325:Decomposition.
4322:
4313:
4312:
4293:
4291:
4284:
4278:
4275:
4262:
4259:
4239:bioinformatics
4234:
4233:Bioinformatics
4231:
4217:
4214:
4206:Gaussian noise
4193:
4190:
4176:
4173:
4170:
4167:
4145:
4141:
4120:
4108:
4105:
4100:
4097:
4042:
4039:
4021:
4018:
3969:
3966:
3964:
3961:
3928:
3925:
3899:
3896:
3865:
3859:
3856:
3851:
3846:
3840:
3837:
3813:
3810:
3807:
3801:
3798:
3783:
3782:
3770:
3764:
3761:
3756:
3753:
3750:
3745:
3741:
3738:
3718:
3715:
3632:
3628:
3622:
3619:
3615:
3609:
3605:
3545:
3542:
3514:
3511:
3491:
3471:
3468:
3465:
3462:
3459:
3431:
3430:Sequential NMF
3428:
3315:
3310:
3306:
3302:
3269:
3263:
3257:
3252:
3243:
3237:
3231:
3201:
3196:
3189:
3183:
3175:
3168:
3162:
3131:
3130:
3115:
3099:
3096:
3093:
3090:
3087:
3083:
3077:
3073:
3067:
3064:
3061:
3056:
3051:
3046:
3043:
3040:
3035:
3028:
3023:
3018:
3011:
3008:
3005:
3002:
2999:
2995:
2989:
2985:
2979:
2976:
2973:
2968:
2963:
2959:
2955:
2947:
2942:
2939:
2936:
2933:
2930:
2925:
2920:
2915:
2912:
2909:
2904:
2901:
2898:
2895:
2892:
2887:
2875:
2872:
2856:
2853:
2850:
2847:
2844:
2840:
2834:
2829:
2822:
2817:
2810:
2806:
2800:
2795:
2790:
2787:
2780:
2777:
2774:
2771:
2768:
2764:
2759:
2753:
2749:
2743:
2738:
2733:
2730:
2722:
2717:
2714:
2711:
2708:
2705:
2700:
2695:
2690:
2687:
2684:
2679:
2676:
2673:
2670:
2667:
2662:
2650:
2638:
2615:
2579:
2576:
2542:
2539:
2521:
2518:
2491:
2490:
2477:
2472:
2467:
2462:
2459:
2455:
2451:
2446:
2441:
2438:
2434:
2430:
2426:
2422:
2419:
2395:
2376:Frobenius norm
2357:regularization
2341:cost functions
2336:
2333:
2302:
2299:
2274:
2269:
2265:
2261:
2258:
2255:
2250:
2246:
2242:
2213:
2201:
2198:
2086:
2083:
2081:
2078:
2054:
2051:
2046:
2041:
2035:
2011:
1991:
1971:
1951:
1929:
1925:
1894:
1891:
1886:
1881:
1876:
1873:
1868:
1846:
1819:
1816:
1811:
1806:
1800:
1778:
1754:
1751:
1748:
1745:
1742:
1739:
1736:
1716:
1711:
1706:
1702:
1699:
1696:
1693:
1689:
1674:Frobenius norm
1661:
1641:
1620:
1615:
1611:
1607:
1585:
1561:
1556:
1552:
1548:
1545:
1542:
1537:
1533:
1529:
1526:
1522:
1509:
1506:
1473:
1472:
1451:
1424:
1403:
1289:
1266:
1259:
1258:
1247:
1241:
1236:
1230:
1226:
1221:
1216:
1176:
1175:
1164:
1159:
1154:
1150:
1146:
1112:
1109:
1103:after Lee and
1087:
1084:
1080:bioinformatics
1020:linear algebra
1010:is a group of
959:
958:
956:
955:
948:
941:
933:
930:
929:
926:
925:
920:
919:
918:
908:
902:
899:
898:
895:
894:
891:
890:
885:
880:
875:
870:
865:
860:
854:
851:
850:
847:
846:
843:
842:
837:
832:
827:
825:Occam learning
822:
817:
812:
807:
801:
798:
797:
794:
793:
790:
789:
784:
782:Learning curve
779:
774:
768:
765:
764:
761:
760:
757:
756:
751:
746:
741:
735:
732:
731:
728:
727:
724:
723:
722:
721:
711:
706:
701:
695:
690:
689:
686:
685:
682:
681:
675:
670:
665:
660:
659:
658:
648:
643:
642:
641:
636:
631:
626:
616:
611:
606:
601:
600:
599:
589:
588:
587:
582:
577:
572:
562:
557:
552:
546:
541:
540:
537:
536:
533:
532:
527:
522:
514:
508:
503:
502:
499:
498:
495:
494:
493:
492:
487:
482:
471:
466:
465:
462:
461:
458:
457:
452:
447:
442:
437:
432:
427:
422:
417:
411:
406:
405:
402:
401:
398:
397:
392:
387:
381:
376:
371:
363:
358:
353:
347:
342:
341:
338:
337:
334:
333:
328:
323:
318:
313:
308:
303:
298:
290:
289:
288:
283:
278:
268:
266:Decision trees
263:
257:
243:classification
233:
232:
231:
228:
227:
224:
223:
218:
213:
208:
203:
198:
193:
188:
183:
178:
173:
168:
163:
158:
153:
148:
143:
138:
136:Classification
132:
129:
128:
125:
124:
121:
120:
115:
110:
105:
100:
95:
93:Batch learning
90:
85:
80:
75:
70:
65:
60:
54:
51:
50:
47:
46:
35:
34:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
8381:
8370:
8369:Factorization
8367:
8365:
8362:
8360:
8359:Matrix theory
8357:
8355:
8352:
8351:
8349:
8338:
8337:
8336:
8330:
8326:
8309:
8305:
8301:
8298:
8294:
8290:
8287:
8283:
8279:
8276:
8272:
8268:
8265:
8261:
8257:
8254:
8250:
8246:
8243:
8239:
8235:
8232:
8228:
8223:
8220:
8216:
8212:
8208:
8204:
8199:
8194:
8189:
8184:
8180:
8176:
8175:
8170:
8165:
8161:
8157:
8153:
8149:
8145:
8141:
8137:
8133:
8132:
8126:
8122:
8118:
8114:
8110:
8106:
8102:
8098:
8094:
8093:
8088:
8084:
8080:
8075:
8070:
8065:
8061:
8057:
8053:
8049:
8045:
8041:
8037:
8033:
8032:
8026:
8022:
8018:
8014:
8010:
8006:
8002:
7998:
7994:
7993:
7987:
7983:
7979:
7975:
7971:
7970:
7965:
7961:
7956:
7951:
7947:
7943:
7939:
7935:
7934:
7929:
7924:
7923:
7918:
7909:
7904:
7897:
7894:
7889:
7885:
7881:
7875:
7871:
7867:
7862:
7857:
7850:
7849:
7841:
7838:
7827:on 2015-04-02
7823:
7819:
7812:
7805:
7802:
7791:on 2015-04-19
7787:
7783:
7779:
7775:
7771:
7766:
7761:
7757:
7753:
7749:
7745:
7738:
7731:
7728:
7717:
7713:
7707:
7704:
7699:
7692:
7685:
7682:
7677:
7670:
7663:
7660:
7655:
7651:
7646:
7641:
7637:
7633:
7629:
7625:
7618:
7615:
7610:
7606:
7602:
7598:
7594:
7590:
7587:(1): 216â18.
7586:
7582:
7581:
7576:
7569:
7566:
7561:
7557:
7552:
7547:
7543:
7539:
7535:
7531:
7530:
7525:
7518:
7515:
7510:
7506:
7502:
7498:
7494:
7490:
7487:(3): 216â25.
7486:
7482:
7481:
7473:
7470:
7465:
7461:
7457:
7453:
7449:
7445:
7441:
7437:
7436:
7428:
7425:
7420:
7416:
7412:
7408:
7404:
7400:
7396:
7392:
7385:
7382:
7377:
7373:
7369:
7365:
7361:
7357:
7353:
7349:
7342:
7339:
7334:
7330:
7326:
7320:
7316:
7312:
7308:
7301:
7298:
7293:
7289:
7284:
7279:
7275:
7271:
7267:
7263:
7259:
7255:
7251:
7243:
7240:
7235:
7231:
7226:
7221:
7217:
7213:
7209:
7205:
7201:
7197:
7193:
7186:
7183:
7178:
7174:
7169:
7164:
7160:
7156:
7152:
7148:
7147:
7142:
7135:
7132:
7127:
7123:
7118:
7113:
7109:
7105:
7104:
7099:
7092:
7089:
7084:
7080:
7075:
7070:
7065:
7060:
7056:
7052:
7048:
7044:
7043:
7038:
7031:
7028:
7023:
7019:
7014:
7009:
7005:
7001:
6997:
6993:
6992:
6987:
6980:
6977:
6973:
6969:
6963:
6960:
6952:
6948:
6944:
6940:
6936:
6931:
6926:
6922:
6918:
6917:
6909:
6902:
6899:
6894:
6890:
6886:
6882:
6877:
6872:
6868:
6864:
6863:
6855:
6852:
6847:
6843:
6839:
6835:
6828:
6825:
6819:
6814:
6807:
6804:
6798:
6793:
6789:
6785:
6784:
6776:
6773:
6768:
6764:
6760:
6756:
6752:
6748:
6747:
6739:
6736:
6725:
6718:
6715:
6710:
6706:
6702:
6698:
6694:
6690:
6686:
6682:
6681:
6673:
6666:
6663:
6658:
6654:
6650:
6646:
6642:
6638:
6633:
6628:
6624:
6620:
6612:
6609:
6604:
6600:
6595:
6590:
6586:
6582:
6577:
6572:
6568:
6564:
6560:
6553:
6550:
6545:
6541:
6537:
6533:
6529:
6525:
6520:
6515:
6511:
6507:
6500:
6497:
6492:
6488:
6483:
6478:
6474:
6470:
6465:
6460:
6456:
6452:
6448:
6444:
6437:
6434:
6429:
6425:
6421:
6415:
6411:
6407:
6403:
6396:
6393:
6388:
6384:
6383:
6375:
6372:
6367:
6360:
6357:
6349:
6348:
6340:
6337:
6329:
6328:
6320:
6317:
6312:
6308:
6303:
6298:
6294:
6290:
6286:
6282:
6281:
6273:
6270:
6265:
6261:
6257:
6253:
6249:
6245:
6244:
6239:
6233:
6230:
6225:
6224:
6216:
6213:
6209:
6205:
6204:Amnon Shashua
6202:Ron Zass and
6199:
6196:
6185:on 2007-09-28
6181:
6174:
6173:
6165:
6162:
6154:
6153:
6145:
6142:
6137:
6131:
6123:
6119:
6115:
6111:
6107:
6106:10.1038/44565
6103:
6099:
6095:
6091:
6087:
6086:
6078:
6071:
6068:
6063:
6059:
6054:
6049:
6045:
6044:
6036:
6034:
6030:
6025:
6021:
6017:
6013:
6006:
5999:
5996:
5990:
5985:
5981:
5977:
5973:
5966:
5963:
5958:
5954:
5949:
5944:
5940:
5936:
5931:
5926:
5922:
5918:
5914:
5907:
5905:
5903:
5899:
5894:
5890:
5886:
5882:
5878:
5874:
5869:
5864:
5860:
5856:
5849:
5847:
5843:
5837:
5832:
5825:
5823:
5819:
5814:
5810:
5806:
5802:
5797:
5792:
5788:
5784:
5777:
5774:
5769:
5763:
5759:
5755:
5751:
5744:
5741:
5736:
5732:
5727:
5722:
5718:
5714:
5713:
5705:
5698:
5695:
5690:
5686:
5681:
5676:
5672:
5668:
5664:
5660:
5659:
5651:
5648:
5643:
5639:
5635:
5631:
5627:
5623:
5619:
5615:
5608:
5605:
5600:
5596:
5591:
5586:
5582:
5578:
5577:
5569:
5565:
5558:
5555:
5550:
5546:
5542:
5538:
5533:
5528:
5524:
5520:
5513:
5510:
5505:
5501:
5497:
5493:
5489:
5485:
5480:
5475:
5471:
5467:
5466:
5458:
5451:
5449:
5445:
5440:
5436:
5432:
5426:
5422:
5418:
5414:
5410:
5403:
5400:
5395:
5391:
5387:
5383:
5379:
5375:
5371:
5367:
5360:
5357:
5352:
5350:9780769530284
5346:
5342:
5341:
5333:
5330:
5325:
5323:9781450308137
5319:
5315:
5311:
5304:
5303:
5295:
5292:
5287:
5283:
5278:
5273:
5268:
5263:
5259:
5255:
5251:
5247:
5246:
5241:
5234:
5232:
5228:
5222:
5217:
5213:
5206:
5204:
5200:
5195:
5191:
5187:
5183:
5182:
5174:
5171:
5166:
5162:
5158:
5154:
5149:
5144:
5140:
5136:
5135:SIAM J. Optim
5129:
5126:
5121:
5117:
5113:
5109:
5102:
5099:
5094:
5087:
5084:
5079:
5075:
5071:
5067:
5060:
5057:
5051:
5049:
5045:
5037:
5033:
5029:
5025:
5021:
5014:
5007:
5004:
5000:
4994:
4992:
4990:
4986:
4981:
4974:
4973:
4965:
4963:
4959:
4954:
4950:
4946:
4942:
4938:
4937:10.1038/44565
4934:
4930:
4926:
4922:
4918:
4917:
4912:
4905:
4903:
4899:
4894:
4890:
4886:
4882:
4878:
4874:
4873:
4868:
4864:
4858:
4855:
4850:
4846:
4842:
4838:
4834:
4830:
4826:
4822:
4818:
4811:
4808:
4803:
4799:
4795:
4791:
4787:
4783:
4780:: S273âS276.
4779:
4775:
4774:
4766:
4763:
4758:
4754:
4750:
4746:
4742:
4738:
4737:
4736:Technometrics
4732:
4728:
4722:
4719:
4714:
4710:
4705:
4700:
4695:
4690:
4686:
4682:
4678:
4674:
4670:
4663:
4660:
4655:
4654:
4646:
4643:
4638:
4634:
4633:Peter J. Haas
4627:
4625:
4621:
4616:
4612:
4607:
4602:
4598:
4594:
4589:
4584:
4580:
4576:
4572:
4565:
4563:
4561:
4559:
4557:
4553:
4548:
4544:
4539:
4534:
4530:
4526:
4521:
4516:
4512:
4508:
4504:
4497:
4495:
4493:
4491:
4489:
4487:
4485:
4481:
4476:
4472:
4468:
4464:
4460:
4456:
4451:
4446:
4442:
4438:
4431:
4429:
4427:
4423:
4415:
4408:
4405:
4400:
4396:
4389:
4387:
4385:
4381:
4375:
4370:
4366:
4363:
4361:
4358:
4356:
4353:
4351:
4348:
4346:
4343:
4342:
4338:
4333:
4330:
4327:
4323:
4320:
4319:
4318:
4309:
4306:February 2024
4297:
4292:
4283:
4282:
4276:
4274:
4272:
4268:
4260:
4258:
4256:
4251:
4248:
4244:
4240:
4232:
4230:
4227:
4223:
4215:
4213:
4209:
4207:
4203:
4202:Wiener filter
4199:
4191:
4189:
4171:
4165:
4143:
4139:
4118:
4106:
4104:
4098:
4096:
4092:
4088:
4086:
4083:articles and
4082:
4078:
4074:
4069:
4067:
4066:data clusters
4063:
4059:
4055:
4053:
4052:document-term
4048:
4040:
4038:
4034:
4030:
4027:
4019:
4017:
4015:
4009:
4007:
4003:
3999:
3995:
3990:
3988:
3984:
3980:
3975:
3967:
3962:
3960:
3957:
3955:
3951:
3946:
3882:
3880:
3857:
3854:
3844:
3762:
3759:
3743:
3728:
3727:
3726:
3724:
3716:
3714:
3712:
3708:
3703:
3700:
3698:
3693:
3691:
3686:
3682:
3676:
3671:
3666:
3664:
3660:
3657:, trained by
3656:
3652:
3630:
3626:
3620:
3617:
3613:
3607:
3603:
3594:
3589:
3583:
3577:
3571:
3565:
3561:
3559:
3555:
3551:
3543:
3541:
3533:
3527:
3521:
3512:
3510:
3508:
3503:
3489:
3466:
3463:
3460:
3449:
3444:
3438:
3429:
3422:
3418:
3416:
3411:
3405:
3399:
3397:
3391:
3389:
3386:methods, the
3385:
3380:
3374:
3368:
3362:
3356:
3352:is fixed and
3350:
3344:
3340:is fixed and
3338:
3333:
3328:
3304:
3291:
3146:
3140:
3134:
3127:
3121:
3116:
3094:
3091:
3088:
3075:
3065:
3062:
3059:
3044:
3041:
3038:
3026:
3006:
3003:
3000:
2987:
2977:
2974:
2971:
2945:
2937:
2934:
2931:
2913:
2910:
2907:
2899:
2896:
2893:
2876:
2873:
2851:
2848:
2845:
2832:
2820:
2808:
2798:
2775:
2772:
2769:
2751:
2741:
2720:
2712:
2709:
2706:
2688:
2685:
2682:
2674:
2671:
2668:
2651:
2636:
2627:
2621:
2616:
2614:non negative.
2612:
2606:
2601:
2600:
2599:
2597:
2592:
2586:
2577:
2575:
2572:
2567:
2562:
2556:
2549:
2540:
2538:
2536:
2532:
2528:
2519:
2517:
2515:
2514:sparse coding
2511:
2507:
2503:
2498:
2496:
2475:
2470:
2453:
2439:
2428:
2417:
2410:
2409:
2408:
2383:
2381:
2377:
2372:
2369:
2363:
2358:
2353:
2347:
2342:
2334:
2332:
2329:
2323:
2319:
2313:
2308:
2300:
2298:
2295:
2289:
2267:
2263:
2259:
2256:
2253:
2248:
2244:
2232:
2227:
2221:
2212:
2208:
2199:
2197:
2194:
2188:
2182:
2176:
2170:
2164:
2158:
2152:
2149:
2145:
2141:
2137:, such that:
2135:
2129:
2123:
2117:
2111:
2105:
2099:
2093:
2084:
2079:
2077:
2075:
2071:
2066:
2052:
2049:
2044:
2023:
2009:
1989:
1969:
1949:
1927:
1923:
1914:
1910:
1892:
1889:
1879:
1874:
1871:
1844:
1835:
1833:
1817:
1814:
1809:
1766:
1752:
1749:
1746:
1743:
1740:
1737:
1734:
1714:
1709:
1700:
1697:
1694:
1691:
1677:
1675:
1659:
1639:
1609:
1573:
1554:
1550:
1546:
1543:
1540:
1535:
1531:
1524:
1507:
1505:
1502:
1496:
1489:
1482:
1476:
1469:
1463:
1457:
1452:
1448:
1442:
1436:
1430:
1425:
1421:
1417:
1412:
1408:
1404:
1400:
1394:
1388:
1383:
1382:
1381:
1378:
1375:
1369:
1363:
1357:
1353:
1347:
1341:
1337:
1331:
1325:
1321:
1315:
1308:
1305:
1299:
1292:
1288:
1282:
1276:
1269:
1265:
1245:
1239:
1224:
1219:
1205:
1204:
1203:
1200:
1194:
1188:
1182:
1162:
1148:
1136:
1135:
1134:
1131:
1125:
1119:
1110:
1108:
1106:
1102:
1098:
1093:
1085:
1083:
1081:
1077:
1073:
1069:
1065:
1061:
1057:
1053:
1048:
1045:
1039:
1034:
1029:
1025:
1021:
1017:
1013:
1009:
1005:
1001:
997:
989:
983:
977:
971:
965:
954:
949:
947:
942:
940:
935:
934:
932:
931:
924:
921:
917:
914:
913:
912:
909:
907:
904:
903:
897:
896:
889:
886:
884:
881:
879:
876:
874:
871:
869:
866:
864:
861:
859:
856:
855:
849:
848:
841:
838:
836:
833:
831:
828:
826:
823:
821:
818:
816:
813:
811:
808:
806:
803:
802:
796:
795:
788:
785:
783:
780:
778:
775:
773:
770:
769:
763:
762:
755:
752:
750:
747:
745:
744:Crowdsourcing
742:
740:
737:
736:
730:
729:
720:
717:
716:
715:
712:
710:
707:
705:
702:
700:
697:
696:
693:
688:
687:
679:
676:
674:
673:Memtransistor
671:
669:
666:
664:
661:
657:
654:
653:
652:
649:
647:
644:
640:
637:
635:
632:
630:
627:
625:
622:
621:
620:
617:
615:
612:
610:
607:
605:
602:
598:
595:
594:
593:
590:
586:
583:
581:
578:
576:
573:
571:
568:
567:
566:
563:
561:
558:
556:
555:Deep learning
553:
551:
548:
547:
544:
539:
538:
531:
528:
526:
523:
521:
519:
515:
513:
510:
509:
506:
501:
500:
491:
490:Hidden Markov
488:
486:
483:
481:
478:
477:
476:
473:
472:
469:
464:
463:
456:
453:
451:
448:
446:
443:
441:
438:
436:
433:
431:
428:
426:
423:
421:
418:
416:
413:
412:
409:
404:
403:
396:
393:
391:
388:
386:
382:
380:
377:
375:
372:
370:
368:
364:
362:
359:
357:
354:
352:
349:
348:
345:
340:
339:
332:
329:
327:
324:
322:
319:
317:
314:
312:
309:
307:
304:
302:
299:
297:
295:
291:
287:
286:Random forest
284:
282:
279:
277:
274:
273:
272:
269:
267:
264:
262:
259:
258:
251:
250:
245:
244:
236:
230:
229:
222:
219:
217:
214:
212:
209:
207:
204:
202:
199:
197:
194:
192:
189:
187:
184:
182:
179:
177:
174:
172:
171:Data cleaning
169:
167:
164:
162:
159:
157:
154:
152:
149:
147:
144:
142:
139:
137:
134:
133:
127:
126:
119:
116:
114:
111:
109:
106:
104:
101:
99:
96:
94:
91:
89:
86:
84:
83:Meta-learning
81:
79:
76:
74:
71:
69:
66:
64:
61:
59:
56:
55:
49:
48:
45:
40:
36:
32:
31:
19:
8333:
8332:
8331:profile for
8328:
8178:
8172:
8135:
8129:
8096:
8090:
8035:
8029:
7996:
7990:
7976:(1): 23â35.
7973:
7967:
7937:
7931:
7896:
7847:
7840:
7829:. Retrieved
7822:the original
7817:
7804:
7793:. Retrieved
7786:the original
7750:(1): 44â56.
7747:
7743:
7730:
7719:. Retrieved
7715:
7706:
7697:
7684:
7675:
7662:
7627:
7623:
7617:
7584:
7578:
7568:
7533:
7527:
7517:
7484:
7478:
7472:
7439:
7433:
7427:
7394:
7390:
7384:
7351:
7347:
7341:
7306:
7300:
7257:
7253:
7242:
7199:
7196:Cell Reports
7195:
7185:
7150:
7144:
7134:
7107:
7101:
7091:
7046:
7040:
7030:
6995:
6989:
6979:
6971:
6962:
6951:the original
6920:
6914:
6901:
6866:
6860:
6854:
6837:
6833:
6827:
6806:
6782:
6775:
6750:
6744:
6738:
6727:. Retrieved
6717:
6684:
6678:
6665:
6622:
6618:
6611:
6566:
6562:
6552:
6509:
6505:
6499:
6454:
6450:
6436:
6401:
6395:
6381:
6374:
6365:
6359:
6346:
6339:
6326:
6319:
6302:10.1.1.21.24
6284:
6278:
6272:
6247:
6241:
6232:
6222:
6215:
6198:
6187:. Retrieved
6180:the original
6171:
6164:
6151:
6144:
6130:cite journal
6089:
6083:
6070:
6042:
6015:
6011:
5998:
5979:
5975:
5965:
5920:
5916:
5858:
5854:
5786:
5782:
5776:
5749:
5743:
5716:
5710:
5697:
5662:
5656:
5650:
5617:
5613:
5607:
5580:
5574:
5557:
5522:
5518:
5512:
5469:
5463:
5412:
5402:
5369:
5365:
5359:
5339:
5332:
5301:
5294:
5249:
5243:
5211:
5185:
5179:
5173:
5138:
5134:
5128:
5111:
5107:
5101:
5092:
5086:
5069:
5065:
5059:
5036:the original
5023:
5019:
5006:
4971:
4920:
4914:
4876:
4870:
4857:
4824:
4820:
4810:
4777:
4771:
4765:
4740:
4734:
4721:
4676:
4672:
4662:
4652:
4645:
4636:
4578:
4574:
4510:
4506:
4440:
4436:
4407:
4398:
4316:
4303:
4295:
4264:
4252:
4236:
4219:
4210:
4195:
4110:
4102:
4093:
4089:
4070:
4061:
4058:term-feature
4057:
4051:
4044:
4035:
4031:
4023:
4010:
3991:
3971:
3963:Applications
3958:
3944:
3883:
3879:non-negative
3784:
3720:
3704:
3701:
3694:
3687:
3680:
3674:
3667:
3648:
3587:
3581:
3575:
3569:
3549:
3547:
3531:
3525:
3519:
3516:
3504:
3442:
3436:
3433:
3409:
3403:
3400:
3392:
3378:
3372:
3366:
3360:
3354:
3348:
3342:
3336:
3329:
3144:
3138:
3135:
3132:
3125:
3119:
2625:
2619:
2610:
2604:
2602:initialize:
2590:
2584:
2581:
2570:
2560:
2554:
2547:
2544:
2523:
2509:
2499:
2492:
2384:
2373:
2367:
2361:
2351:
2345:
2338:
2327:
2321:
2317:
2311:
2305:In case the
2304:
2293:
2287:
2225:
2219:
2210:
2206:
2203:
2192:
2186:
2180:
2174:
2168:
2162:
2156:
2153:
2147:
2143:
2139:
2133:
2127:
2121:
2115:
2109:
2103:
2097:
2091:
2088:
2067:
2024:
1912:
1908:
1836:
1767:
1678:
1574:
1511:
1500:
1494:
1487:
1480:
1477:
1474:
1467:
1461:
1455:
1446:
1440:
1434:
1428:
1419:
1415:
1410:
1406:
1398:
1392:
1386:
1379:
1373:
1367:
1361:
1359:matrix then
1355:
1351:
1345:
1343:matrix, and
1339:
1335:
1329:
1323:
1319:
1313:
1309:
1303:
1297:
1290:
1286:
1280:
1274:
1267:
1263:
1260:
1198:
1192:
1186:
1180:
1177:
1129:
1123:
1117:
1114:
1100:
1096:
1092:chemometrics
1089:
1068:chemometrics
1049:
1043:
1037:
1027:
1007:
1003:
999:
995:
994:
987:
981:
975:
969:
830:PAC learning
517:
434:
366:
361:Hierarchical
293:
247:
241:
8181:(2): 1â17.
6625:(24): A24.
6512:(2): L148.
6441:Berné, O.;
5982:: 175â182.
5564:Haesun Park
4863:Pia Anttila
4047:text mining
4041:Text mining
3954:permutation
3396:NP-complete
3288:terms, are
3148:, i.e. the
3129:are stable.
1942:belongs to
1727:subject to
1115:Let matrix
714:Multi-agent
651:Transformer
550:Autoencoder
306:Naive Bayes
44:data mining
8348:Categories
7908:1605.06848
7831:2015-03-22
7795:2015-04-19
7721:2019-12-14
7324:1595933395
6818:1911.04705
6729:2008-08-26
6680:NeuroImage
6632:1502.03092
6569:(2): 948.
6443:Joblin, C.
6189:2007-01-29
5930:1604.06097
5923:(2): 117.
5861:(2): L28.
5836:1612.06037
5796:2109.03874
5221:cs/0202009
4588:2001.00563
4520:1712.10317
4513:(2): 104.
4002:exoplanets
3717:Uniqueness
3585:, so that
3388:active set
2578:Algorithms
2520:Online NMF
2371:matrices.
1111:Background
1033:factorized
1012:algorithms
699:Q-learning
597:Restricted
395:Mean shift
344:Clustering
321:Perceptron
249:regression
151:Clustering
146:Regression
8074:0801.3199
7856:CiteSeerX
7760:CiteSeerX
7640:CiteSeerX
7419:235634059
7376:218504587
7274:0168-9525
7216:2211-1247
6974:, 431â436
6925:CiteSeerX
6871:CiteSeerX
6797:0805.1154
6788:Wikimania
6603:119200505
6576:1207.6637
6519:0902.3247
6491:0004-6361
6297:CiteSeerX
6206:(2005). "
6053:1212.4777
5957:118349503
5868:1207.4197
5813:2364-415X
5675:CiteSeerX
5585:CiteSeerX
5527:CiteSeerX
5474:CiteSeerX
5148:0708.4149
4980:MIT Press
4849:Q29308406
4841:1180-4009
4802:Q58065673
4794:0021-8502
4615:209531731
4581:(2): 74.
3998:linearity
3994:linearity
3968:Astronomy
3927:~
3898:~
3855:−
3839:~
3800:~
3760:−
3604:∑
3593:generated
3513:Exact NMF
2919:←
2694:←
2527:streaming
2504:(akin to
2454:−
2257:…
1750:≥
1738:≥
1695:−
1610:≃
1544:…
1052:astronomy
858:ECML PKDD
840:VC theory
787:ROC curve
719:Self-play
639:DeepDream
480:Bayes net
271:Ensembles
52:Paradigms
8207:19536273
8160:13208611
8152:18785855
8060:15445516
8013:17298233
7782:12530378
7609:11060831
7601:25167546
7560:25899294
7501:11989846
7464:37186516
7411:34166199
7368:32365039
7292:30143323
7234:23318258
7177:23291781
7126:17483501
7083:18654623
7022:24496008
6991:Genetics
6893:12931155
6767:16249147
6709:18509039
6701:15946864
6657:20174209
6428:17923083
6114:10548103
5893:51088743
5735:11197117
5566:(2008).
5496:17716011
5386:24807135
5286:23133590
5245:PLOS One
5108:SIAM Rev
4945:10548103
4845:Wikidata
4798:Wikidata
4713:22216138
4673:PLOS ONE
4475:18561804
4339:See also
4014:sparsity
2466:‖
2445:‖
2217:ïŒ i.e.,
1907:for all
1790:, i.e.
1705:‖
1688:‖
1327:matrix,
1022:where a
1006:), also
281:Boosting
130:Problems
8325:Scholia
8310:(2020).
8299:(2019).
8288:(2018).
8277:(2016).
8266:(2016).
8255:(2014).
8244:(2011).
8233:(2009).
8221:(2008).
8198:2688815
8121:9997603
8101:Bibcode
8040:Bibcode
8021:5337451
7942:Bibcode
7752:Bibcode
7632:Bibcode
7551:4640278
7509:6553527
7444:Bibcode
7283:6309559
7225:3588146
7168:4313078
7074:2447881
7051:Bibcode
7013:3982712
6947:8079061
6637:Bibcode
6581:Bibcode
6544:7332750
6524:Bibcode
6469:Bibcode
6353:. NIPS.
6289:Bibcode
6264:1390831
6226:. NIPS.
6122:4428232
6094:Bibcode
6058:Bibcode
5935:Bibcode
5873:Bibcode
5667:Bibcode
5642:8143231
5622:Bibcode
5549:2183630
5504:2295736
5439:3109867
5394:8755408
5277:3487913
5254:Bibcode
5165:7150400
4953:4428232
4925:Bibcode
4881:Bibcode
4757:1267173
4704:3245233
4681:Bibcode
4656:. AAAI.
4593:Bibcode
4547:3966513
4525:Bibcode
4455:Bibcode
4296:updated
3723:inverse
3699:model.
3697:PARAFAC
2365:and/or
2359:of the
1295:is the
1272:is the
1086:History
863:NeurIPS
680:(ECRAM)
634:AlexNet
276:Bagging
8327:has a
8306:
8295:
8284:
8273:
8262:
8251:
8240:
8229:
8217:
8205:
8195:
8158:
8150:
8119:
8058:
8019:
8011:
7919:Others
7886:
7876:
7858:
7780:
7762:
7642:
7607:
7599:
7558:
7548:
7507:
7499:
7462:
7417:
7409:
7374:
7366:
7333:165018
7331:
7321:
7290:
7280:
7272:
7232:
7222:
7214:
7175:
7165:
7124:
7081:
7071:
7020:
7010:
6945:
6927:
6891:
6873:
6765:
6707:
6699:
6655:
6601:
6542:
6489:
6426:
6416:
6299:
6262:
6120:
6112:
6085:Nature
5955:
5891:
5811:
5764:
5733:
5677:
5640:
5587:
5547:
5529:
5502:
5494:
5476:
5437:
5427:
5392:
5384:
5347:
5320:
5284:
5274:
5163:
4951:
4943:
4916:Nature
4847:
4839:
4800:
4792:
4755:
4711:
4701:
4613:
4545:
4473:
4355:Tensor
4073:PubMed
4060:and a
4054:matrix
3117:Until
1333:is an
1317:is an
1261:where
1078:, and
1024:matrix
656:Vision
512:RANSAC
390:OPTICS
385:DBSCAN
369:-means
176:AutoML
8329:topic
8156:S2CID
8117:S2CID
8069:arXiv
8056:S2CID
8017:S2CID
7903:arXiv
7884:S2CID
7852:(PDF)
7825:(PDF)
7814:(PDF)
7789:(PDF)
7778:S2CID
7740:(PDF)
7694:(PDF)
7672:(PDF)
7605:S2CID
7505:S2CID
7460:S2CID
7415:S2CID
7372:S2CID
7329:S2CID
6954:(PDF)
6943:S2CID
6911:(PDF)
6889:S2CID
6813:arXiv
6792:arXiv
6763:S2CID
6705:S2CID
6675:(PDF)
6653:S2CID
6627:arXiv
6599:S2CID
6571:arXiv
6540:S2CID
6514:arXiv
6459:arXiv
6424:S2CID
6351:(PDF)
6331:(PDF)
6260:JSTOR
6183:(PDF)
6176:(PDF)
6156:(PDF)
6118:S2CID
6080:(PDF)
6048:arXiv
6008:(PDF)
5953:S2CID
5925:arXiv
5889:S2CID
5863:arXiv
5831:arXiv
5791:arXiv
5731:S2CID
5707:(PDF)
5638:S2CID
5571:(PDF)
5545:S2CID
5500:S2CID
5460:(PDF)
5435:S2CID
5390:S2CID
5306:(PDF)
5216:arXiv
5161:S2CID
5143:arXiv
5039:(PDF)
5016:(PDF)
4976:(PDF)
4949:S2CID
4753:JSTOR
4611:S2CID
4583:arXiv
4543:S2CID
4515:arXiv
4471:S2CID
4445:arXiv
4417:(PDF)
4376:Notes
4267:SPECT
4077:Enron
3537:O(rm)
3292:when
2506:Lasso
2500:When
2154:When
2080:Types
1349:is a
1105:Seung
878:IJCAI
704:SARSA
663:Mamba
629:LeNet
624:U-Net
450:t-SNE
374:Fuzzy
351:BIRCH
8304:ISBN
8293:ISBN
8282:ISBN
8271:ISBN
8260:ISBN
8249:ISBN
8238:ISBN
8227:ISBN
8215:ISBN
8203:PMID
8179:2009
8148:PMID
8009:PMID
7888:4968
7874:ISBN
7597:PMID
7556:PMID
7497:PMID
7407:PMID
7364:PMID
7319:ISBN
7288:PMID
7270:ISSN
7230:PMID
7212:ISSN
7173:PMID
7122:PMID
7079:PMID
7018:PMID
6697:PMID
6487:ISSN
6414:ISBN
6136:link
6110:PMID
5809:ISSN
5762:ISBN
5492:PMID
5425:ISBN
5382:PMID
5345:ISBN
5318:ISBN
5282:PMID
4941:PMID
4837:ISSN
4790:ISSN
4709:PMID
4269:and
4245:and
4004:and
3913:and
3877:are
3825:and
3556:and
3440:and
3407:and
3376:and
3364:and
3216:and
3142:and
3123:and
2623:and
2608:and
2588:and
2349:and
2184:and
2160:and
2125:and
1880:>
1652:and
1432:and
1371:and
1284:and
1127:and
1041:and
1018:and
1004:NNMF
979:and
888:JMLR
873:ICLR
868:ICML
754:RLHF
570:LSTM
356:CURE
42:and
8193:PMC
8183:doi
8140:doi
8109:doi
8048:doi
8001:doi
7978:doi
7950:doi
7866:doi
7770:doi
7650:doi
7589:doi
7546:PMC
7538:doi
7489:doi
7452:doi
7399:doi
7356:doi
7311:doi
7278:PMC
7262:doi
7220:PMC
7204:doi
7163:PMC
7155:doi
7151:125
7112:doi
7069:PMC
7059:doi
7008:PMC
7000:doi
6996:196
6970:",
6935:doi
6881:doi
6842:doi
6755:doi
6689:doi
6645:doi
6623:581
6589:doi
6567:427
6532:doi
6510:694
6477:doi
6455:469
6406:doi
6307:doi
6252:doi
6102:doi
6090:401
6020:doi
6016:436
5984:doi
5943:doi
5921:824
5881:doi
5859:755
5801:doi
5754:doi
5721:doi
5685:doi
5630:doi
5595:doi
5537:doi
5484:doi
5417:doi
5374:doi
5310:doi
5272:PMC
5262:doi
5190:doi
5153:doi
5116:doi
5074:doi
5028:doi
4933:doi
4921:401
4889:doi
4829:doi
4782:doi
4745:doi
4699:PMC
4689:doi
4601:doi
4579:892
4533:doi
4511:852
4463:doi
4441:133
4271:PET
3591:is
3548:In
3415:SVD
2874:and
2533:in
2309:of
2229:to
1765:,
1597:by
1396:in
1090:In
1031:is
1014:in
1002:or
1000:NMF
614:SOM
604:GAN
580:ESN
575:GRU
520:-NN
455:SDL
445:PGD
440:PCA
435:NMF
430:LDA
425:ICA
420:CCA
296:-NN
8350::
8201:.
8191:.
8177:.
8171:.
8154:.
8146:.
8136:21
8134:.
8115:.
8107:.
8097:25
8095:.
8054:.
8046:.
8036:51
8034:.
8015:.
8007:.
7997:19
7995:.
7974:37
7972:.
7948:.
7938:23
7936:.
7930:.
7882:.
7872:.
7864:.
7816:.
7776:.
7768:.
7758:.
7748:61
7746:.
7742:.
7714:.
7696:.
7674:.
7648:.
7638:.
7628:41
7626:.
7603:.
7595:.
7585:34
7583:.
7577:.
7554:.
7544:.
7534:35
7532:.
7526:.
7503:.
7495:.
7485:21
7483:.
7458:.
7450:.
7440:29
7438:.
7413:.
7405:.
7395:PP
7393:.
7370:.
7362:.
7352:24
7350:.
7327:.
7317:.
7286:.
7276:.
7268:.
7258:34
7256:.
7252:.
7228:.
7218:.
7210:.
7198:.
7194:.
7171:.
7161:.
7149:.
7143:.
7120:.
7108:23
7106:.
7100:.
7077:.
7067:.
7057:.
7045:.
7039:.
7016:.
7006:.
6994:.
6988:.
6941:.
6933:.
6919:.
6913:.
6887:.
6879:.
6867:24
6865:.
6838:52
6836:.
6790:.
6786:.
6761:.
6751:11
6749:.
6703:.
6695:.
6685:27
6683:.
6677:.
6651:.
6643:.
6635:.
6621:.
6597:.
6587:.
6579:.
6565:.
6561:.
6538:.
6530:.
6522:.
6508:.
6485:.
6475:.
6467:.
6453:.
6449:.
6422:.
6412:.
6305:.
6295:.
6285:22
6283:.
6258:.
6246:.
6132:}}
6128:{{
6116:.
6108:.
6100:.
6088:.
6082:.
6056:.
6032:^
6014:.
6010:.
5980:35
5978:.
5974:.
5951:.
5941:.
5933:.
5919:.
5915:.
5901:^
5887:.
5879:.
5871:.
5857:.
5845:^
5821:^
5807:.
5799:.
5787:16
5785:.
5760:.
5729:.
5717:33
5715:.
5709:.
5683:.
5673:.
5663:58
5661:.
5636:.
5628:.
5618:60
5616:.
5593:.
5581:30
5579:.
5573:.
5543:.
5535:.
5523:18
5521:.
5498:.
5490:.
5482:.
5470:19
5468:.
5462:.
5447:^
5433:.
5423:.
5411:.
5388:.
5380:.
5370:23
5368:.
5316:.
5280:.
5270:.
5260:.
5248:.
5242:.
5230:^
5202:^
5186:71
5184:.
5159:.
5151:.
5139:20
5137:.
5112:16
5110:.
5068:.
5047:^
5024:52
5022:.
5018:.
4988:^
4961:^
4947:.
4939:.
4931:.
4919:.
4901:^
4887:.
4877:29
4875:.
4865:;
4843:.
4835:.
4823:.
4819:.
4796:.
4788:.
4778:22
4776:.
4751:.
4741:13
4739:.
4729:;
4707:.
4697:.
4687:.
4675:.
4671:.
4623:^
4609:.
4599:.
4591:.
4577:.
4573:.
4555:^
4541:.
4531:.
4523:.
4509:.
4505:.
4483:^
4469:.
4461:.
4453:.
4439:.
4425:^
4397:.
4383:^
4008:.
3989:.
3956:.
3665:.
3327:.
2497:.
2352:WH
2322:WH
2320:=
2209:â
2146:+
2144:WH
2142:=
2104:WH
1911:â
1834:.
1753:0.
1676:)
1572:.
1504:.
1456:WH
1377:.
1354:Ă
1338:Ă
1322:Ă
1307:.
1133:,
1082:.
1074:,
1070:,
1066:,
1062:,
1058:,
1054:,
883:ML
8339:.
8209:.
8185::
8162:.
8142::
8123:.
8111::
8103::
8077:.
8071::
8062:.
8050::
8042::
8023:.
8003::
7984:.
7980::
7958:.
7952::
7944::
7911:.
7905::
7890:.
7868::
7834:.
7798:.
7772::
7754::
7724:.
7700:.
7678:.
7656:.
7652::
7634::
7611:.
7591::
7562:.
7540::
7511:.
7491::
7466:.
7454::
7446::
7421:.
7401::
7378:.
7358::
7335:.
7313::
7294:.
7264::
7236:.
7206::
7200:3
7179:.
7157::
7128:.
7114::
7085:.
7061::
7053::
7047:4
7024:.
7002::
6937::
6921:8
6895:.
6883::
6848:.
6844::
6821:.
6815::
6800:.
6794::
6769:.
6757::
6732:.
6711:.
6691::
6659:.
6647::
6639::
6629::
6605:.
6591::
6583::
6573::
6546:.
6534::
6526::
6516::
6493:.
6479::
6471::
6461::
6430:.
6408::
6313:.
6309::
6291::
6266:.
6254::
6248:8
6192:.
6138:)
6124:.
6104::
6096::
6064:.
6060::
6050::
6026:.
6022::
5992:.
5986::
5959:.
5945::
5937::
5927::
5895:.
5883::
5875::
5865::
5839:.
5833::
5815:.
5803::
5793::
5770:.
5756::
5737:.
5723::
5691:.
5687::
5669::
5644:.
5632::
5624::
5601:.
5597::
5551:.
5539::
5506:.
5486::
5441:.
5419::
5396:.
5376::
5326:.
5312::
5288:.
5264::
5256::
5250:7
5224:.
5218::
5196:.
5192::
5167:.
5155::
5145::
5122:.
5118::
5080:.
5076::
5070:2
5030::
4955:.
4935::
4927::
4895:.
4891::
4883::
4851:.
4831::
4825:5
4804:.
4784::
4759:.
4747::
4715:.
4691::
4683::
4677:6
4617:.
4603::
4595::
4585::
4549:.
4535::
4527::
4517::
4477:.
4465::
4457::
4447::
4308:)
4304:(
4298:.
4175:)
4172:N
4169:(
4166:O
4144:2
4140:N
4119:N
3945:B
3924:H
3895:W
3864:H
3858:1
3850:B
3845:=
3836:H
3812:B
3809:W
3806:=
3797:W
3769:H
3763:1
3755:B
3752:B
3749:W
3744:=
3740:H
3737:W
3681:H
3675:W
3645:.
3631:a
3627:h
3621:a
3618:i
3614:W
3608:a
3588:V
3582:W
3576:H
3570:V
3532:V
3526:V
3520:V
3490:n
3470:)
3467:1
3464:+
3461:n
3458:(
3443:H
3437:W
3410:H
3404:W
3379:H
3373:W
3367:H
3361:W
3355:H
3349:W
3343:W
3337:H
3314:H
3309:W
3305:=
3301:V
3268:T
3262:H
3256:H
3251:W
3242:T
3236:H
3230:V
3200:H
3195:W
3188:T
3182:W
3174:V
3167:T
3161:W
3145:H
3139:W
3126:H
3120:W
3098:]
3095:j
3092:,
3089:i
3086:[
3082:)
3076:T
3072:)
3066:1
3063:+
3060:n
3055:H
3050:(
3045:1
3042:+
3039:n
3034:H
3027:n
3022:W
3017:(
3010:]
3007:j
3004:,
3001:i
2998:[
2994:)
2988:T
2984:)
2978:1
2975:+
2972:n
2967:H
2962:(
2958:V
2954:(
2946:n
2941:]
2938:j
2935:,
2932:i
2929:[
2924:W
2914:1
2911:+
2908:n
2903:]
2900:j
2897:,
2894:i
2891:[
2886:W
2855:]
2852:j
2849:,
2846:i
2843:[
2839:)
2833:n
2828:H
2821:n
2816:W
2809:T
2805:)
2799:n
2794:W
2789:(
2786:(
2779:]
2776:j
2773:,
2770:i
2767:[
2763:)
2758:V
2752:T
2748:)
2742:n
2737:W
2732:(
2729:(
2721:n
2716:]
2713:j
2710:,
2707:i
2704:[
2699:H
2689:1
2686:+
2683:n
2678:]
2675:j
2672:,
2669:i
2666:[
2661:H
2637:n
2626:H
2620:W
2611:H
2605:W
2591:H
2585:W
2571:H
2561:V
2555:W
2548:V
2476:2
2471:F
2461:H
2458:W
2450:V
2440:=
2437:)
2433:H
2429:,
2425:W
2421:(
2418:F
2394:V
2368:H
2362:W
2346:V
2328:V
2318:V
2312:V
2294:H
2288:W
2273:)
2268:n
2264:v
2260:,
2254:,
2249:1
2245:v
2241:(
2226:W
2220:W
2214:+
2211:R
2207:W
2193:V
2187:H
2181:W
2175:V
2169:V
2163:H
2157:W
2148:U
2140:V
2134:U
2128:H
2122:W
2116:V
2110:V
2098:H
2092:W
2053:I
2050:=
2045:T
2040:H
2034:H
2010:k
1990:k
1970:W
1950:k
1928:j
1924:v
1913:k
1909:i
1893:j
1890:i
1885:H
1875:j
1872:k
1867:H
1845:H
1818:I
1815:=
1810:T
1805:H
1799:H
1777:H
1747:H
1744:,
1741:0
1735:W
1715:,
1710:F
1701:H
1698:W
1692:V
1660:H
1640:W
1619:H
1614:W
1606:V
1584:V
1560:)
1555:n
1551:v
1547:,
1541:,
1536:1
1532:v
1528:(
1525:=
1521:V
1501:H
1495:W
1488:H
1481:W
1471:.
1468:H
1462:W
1450:.
1447:V
1441:V
1435:H
1429:W
1420:H
1411:W
1399:V
1393:v
1387:V
1374:n
1368:m
1362:p
1356:n
1352:p
1346:H
1340:p
1336:m
1330:W
1324:n
1320:m
1314:V
1304:H
1298:i
1291:i
1287:h
1281:V
1275:i
1268:i
1264:v
1246:,
1240:i
1235:h
1229:W
1225:=
1220:i
1215:v
1199:V
1193:H
1187:W
1181:V
1163:.
1158:H
1153:W
1149:=
1145:V
1130:H
1124:W
1118:V
1044:H
1038:W
1028:V
998:(
991:.
988:V
982:H
976:W
970:V
952:e
945:t
938:v
518:k
367:k
294:k
252:)
240:(
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.