Knowledge (XXG)

Sparse matrix

Source 📝

808: 518: 4087: 803:{\displaystyle {\begin{bmatrix}X&X&X&\cdot &\cdot &\cdot &\cdot &\\X&X&\cdot &X&X&\cdot &\cdot &\\X&\cdot &X&\cdot &X&\cdot &\cdot &\\\cdot &X&\cdot &X&\cdot &X&\cdot &\\\cdot &X&X&\cdot &X&X&X&\\\cdot &\cdot &\cdot &X&X&X&\cdot &\\\cdot &\cdot &\cdot &\cdot &X&\cdot &X&\\\end{bmatrix}}} 277: 259: 1062: 48: 1794: 912: 1271:
In the case of a sparse matrix, substantial memory requirement reductions can be realized by storing only the non-zero entries. Depending on the number and distribution of the non-zero entries, different data structures can be used and yield huge savings in memory when compared to the basic approach.
2216:
The WSE is the largest chip ever made at 46,225 square millimeters in area, it is 56.7 times larger than the largest graphics processing unit. It contains 78 times more AI optimized compute cores, 3,000 times more high speed, on-chip memory, 10,000 times more memory bandwidth, and 33,000 times more
1929:
indexes where each column starts. The name is based on the fact that column index information is compressed relative to the COO format. One typically uses another format (LIL, DOK, COO) for construction. This format is efficient for arithmetic operations, column slicing, and matrix-vector products.
1303:
to the value of the elements. Elements that are missing from the dictionary are taken to be zero. The format is good for incrementally constructing a sparse matrix in random order, but poor for iterating over non-zero values in lexicographical order. One typically constructs a matrix in this format
339:
Conceptually, sparsity corresponds to systems with few pairwise interactions. For example, consider a line of balls connected by springs from one to the next: this is a sparse system as only adjacent balls are coupled. By contrast, if the same line of balls were to have springs connecting each ball
379:
that take advantage of the sparse structure of the matrix. Specialized computers have been made for sparse matrices, as they are common in the machine learning field. Operations using standard dense-matrix structures and algorithms are slow and inefficient when applied to large sparse matrices as
2158:
The computation kernel of DNN is large sparse-dense matrix multiplication. In the field of numerical analysis, a sparse matrix is a matrix populated primarily with zeros as elements of the table. By contrast, if the number of non-zero elements in a matrix is relatively large, then it is commonly
1120:
in use. Orthogonalization methods (such as QR factorization) are common, for example, when solving problems by least squares methods. While the theoretical fill-in is still the same, in practical terms the "false non-zeros" can be different for different methods. And symbolic versions of those
1104:
of a matrix are those entries that change from an initial zero to a non-zero value during the execution of an algorithm. To reduce the memory requirements and the number of arithmetic operations used during an algorithm, it is useful to minimize the fill-in by switching rows and columns in the
812:
Matrices with reasonably small upper and lower bandwidth are known as band matrices and often lend themselves to simpler algorithms than general sparse matrices; or one can sometimes apply dense matrix algorithms and gain efficiency simply by looping over a reduced number of indices.
2159:
considered a dense matrix. The fraction of zero elements (non-zero elements) in a matrix is called the sparsity (density). Operations using standard dense-matrix structures and algorithms are relatively slow and consume large amounts of memory when applied to large sparse matrices.
1581: 1642: 254:{\displaystyle \left({\begin{smallmatrix}11&22&0&0&0&0&0\\0&33&44&0&0&0&0\\0&0&55&66&77&0&0\\0&0&0&0&0&88&0\\0&0&0&0&0&0&99\\\end{smallmatrix}}\right)} 1347:
by three (one-dimensional) arrays, that respectively contain nonzero values, the extents of rows, and column indices. It is similar to COO, but compresses the row indices, hence the name. This format allows fast row access and matrix-vector multiplications
2486:(This book, by a professor at the State University of New York at Stony Book, was the first book exclusively dedicated to Sparse Matrices. Graduate courses using this as a textbook were offered at that University in the early 1980s). 1312:
LIL stores one list per row, with each entry containing the column index and the value. Typically, these entries are kept sorted by column index for faster lookup. This is another format good for incremental matrix construction.
1057:{\displaystyle \mathbf {A} ={\begin{bmatrix}\mathbf {A} _{1}&0&\cdots &0\\0&\mathbf {A} _{2}&\cdots &0\\\vdots &\vdots &\ddots &\vdots \\0&0&\cdots &\mathbf {A} _{n}\end{bmatrix}},} 2187:
The WSE contains 400,000 AI-optimized compute cores. Called SLAC™ for Sparse Linear Algebra Cores, the compute cores are flexible, programmable, and optimized for the sparse linear algebra that underpins all neural network
1469: 1325:
tuples. Ideally, the entries are sorted first by row index and then by column index, to improve random access times. This is another format that is good for incremental matrix construction.
312:
but a common criterion is that the number of non-zero elements is roughly equal to the number of rows or columns. By contrast, if most of the elements are non-zero, the matrix is considered
1789:{\displaystyle {\begin{pmatrix}10&20&0&0&0&0\\0&30&0&40&0&0\\0&0&50&60&70&0\\0&0&0&0&0&80\\\end{pmatrix}}} 1946: 1880:
The (old and new) Yale sparse matrix formats are instances of the CSR scheme. The old Yale format works exactly as described above, with three arrays; the new format combines
1279:
Those that support efficient modification, such as DOK (Dictionary of keys), LIL (List of lists), or COO (Coordinate list). These are typically used to construct the matrices.
4135: 1272:
The trade-off is that accessing the individual elements becomes more complex and additional structures are needed to be able to recover the original matrix unambiguously.
3745: 1172: 1901:
It is likely known as the Yale format because it was proposed in the 1977 Yale Sparse Matrix Package report from Department of Computer Science at Yale University.
2200: 1867:
would not be redundant). Nonetheless, this does avoid the need to handle an exceptional case when computing the length of each row, as it guarantees the formula
2171: 1192: 3132: 1909:
CSC is similar to CSR except that values are read first by column, a row index is stored for each value, and column pointers are stored. For example, CSC is
4256: 515:. As another example, the following sparse matrix has lower and upper bandwidth both equal to 3. Notice that zeros are represented with dots for clarity. 3959: 4281: 2808: 3178: 4050: 308:
in which most of the elements are zero. There is no strict definition regarding the proportion of zero-value elements for a matrix to qualify as
4276: 4128: 3102: 2768: 2733: 2515: 2450: 2423: 2151: 1955:, a large C++ library, with sub-libraries dedicated to the storage of dense and sparse matrices and solution of corresponding linear systems. 3969: 3735: 1624:
In this case the CSR representation contains 13 entries, compared to 16 in the original matrix. The CSR format saves on memory only when
2201:"Argonne National Laboratory Deploys Cerebras CS-1, the World's Fastest Artificial Intelligence Computer | Argonne National Laboratory" 4235: 4121: 3041: 2471: 2365: 1898:, the data array can be omitted, as the existence of an entry in the row array is sufficient to model a binary adjacency relation. 1576:{\displaystyle {\begin{pmatrix}5&0&0&0\\0&8&0&0\\0&0&3&0\\0&6&0&0\\\end{pmatrix}}} 4266: 1942:
Many software libraries support sparse matrices, and provide solvers for sparse matrix equations. The following are open-source:
1282:
Those that support efficient access and matrix operations, such as CSR (Compressed Sparse Row) or CSC (Compressed Sparse Column).
4193: 3770: 2831: 1863:, so they are in some sense redundant (although in programming languages where the array length needs to be explicitly stored, 1106: 268:
The above sparse matrix contains only 9 non-zero elements, with 26 zero elements. Its sparsity is 74%, and its density is 26%.
3317: 2836: 2110: 2801: 361: 4225: 3534: 3171: 2134:
Yan, Di; Wu, Tao; Liu, Ying; Gao, Yang (2017). "An efficient sparse-dense matrix multiplication on a multicore system".
1357:). The CSR format has been in use since at least the mid-1960s, with the first complete description appearing in 1967. 3609: 2915: 2898: 2000: 416: 4179: 3765: 3287: 3114: 2881: 2876: 2257: 2243: 2229: 2105: 2006: 4230: 3869: 3740: 3654: 2871: 2535: 4302: 4144: 3974: 3864: 3572: 3252: 2910: 2905: 2864: 2794: 1877:. Moreover, the memory cost of this redundant storage is likely insignificant for a sufficiently large matrix. 352:, which typically have a low density of significant data or connections. Large sparse matrices often appear in 4009: 3938: 3820: 3680: 3277: 3164: 3145: 3122: 31: 3879: 3462: 3267: 3127: 2927: 2660: 2285: 2003:, another finite element library that also has a sub-library for sparse linear systems and their solution. 1117: 1110: 4189: 3825: 3562: 3412: 3407: 3242: 3217: 3212: 3053: 3008: 2970: 2070: 1958: 900: 852: 389: 340:
to all other balls, the system would correspond to a dense matrix. The concept of sparsity is useful in
281: 4086: 2278:
Parallel sparse matrix-vector and matrix-transpose-vector multiplication using compressed sparse blocks
1206:
A matrix is typically stored as a two-dimensional array. Each entry in the array represents an element
392:. Some very large sparse matrices are infeasible to manipulate using standard dense-matrix algorithms. 1949:, a large C library, containing many different matrix solvers for a variety of matrix storage formats. 4184: 4019: 3377: 3207: 3187: 2993: 2544: 2272: 1391: 1222: 305: 293: 2665: 2290: 1997:, a finite element library that also has a sub-library for sparse linear systems and their solution. 4163: 4040: 4014: 3592: 3397: 3387: 2411: 2085: 3036: 2027:
Fortran 77 library for sparse matrix diagonalization and manipulation, using the Arnoldi algorithm
1121:
algorithms can be used in the same manner as the symbolic Cholesky to compute worst case fill-in.
4091: 4045: 4035: 3989: 3984: 3913: 3849: 3715: 3452: 3447: 3382: 3372: 3237: 3021: 2886: 2846: 2635: 2570: 2340: 1962: 1137: 504: 349: 289: 2579:
Also NOAA Technical Memorandum NOS NGS-4, National Geodetic Survey, Rockville, MD. Referencing
276: 4102: 3889: 3884: 3874: 3854: 3815: 3810: 3639: 3634: 3619: 3614: 3605: 3600: 3547: 3442: 3392: 3337: 3307: 3302: 3282: 3272: 3232: 2955: 2854: 2774: 2764: 2739: 2729: 2695:
A EU-funded project on sparse models, algorithms and dictionary learning for large-scale data.
2521: 2511: 2477: 2467: 2446: 2419: 2147: 2095: 1292: 4199: 4097: 4065: 3994: 3933: 3928: 3908: 3844: 3750: 3720: 3705: 3690: 3685: 3624: 3577: 3552: 3542: 3513: 3432: 3427: 3402: 3332: 3312: 3222: 3202: 2978: 2756: 2721: 2704: 2670: 2627: 2592: 2560: 2552: 2438: 2332: 2139: 2075: 2040: 1968: 1895: 1258:
matrix, the amount of memory required to store the matrix in this format is proportional to
1147: 1130: 884: 880: 385: 2461: 4307: 4240: 3795: 3730: 3710: 3695: 3675: 3659: 3557: 3488: 3478: 3437: 3322: 3292: 2998: 2940: 2057: 843: 830: 381: 2548: 2358: 1961:
is a C++ library that contains several sparse matrix solvers. However, none of them are
4158: 4055: 3999: 3979: 3964: 3923: 3800: 3760: 3725: 3649: 3588: 3567: 3508: 3498: 3483: 3417: 3362: 3352: 3347: 3257: 3090: 3068: 2893: 2817: 2504: 2090: 1988: 1892: 1195: 1177: 888: 376: 345: 2491: 2317: 2276: 1917:
is an array of the (top-to-bottom, then left-to-right) non-zero values of the matrix;
1602:
Then we take slices from V and COL_INDEX starting at row_start and ending at row_end.
4296: 4060: 3918: 3859: 3790: 3780: 3775: 3700: 3629: 3503: 3493: 3342: 3327: 3262: 3063: 2945: 2574: 2407: 2080: 1410:, and contain the non-zero values and the column indices of those values respectively 848: 341: 2639: 316:. The number of zero-valued elements divided by the total number of elements (e.g., 3943: 3900: 3805: 3518: 3457: 3367: 3247: 2648: 2565: 2344: 2036: 1300: 2681: 1268:(disregarding the fact that the dimensions of the matrix also need to be stored). 1930:
This is the traditional format for specifying a sparse matrix in MATLAB (via the
3785: 3755: 3523: 3357: 3227: 3058: 2983: 2100: 2015:
provides support for several sparse matrix formats, linear algebra, and solvers.
460: 412: 406: 357: 58: 2615: 3836: 3297: 3046: 2950: 2708: 2596: 2533:
Snay, Richard A. (1976). "Reducing the profile of sparse symmetric matrices".
2442: 2357:
Eisenstat, S. C.; Gursky, M. C.; Schultz, M. H.; Sherman, A. H. (April 1977).
2143: 353: 2778: 2743: 2525: 2481: 4220: 4070: 3644: 2988: 2935: 2760: 2725: 1109:
can be used to calculate the worst possible fill-in before doing the actual
903:
consists of sub-matrices along its diagonal blocks. A block-diagonal matrix
372: 4113: 3085: 2631: 2136:
2017 IEEE 17th International Conference on Communication Technology (ICCT)
1849:(10, 20, ...) (0, 30, 0, 40, ...)(0, 0, 50, 60, 70, 0) (0, 0, 0, 0, 0, 80) 1621:. We now know that in row 1 we have one element at column 1 with value 8. 4004: 3031: 2859: 1952: 368: 2466:. Mathematics in science and engineering. Vol. 99. Academic Press. 2172:"Cerebras Systems Unveils the Industry's First Trillion Transistor Chip" 17: 3080: 3026: 2556: 2336: 1994: 1888:
into a single array and handles the diagonal of the matrix separately.
2033:
Library for solution of large scale linear systems and sparse matrices
4271: 4261: 3075: 3016: 2024: 2018: 841:
A very efficient structure for an extreme case of band matrices, the
2674: 2616:"A comparison of several bandwidth and profile reduction algorithms" 1198:
can significantly accelerate convergence of such iterative methods.
36: 2271:
Buluç, Aydın; Fineman, Jeremy T.; Frigo, Matteo; Gilbert, John R.;
1304:
and then converts to another more efficient format for processing.
2786: 2030: 2012: 1141: 275: 3156: 2614:
Gibbs, Norman E.; Poole, William G.; Stockmeyer, Paul K. (1976).
3097: 829:
with a lower bandwidth. A number of algorithms are designed for
4117: 3160: 2790: 2692: 384:
are wasted on the zeros. Sparse data is by nature more easily
2389: 2284:. ACM Symp. on Parallelism in Algorithms and Architectures. 2060:
who initiated some pioneering work but then left the field.
1605:
To extract the row 1 (the second row) of this matrix we set
284:
in two dimensions. The non-zero elements are shown in black.
2687: 2647:
Gilbert, John R.; Moler, Cleve; Schreiber, Robert (1992).
2021:
is a C++ and C# library with sparse linear algebra support
371:, it is beneficial and often necessary to use specialized 2009:
provides a user-friendly C++ wrapper for BLAS and LAPACK.
1248:
is the column index, numbered from left to right. For an
2129: 2127: 2701:
Iterative Solution of Large Sparse Systems of Equations
2649:"Sparse matrices in MATLAB: Design and Implementation" 1651: 1478: 929: 527: 1921:
is the row indices corresponding to the values; and,
1645: 1472: 1416:
contains the column in which the corresponding entry
1180: 1150: 915: 521: 51: 1144:
utilize fast computations of matrix-vector products
1133:
and direct methods exist for sparse matrix solving.
4249: 4213: 4172: 4151: 4028: 3952: 3898: 3834: 3668: 3586: 3532: 3471: 3195: 3113: 3007: 2969: 2926: 2845: 2824: 2043:, provides support for sparse matrices and solvers. 1242:is the row index, numbered from top to bottom, and 367:When storing and manipulating sparse matrices on a 2503: 1788: 1575: 1442:where the given row starts. This is equivalent to 1186: 1166: 1056: 802: 253: 1376:in row form using three (one-dimensional) arrays 2653:SIAM Journal on Matrix Analysis and Applications 1800:matrix (24 entries) with 8 nonzero elements, so 1446:encoding the total number of nonzeros above row 816:By rearranging the rows and columns of a matrix 411:An important special type of sparse matrices is 1329:Compressed sparse row (CSR, CRS or Yale format) 30:"Sparsity" redirects here. For other uses, see 2390:Oral history interview with Harry M. Markowitz 4129: 3172: 2802: 2318:"Sparse Matrix Multiplication Package (SMMP)" 2316:Bank, Randolph E.; Douglas, Craig C. (1993), 1855:Note that in this format, the first value of 500: 27:Matrix in which most of the elements are zero 8: 1599:row_start = ROW_INDEX row_end = ROW_INDEX 2718:Iterative Methods for Sparse Linear Systems 4136: 4122: 4114: 3746:Fundamental (linear differential equation) 3179: 3165: 3157: 2809: 2795: 2787: 2418:(3rd ed.). Baltimore: Johns Hopkins. 1803:V = COL_INDEX = ROW_INDEX = 2664: 2620:ACM Transactions on Mathematical Software 2564: 2289: 1646: 1644: 1473: 1471: 1341:(CRS) or Yale format represents a matrix 1221:of the matrix and is accessed by the two 1179: 1158: 1149: 1037: 1032: 974: 969: 938: 933: 924: 916: 914: 522: 520: 56: 50: 2753:Direct Methods for Sparse Linear Systems 2587:Scott, Jennifer; Tuma, Miroslav (2023). 1806:The whole is stored as 21 entries: 8 in 1590:V = COL_INDEX = ROW_INDEX = 1384:denote the number of nonzero entries in 1275:Formats can be divided into two groups: 879:A symmetric sparse matrix arises as the 332:matrix) is sometimes referred to as the 280:A sparse matrix obtained when solving a 4267:Basic Linear Algebra Subprograms (BLAS) 4051:Matrix representation of conic sections 2433:Stoer, Josef; Bulirsch, Roland (2002). 2123: 1458:immediately after the last valid index 2492:"Sparse Matrix Multiplication Package" 1859:is always zero and the last is always 1587:matrix with 4 nonzero elements, hence 847:, is to store just the entries in the 822:it may be possible to obtain a matrix 2490:Bank, Randolph E.; Douglas, Craig C. 2325:Advances in Computational Mathematics 1905:Compressed sparse column (CSC or CCS) 887:; it can be stored efficiently as an 388:and thus requires significantly less 7: 2589:Algorithms for Sparse Linear Systems 2580: 2304: 1987:olver), written in Fortran90, is a 1831:(10, 20) (30, 40) (50, 60, 70) (80) 1596:To extract a row, we first define: 2435:Introduction to Numerical Analysis 2371:from the original on April 6, 2019 1593:assuming a zero-indexed language. 25: 2682:Sparse Matrix Algorithms Research 1841:) where each row starts and ends; 1116:There are other methods than the 57: 4085: 2684:at the Texas A&M University. 1454:, i.e., the fictitious index in 1033: 970: 934: 917: 3953:Used in science and engineering 1360:The CSR format stores a sparse 1125:Solving sparse matrix equations 1107:symbolic Cholesky decomposition 3196:Explicitly constrained entries 2460:Tewarson, Reginald P. (1973). 2111:Matrix Market exchange formats 362:partial differential equations 344:and application areas such as 1: 3970:Fundamental (computer vision) 2688:SuiteSparse Matrix Collection 2699:Hackbusch, Wolfgang (2016). 2502:Pissanetzky, Sergio (1984). 2359:"Yale Sparse Matrix Package" 1639:Another example, the matrix 3736:Duplication and elimination 3535:eigenvalues or eigenvectors 3133:Directed acyclic word graph 2899:Double-ended priority queue 1136:Iterative methods, such as 1077:is a square matrix for all 417:lower bandwidth of a matrix 4324: 4180:System of linear equations 3669:With specific applications 3298:Discrete Fourier Transform 2751:Davis, Timothy A. (2006). 2703:(2nd ed.). Springer. 2437:(3rd ed.). Springer. 2106:Harwell-Boeing file format 1847:aligns values in columns: 1833:, indicating the index of 1613:. Then we make the slices 503:, §1.2.1). For example, a 415:, defined as follows. The 404: 360:applications when solving 29: 4231:Cache-oblivious algorithm 4079: 3960:Cabibbo–Kobayashi–Maskawa 3587:Satisfying conditions on 3141: 2709:10.1007/978-3-319-28483-5 2597:10.1007/978-3-031-25820-6 2443:10.1007/978-0-387-21738-3 2144:10.1109/icct.2017.8359956 2138:. IEEE. pp. 1880–3. 1434:and encodes the index in 1378:(V, COL_INDEX, ROW_INDEX) 501:Golub & Van Loan 1996 4282:General purpose software 4145:Numerical linear algebra 2865:Retrieval Data Structure 2506:Sparse Matrix Technology 2217:communication bandwidth. 1466:For example, the matrix 1287:Dictionary of keys (DOK) 42:Example of sparse matrix 3318:Generalized permutation 3146:List of data structures 3123:Binary decision diagram 2761:10.1137/1.9780898718881 2726:10.1137/1.9780898718003 2566:2027/uc1.31210024848523 2259:scipy.sparse.coo_matrix 2245:scipy.sparse.lil_matrix 2231:scipy.sparse.dok_matrix 2056:was possibly coined by 2039:, a Python library for 1911:(val, row_ind, col_ptr) 1450:. The last element is 463:is the smallest number 424:is the smallest number 32:Sparse (disambiguation) 4092:Mathematics portal 3128:Directed acyclic graph 1983:arallel sparse direct 1790: 1577: 1339:compressed row storage 1194:is sparse. The use of 1188: 1168: 1167:{\displaystyle Ax_{i}} 1118:Cholesky decomposition 1111:Cholesky decomposition 1058: 831:bandwidth minimization 804: 285: 282:finite element problem 255: 4277:Specialized libraries 4190:Matrix multiplication 4185:Matrix decompositions 2716:Saad, Yousef (2003). 2632:10.1145/355705.355707 2273:Leiserson, Charles E. 2071:Matrix representation 1869:ROW_INDEX − ROW_INDEX 1791: 1578: 1394:shall be used here.) 1335:compressed sparse row 1321:COO stores a list of 1317:Coordinate list (COO) 1189: 1169: 1059: 901:block-diagonal matrix 865:matrix requires only 853:one-dimensional array 805: 279: 256: 2994:Unrolled linked list 2412:Van Loan, Charles F. 2176:www.businesswire.com 1643: 1470: 1323:(row, column, value) 1178: 1148: 913: 519: 511:and upper bandwidth 507:has lower bandwidth 430:such that the entry 294:scientific computing 49: 4164:Numerical stability 4041:Linear independence 3288:Diagonally dominant 3042:Self-balancing tree 2549:1976BGeod..50..341S 2536:Bulletin Géodésique 2416:Matrix Computations 2086:Single-entry matrix 1308:List of lists (LIL) 4046:Matrix exponential 4036:Jordan normal form 3870:Fisher information 3741:Euclidean distance 3655:Totally unimodular 3022:Binary search tree 2887:Double-ended queue 2557:10.1007/BF02521587 2510:. Academic Press. 2337:10.1007/BF02070824 1896:adjacency matrices 1871:works for any row 1786: 1780: 1573: 1567: 1392:zero-based indices 1291:DOK consists of a 1236:. Conventionally, 1184: 1164: 1138:conjugate gradient 1054: 1045: 800: 794: 505:tridiagonal matrix 445:vanishes whenever 350:numerical analysis 290:numerical analysis 286: 251: 245: 244: 4290: 4289: 4111: 4110: 4103:Category:Matrices 3975:Fuzzy associative 3865:Doubly stochastic 3573:Positive-definite 3253:Block tridiagonal 3154: 3153: 2956:Hashed array tree 2855:Associative array 2770:978-0-89871-613-9 2735:978-0-89871-534-7 2517:978-0-12-557580-5 2452:978-0-387-95452-3 2425:978-0-8018-5414-9 2392:, pp. 9, 10. 2153:978-1-5090-3944-9 2096:Sparse graph code 1825:splits the array 1187:{\displaystyle A} 459:. Similarly, the 274: 273: 16:(Redirected from 4315: 4200:Matrix splitting 4138: 4131: 4124: 4115: 4098:List of matrices 4090: 4089: 4066:Row echelon form 4010:State transition 3939:Seidel adjacency 3821:Totally positive 3681:Alternating sign 3278:Complex Hadamard 3181: 3174: 3167: 3158: 2979:Association list 2811: 2804: 2797: 2788: 2782: 2747: 2712: 2678: 2668: 2643: 2600: 2578: 2568: 2529: 2509: 2498: 2496: 2485: 2456: 2429: 2393: 2387: 2381: 2380: 2378: 2376: 2370: 2363: 2354: 2348: 2347: 2322: 2313: 2307: 2302: 2296: 2295: 2293: 2283: 2268: 2262: 2260: 2254: 2248: 2246: 2240: 2234: 2232: 2226: 2220: 2219: 2213: 2212: 2197: 2191: 2190: 2184: 2183: 2168: 2162: 2161: 2131: 2076:Pareto principle 2041:machine learning 1933: 1928: 1924: 1920: 1916: 1912: 1887: 1883: 1876: 1870: 1866: 1862: 1858: 1850: 1846: 1840: 1836: 1832: 1828: 1824: 1817: 1813: 1809: 1799: 1795: 1793: 1792: 1787: 1785: 1784: 1635: 1620: 1616: 1612: 1608: 1586: 1582: 1580: 1579: 1574: 1572: 1571: 1461: 1457: 1453: 1449: 1445: 1441: 1437: 1433: 1426: 1419: 1415: 1409: 1405: 1401: 1389: 1383: 1379: 1375: 1369: 1356: 1346: 1324: 1298: 1267: 1257: 1247: 1241: 1235: 1229: 1220: 1193: 1191: 1190: 1185: 1173: 1171: 1170: 1165: 1163: 1162: 1096:Reducing fill-in 1086: 1076: 1063: 1061: 1060: 1055: 1050: 1049: 1042: 1041: 1036: 979: 978: 973: 943: 942: 937: 920: 908: 885:undirected graph 881:adjacency matrix 870: 864: 855:, so a diagonal 828: 821: 809: 807: 806: 801: 799: 798: 792: 754: 716: 678: 640: 602: 564: 514: 510: 498: 484: 468: 458: 444: 429: 423: 269: 260: 258: 257: 252: 250: 246: 37: 21: 4323: 4322: 4318: 4317: 4316: 4314: 4313: 4312: 4303:Sparse matrices 4293: 4292: 4291: 4286: 4245: 4241:Multiprocessing 4209: 4205:Sparse problems 4168: 4147: 4142: 4112: 4107: 4084: 4075: 4024: 3948: 3894: 3830: 3664: 3582: 3528: 3467: 3268:Centrosymmetric 3191: 3185: 3155: 3150: 3137: 3109: 3003: 2999:XOR linked list 2965: 2941:Circular buffer 2922: 2841: 2820: 2818:Data structures 2815: 2785: 2771: 2750: 2736: 2715: 2698: 2675:10.1137/0613024 2666:10.1.1.470.1054 2646: 2613: 2609: 2607:Further reading 2604: 2586: 2532: 2518: 2501: 2494: 2489: 2474: 2463:Sparse Matrices 2459: 2453: 2432: 2426: 2406: 2402: 2397: 2396: 2388: 2384: 2374: 2372: 2368: 2361: 2356: 2355: 2351: 2320: 2315: 2314: 2310: 2303: 2299: 2291:10.1.1.211.5256 2281: 2270: 2269: 2265: 2258: 2255: 2251: 2244: 2241: 2237: 2230: 2227: 2223: 2210: 2208: 2207:(Press release) 2199: 2198: 2194: 2181: 2179: 2170: 2169: 2165: 2154: 2133: 2132: 2125: 2120: 2115: 2066: 2058:Harry Markowitz 2050: 1940: 1931: 1926: 1925:is the list of 1922: 1918: 1914: 1910: 1907: 1885: 1881: 1872: 1868: 1864: 1860: 1856: 1848: 1844: 1838: 1834: 1830: 1826: 1822: 1815: 1811: 1807: 1804: 1797: 1779: 1778: 1773: 1768: 1763: 1758: 1753: 1747: 1746: 1741: 1736: 1731: 1726: 1721: 1715: 1714: 1709: 1704: 1699: 1694: 1689: 1683: 1682: 1677: 1672: 1667: 1662: 1657: 1647: 1641: 1640: 1625: 1618: 1614: 1610: 1606: 1600: 1591: 1584: 1566: 1565: 1560: 1555: 1550: 1544: 1543: 1538: 1533: 1528: 1522: 1521: 1516: 1511: 1506: 1500: 1499: 1494: 1489: 1484: 1474: 1468: 1467: 1459: 1455: 1451: 1447: 1443: 1439: 1435: 1428: 1424: 1417: 1413: 1407: 1403: 1399: 1385: 1381: 1377: 1371: 1361: 1349: 1342: 1331: 1322: 1319: 1310: 1296: 1289: 1259: 1249: 1243: 1237: 1231: 1225: 1219: 1207: 1204: 1196:preconditioners 1176: 1175: 1174:, where matrix 1154: 1146: 1145: 1127: 1098: 1093: 1078: 1075: 1067: 1044: 1043: 1031: 1029: 1024: 1019: 1013: 1012: 1007: 1002: 997: 991: 990: 985: 980: 968: 966: 960: 959: 954: 949: 944: 932: 925: 911: 910: 904: 897: 877: 866: 856: 844:diagonal matrix 839: 823: 817: 793: 791: 786: 781: 776: 771: 766: 761: 755: 753: 748: 743: 738: 733: 728: 723: 717: 715: 710: 705: 700: 695: 690: 685: 679: 677: 672: 667: 662: 657: 652: 647: 641: 639: 634: 629: 624: 619: 614: 609: 603: 601: 596: 591: 586: 581: 576: 571: 565: 563: 558: 553: 548: 543: 538: 533: 523: 517: 516: 512: 508: 486: 482: 470: 464: 461:upper bandwidth 446: 443: 431: 425: 419: 409: 403: 398: 380:processing and 377:data structures 336:of the matrix. 270: 267: 261: 243: 242: 237: 232: 227: 222: 217: 212: 206: 205: 200: 195: 190: 185: 180: 175: 169: 168: 163: 158: 153: 148: 143: 138: 132: 131: 126: 121: 116: 111: 106: 101: 95: 94: 89: 84: 79: 74: 69: 64: 52: 47: 46: 44: 35: 28: 23: 22: 15: 12: 11: 5: 4321: 4319: 4311: 4310: 4305: 4295: 4294: 4288: 4287: 4285: 4284: 4279: 4274: 4269: 4264: 4259: 4253: 4251: 4247: 4246: 4244: 4243: 4238: 4233: 4228: 4223: 4217: 4215: 4211: 4210: 4208: 4207: 4202: 4197: 4187: 4182: 4176: 4174: 4170: 4169: 4167: 4166: 4161: 4159:Floating point 4155: 4153: 4149: 4148: 4143: 4141: 4140: 4133: 4126: 4118: 4109: 4108: 4106: 4105: 4100: 4095: 4080: 4077: 4076: 4074: 4073: 4068: 4063: 4058: 4056:Perfect matrix 4053: 4048: 4043: 4038: 4032: 4030: 4026: 4025: 4023: 4022: 4017: 4012: 4007: 4002: 3997: 3992: 3987: 3982: 3977: 3972: 3967: 3962: 3956: 3954: 3950: 3949: 3947: 3946: 3941: 3936: 3931: 3926: 3921: 3916: 3911: 3905: 3903: 3896: 3895: 3893: 3892: 3887: 3882: 3877: 3872: 3867: 3862: 3857: 3852: 3847: 3841: 3839: 3832: 3831: 3829: 3828: 3826:Transformation 3823: 3818: 3813: 3808: 3803: 3798: 3793: 3788: 3783: 3778: 3773: 3768: 3763: 3758: 3753: 3748: 3743: 3738: 3733: 3728: 3723: 3718: 3713: 3708: 3703: 3698: 3693: 3688: 3683: 3678: 3672: 3670: 3666: 3665: 3663: 3662: 3657: 3652: 3647: 3642: 3637: 3632: 3627: 3622: 3617: 3612: 3603: 3597: 3595: 3584: 3583: 3581: 3580: 3575: 3570: 3565: 3563:Diagonalizable 3560: 3555: 3550: 3545: 3539: 3537: 3533:Conditions on 3530: 3529: 3527: 3526: 3521: 3516: 3511: 3506: 3501: 3496: 3491: 3486: 3481: 3475: 3473: 3469: 3468: 3466: 3465: 3460: 3455: 3450: 3445: 3440: 3435: 3430: 3425: 3420: 3415: 3413:Skew-symmetric 3410: 3408:Skew-Hermitian 3405: 3400: 3395: 3390: 3385: 3380: 3375: 3370: 3365: 3360: 3355: 3350: 3345: 3340: 3335: 3330: 3325: 3320: 3315: 3310: 3305: 3300: 3295: 3290: 3285: 3280: 3275: 3270: 3265: 3260: 3255: 3250: 3245: 3243:Block-diagonal 3240: 3235: 3230: 3225: 3220: 3218:Anti-symmetric 3215: 3213:Anti-Hermitian 3210: 3205: 3199: 3197: 3193: 3192: 3186: 3184: 3183: 3176: 3169: 3161: 3152: 3151: 3149: 3148: 3142: 3139: 3138: 3136: 3135: 3130: 3125: 3119: 3117: 3111: 3110: 3108: 3107: 3106: 3105: 3095: 3094: 3093: 3091:Hilbert R-tree 3088: 3083: 3073: 3072: 3071: 3069:Fibonacci heap 3066: 3061: 3051: 3050: 3049: 3044: 3039: 3037:Red–black tree 3034: 3029: 3019: 3013: 3011: 3005: 3004: 3002: 3001: 2996: 2991: 2986: 2981: 2975: 2973: 2967: 2966: 2964: 2963: 2958: 2953: 2948: 2943: 2938: 2932: 2930: 2924: 2923: 2921: 2920: 2919: 2918: 2913: 2903: 2902: 2901: 2894:Priority queue 2891: 2890: 2889: 2879: 2874: 2869: 2868: 2867: 2862: 2851: 2849: 2843: 2842: 2840: 2839: 2834: 2828: 2826: 2822: 2821: 2816: 2814: 2813: 2806: 2799: 2791: 2784: 2783: 2769: 2748: 2734: 2713: 2696: 2690: 2685: 2679: 2659:(1): 333–356. 2644: 2626:(4): 322–330. 2610: 2608: 2605: 2603: 2602: 2591:. Birkhauser. 2584: 2543:(4): 341–352. 2530: 2516: 2499: 2487: 2472: 2457: 2451: 2430: 2424: 2408:Golub, Gene H. 2403: 2401: 2398: 2395: 2394: 2382: 2349: 2308: 2297: 2263: 2249: 2235: 2221: 2192: 2163: 2152: 2122: 2121: 2119: 2116: 2114: 2113: 2108: 2103: 2098: 2093: 2091:Skyline matrix 2088: 2083: 2078: 2073: 2067: 2065: 2062: 2049: 2046: 2045: 2044: 2034: 2028: 2022: 2016: 2010: 2004: 1998: 1992: 1989:frontal solver 1966: 1956: 1950: 1939: 1936: 1906: 1903: 1853: 1852: 1842: 1802: 1783: 1777: 1774: 1772: 1769: 1767: 1764: 1762: 1759: 1757: 1754: 1752: 1749: 1748: 1745: 1742: 1740: 1737: 1735: 1732: 1730: 1727: 1725: 1722: 1720: 1717: 1716: 1713: 1710: 1708: 1705: 1703: 1700: 1698: 1695: 1693: 1690: 1688: 1685: 1684: 1681: 1678: 1676: 1673: 1671: 1668: 1666: 1663: 1661: 1658: 1656: 1653: 1652: 1650: 1598: 1589: 1570: 1564: 1561: 1559: 1556: 1554: 1551: 1549: 1546: 1545: 1542: 1539: 1537: 1534: 1532: 1529: 1527: 1524: 1523: 1520: 1517: 1515: 1512: 1510: 1507: 1505: 1502: 1501: 1498: 1495: 1493: 1490: 1488: 1485: 1483: 1480: 1479: 1477: 1464: 1463: 1421: 1411: 1406:are of length 1330: 1327: 1318: 1315: 1309: 1306: 1288: 1285: 1284: 1283: 1280: 1211: 1203: 1200: 1183: 1161: 1157: 1153: 1126: 1123: 1097: 1094: 1092: 1089: 1071: 1053: 1048: 1040: 1035: 1030: 1028: 1025: 1023: 1020: 1018: 1015: 1014: 1011: 1008: 1006: 1003: 1001: 998: 996: 993: 992: 989: 986: 984: 981: 977: 972: 967: 965: 962: 961: 958: 955: 953: 950: 948: 945: 941: 936: 931: 930: 928: 923: 919: 896: 895:Block diagonal 893: 889:adjacency list 876: 873: 838: 835: 797: 790: 787: 785: 782: 780: 777: 775: 772: 770: 767: 765: 762: 760: 757: 756: 752: 749: 747: 744: 742: 739: 737: 734: 732: 729: 727: 724: 722: 719: 718: 714: 711: 709: 706: 704: 701: 699: 696: 694: 691: 689: 686: 684: 681: 680: 676: 673: 671: 668: 666: 663: 661: 658: 656: 653: 651: 648: 646: 643: 642: 638: 635: 633: 630: 628: 625: 623: 620: 618: 615: 613: 610: 608: 605: 604: 600: 597: 595: 592: 590: 587: 585: 582: 580: 577: 575: 572: 570: 567: 566: 562: 559: 557: 554: 552: 549: 547: 544: 542: 539: 537: 534: 532: 529: 528: 526: 474: 435: 405:Main article: 402: 399: 397: 394: 346:network theory 272: 271: 266: 263: 262: 249: 241: 238: 236: 233: 231: 228: 226: 223: 221: 218: 216: 213: 211: 208: 207: 204: 201: 199: 196: 194: 191: 189: 186: 184: 181: 179: 176: 174: 171: 170: 167: 164: 162: 159: 157: 154: 152: 149: 147: 144: 142: 139: 137: 134: 133: 130: 127: 125: 122: 120: 117: 115: 112: 110: 107: 105: 102: 100: 97: 96: 93: 90: 88: 85: 83: 80: 78: 75: 73: 70: 68: 65: 63: 60: 59: 55: 45: 40: 26: 24: 14: 13: 10: 9: 6: 4: 3: 2: 4320: 4309: 4306: 4304: 4301: 4300: 4298: 4283: 4280: 4278: 4275: 4273: 4270: 4268: 4265: 4263: 4260: 4258: 4255: 4254: 4252: 4248: 4242: 4239: 4237: 4234: 4232: 4229: 4227: 4224: 4222: 4219: 4218: 4216: 4212: 4206: 4203: 4201: 4198: 4195: 4191: 4188: 4186: 4183: 4181: 4178: 4177: 4175: 4171: 4165: 4162: 4160: 4157: 4156: 4154: 4150: 4146: 4139: 4134: 4132: 4127: 4125: 4120: 4119: 4116: 4104: 4101: 4099: 4096: 4094: 4093: 4088: 4082: 4081: 4078: 4072: 4069: 4067: 4064: 4062: 4061:Pseudoinverse 4059: 4057: 4054: 4052: 4049: 4047: 4044: 4042: 4039: 4037: 4034: 4033: 4031: 4029:Related terms 4027: 4021: 4020:Z (chemistry) 4018: 4016: 4013: 4011: 4008: 4006: 4003: 4001: 3998: 3996: 3993: 3991: 3988: 3986: 3983: 3981: 3978: 3976: 3973: 3971: 3968: 3966: 3963: 3961: 3958: 3957: 3955: 3951: 3945: 3942: 3940: 3937: 3935: 3932: 3930: 3927: 3925: 3922: 3920: 3917: 3915: 3912: 3910: 3907: 3906: 3904: 3902: 3897: 3891: 3888: 3886: 3883: 3881: 3878: 3876: 3873: 3871: 3868: 3866: 3863: 3861: 3858: 3856: 3853: 3851: 3848: 3846: 3843: 3842: 3840: 3838: 3833: 3827: 3824: 3822: 3819: 3817: 3814: 3812: 3809: 3807: 3804: 3802: 3799: 3797: 3794: 3792: 3789: 3787: 3784: 3782: 3779: 3777: 3774: 3772: 3769: 3767: 3764: 3762: 3759: 3757: 3754: 3752: 3749: 3747: 3744: 3742: 3739: 3737: 3734: 3732: 3729: 3727: 3724: 3722: 3719: 3717: 3714: 3712: 3709: 3707: 3704: 3702: 3699: 3697: 3694: 3692: 3689: 3687: 3684: 3682: 3679: 3677: 3674: 3673: 3671: 3667: 3661: 3658: 3656: 3653: 3651: 3648: 3646: 3643: 3641: 3638: 3636: 3633: 3631: 3628: 3626: 3623: 3621: 3618: 3616: 3613: 3611: 3607: 3604: 3602: 3599: 3598: 3596: 3594: 3590: 3585: 3579: 3576: 3574: 3571: 3569: 3566: 3564: 3561: 3559: 3556: 3554: 3551: 3549: 3546: 3544: 3541: 3540: 3538: 3536: 3531: 3525: 3522: 3520: 3517: 3515: 3512: 3510: 3507: 3505: 3502: 3500: 3497: 3495: 3492: 3490: 3487: 3485: 3482: 3480: 3477: 3476: 3474: 3470: 3464: 3461: 3459: 3456: 3454: 3451: 3449: 3446: 3444: 3441: 3439: 3436: 3434: 3431: 3429: 3426: 3424: 3421: 3419: 3416: 3414: 3411: 3409: 3406: 3404: 3401: 3399: 3396: 3394: 3391: 3389: 3386: 3384: 3381: 3379: 3378:Pentadiagonal 3376: 3374: 3371: 3369: 3366: 3364: 3361: 3359: 3356: 3354: 3351: 3349: 3346: 3344: 3341: 3339: 3336: 3334: 3331: 3329: 3326: 3324: 3321: 3319: 3316: 3314: 3311: 3309: 3306: 3304: 3301: 3299: 3296: 3294: 3291: 3289: 3286: 3284: 3281: 3279: 3276: 3274: 3271: 3269: 3266: 3264: 3261: 3259: 3256: 3254: 3251: 3249: 3246: 3244: 3241: 3239: 3236: 3234: 3231: 3229: 3226: 3224: 3221: 3219: 3216: 3214: 3211: 3209: 3208:Anti-diagonal 3206: 3204: 3201: 3200: 3198: 3194: 3189: 3182: 3177: 3175: 3170: 3168: 3163: 3162: 3159: 3147: 3144: 3143: 3140: 3134: 3131: 3129: 3126: 3124: 3121: 3120: 3118: 3116: 3112: 3104: 3101: 3100: 3099: 3096: 3092: 3089: 3087: 3084: 3082: 3079: 3078: 3077: 3074: 3070: 3067: 3065: 3064:Binomial heap 3062: 3060: 3057: 3056: 3055: 3052: 3048: 3045: 3043: 3040: 3038: 3035: 3033: 3030: 3028: 3025: 3024: 3023: 3020: 3018: 3015: 3014: 3012: 3010: 3006: 3000: 2997: 2995: 2992: 2990: 2987: 2985: 2982: 2980: 2977: 2976: 2974: 2972: 2968: 2962: 2961:Sparse matrix 2959: 2957: 2954: 2952: 2949: 2947: 2946:Dynamic array 2944: 2942: 2939: 2937: 2934: 2933: 2931: 2929: 2925: 2917: 2914: 2912: 2909: 2908: 2907: 2904: 2900: 2897: 2896: 2895: 2892: 2888: 2885: 2884: 2883: 2880: 2878: 2875: 2873: 2870: 2866: 2863: 2861: 2858: 2857: 2856: 2853: 2852: 2850: 2848: 2844: 2838: 2835: 2833: 2830: 2829: 2827: 2823: 2819: 2812: 2807: 2805: 2800: 2798: 2793: 2792: 2789: 2780: 2776: 2772: 2766: 2762: 2758: 2754: 2749: 2745: 2741: 2737: 2731: 2727: 2723: 2719: 2714: 2710: 2706: 2702: 2697: 2694: 2693:SMALL project 2691: 2689: 2686: 2683: 2680: 2676: 2672: 2667: 2662: 2658: 2654: 2650: 2645: 2641: 2637: 2633: 2629: 2625: 2621: 2617: 2612: 2611: 2606: 2601:(Open Access) 2598: 2594: 2590: 2585: 2582: 2576: 2572: 2567: 2562: 2558: 2554: 2550: 2546: 2542: 2538: 2537: 2531: 2527: 2523: 2519: 2513: 2508: 2507: 2500: 2493: 2488: 2483: 2479: 2475: 2473:0-12-685650-8 2469: 2465: 2464: 2458: 2454: 2448: 2444: 2440: 2436: 2431: 2427: 2421: 2417: 2413: 2409: 2405: 2404: 2399: 2391: 2386: 2383: 2367: 2360: 2353: 2350: 2346: 2342: 2338: 2334: 2330: 2326: 2319: 2312: 2309: 2306: 2301: 2298: 2292: 2287: 2280: 2279: 2274: 2267: 2264: 2261: 2253: 2250: 2247: 2239: 2236: 2233: 2225: 2222: 2218: 2206: 2202: 2196: 2193: 2189: 2177: 2173: 2167: 2164: 2160: 2155: 2149: 2145: 2141: 2137: 2130: 2128: 2124: 2117: 2112: 2109: 2107: 2104: 2102: 2099: 2097: 2094: 2092: 2089: 2087: 2084: 2082: 2081:Ragged matrix 2079: 2077: 2074: 2072: 2069: 2068: 2063: 2061: 2059: 2055: 2054:sparse matrix 2047: 2042: 2038: 2035: 2032: 2029: 2026: 2023: 2020: 2017: 2014: 2011: 2008: 2005: 2002: 1999: 1996: 1993: 1990: 1986: 1982: 1978: 1974: 1970: 1967: 1964: 1960: 1957: 1954: 1951: 1948: 1945: 1944: 1943: 1937: 1935: 1904: 1902: 1899: 1897: 1894: 1889: 1878: 1875: 1843: 1821: 1820: 1819: 1801: 1781: 1775: 1770: 1765: 1760: 1755: 1750: 1743: 1738: 1733: 1728: 1723: 1718: 1711: 1706: 1701: 1696: 1691: 1686: 1679: 1674: 1669: 1664: 1659: 1654: 1648: 1637: 1634:− 1) − 1) / 2 1633: 1629: 1622: 1603: 1597: 1594: 1588: 1568: 1562: 1557: 1552: 1547: 1540: 1535: 1530: 1525: 1518: 1513: 1508: 1503: 1496: 1491: 1486: 1481: 1475: 1431: 1427:is of length 1422: 1412: 1397: 1396: 1395: 1393: 1390:. (Note that 1388: 1374: 1368: 1364: 1358: 1355: 1352: 1345: 1340: 1336: 1328: 1326: 1316: 1314: 1307: 1305: 1302: 1297:(row, column) 1294: 1286: 1281: 1278: 1277: 1276: 1273: 1269: 1266: 1262: 1256: 1252: 1246: 1240: 1234: 1228: 1224: 1218: 1214: 1210: 1201: 1199: 1197: 1181: 1159: 1155: 1151: 1143: 1139: 1134: 1132: 1124: 1122: 1119: 1114: 1112: 1108: 1103: 1095: 1090: 1088: 1085: 1081: 1074: 1070: 1064: 1051: 1046: 1038: 1026: 1021: 1016: 1009: 1004: 999: 994: 987: 982: 975: 963: 956: 951: 946: 939: 926: 921: 909:has the form 907: 902: 894: 892: 890: 886: 882: 874: 872: 869: 863: 859: 854: 850: 849:main diagonal 846: 845: 836: 834: 832: 826: 820: 814: 810: 795: 788: 783: 778: 773: 768: 763: 758: 750: 745: 740: 735: 730: 725: 720: 712: 707: 702: 697: 692: 687: 682: 674: 669: 664: 659: 654: 649: 644: 636: 631: 626: 621: 616: 611: 606: 598: 593: 588: 583: 578: 573: 568: 560: 555: 550: 545: 540: 535: 530: 524: 506: 502: 497: 493: 489: 481: 477: 473: 467: 462: 457: 453: 449: 442: 438: 434: 428: 422: 418: 414: 408: 400: 396:Special cases 395: 393: 391: 387: 383: 378: 374: 370: 365: 363: 359: 355: 351: 347: 343: 342:combinatorics 337: 335: 331: 327: 323: 319: 315: 311: 307: 303: 299: 298:sparse matrix 295: 291: 283: 278: 265: 264: 247: 239: 234: 229: 224: 219: 214: 209: 202: 197: 192: 187: 182: 177: 172: 165: 160: 155: 150: 145: 140: 135: 128: 123: 118: 113: 108: 103: 98: 91: 86: 81: 76: 71: 66: 61: 53: 43: 39: 38: 33: 19: 4204: 4152:Key concepts 4083: 4015:Substitution 3901:graph theory 3422: 3398:Quaternionic 3388:Persymmetric 2960: 2916:Disjoint-set 2752: 2717: 2700: 2656: 2652: 2623: 2619: 2588: 2540: 2534: 2505: 2462: 2434: 2415: 2385: 2373:. Retrieved 2352: 2328: 2324: 2311: 2300: 2277: 2266: 2252: 2238: 2224: 2215: 2209:. Retrieved 2204: 2195: 2186: 2180:. Retrieved 2178:. 2019-08-19 2175: 2166: 2157: 2135: 2053: 2051: 2037:scikit-learn 1984: 1980: 1976: 1972: 1963:parallelized 1941: 1908: 1900: 1890: 1879: 1873: 1854: 1805: 1638: 1631: 1627: 1623: 1619:COL_INDEX = 1604: 1601: 1595: 1592: 1465: 1429: 1386: 1372: 1366: 1362: 1359: 1353: 1350: 1343: 1338: 1334: 1332: 1320: 1311: 1290: 1274: 1270: 1264: 1260: 1254: 1250: 1244: 1238: 1232: 1226: 1216: 1212: 1208: 1205: 1135: 1128: 1115: 1105:matrix. The 1101: 1099: 1083: 1079: 1072: 1068: 1065: 905: 898: 878: 867: 861: 857: 842: 840: 824: 818: 815: 811: 495: 491: 487: 479: 475: 471: 465: 455: 451: 447: 440: 436: 432: 426: 420: 410: 366: 338: 333: 329: 325: 321: 317: 313: 309: 302:sparse array 301: 297: 287: 41: 3990:Hamiltonian 3914:Biadjacency 3850:Correlation 3766:Householder 3716:Commutation 3453:Vandermonde 3448:Tridiagonal 3383:Permutation 3373:Nonnegative 3358:Matrix unit 3238:Bisymmetric 3059:Binary heap 2984:Linked list 2331:: 127–137, 2205:www.anl.gov 2188:computation 2101:Sparse file 1975:ltifrontal 1934:function). 1829:into rows: 1814:, and 5 in 1607:row_start=1 1420:is located. 1398:The arrays 1140:method and 413:band matrix 407:Band matrix 358:engineering 4297:Categories 4194:algorithms 3890:Transition 3885:Stochastic 3855:Covariance 3837:statistics 3816:Symplectic 3811:Similarity 3640:Unimodular 3635:Orthogonal 3620:Involutory 3615:Invertible 3610:Projection 3606:Idempotent 3548:Convergent 3443:Triangular 3393:Polynomial 3338:Hessenberg 3308:Equivalent 3303:Elementary 3283:Copositive 3273:Conference 3233:Bidiagonal 3047:Splay tree 2951:Hash table 2832:Collection 2400:References 2211:2019-12-02 2182:2019-12-02 1626:NNZ < ( 1423:The array 1295:that maps 1293:dictionary 1082:= 1, ..., 469:such that 386:compressed 373:algorithms 354:scientific 4221:CPU cache 4071:Wronskian 3995:Irregular 3985:Gell-Mann 3934:Laplacian 3929:Incidence 3909:Adjacency 3880:Precision 3845:Centering 3751:Generator 3721:Confusion 3706:Circulant 3686:Augmented 3645:Unipotent 3625:Nilpotent 3601:Congruent 3578:Stieltjes 3553:Defective 3543:Companion 3514:Redheffer 3433:Symmetric 3428:Sylvester 3403:Signature 3333:Hermitian 3313:Frobenius 3223:Arrowhead 3203:Alternant 3103:Hash tree 2989:Skip list 2936:Bit array 2837:Container 2779:694087302 2744:693784152 2661:CiteSeerX 2581:Saad 2003 2575:123079384 2526:680489638 2482:316552948 2305:Saad 2003 2286:CiteSeerX 2052:The term 2007:Armadillo 1979:assively 1886:COL_INDEX 1882:ROW_INDEX 1857:ROW_INDEX 1845:COL_INDEX 1839:COL_INDEX 1823:ROW_INDEX 1816:ROW_INDEX 1812:COL_INDEX 1611:row_end=2 1444:ROW_INDEX 1440:COL_INDEX 1425:ROW_INDEX 1414:COL_INDEX 1404:COL_INDEX 1337:(CSR) or 1131:iterative 1027:⋯ 1010:⋮ 1005:⋱ 1000:⋮ 995:⋮ 983:⋯ 952:⋯ 875:Symmetric 871:entries. 784:⋅ 774:⋅ 769:⋅ 764:⋅ 759:⋅ 751:⋅ 731:⋅ 726:⋅ 721:⋅ 698:⋅ 683:⋅ 675:⋅ 665:⋅ 655:⋅ 645:⋅ 637:⋅ 632:⋅ 622:⋅ 612:⋅ 599:⋅ 594:⋅ 579:⋅ 561:⋅ 556:⋅ 551:⋅ 546:⋅ 485:whenever 4250:Software 4214:Hardware 4173:Problems 3899:Used in 3835:Used in 3796:Rotation 3771:Jacobian 3731:Distance 3711:Cofactor 3696:Carleman 3676:Adjugate 3660:Weighing 3593:inverses 3589:products 3558:Definite 3489:Identity 3479:Exchange 3472:Constant 3438:Toeplitz 3323:Hadamard 3293:Diagonal 3032:AVL tree 2911:Multiset 2860:Multimap 2847:Abstract 2755:. SIAM. 2720:. SIAM. 2640:14494429 2414:(1996). 2366:Archived 2275:(2009). 2064:See also 1953:Trilinos 1938:Software 1913:, where 837:Diagonal 369:computer 334:sparsity 18:Sparsity 4000:Overlap 3965:Density 3924:Edmonds 3801:Seifert 3761:Hessian 3726:Coxeter 3650:Unitary 3568:Hurwitz 3499:Of ones 3484:Hilbert 3418:Skyline 3363:Metzler 3353:Logical 3348:Integer 3258:Boolean 3190:classes 3086:R+ tree 3081:R* tree 3027:AA tree 2545:Bibcode 2375:6 April 2345:6412241 2048:History 1995:deal.II 1923:col_ptr 1919:row_ind 1893:logical 1810:, 8 in 1460:NNZ − 1 1370:matrix 1223:indices 1202:Storage 1102:fill-in 390:storage 324:for an 4308:Arrays 4272:LAPACK 4262:MATLAB 3919:Degree 3860:Design 3791:Random 3781:Payoff 3776:Moment 3701:Cartan 3691:Bézout 3630:Normal 3504:Pascal 3494:Lehmer 3423:Sparse 3343:Hollow 3328:Hankel 3263:Cauchy 3188:Matrix 3115:Graphs 3076:R-tree 3017:B-tree 2971:Linked 2928:Arrays 2777:  2767:  2742:  2732:  2663:  2638:  2573:  2524:  2514:  2480:  2470:  2449:  2422:  2343:  2288:  2150:  2025:ARPACK 2019:ALGLIB 1959:Eigen3 1932:sparse 1380:. Let 1066:where 883:of an 401:Banded 382:memory 310:sparse 306:matrix 4257:ATLAS 3980:Gamma 3944:Tutte 3806:Shear 3519:Shift 3509:Pauli 3458:Walsh 3368:Moore 3248:Block 3009:Trees 2882:Queue 2877:Stack 2825:Types 2636:S2CID 2571:S2CID 2495:(PDF) 2369:(PDF) 2362:(PDF) 2341:S2CID 2321:(PDF) 2282:(PDF) 2118:Notes 2031:SLEPc 2013:SciPy 1969:MUMPS 1947:PETSc 1837:(and 1798:4 × 6 1796:is a 1585:4 × 4 1583:is a 1301:pairs 1142:GMRES 1129:Both 851:as a 490:< 450:> 314:dense 304:is a 4236:SIMD 3786:Pick 3756:Gram 3524:Zero 3228:Band 3098:Trie 3054:Heap 2872:List 2775:OCLC 2765:ISBN 2740:OCLC 2730:ISBN 2522:OCLC 2512:ISBN 2478:OCLC 2468:ISBN 2447:ISBN 2420:ISBN 2377:2019 2256:See 2242:See 2228:See 2148:ISBN 2001:DUNE 1891:For 1884:and 1617:and 1615:V = 1609:and 1438:and 1402:and 1333:The 1230:and 1100:The 375:and 348:and 296:, a 292:and 4226:TLB 3875:Hat 3608:or 3591:or 2906:Set 2757:doi 2722:doi 2705:doi 2671:doi 2628:doi 2593:doi 2561:hdl 2553:doi 2439:doi 2333:doi 2140:doi 1927:val 1915:val 1865:NNZ 1861:NNZ 1452:NNZ 1432:+ 1 1408:NNZ 1382:NNZ 1091:Use 483:= 0 356:or 300:or 288:In 4299:: 2773:. 2763:. 2738:. 2728:. 2669:. 2657:13 2655:. 2651:. 2634:. 2622:. 2618:. 2569:. 2559:. 2551:. 2541:50 2539:. 2520:. 2476:. 2445:. 2410:; 2364:. 2339:, 2327:, 2323:, 2214:. 2203:. 2185:. 2174:. 2156:. 2146:. 2126:^ 1973:MU 1818:. 1776:80 1739:70 1734:60 1729:50 1702:40 1692:30 1660:20 1655:10 1636:. 1365:× 1263:× 1253:× 1113:. 1087:. 899:A 891:. 860:× 833:. 494:− 454:+ 364:. 328:× 320:× 240:99 198:88 156:77 151:66 146:55 109:44 104:33 67:22 62:11 4196:) 4192:( 4137:e 4130:t 4123:v 4005:S 3463:Z 3180:e 3173:t 3166:v 2810:e 2803:t 2796:v 2781:. 2759:: 2746:. 2724:: 2711:. 2707:: 2677:. 2673:: 2642:. 2630:: 2624:2 2599:. 2595:: 2583:. 2577:. 2563:: 2555:: 2547:: 2528:. 2497:. 2484:. 2455:. 2441:: 2428:. 2379:. 2335:: 2329:1 2294:. 2142:: 1991:. 1985:S 1981:P 1977:M 1971:( 1965:. 1874:i 1851:. 1835:V 1827:V 1808:V 1782:) 1771:0 1766:0 1761:0 1756:0 1751:0 1744:0 1724:0 1719:0 1712:0 1707:0 1697:0 1687:0 1680:0 1675:0 1670:0 1665:0 1649:( 1632:n 1630:( 1628:m 1569:) 1563:0 1558:0 1553:6 1548:0 1541:0 1536:3 1531:0 1526:0 1519:0 1514:0 1509:8 1504:0 1497:0 1492:0 1487:0 1482:5 1476:( 1462:. 1456:V 1448:j 1436:V 1430:m 1418:V 1400:V 1387:M 1373:M 1367:n 1363:m 1354:x 1351:M 1348:( 1344:M 1299:- 1265:n 1261:m 1255:n 1251:m 1245:j 1239:i 1233:j 1227:i 1217:j 1215:, 1213:i 1209:a 1182:A 1160:i 1156:x 1152:A 1084:n 1080:k 1073:k 1069:A 1052:, 1047:] 1039:n 1034:A 1022:0 1017:0 988:0 976:2 971:A 964:0 957:0 947:0 940:1 935:A 927:[ 922:= 918:A 906:A 868:n 862:n 858:n 827:′ 825:A 819:A 796:] 789:X 779:X 746:X 741:X 736:X 713:X 708:X 703:X 693:X 688:X 670:X 660:X 650:X 627:X 617:X 607:X 589:X 584:X 574:X 569:X 541:X 536:X 531:X 525:[ 513:1 509:1 499:( 496:p 492:j 488:i 480:j 478:, 476:i 472:a 466:p 456:p 452:j 448:i 441:j 439:, 437:i 433:a 427:p 421:A 330:n 326:m 322:n 318:m 248:) 235:0 230:0 225:0 220:0 215:0 210:0 203:0 193:0 188:0 183:0 178:0 173:0 166:0 161:0 141:0 136:0 129:0 124:0 119:0 114:0 99:0 92:0 87:0 82:0 77:0 72:0 54:( 34:. 20:)

Index

Sparsity
Sparse (disambiguation)

finite element problem
numerical analysis
scientific computing
matrix
combinatorics
network theory
numerical analysis
scientific
engineering
partial differential equations
computer
algorithms
data structures
memory
compressed
storage
Band matrix
band matrix
lower bandwidth of a matrix
upper bandwidth
Golub & Van Loan 1996
tridiagonal matrix
bandwidth minimization
diagonal matrix
main diagonal
one-dimensional array
adjacency matrix

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.