3026:
iteration the kernel is shifted to the centroid or the mean of the points within it. The method of calculating this mean depends on the choice of the kernel. In this case if a
Gaussian kernel is chosen instead of a flat kernel, then every point will first be assigned a weight which will decay exponentially as the distance from the kernel's center increases. At convergence, there will be no direction at which a shift can accommodate more points inside the kernel.
1584:
differentiable, convex, and strictly decreasing profile function. However, the one-dimensional case has limited real world applications. Also, the convergence of the algorithm in higher dimensions with a finite number of the stationary (or isolated) points has been proved. However, sufficient conditions for a general kernel function to have finite stationary (or isolated) points have not been provided.
3035:
confidence map is a probability density function on the new image, assigning each pixel of the new image a probability, which is the probability of the pixel color occurring in the object in the previous image. A few algorithms, such as kernel-based object tracking, ensemble tracking, CAMshift expand on this idea.
3025:
as the kernel. Mean-shift is a hill climbing algorithm which involves shifting this kernel iteratively to a higher density region until convergence. Every shift is defined by a mean shift vector. The mean shift vector always points toward the direction of the maximum increase in the density. At every
3034:
The mean shift algorithm can be used for visual tracking. The simplest such algorithm would create a confidence map in the new image based on the color histogram of the object in the previous image, and use mean shift to find the peak of a confidence map near the object's old position. The
1583:
Although the mean shift algorithm has been widely used in many applications, a rigid proof for the convergence of the algorithm using a general kernel in a high dimensional space is still not known. Aliyari
Ghassabeh showed the convergence of the mean shift algorithm in one dimension with a
1827:
2852:
1368:
2169:
from the equation above, we can find its local maxima using gradient ascent or some other optimization technique. The problem with this "brute force" approach is that, for higher dimensions, it becomes computationally prohibitive to evaluate
2051:
3413:. The superscripts s and r denote the spatial and range components of a vector, respectively. The assignment specifies that the filtered data at the spatial location axis will have the range component of the point of convergence
1893:
simultaneously. The first question, then, is how to estimate the density function given a sparse set of samples. One of the simplest approaches is to just smooth the data, e.g., by convolving it with a fixed kernel of width
2746:
2927:
1725:
1181:
3411:
877:
2762:
2497:
2547:
2463:
915:
2606:
872:
3128:
2388:
1865:
1549:
1468:
862:
3224:
2657:
703:
3449:
1209:
1076:
3333:
3294:
3265:
1697:
2950:
1920:
2683:
1891:
1506:
3068:
2311:
2255:
2228:
2081:
2284:
2197:
2167:
2110:
1578:
1400:
3178:
910:
3148:
3023:
3003:
2970:
2408:
2359:
2339:
2134:
1912:
1717:
1677:
1657:
1637:
1617:
1420:
1201:
1031:
867:
718:
449:
3675:
Fukunaga, Keinosuke; Larry D. Hostetler (January 1975). "The
Estimation of the Gradient of a Density Function, with Applications in Pattern Recognition".
950:
753:
1001:
The mean shift procedure is usually credited to work by
Fukunaga and Hostetler in 1975. It is, however, reminiscent of earlier work by Schnell in 1964.
829:
1588:
378:
4017:
4051:
3979:
4085:
887:
650:
185:
1013:—of a density function given discrete data sampled from that function. This is an iterative method, and we start with an initial estimate
905:
738:
713:
662:
786:
781:
434:
444:
82:
2691:
839:
943:
603:
424:
1822:{\displaystyle K(x)={\begin{cases}1&{\text{if}}\ \|x\|\leq \lambda \\0&{\text{if}}\ \|x\|>\lambda \\\end{cases}}}
2865:
814:
516:
292:
3769:
Aliyari
Ghassabeh, Youness (2013-09-01). "On the convergence of the mean shift algorithm in the one-dimensional space".
1079:
771:
708:
618:
596:
439:
429:
2199:
over the complete search space. Instead, mean shift uses a variant of what is known in the optimization literature as
1085:
922:
834:
819:
280:
102:
809:
2847:{\displaystyle k(x)={\begin{cases}1&{\text{if}}\ x\leq \lambda \\0&{\text{if}}\ x>\lambda \\\end{cases}}}
3341:
882:
559:
454:
242:
175:
135:
3562:
2468:
936:
542:
310:
180:
2506:
2413:
4080:
564:
484:
407:
325:
155:
117:
112:
72:
67:
1078:
be given. This function determines the weight of nearby points for re-estimation of the mean. Typically a
511:
360:
260:
87:
3926:
3636:
3599:
691:
667:
569:
330:
305:
265:
77:
3627:
Comaniciu, Dorin; Peter Meer (May 2002). "Mean Shift: A Robust
Approach Toward Feature Space Analysis".
645:
467:
419:
275:
190:
62:
2554:
3073:
2364:
1835:
1519:
1425:
3831:
3788:
574:
524:
3641:
3604:
2786:
1749:
1363:{\displaystyle m(x)={\frac {\sum _{x_{i}\in N(x)}K(x_{i}-x)x_{i}}{\sum _{x_{i}\in N(x)}K(x_{i}-x)}}}
4014:
3568:
3183:
2618:
1034:
677:
613:
584:
489:
315:
248:
234:
220:
195:
145:
97:
57:
4034:
Emami, Ebrahim (2013). "Online failure detection and correction for CAMShift tracking algorithm".
3931:
4057:
3993:
3944:
3890:
3822:
Li, Xiangru; Hu, Zhanyi; Wu, Fuchao (2007-06-01). "A note on the convergence of the mean shift".
3804:
3778:
3654:
3416:
655:
579:
365:
160:
3150:-dimensional input and filtered image pixels in the joint spatial-range domain. For each pixel,
2046:{\displaystyle f(x)=\sum _{i}K(x-x_{i})=\sum _{i}k\left({\frac {\|x-x_{i}\|^{2}}{h^{2}}}\right)}
1039:
3703:
3493:
Inappropriate window size can cause modes to be merged, or generate additional “shallow” modes.
3299:
3270:
3231:
4047:
3985:
3975:
3882:
3874:
2136:
is the only parameter in the algorithm and is called the bandwidth. This approach is known as
1682:
1010:
978:
748:
591:
504:
300:
270:
215:
210:
165:
107:
3917:
Comaniciu, Dorin; Visvanathan Ramesh; Peter Meer (May 2003). "Kernel-based Object
Tracking".
3738:"A sufficient condition for the convergence of the mean shift algorithm with Gaussian kernel"
2935:
4039:
3967:
3936:
3866:
3839:
3796:
3749:
3715:
3684:
3646:
3609:
3557:
2662:
1870:
1476:
990:
982:
974:
776:
529:
479:
389:
373:
343:
205:
200:
150:
140:
38:
3046:
2289:
2233:
2206:
2059:
4021:
3964:
2005 IEEE Computer
Society Conference on Computer Vision and Pattern Recognition (CVPR'05)
3505:
Variants of the algorithm can be found in machine learning and image processing packages:
2260:
2173:
2143:
2086:
1554:
1376:
986:
804:
608:
474:
414:
3157:
3835:
3792:
2985:
Consider a set of points in two-dimensional space. Assume a circular window centered at
3133:
3008:
2988:
2955:
2393:
2344:
2324:
2119:
1897:
1702:
1662:
1642:
1622:
1602:
1405:
1186:
1016:
967:
824:
355:
92:
4074:
4043:
3533:
970:
743:
672:
554:
285:
170:
4061:
3808:
4010:
3997:
3894:
3857:
Carreira-Perpinan, Miguel A. (May 2007). "Gaussian Mean-Shift Is an EM Algorithm".
3539:
3948:
3658:
3542:
Numpy/Python implementation uses ball tree for efficient neighboring points lookup
3843:
3800:
549:
43:
3940:
3754:
3737:
3907:
Richard
Szeliski, Computer Vision, Algorithms and Applications, Springer, 2011
3461:
Mean shift is an application-independent tool suitable for real data analysis.
698:
320:
3878:
3719:
3688:
3870:
857:
638:
3989:
3886:
3971:
17:
4036:
2013 8th
Iranian Conference on Machine Vision and Image Processing (MVIP)
3590:
Cheng, Yizong (August 1995). "Mean Shift, Mode Seeking, and Clustering".
3650:
3474:
633:
3613:
3552:
3527:
3521:
3515:
384:
4015:
Computer Vision Face Tracking For Use in a Perceptual User Interface
3783:
628:
623:
350:
2751:
The two most frequently used kernel profiles for mean shift are:
3509:
3470:
The procedure relies on choice of a single parameter: bandwidth.
1183:. The weighted mean of the density in the window determined by
3966:. Vol. 2. San Diego, California: IEEE. pp. 494–501.
3919:
IEEE Transactions on Pattern Analysis and Machine Intelligence
3859:
IEEE Transactions on Pattern Analysis and Machine Intelligence
3629:
IEEE Transactions on Pattern Analysis and Machine Intelligence
3592:
IEEE Transactions on Pattern Analysis and Machine Intelligence
3473:
The bandwidth/window size 'h' has a physical meaning, unlike
973:
mathematical analysis technique for locating the maxima of a
1679:
be a flat kernel that is the characteristic function of the
2840:
2257:, mean shift computes the gradient of the density estimate
1815:
3530:
contains mean-shift implementation via cvMeanShift Method
3512:. Java data mining tool with many clustering algorithms.
2741:{\displaystyle \int _{0}^{\infty }k(r)\,dr<\infty \ }
916:
List of datasets in computer vision and image processing
3464:
Does not assume any predefined shape on data clusters.
2140:
or the Parzen window technique. Once we have computed
1009:
Mean shift is a procedure for locating the maxima—the
3524:. Efficient dual-tree algorithm-based implementation.
3419:
3344:
3302:
3273:
3234:
3186:
3160:
3136:
3076:
3049:
3011:
2991:
2958:
2938:
2868:
2765:
2694:
2665:
2621:
2557:
2509:
2471:
2416:
2396:
2367:
2347:
2327:
2292:
2263:
2236:
2209:
2176:
2146:
2122:
2089:
2062:
1923:
1900:
1873:
1838:
1728:
1705:
1685:
1665:
1645:
1625:
1605:
1557:
1522:
1479:
1428:
1408:
1379:
1212:
1189:
1088:
1042:
1019:
2922:{\displaystyle k(x)=e^{-{\frac {x}{2\sigma ^{2}}}},}
3467:
It is capable of handling arbitrary feature spaces.
3443:
3405:
3327:
3288:
3259:
3218:
3172:
3142:
3122:
3062:
3017:
2997:
2964:
2944:
2921:
2846:
2740:
2677:
2651:
2600:
2541:
2491:
2457:
2402:
2382:
2353:
2333:
2305:
2278:
2249:
2222:
2191:
2161:
2128:
2104:
2075:
2045:
1906:
1885:
1859:
1821:
1711:
1691:
1671:
1651:
1631:
1611:
1572:
1543:
1500:
1462:
1414:
1394:
1362:
1195:
1175:
1070:
1025:
1082:on the distance to the current estimate is used,
1176:{\displaystyle K(x_{i}-x)=e^{-c||x_{i}-x||^{2}}}
981:-seeking algorithm. Application domains include
3518:. Image filtering using the mean shift filter.
3490:The selection of a window size is not trivial.
2203:. Starting at some guess for a local maximum,
911:List of datasets for machine-learning research
3406:{\displaystyle z_{i}=(x_{i}^{s},y_{i,c}^{r})}
944:
8:
2586:
2579:
2424:
2417:
2313:and takes an uphill step in that direction.
2017:
1997:
1803:
1797:
1771:
1765:
2492:{\displaystyle K:X\rightarrow \mathbb {R} }
3962:Avidan, Shai (2005). "Ensemble Tracking".
3670:
3668:
3496:Often requires using adaptive window size.
2542:{\displaystyle k:\rightarrow \mathbb {R} }
2458:{\displaystyle \|x\|^{2}=x^{\top }x\geq 0}
951:
937:
29:
3930:
3782:
3753:
3736:Aliyari Ghassabeh, Youness (2015-03-01).
3704:"Eine Methode zur Auffindung von Gruppen"
3640:
3603:
3435:
3424:
3418:
3394:
3383:
3370:
3365:
3349:
3343:
3313:
3301:
3272:
3239:
3233:
3210:
3191:
3185:
3159:
3135:
3081:
3075:
3054:
3048:
3010:
2990:
2957:
2937:
2905:
2892:
2888:
2867:
2820:
2794:
2781:
2764:
2722:
2704:
2699:
2693:
2664:
2620:
2589:
2556:
2535:
2534:
2508:
2499:is said to be a kernel if there exists a
2485:
2484:
2470:
2440:
2427:
2415:
2395:
2374:
2370:
2369:
2366:
2346:
2326:
2297:
2291:
2262:
2241:
2235:
2230:, which can be a random input data point
2214:
2208:
2175:
2145:
2121:
2088:
2067:
2061:
2031:
2020:
2010:
1994:
1981:
1965:
1943:
1922:
1899:
1872:
1837:
1789:
1757:
1744:
1727:
1704:
1684:
1664:
1644:
1624:
1604:
1556:
1521:
1478:
1439:
1427:
1407:
1378:
1342:
1309:
1304:
1292:
1273:
1240:
1235:
1228:
1211:
1188:
1165:
1160:
1154:
1142:
1133:
1128:
1121:
1099:
1087:
1053:
1041:
1018:
3585:
3583:
4038:. Vol. 2. IEEE. pp. 180–183.
3677:IEEE Transactions on Information Theory
3579:
2932:where the standard deviation parameter
37:
7:
3731:
3729:
1832:In each iteration of the algorithm,
4024:, Intel Technology Journal, No. Q2.
1551:, and repeats the estimation until
906:Glossary of artificial intelligence
2952:works as the bandwidth parameter,
2732:
2705:
2525:
2441:
1589:Expectation–maximization algorithm
25:
2601:{\displaystyle K(x)=k(\|x\|^{2})}
2201:multiple restart gradient descent
4044:10.1109/IranianMVIP.2013.6779974
3742:Journal of Multivariate Analysis
3123:{\displaystyle z_{i},i=1,...,n,}
2383:{\displaystyle \mathbb {R} ^{n}}
1860:{\displaystyle s\leftarrow m(s)}
1544:{\displaystyle x\leftarrow m(x)}
1512:in Fukunaga and Hostetler. The
1463:{\displaystyle K(x_{i}-x)\neq 0}
3400:
3358:
3283:
3277:
2878:
2872:
2775:
2769:
2719:
2713:
2688:k is piecewise continuous and
2646:
2640:
2631:
2625:
2595:
2576:
2567:
2561:
2531:
2528:
2516:
2481:
2361:-dimensional Euclidean space,
2273:
2267:
2186:
2180:
2156:
2150:
2099:
2093:
1971:
1952:
1933:
1927:
1854:
1848:
1842:
1738:
1732:
1639:-dimensional Euclidean space,
1567:
1561:
1538:
1532:
1526:
1489:
1483:
1451:
1432:
1389:
1383:
1354:
1335:
1327:
1321:
1285:
1266:
1258:
1252:
1222:
1216:
1161:
1155:
1134:
1129:
1111:
1092:
1065:
1046:
326:Relevance vector machine (RVM)
1:
3219:{\displaystyle y_{i,1}=x_{i}}
2652:{\displaystyle k(a)\geq k(b)}
815:Computational learning theory
379:Expectation–maximization (EM)
3844:10.1016/j.patcog.2006.10.016
3801:10.1016/j.patrec.2013.05.004
1422:, a set of points for which
772:Coefficient of determination
619:Convolutional neural network
331:Support vector machine (SVM)
4086:Cluster analysis algorithms
3771:Pattern Recognition Letters
3444:{\displaystyle y_{i,c}^{r}}
2112:is the kernel function (or
923:Outline of machine learning
820:Empirical risk minimization
4102:
3941:10.1109/tpami.2003.1195991
3755:10.1016/j.jmva.2014.11.009
2410:is a non-negative number,
2083:are the input samples and
1587:Gaussian Mean-Shift is an
1071:{\displaystyle K(x_{i}-x)}
560:Feedforward neural network
311:Artificial neural networks
3563:Kernel density estimation
3328:{\displaystyle y=y_{i,c}}
3289:{\displaystyle m(\cdot )}
3260:{\displaystyle y_{i,j+1}}
2138:kernel density estimation
1599:Let data be a finite set
543:Artificial neural network
3720:10.1002/bimj.19640060105
3708:Biometrische Zeitschrift
3689:10.1109/TIT.1975.1055330
1692:{\displaystyle \lambda }
852:Journals and conferences
799:Mathematical foundations
709:Temporal difference (TD)
565:Recurrent neural network
485:Conditional random field
408:Dimensionality reduction
156:Dimensionality reduction
118:Quantum machine learning
113:Neuromorphic engineering
73:Self-supervised learning
68:Semi-supervised learning
3871:10.1109/tpami.2007.1057
3536:. A C++ implementation.
2945:{\displaystyle \sigma }
2321:Kernel definition: Let
1402:is the neighborhood of
261:Apprenticeship learning
3445:
3407:
3329:
3290:
3261:
3220:
3174:
3144:
3124:
3064:
3019:
2999:
2966:
2946:
2923:
2848:
2742:
2679:
2678:{\displaystyle a<b}
2653:
2602:
2543:
2493:
2459:
2404:
2384:
2355:
2335:
2307:
2280:
2251:
2224:
2193:
2163:
2130:
2106:
2077:
2047:
1908:
1887:
1886:{\displaystyle s\in S}
1861:
1823:
1713:
1693:
1673:
1653:
1633:
1613:
1574:
1545:
1502:
1501:{\displaystyle m(x)-x}
1464:
1416:
1396:
1364:
1197:
1177:
1072:
1027:
810:Bias–variance tradeoff
692:Reinforcement learning
668:Spiking neural network
78:Reinforcement learning
27:Mathematical technique
3972:10.1109/CVPR.2005.144
3446:
3408:
3330:
3291:
3262:
3221:
3175:
3145:
3125:
3065:
3063:{\displaystyle x_{i}}
3020:
3000:
2967:
2947:
2924:
2849:
2743:
2680:
2654:
2615:k is non-increasing:
2603:
2544:
2494:
2460:
2405:
2385:
2356:
2336:
2308:
2306:{\displaystyle y_{k}}
2281:
2252:
2250:{\displaystyle x_{1}}
2225:
2223:{\displaystyle y_{k}}
2194:
2164:
2131:
2107:
2078:
2076:{\displaystyle x_{i}}
2048:
1909:
1888:
1867:is performed for all
1862:
1824:
1714:
1694:
1674:
1654:
1634:
1614:
1575:
1546:
1503:
1465:
1417:
1397:
1365:
1198:
1178:
1073:
1028:
646:Neural radiance field
468:Structured prediction
191:Structured prediction
63:Unsupervised learning
3702:Schnell, P. (1964).
3417:
3342:
3300:
3271:
3232:
3184:
3158:
3134:
3074:
3047:
3009:
2989:
2956:
2936:
2866:
2763:
2692:
2663:
2619:
2555:
2507:
2469:
2414:
2394:
2365:
2345:
2325:
2290:
2279:{\displaystyle f(x)}
2261:
2234:
2207:
2192:{\displaystyle f(x)}
2174:
2162:{\displaystyle f(x)}
2144:
2120:
2105:{\displaystyle k(r)}
2087:
2060:
1921:
1898:
1871:
1836:
1726:
1703:
1683:
1663:
1643:
1623:
1603:
1573:{\displaystyle m(x)}
1555:
1520:
1514:mean-shift algorithm
1477:
1426:
1406:
1395:{\displaystyle N(x)}
1377:
1210:
1187:
1086:
1040:
1017:
835:Statistical learning
733:Learning with humans
525:Local outlier factor
3836:2007PatRe..40.1756L
3824:Pattern Recognition
3793:2013PaReL..34.1423A
3569:Kernel (statistics)
3440:
3399:
3375:
3296:until convergence,
3173:{\displaystyle j=1}
2709:
678:Electrochemical RAM
585:reservoir computing
316:Logistic regression
235:Supervised learning
221:Multimodal learning
196:Feature engineering
141:Generative modeling
103:Rule-based learning
98:Curriculum learning
58:Supervised learning
33:Part of a series on
4020:2012-04-17 at the
3651:10.1109/34.1000236
3441:
3420:
3403:
3379:
3361:
3325:
3286:
3257:
3216:
3170:
3140:
3120:
3060:
3015:
3005:and having radius
2995:
2962:
2942:
2919:
2844:
2839:
2738:
2695:
2675:
2649:
2612:k is non-negative.
2598:
2539:
2489:
2455:
2400:
2380:
2351:
2331:
2303:
2276:
2247:
2220:
2189:
2159:
2126:
2102:
2073:
2043:
1986:
1948:
1904:
1883:
1857:
1819:
1814:
1709:
1689:
1669:
1649:
1629:
1609:
1570:
1541:
1498:
1460:
1412:
1392:
1360:
1331:
1262:
1193:
1173:
1068:
1023:
246: •
161:Density estimation
4053:978-1-4673-6184-2
3981:978-0-7695-2372-9
3777:(12): 1423–1427.
3614:10.1109/34.400568
3143:{\displaystyle d}
3018:{\displaystyle r}
2998:{\displaystyle C}
2965:{\displaystyle h}
2912:
2827:
2823:
2801:
2797:
2737:
2403:{\displaystyle x}
2354:{\displaystyle n}
2334:{\displaystyle X}
2129:{\displaystyle h}
2037:
1977:
1939:
1907:{\displaystyle h}
1796:
1792:
1764:
1760:
1712:{\displaystyle X}
1672:{\displaystyle K}
1652:{\displaystyle X}
1632:{\displaystyle n}
1612:{\displaystyle S}
1415:{\displaystyle x}
1358:
1300:
1231:
1196:{\displaystyle K}
1026:{\displaystyle x}
961:
960:
766:Model diagnostics
749:Human-in-the-loop
592:Boltzmann machine
505:Anomaly detection
301:Linear regression
216:Ontology learning
211:Grammar induction
186:Semantic analysis
181:Association rules
166:Anomaly detection
108:Neuro-symbolic AI
16:(Redirected from
4093:
4066:
4065:
4031:
4025:
4008:
4002:
4001:
3959:
3953:
3952:
3934:
3914:
3908:
3905:
3899:
3898:
3854:
3848:
3847:
3830:(6): 1756–1762.
3819:
3813:
3812:
3786:
3766:
3760:
3759:
3757:
3733:
3724:
3723:
3699:
3693:
3692:
3672:
3663:
3662:
3644:
3624:
3618:
3617:
3607:
3587:
3558:OPTICS algorithm
3450:
3448:
3447:
3442:
3439:
3434:
3412:
3410:
3409:
3404:
3398:
3393:
3374:
3369:
3354:
3353:
3334:
3332:
3331:
3326:
3324:
3323:
3295:
3293:
3292:
3287:
3266:
3264:
3263:
3258:
3256:
3255:
3225:
3223:
3222:
3217:
3215:
3214:
3202:
3201:
3179:
3177:
3176:
3171:
3149:
3147:
3146:
3141:
3129:
3127:
3126:
3121:
3086:
3085:
3069:
3067:
3066:
3061:
3059:
3058:
3024:
3022:
3021:
3016:
3004:
3002:
3001:
2996:
2971:
2969:
2968:
2963:
2951:
2949:
2948:
2943:
2928:
2926:
2925:
2920:
2915:
2914:
2913:
2911:
2910:
2909:
2893:
2853:
2851:
2850:
2845:
2843:
2842:
2825:
2824:
2821:
2799:
2798:
2795:
2747:
2745:
2744:
2739:
2735:
2708:
2703:
2684:
2682:
2681:
2676:
2658:
2656:
2655:
2650:
2607:
2605:
2604:
2599:
2594:
2593:
2548:
2546:
2545:
2540:
2538:
2498:
2496:
2495:
2490:
2488:
2464:
2462:
2461:
2456:
2445:
2444:
2432:
2431:
2409:
2407:
2406:
2401:
2389:
2387:
2386:
2381:
2379:
2378:
2373:
2360:
2358:
2357:
2352:
2340:
2338:
2337:
2332:
2317:Types of kernels
2312:
2310:
2309:
2304:
2302:
2301:
2285:
2283:
2282:
2277:
2256:
2254:
2253:
2248:
2246:
2245:
2229:
2227:
2226:
2221:
2219:
2218:
2198:
2196:
2195:
2190:
2168:
2166:
2165:
2160:
2135:
2133:
2132:
2127:
2111:
2109:
2108:
2103:
2082:
2080:
2079:
2074:
2072:
2071:
2052:
2050:
2049:
2044:
2042:
2038:
2036:
2035:
2026:
2025:
2024:
2015:
2014:
1995:
1985:
1970:
1969:
1947:
1913:
1911:
1910:
1905:
1892:
1890:
1889:
1884:
1866:
1864:
1863:
1858:
1828:
1826:
1825:
1820:
1818:
1817:
1794:
1793:
1790:
1762:
1761:
1758:
1718:
1716:
1715:
1710:
1698:
1696:
1695:
1690:
1678:
1676:
1675:
1670:
1658:
1656:
1655:
1650:
1638:
1636:
1635:
1630:
1619:embedded in the
1618:
1616:
1615:
1610:
1579:
1577:
1576:
1571:
1550:
1548:
1547:
1542:
1507:
1505:
1504:
1499:
1469:
1467:
1466:
1461:
1444:
1443:
1421:
1419:
1418:
1413:
1401:
1399:
1398:
1393:
1369:
1367:
1366:
1361:
1359:
1357:
1347:
1346:
1330:
1314:
1313:
1298:
1297:
1296:
1278:
1277:
1261:
1245:
1244:
1229:
1202:
1200:
1199:
1194:
1182:
1180:
1179:
1174:
1172:
1171:
1170:
1169:
1164:
1158:
1147:
1146:
1137:
1132:
1104:
1103:
1077:
1075:
1074:
1069:
1058:
1057:
1032:
1030:
1029:
1024:
991:image processing
983:cluster analysis
975:density function
953:
946:
939:
900:Related articles
777:Confusion matrix
530:Isolation forest
475:Graphical models
254:
253:
206:Learning to rank
201:Feature learning
39:Machine learning
30:
21:
4101:
4100:
4096:
4095:
4094:
4092:
4091:
4090:
4081:Computer vision
4071:
4070:
4069:
4054:
4033:
4032:
4028:
4022:Wayback Machine
4009:
4005:
3982:
3961:
3960:
3956:
3916:
3915:
3911:
3906:
3902:
3856:
3855:
3851:
3821:
3820:
3816:
3768:
3767:
3763:
3735:
3734:
3727:
3701:
3700:
3696:
3674:
3673:
3666:
3642:10.1.1.160.3832
3626:
3625:
3621:
3605:10.1.1.510.1222
3589:
3588:
3581:
3577:
3549:
3503:
3487:
3458:
3415:
3414:
3345:
3340:
3339:
3309:
3298:
3297:
3269:
3268:
3235:
3230:
3229:
3206:
3187:
3182:
3181:
3156:
3155:
3132:
3131:
3077:
3072:
3071:
3050:
3045:
3044:
3041:
3032:
3007:
3006:
2987:
2986:
2983:
2978:
2954:
2953:
2934:
2933:
2930:
2901:
2897:
2884:
2864:
2863:
2858:Gaussian kernel
2855:
2838:
2837:
2818:
2812:
2811:
2792:
2782:
2761:
2760:
2690:
2689:
2661:
2660:
2617:
2616:
2585:
2553:
2552:
2505:
2504:
2467:
2466:
2436:
2423:
2412:
2411:
2392:
2391:
2368:
2363:
2362:
2343:
2342:
2323:
2322:
2319:
2293:
2288:
2287:
2259:
2258:
2237:
2232:
2231:
2210:
2205:
2204:
2172:
2171:
2142:
2141:
2118:
2117:
2085:
2084:
2063:
2058:
2057:
2054:
2027:
2016:
2006:
1996:
1990:
1961:
1919:
1918:
1896:
1895:
1869:
1868:
1834:
1833:
1830:
1813:
1812:
1787:
1781:
1780:
1755:
1745:
1724:
1723:
1701:
1700:
1681:
1680:
1661:
1660:
1641:
1640:
1621:
1620:
1601:
1600:
1597:
1553:
1552:
1518:
1517:
1475:
1474:
1473:The difference
1435:
1424:
1423:
1404:
1403:
1375:
1374:
1338:
1305:
1299:
1288:
1269:
1236:
1230:
1208:
1207:
1185:
1184:
1159:
1138:
1117:
1095:
1084:
1083:
1080:Gaussian kernel
1049:
1038:
1037:
1035:kernel function
1015:
1014:
1007:
999:
987:computer vision
957:
928:
927:
901:
893:
892:
853:
845:
844:
805:Kernel machines
800:
792:
791:
767:
759:
758:
739:Active learning
734:
726:
725:
694:
684:
683:
609:Diffusion model
545:
535:
534:
507:
497:
496:
470:
460:
459:
415:Factor analysis
410:
400:
399:
383:
346:
336:
335:
256:
255:
239:
238:
237:
226:
225:
131:
123:
122:
88:Online learning
53:
41:
28:
23:
22:
15:
12:
11:
5:
4099:
4097:
4089:
4088:
4083:
4073:
4072:
4068:
4067:
4052:
4026:
4003:
3980:
3954:
3925:(5): 564–575.
3909:
3900:
3865:(5): 767–776.
3849:
3814:
3761:
3725:
3694:
3664:
3635:(5): 603–619.
3619:
3598:(8): 790–799.
3578:
3576:
3573:
3572:
3571:
3566:
3560:
3555:
3548:
3545:
3544:
3543:
3537:
3531:
3525:
3519:
3513:
3502:
3499:
3498:
3497:
3494:
3491:
3486:
3483:
3482:
3481:
3471:
3468:
3465:
3462:
3457:
3454:
3453:
3452:
3438:
3433:
3430:
3427:
3423:
3402:
3397:
3392:
3389:
3386:
3382:
3378:
3373:
3368:
3364:
3360:
3357:
3352:
3348:
3336:
3322:
3319:
3316:
3312:
3308:
3305:
3285:
3282:
3279:
3276:
3254:
3251:
3248:
3245:
3242:
3238:
3226:
3213:
3209:
3205:
3200:
3197:
3194:
3190:
3169:
3166:
3163:
3139:
3119:
3116:
3113:
3110:
3107:
3104:
3101:
3098:
3095:
3092:
3089:
3084:
3080:
3057:
3053:
3040:
3037:
3031:
3028:
3014:
2994:
2982:
2979:
2977:
2974:
2961:
2941:
2918:
2908:
2904:
2900:
2896:
2891:
2887:
2883:
2880:
2877:
2874:
2871:
2861:
2860:
2859:
2841:
2836:
2833:
2830:
2819:
2817:
2814:
2813:
2810:
2807:
2804:
2793:
2791:
2788:
2787:
2785:
2780:
2777:
2774:
2771:
2768:
2758:
2757:
2756:
2749:
2748:
2734:
2731:
2728:
2725:
2721:
2718:
2715:
2712:
2707:
2702:
2698:
2686:
2674:
2671:
2668:
2648:
2645:
2642:
2639:
2636:
2633:
2630:
2627:
2624:
2613:
2597:
2592:
2588:
2584:
2581:
2578:
2575:
2572:
2569:
2566:
2563:
2560:
2537:
2533:
2530:
2527:
2524:
2521:
2518:
2515:
2512:
2487:
2483:
2480:
2477:
2474:
2454:
2451:
2448:
2443:
2439:
2435:
2430:
2426:
2422:
2419:
2399:
2390:. The norm of
2377:
2372:
2350:
2330:
2318:
2315:
2300:
2296:
2275:
2272:
2269:
2266:
2244:
2240:
2217:
2213:
2188:
2185:
2182:
2179:
2158:
2155:
2152:
2149:
2125:
2101:
2098:
2095:
2092:
2070:
2066:
2041:
2034:
2030:
2023:
2019:
2013:
2009:
2005:
2002:
1999:
1993:
1989:
1984:
1980:
1976:
1973:
1968:
1964:
1960:
1957:
1954:
1951:
1946:
1942:
1938:
1935:
1932:
1929:
1926:
1916:
1903:
1882:
1879:
1876:
1856:
1853:
1850:
1847:
1844:
1841:
1816:
1811:
1808:
1805:
1802:
1799:
1788:
1786:
1783:
1782:
1779:
1776:
1773:
1770:
1767:
1756:
1754:
1751:
1750:
1748:
1743:
1740:
1737:
1734:
1731:
1721:
1708:
1688:
1668:
1648:
1628:
1608:
1596:
1593:
1569:
1566:
1563:
1560:
1540:
1537:
1534:
1531:
1528:
1525:
1497:
1494:
1491:
1488:
1485:
1482:
1459:
1456:
1453:
1450:
1447:
1442:
1438:
1434:
1431:
1411:
1391:
1388:
1385:
1382:
1371:
1370:
1356:
1353:
1350:
1345:
1341:
1337:
1334:
1329:
1326:
1323:
1320:
1317:
1312:
1308:
1303:
1295:
1291:
1287:
1284:
1281:
1276:
1272:
1268:
1265:
1260:
1257:
1254:
1251:
1248:
1243:
1239:
1234:
1227:
1224:
1221:
1218:
1215:
1192:
1168:
1163:
1157:
1153:
1150:
1145:
1141:
1136:
1131:
1127:
1124:
1120:
1116:
1113:
1110:
1107:
1102:
1098:
1094:
1091:
1067:
1064:
1061:
1056:
1052:
1048:
1045:
1022:
1006:
1003:
998:
995:
977:, a so-called
968:non-parametric
959:
958:
956:
955:
948:
941:
933:
930:
929:
926:
925:
920:
919:
918:
908:
902:
899:
898:
895:
894:
891:
890:
885:
880:
875:
870:
865:
860:
854:
851:
850:
847:
846:
843:
842:
837:
832:
827:
825:Occam learning
822:
817:
812:
807:
801:
798:
797:
794:
793:
790:
789:
784:
782:Learning curve
779:
774:
768:
765:
764:
761:
760:
757:
756:
751:
746:
741:
735:
732:
731:
728:
727:
724:
723:
722:
721:
711:
706:
701:
695:
690:
689:
686:
685:
682:
681:
675:
670:
665:
660:
659:
658:
648:
643:
642:
641:
636:
631:
626:
616:
611:
606:
601:
600:
599:
589:
588:
587:
582:
577:
572:
562:
557:
552:
546:
541:
540:
537:
536:
533:
532:
527:
522:
514:
508:
503:
502:
499:
498:
495:
494:
493:
492:
487:
482:
471:
466:
465:
462:
461:
458:
457:
452:
447:
442:
437:
432:
427:
422:
417:
411:
406:
405:
402:
401:
398:
397:
392:
387:
381:
376:
371:
363:
358:
353:
347:
342:
341:
338:
337:
334:
333:
328:
323:
318:
313:
308:
303:
298:
290:
289:
288:
283:
278:
268:
266:Decision trees
263:
257:
243:classification
233:
232:
231:
228:
227:
224:
223:
218:
213:
208:
203:
198:
193:
188:
183:
178:
173:
168:
163:
158:
153:
148:
143:
138:
136:Classification
132:
129:
128:
125:
124:
121:
120:
115:
110:
105:
100:
95:
93:Batch learning
90:
85:
80:
75:
70:
65:
60:
54:
51:
50:
47:
46:
35:
34:
26:
24:
14:
13:
10:
9:
6:
4:
3:
2:
4098:
4087:
4084:
4082:
4079:
4078:
4076:
4063:
4059:
4055:
4049:
4045:
4041:
4037:
4030:
4027:
4023:
4019:
4016:
4012:
4007:
4004:
3999:
3995:
3991:
3987:
3983:
3977:
3973:
3969:
3965:
3958:
3955:
3950:
3946:
3942:
3938:
3933:
3932:10.1.1.8.7474
3928:
3924:
3920:
3913:
3910:
3904:
3901:
3896:
3892:
3888:
3884:
3880:
3876:
3872:
3868:
3864:
3860:
3853:
3850:
3845:
3841:
3837:
3833:
3829:
3825:
3818:
3815:
3810:
3806:
3802:
3798:
3794:
3790:
3785:
3780:
3776:
3772:
3765:
3762:
3756:
3751:
3747:
3743:
3739:
3732:
3730:
3726:
3721:
3717:
3713:
3710:(in German).
3709:
3705:
3698:
3695:
3690:
3686:
3682:
3678:
3671:
3669:
3665:
3660:
3656:
3652:
3648:
3643:
3638:
3634:
3630:
3623:
3620:
3615:
3611:
3606:
3601:
3597:
3593:
3586:
3584:
3580:
3574:
3570:
3567:
3564:
3561:
3559:
3556:
3554:
3551:
3550:
3546:
3541:
3538:
3535:
3534:Orfeo toolbox
3532:
3529:
3526:
3523:
3520:
3517:
3514:
3511:
3508:
3507:
3506:
3500:
3495:
3492:
3489:
3488:
3484:
3479:
3477:
3472:
3469:
3466:
3463:
3460:
3459:
3455:
3436:
3431:
3428:
3425:
3421:
3395:
3390:
3387:
3384:
3380:
3376:
3371:
3366:
3362:
3355:
3350:
3346:
3337:
3320:
3317:
3314:
3310:
3306:
3303:
3280:
3274:
3267:according to
3252:
3249:
3246:
3243:
3240:
3236:
3227:
3211:
3207:
3203:
3198:
3195:
3192:
3188:
3167:
3164:
3161:
3153:
3152:
3151:
3137:
3117:
3114:
3111:
3108:
3105:
3102:
3099:
3096:
3093:
3090:
3087:
3082:
3078:
3055:
3051:
3038:
3036:
3029:
3027:
3012:
2992:
2980:
2975:
2973:
2959:
2939:
2929:
2916:
2906:
2902:
2898:
2894:
2889:
2885:
2881:
2875:
2869:
2857:
2856:
2854:
2834:
2831:
2828:
2815:
2808:
2805:
2802:
2789:
2783:
2778:
2772:
2766:
2754:
2753:
2752:
2729:
2726:
2723:
2716:
2710:
2700:
2696:
2687:
2672:
2669:
2666:
2643:
2637:
2634:
2628:
2622:
2614:
2611:
2610:
2609:
2590:
2582:
2573:
2570:
2564:
2558:
2550:
2522:
2519:
2513:
2510:
2502:
2478:
2475:
2472:
2465:. A function
2452:
2449:
2446:
2437:
2433:
2428:
2420:
2397:
2375:
2348:
2328:
2316:
2314:
2298:
2294:
2270:
2264:
2242:
2238:
2215:
2211:
2202:
2183:
2177:
2153:
2147:
2139:
2123:
2115:
2114:Parzen window
2096:
2090:
2068:
2064:
2053:
2039:
2032:
2028:
2021:
2011:
2007:
2003:
2000:
1991:
1987:
1982:
1978:
1974:
1966:
1962:
1958:
1955:
1949:
1944:
1940:
1936:
1930:
1924:
1915:
1901:
1880:
1877:
1874:
1851:
1845:
1839:
1829:
1809:
1806:
1800:
1784:
1777:
1774:
1768:
1752:
1746:
1741:
1735:
1729:
1720:
1706:
1686:
1666:
1646:
1626:
1606:
1594:
1592:
1590:
1585:
1581:
1564:
1558:
1535:
1529:
1523:
1515:
1511:
1495:
1492:
1486:
1480:
1471:
1457:
1454:
1448:
1445:
1440:
1436:
1429:
1409:
1386:
1380:
1351:
1348:
1343:
1339:
1332:
1324:
1318:
1315:
1310:
1306:
1301:
1293:
1289:
1282:
1279:
1274:
1270:
1263:
1255:
1249:
1246:
1241:
1237:
1232:
1225:
1219:
1213:
1206:
1205:
1204:
1190:
1166:
1151:
1148:
1143:
1139:
1125:
1122:
1118:
1114:
1108:
1105:
1100:
1096:
1089:
1081:
1062:
1059:
1054:
1050:
1043:
1036:
1020:
1012:
1004:
1002:
996:
994:
992:
988:
984:
980:
976:
972:
971:feature-space
969:
965:
954:
949:
947:
942:
940:
935:
934:
932:
931:
924:
921:
917:
914:
913:
912:
909:
907:
904:
903:
897:
896:
889:
886:
884:
881:
879:
876:
874:
871:
869:
866:
864:
861:
859:
856:
855:
849:
848:
841:
838:
836:
833:
831:
828:
826:
823:
821:
818:
816:
813:
811:
808:
806:
803:
802:
796:
795:
788:
785:
783:
780:
778:
775:
773:
770:
769:
763:
762:
755:
752:
750:
747:
745:
744:Crowdsourcing
742:
740:
737:
736:
730:
729:
720:
717:
716:
715:
712:
710:
707:
705:
702:
700:
697:
696:
693:
688:
687:
679:
676:
674:
673:Memtransistor
671:
669:
666:
664:
661:
657:
654:
653:
652:
649:
647:
644:
640:
637:
635:
632:
630:
627:
625:
622:
621:
620:
617:
615:
612:
610:
607:
605:
602:
598:
595:
594:
593:
590:
586:
583:
581:
578:
576:
573:
571:
568:
567:
566:
563:
561:
558:
556:
555:Deep learning
553:
551:
548:
547:
544:
539:
538:
531:
528:
526:
523:
521:
519:
515:
513:
510:
509:
506:
501:
500:
491:
490:Hidden Markov
488:
486:
483:
481:
478:
477:
476:
473:
472:
469:
464:
463:
456:
453:
451:
448:
446:
443:
441:
438:
436:
433:
431:
428:
426:
423:
421:
418:
416:
413:
412:
409:
404:
403:
396:
393:
391:
388:
386:
382:
380:
377:
375:
372:
370:
368:
364:
362:
359:
357:
354:
352:
349:
348:
345:
340:
339:
332:
329:
327:
324:
322:
319:
317:
314:
312:
309:
307:
304:
302:
299:
297:
295:
291:
287:
286:Random forest
284:
282:
279:
277:
274:
273:
272:
269:
267:
264:
262:
259:
258:
251:
250:
245:
244:
236:
230:
229:
222:
219:
217:
214:
212:
209:
207:
204:
202:
199:
197:
194:
192:
189:
187:
184:
182:
179:
177:
174:
172:
171:Data cleaning
169:
167:
164:
162:
159:
157:
154:
152:
149:
147:
144:
142:
139:
137:
134:
133:
127:
126:
119:
116:
114:
111:
109:
106:
104:
101:
99:
96:
94:
91:
89:
86:
84:
83:Meta-learning
81:
79:
76:
74:
71:
69:
66:
64:
61:
59:
56:
55:
49:
48:
45:
40:
36:
32:
31:
19:
4035:
4029:
4011:Gary Bradski
4006:
3963:
3957:
3922:
3918:
3912:
3903:
3862:
3858:
3852:
3827:
3823:
3817:
3774:
3770:
3764:
3745:
3741:
3714:(1): 47–48.
3711:
3707:
3697:
3683:(1): 32–40.
3680:
3676:
3632:
3628:
3622:
3595:
3591:
3540:scikit-learn
3504:
3501:Availability
3475:
3042:
3033:
2984:
2976:Applications
2931:
2862:
2759:
2750:
2551:
2549:, such that
2500:
2320:
2200:
2137:
2113:
2055:
1917:
1831:
1722:
1598:
1586:
1582:
1513:
1509:
1472:
1372:
1008:
1000:
963:
962:
830:PAC learning
517:
394:
366:
361:Hierarchical
293:
247:
241:
3154:Initialize
2755:Flat kernel
1580:converges.
714:Multi-agent
651:Transformer
550:Autoencoder
306:Naive Bayes
44:data mining
4075:Categories
3575:References
3485:Weaknesses
2981:Clustering
1510:mean shift
1508:is called
964:Mean shift
699:Q-learning
597:Restricted
395:Mean shift
344:Clustering
321:Perceptron
249:regression
151:Clustering
146:Regression
18:Mean-shift
3927:CiteSeerX
3879:0162-8828
3784:1407.2961
3637:CiteSeerX
3600:CiteSeerX
3456:Strengths
3281:⋅
3039:Smoothing
2940:σ
2903:σ
2890:−
2835:λ
2809:λ
2806:≤
2733:∞
2706:∞
2697:∫
2635:≥
2587:‖
2580:‖
2532:→
2526:∞
2482:→
2450:≥
2442:⊤
2425:‖
2418:‖
2018:‖
2004:−
1998:‖
1979:∑
1959:−
1941:∑
1878:∈
1843:←
1810:λ
1804:‖
1798:‖
1778:λ
1775:≤
1772:‖
1766:‖
1699:-ball in
1687:λ
1527:←
1516:now sets
1493:−
1455:≠
1446:−
1349:−
1316:∈
1302:∑
1280:−
1247:∈
1233:∑
1149:−
1123:−
1106:−
1060:−
1033:. Let a
858:ECML PKDD
840:VC theory
787:ROC curve
719:Self-play
639:DeepDream
480:Bayes net
271:Ensembles
52:Paradigms
4062:15864761
4018:Archived
3990:17170479
3887:17356198
3809:10233475
3748:: 1–10.
3547:See also
3228:Compute
3030:Tracking
1005:Overview
281:Boosting
130:Problems
4013:(1998)
3998:1638397
3895:6694308
3832:Bibcode
3789:Bibcode
3338:Assign
3130:be the
2501:profile
2341:be the
1595:Details
997:History
863:NeurIPS
680:(ECRAM)
634:AlexNet
276:Bagging
4060:
4050:
3996:
3988:
3978:
3949:823678
3947:
3929:
3893:
3885:
3877:
3807:
3659:691081
3657:
3639:
3602:
3553:DBSCAN
3528:OpenCV
3522:mlpack
3516:ImageJ
3478:-means
2826:
2800:
2736:
2056:where
1795:
1763:
1659:. Let
1373:where
656:Vision
512:RANSAC
390:OPTICS
385:DBSCAN
369:-means
176:AutoML
4058:S2CID
3994:S2CID
3945:S2CID
3891:S2CID
3805:S2CID
3779:arXiv
3655:S2CID
3565:(KDE)
2608:and
1011:modes
966:is a
878:IJCAI
704:SARSA
663:Mamba
629:LeNet
624:U-Net
450:t-SNE
374:Fuzzy
351:BIRCH
4048:ISBN
3986:PMID
3976:ISBN
3883:PMID
3875:ISSN
3510:ELKI
3180:and
3070:and
3043:Let
2832:>
2730:<
2670:<
1807:>
989:and
979:mode
888:JMLR
873:ICLR
868:ICML
754:RLHF
570:LSTM
356:CURE
42:and
4040:doi
3968:doi
3937:doi
3867:doi
3840:doi
3797:doi
3750:doi
3746:135
3716:doi
3685:doi
3647:doi
3610:doi
2659:if
2286:at
2116:).
1203:is
985:in
614:SOM
604:GAN
580:ESN
575:GRU
520:-NN
455:SDL
445:PGD
440:PCA
435:NMF
430:LDA
425:ICA
420:CCA
296:-NN
4077::
4056:.
4046:.
3992:.
3984:.
3974:.
3943:.
3935:.
3923:25
3921:.
3889:.
3881:.
3873:.
3863:29
3861:.
3838:.
3828:40
3826:.
3803:.
3795:.
3787:.
3775:34
3773:.
3744:.
3740:.
3728:^
3706:.
3681:21
3679:.
3667:^
3653:.
3645:.
3633:24
3631:.
3608:.
3596:17
3594:.
3582:^
2972:.
2822:if
2796:if
2503:,
1914:,
1791:if
1759:if
1719:,
1591:.
1470:.
993:.
883:ML
4064:.
4042::
4000:.
3970::
3951:.
3939::
3897:.
3869::
3846:.
3842::
3834::
3811:.
3799::
3791::
3781::
3758:.
3752::
3722:.
3718::
3712:6
3691:.
3687::
3661:.
3649::
3616:.
3612::
3480:.
3476:k
3451:.
3437:r
3432:c
3429:,
3426:i
3422:y
3401:)
3396:r
3391:c
3388:,
3385:i
3381:y
3377:,
3372:s
3367:i
3363:x
3359:(
3356:=
3351:i
3347:z
3335:.
3321:c
3318:,
3315:i
3311:y
3307:=
3304:y
3284:)
3278:(
3275:m
3253:1
3250:+
3247:j
3244:,
3241:i
3237:y
3212:i
3208:x
3204:=
3199:1
3196:,
3193:i
3189:y
3168:1
3165:=
3162:j
3138:d
3118:,
3115:n
3112:,
3109:.
3106:.
3103:.
3100:,
3097:1
3094:=
3091:i
3088:,
3083:i
3079:z
3056:i
3052:x
3013:r
2993:C
2960:h
2917:,
2907:2
2899:2
2895:x
2886:e
2882:=
2879:)
2876:x
2873:(
2870:k
2829:x
2816:0
2803:x
2790:1
2784:{
2779:=
2776:)
2773:x
2770:(
2767:k
2727:r
2724:d
2720:)
2717:r
2714:(
2711:k
2701:0
2685:.
2673:b
2667:a
2647:)
2644:b
2641:(
2638:k
2632:)
2629:a
2626:(
2623:k
2596:)
2591:2
2583:x
2577:(
2574:k
2571:=
2568:)
2565:x
2562:(
2559:K
2536:R
2529:]
2523:,
2520:0
2517:[
2514::
2511:k
2486:R
2479:X
2476::
2473:K
2453:0
2447:x
2438:x
2434:=
2429:2
2421:x
2398:x
2376:n
2371:R
2349:n
2329:X
2299:k
2295:y
2274:)
2271:x
2268:(
2265:f
2243:1
2239:x
2216:k
2212:y
2187:)
2184:x
2181:(
2178:f
2157:)
2154:x
2151:(
2148:f
2124:h
2100:)
2097:r
2094:(
2091:k
2069:i
2065:x
2040:)
2033:2
2029:h
2022:2
2012:i
2008:x
2001:x
1992:(
1988:k
1983:i
1975:=
1972:)
1967:i
1963:x
1956:x
1953:(
1950:K
1945:i
1937:=
1934:)
1931:x
1928:(
1925:f
1902:h
1881:S
1875:s
1855:)
1852:s
1849:(
1846:m
1840:s
1801:x
1785:0
1769:x
1753:1
1747:{
1742:=
1739:)
1736:x
1733:(
1730:K
1707:X
1667:K
1647:X
1627:n
1607:S
1568:)
1565:x
1562:(
1559:m
1539:)
1536:x
1533:(
1530:m
1524:x
1496:x
1490:)
1487:x
1484:(
1481:m
1458:0
1452:)
1449:x
1441:i
1437:x
1433:(
1430:K
1410:x
1390:)
1387:x
1384:(
1381:N
1355:)
1352:x
1344:i
1340:x
1336:(
1333:K
1328:)
1325:x
1322:(
1319:N
1311:i
1307:x
1294:i
1290:x
1286:)
1283:x
1275:i
1271:x
1267:(
1264:K
1259:)
1256:x
1253:(
1250:N
1242:i
1238:x
1226:=
1223:)
1220:x
1217:(
1214:m
1191:K
1167:2
1162:|
1156:|
1152:x
1144:i
1140:x
1135:|
1130:|
1126:c
1119:e
1115:=
1112:)
1109:x
1101:i
1097:x
1093:(
1090:K
1066:)
1063:x
1055:i
1051:x
1047:(
1044:K
1021:x
952:e
945:t
938:v
518:k
367:k
294:k
252:)
240:(
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.