Knowledge

Siamese neural network

Source 📝

604: 83:. For learning by triplet loss a baseline vector (anchor image) is compared against a positive vector (truthy image) and a negative vector (falsy image). The negative vector will force learning in the network, while the positive vector will act like a regularizer. For learning by contrastive loss there must be a weight decay to regularize the weights, or some similar operation like a normalization. 342: 1636: 1094: 1377: 1763:
search image. By measuring the similarity between exemplar and each part of the search image, a map of similarity score can be given by the twin network. Furthermore, using a Fully Convolutional Network, the process of computing each sector's similarity score can be replaced with only one cross correlation layer.
846: 1762:
Twin networks have been used in object tracking because of its unique two tandem inputs and similarity measurement. In object tracking, one input of the twin network is user pre-selected exemplar image, the other input is a larger search image, which twin network's job is to locate exemplar inside of
1766:
After being first introduced in 2016, Twin fully convolutional network has been used in many High-performance Real-time Object Tracking Neural Networks. Like CFnet, StructSiam, SiamFC-tri, DSiam, SA-Siam, SiamRPN, DaSiamRPN, Cascaded SiamRPN, SiamMask, SiamRPN++, Deeper and Wider SiamRPN.
599:{\displaystyle {\begin{aligned}\delta (x^{(i)},x^{(j)})={\begin{cases}\min \ \|\operatorname {f} \left(x^{(i)}\right)-\operatorname {f} \left(x^{(j)}\right)\|\,,i=j\\\max \ \|\operatorname {f} \left(x^{(i)}\right)-\operatorname {f} \left(x^{(j)}\right)\|\,,i\neq j\end{cases}}\end{aligned}}} 59:, where known images of people are precomputed and compared to an image from a turnstile or similar. It is not obvious at first, but there are two slightly different problems. One is recognizing a person among a large number of other persons, that is the facial recognition problem. 1404: 862: 31:
that uses the same weights while working in tandem on two different input vectors to compute comparable output vectors. Often one of the output vectors is precomputed, thus forming a baseline against which the other output vector is compared. This is similar to comparing
1207: 681: 1715: 43:
It is possible to build an architecture that is functionally similar to a twin network but implements a slightly different function. This is typically used for comparing similar instances in different type sets.
1631:{\displaystyle {\begin{aligned}{\text{if}}\,i=j\,{\text{then}}&\,\operatorname {\delta } \left\,{\text{is small}}\\{\text{otherwise}}&\,\operatorname {\delta } \left\,{\text{is large}}\end{aligned}}} 1089:{\displaystyle {\begin{aligned}{\text{if}}\,i=j\,{\text{then}}&\,\operatorname {\delta } \left\,{\text{is small}}\\{\text{otherwise}}&\,\operatorname {\delta } \left\,{\text{is large}}\end{aligned}}} 323: 67:, that is to verify whether the photo in a pass is the same as the person claiming he or she is the same person. The twin network might be the same, but the implementation can be quite different. 1409: 867: 347: 1751: 1191: 1155: 665: 244: 1372:{\displaystyle \operatorname {\delta } (\mathbf {x} ^{(i)},\mathbf {x} ^{(j)})\approx (\mathbf {x} ^{(i)}-\mathbf {x} ^{(j)})^{T}\mathbf {M} (\mathbf {x} ^{(i)}-\mathbf {x} ^{(j)})} 186: 129: 841:{\displaystyle \operatorname {\delta } (\mathbf {x} ^{(i)},\mathbf {x} ^{(j)})\approx (\mathbf {x} ^{(i)}-\mathbf {x} ^{(j)})^{T}(\mathbf {x} ^{(i)}-\mathbf {x} ^{(j)})} 328:
In particular, the triplet loss algorithm is often defined with squared Euclidean (which unlike Euclidean, does not have triangle inequality) distance at its core.
1663: 1121: 631: 2252:
Li, Bo; Wu, Wei; Wang, Qiang; Zhang, Fangyi; Xing, Junliang; Yan, Junjie (2018). "SiamRPN++: Evolution of Siamese Visual Tracking with Very Deep Networks".
856:
A more general case is where the output vector from the twin network is passed through additional network layers implementing non-linear distance metrics.
2231:
Wang, Qiang; Zhang, Li; Bertinetto, Luca; Hu, Weiming; Torr, Philip H. S. (2018). "Fast Online Object Tracking and Segmentation: A Unifying Approach".
2104: 336:
The common learning goal is to minimize a distance metric for similar objects and maximize for distinct ones. This gives a loss function like
2299: 2000: 1882: 1819: 2121: 2189:
Zhu, Zheng; Wang, Qiang; Li, Bo; Wu, Wei; Yan, Junjie; Hu, Weiming (2018). "Distractor-aware Siamese Networks for Visual Object Tracking".
1670: 2138: 63:
is an example of such a system. In its most extreme form this is recognizing a single person at a train station or airport. The other is
1939:
Chopra, S.; Hadsell, R.; LeCun, Y. (June 2005). "Learning a Similarity Metric Discriminatively, with Application to Face Verification".
1914: 2172: 1983:
Taigman, Y.; Yang, M.; Ranzato, M.; Wolf, L. (June 2014). "DeepFace: Closing the Gap to Human-Level Performance in Face Verification".
1956: 2155: 251: 2049: 55:
in camera images, and matching queries with indexed documents. The perhaps most well-known application of twin networks are
1722: 1162: 37: 1776: 28: 1128: 638: 193: 1398:
This form also allows the twin network to be more of a half-twin, implementing a slightly different functions
136: 48: 2210:
Fan, Heng; Ling, Haibin (2018). "Siamese Cascaded Region Proposal Networks for Real-Time Visual Tracking".
2273:
Zhang, Zhipeng; Peng, Houwen (2019). "Deeper and Wider Siamese Networks for Real-Time Visual Tracking".
1383: 93: 1198: 402: 1387: 2274: 2253: 2232: 2211: 2190: 2074: 2006: 1962: 1896: 1833: 672: 2025: 1996: 1952: 1888: 1878: 1825: 1815: 64: 1865:, Methods in Molecular Biology, vol. 2190 (3rd ed.), New York City, New York, USA: 1802:, Methods in Molecular Biology, vol. 2190 (3rd ed.), New York City, New York, USA: 1988: 1944: 1870: 1807: 80: 56: 1941:
2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05)
1866: 1803: 1642: 1100: 610: 52: 2293: 1913:
Bromley, Jane; Guyon, Isabelle; LeCun, Yann; Säckinger, Eduard; Shah, Roopak (1994).
1900: 1837: 2010: 1966: 1781: 76: 47:
Uses of similarity measures where a twin network might be used are such things as
1874: 1811: 33: 2090: 1858: 1795: 1892: 1829: 2091:"End-to-end representation learning for Correlation Filter based tracking" 1992: 1948: 1753:
function implemented by the network joining outputs from the twin network
1193:
function implemented by the network joining outputs from the twin network
60: 675:, in case of which the loss function can be rewritten in matrix form as 86:
A distance metric for a loss function may have the following properties
2173:"High Performance Visual Tracking with Siamese Region Proposal Network" 1710:{\displaystyle \operatorname {f} (\cdot ),\operatorname {g} (\cdot )} 2026:"Similarity Learning with (or without) Convolutional Neural Network" 1915:"Signature verification using a "Siamese" time delay neural network" 2279: 2258: 2237: 2216: 2195: 2078: 36:
but can be described more technically as a distance function for
1985:
2014 IEEE Conference on Computer Vision and Pattern Recognition
2139:"Learning Dynamic Siamese Network for Visual Object Tracking" 588: 2105:"Structured Siamese Network for Real-Time Visual Tracking" 2057:
Proceedings of the National Institute of Sciences of India
318:{\displaystyle \delta (x,z)\leq \delta (x,y)+\delta (y,z)} 2156:"A Twofold Siamese Network for Real-Time Object Tracking" 2073:
Fully-Convolutional Siamese Networks for Object Tracking
1197:
On a matrix form the previous is often approximated as a
2122:"Triplet Loss in Siamese Network for Object Tracking" 1725: 1673: 1645: 1407: 1210: 1165: 1131: 1103: 865: 684: 641: 613: 345: 254: 196: 139: 96: 1745: 1709: 1657: 1630: 1371: 1185: 1149: 1115: 1088: 840: 659: 625: 598: 317: 238: 180: 123: 1922:Advances in Neural Information Processing Systems 1746:{\displaystyle \operatorname {\delta } (\cdot )} 1186:{\displaystyle \operatorname {\delta } (\cdot )} 497: 405: 1717:function implemented by the half-twin network 332:Predefined metrics, Euclidean distance metric 8: 569: 503: 477: 411: 2050:"On the generalized distance in statistics" 1382:This can be further subdivided in at least 75:Learning in twin networks can be done with 16:Neural network working on two input vectors 1150:{\displaystyle \operatorname {f} (\cdot )} 852:Learned metrics, nonlinear distance metric 660:{\displaystyle \operatorname {f} (\cdot )} 168: 164: 2278: 2257: 2236: 2215: 2194: 1726: 1724: 1672: 1644: 1619: 1618: 1597: 1582: 1563: 1536: 1535: 1528: 1519: 1518: 1497: 1482: 1463: 1436: 1435: 1428: 1427: 1417: 1412: 1408: 1406: 1354: 1349: 1333: 1328: 1319: 1313: 1297: 1292: 1276: 1271: 1249: 1244: 1228: 1223: 1211: 1209: 1166: 1164: 1130: 1102: 1077: 1076: 1055: 1040: 1021: 994: 993: 986: 977: 976: 955: 940: 921: 894: 893: 886: 885: 875: 870: 866: 864: 823: 818: 802: 797: 787: 771: 766: 750: 745: 723: 718: 702: 697: 685: 683: 640: 612: 572: 553: 520: 480: 461: 428: 397: 379: 360: 346: 344: 253: 239:{\displaystyle \delta (x,y)=\delta (y,x)} 195: 138: 95: 1943:. Vol. 1. pp. 539–546 vol. 1. 1157:function implemented by the twin network 671:The most common distance metric used is 667:function implemented by the twin network 1849: 1859:"Siamese neural networks: an overview" 1796:"Siamese neural networks: an overview" 181:{\displaystyle \delta (x,y)=0\iff x=y} 7: 1978: 1976: 1394:Learned metrics, half-twin networks 2024:Chatterjee, Moitreya; Luo, Yunan. 1692: 1674: 1583: 1549: 1483: 1449: 1132: 1041: 1007: 941: 907: 642: 539: 506: 447: 414: 124:{\displaystyle \delta (x,y)\geq 0} 14: 1758:Twin networks for object tracking 1665:are indexes into a set of vectors 1123:are indexes into a set of vectors 633:are indexes into a set of vectors 1869:, Humana Press, pp. 73–94, 1806:, Humana Press, pp. 73–94, 1350: 1329: 1320: 1293: 1272: 1245: 1224: 819: 798: 767: 746: 719: 698: 1740: 1734: 1704: 1698: 1686: 1680: 1604: 1598: 1570: 1564: 1504: 1498: 1470: 1464: 1366: 1361: 1355: 1340: 1334: 1324: 1310: 1304: 1298: 1283: 1277: 1267: 1261: 1256: 1250: 1235: 1229: 1219: 1180: 1174: 1144: 1138: 1062: 1056: 1028: 1022: 962: 956: 928: 922: 835: 830: 824: 809: 803: 793: 784: 778: 772: 757: 751: 741: 735: 730: 724: 709: 703: 693: 654: 648: 560: 554: 527: 521: 468: 462: 435: 429: 391: 386: 380: 367: 361: 353: 312: 300: 291: 279: 270: 258: 233: 221: 212: 200: 165: 155: 143: 133:Identity of Non-discernibles: 112: 100: 1: 2300:Neural network architectures 1875:10.1007/978-1-0716-0826-5_3 1812:10.1007/978-1-0716-0826-5_3 2316: 1863:Artificial Neural Networks 1800:Artificial Neural Networks 38:locality-sensitive hashing 1777:Artificial neural network 29:artificial neural network 1857:Chicco, Davide (2020), 1794:Chicco, Davide (2020), 1201:for a linear space as 49:recognizing handwritten 2048:Chandra, M.P. (1936). 1987:. pp. 1701–1708. 1747: 1711: 1659: 1632: 1373: 1187: 1151: 1117: 1090: 842: 661: 627: 600: 319: 240: 182: 125: 21:Siamese neural network 1993:10.1109/CVPR.2014.220 1949:10.1109/CVPR.2005.202 1748: 1712: 1660: 1633: 1384:Unsupervised learning 1374: 1188: 1152: 1118: 1091: 843: 662: 628: 601: 320: 248:Triangle inequality: 241: 183: 126: 1723: 1671: 1643: 1405: 1208: 1199:Mahalanobis distance 1163: 1129: 1101: 863: 682: 639: 611: 343: 252: 194: 137: 94: 23:(sometimes called a 1658:{\displaystyle i,j} 1388:Supervised learning 1116:{\displaystyle i,j} 626:{\displaystyle i,j} 25:twin neural network 1867:Springer Protocols 1804:Springer Protocols 1743: 1707: 1655: 1628: 1626: 1369: 1183: 1147: 1113: 1086: 1084: 838: 673:Euclidean distance 657: 623: 596: 594: 587: 315: 236: 178: 121: 53:detection of faces 51:checks, automatic 2002:978-1-4799-5118-5 1884:978-1-0716-0826-5 1821:978-1-0716-0826-5 1622: 1531: 1522: 1431: 1415: 1080: 989: 980: 889: 873: 502: 410: 65:face verification 2307: 2285: 2284: 2282: 2270: 2264: 2263: 2261: 2249: 2243: 2242: 2240: 2228: 2222: 2221: 2219: 2207: 2201: 2200: 2198: 2186: 2180: 2179: 2177: 2169: 2163: 2162: 2160: 2152: 2146: 2145: 2143: 2135: 2129: 2128: 2126: 2118: 2112: 2111: 2109: 2101: 2095: 2094: 2087: 2081: 2071: 2065: 2064: 2054: 2045: 2039: 2038: 2036: 2035: 2030: 2021: 2015: 2014: 1980: 1971: 1970: 1936: 1930: 1929: 1919: 1910: 1904: 1903: 1854: 1840: 1752: 1750: 1749: 1744: 1730: 1716: 1714: 1713: 1708: 1664: 1662: 1661: 1656: 1637: 1635: 1634: 1629: 1627: 1623: 1620: 1617: 1613: 1612: 1608: 1607: 1578: 1574: 1573: 1540: 1532: 1529: 1523: 1520: 1517: 1513: 1512: 1508: 1507: 1478: 1474: 1473: 1440: 1432: 1429: 1416: 1413: 1378: 1376: 1375: 1370: 1365: 1364: 1353: 1344: 1343: 1332: 1323: 1318: 1317: 1308: 1307: 1296: 1287: 1286: 1275: 1260: 1259: 1248: 1239: 1238: 1227: 1215: 1192: 1190: 1189: 1184: 1170: 1156: 1154: 1153: 1148: 1122: 1120: 1119: 1114: 1095: 1093: 1092: 1087: 1085: 1081: 1078: 1075: 1071: 1070: 1066: 1065: 1036: 1032: 1031: 998: 990: 987: 981: 978: 975: 971: 970: 966: 965: 936: 932: 931: 898: 890: 887: 874: 871: 847: 845: 844: 839: 834: 833: 822: 813: 812: 801: 792: 791: 782: 781: 770: 761: 760: 749: 734: 733: 722: 713: 712: 701: 689: 666: 664: 663: 658: 632: 630: 629: 624: 605: 603: 602: 597: 595: 591: 590: 568: 564: 563: 535: 531: 530: 500: 476: 472: 471: 443: 439: 438: 408: 390: 389: 371: 370: 324: 322: 321: 316: 245: 243: 242: 237: 187: 185: 184: 179: 130: 128: 127: 122: 90:Non-negativity: 81:contrastive loss 57:face recognition 2315: 2314: 2310: 2309: 2308: 2306: 2305: 2304: 2290: 2289: 2288: 2272: 2271: 2267: 2251: 2250: 2246: 2230: 2229: 2225: 2209: 2208: 2204: 2188: 2187: 2183: 2175: 2171: 2170: 2166: 2158: 2154: 2153: 2149: 2141: 2137: 2136: 2132: 2124: 2120: 2119: 2115: 2107: 2103: 2102: 2098: 2089: 2088: 2084: 2072: 2068: 2052: 2047: 2046: 2042: 2033: 2031: 2028: 2023: 2022: 2018: 2003: 1982: 1981: 1974: 1959: 1938: 1937: 1933: 1917: 1912: 1911: 1907: 1885: 1856: 1855: 1851: 1847: 1822: 1793: 1790: 1788:Further reading 1773: 1760: 1721: 1720: 1669: 1668: 1641: 1640: 1625: 1624: 1593: 1589: 1559: 1555: 1548: 1544: 1533: 1525: 1524: 1493: 1489: 1459: 1455: 1448: 1444: 1433: 1403: 1402: 1396: 1348: 1327: 1309: 1291: 1270: 1243: 1222: 1206: 1205: 1161: 1160: 1127: 1126: 1099: 1098: 1083: 1082: 1051: 1047: 1017: 1013: 1006: 1002: 991: 983: 982: 951: 947: 917: 913: 906: 902: 891: 861: 860: 854: 817: 796: 783: 765: 744: 717: 696: 680: 679: 637: 636: 609: 608: 593: 592: 586: 585: 549: 545: 516: 512: 494: 493: 457: 453: 424: 420: 398: 375: 356: 341: 340: 334: 250: 249: 192: 191: 190:Commutativity: 135: 134: 92: 91: 73: 17: 12: 11: 5: 2313: 2311: 2303: 2302: 2292: 2291: 2287: 2286: 2265: 2244: 2223: 2202: 2181: 2164: 2147: 2130: 2113: 2096: 2082: 2066: 2040: 2016: 2001: 1972: 1957: 1931: 1905: 1883: 1848: 1846: 1843: 1842: 1841: 1820: 1789: 1786: 1785: 1784: 1779: 1772: 1769: 1759: 1756: 1755: 1754: 1742: 1739: 1736: 1733: 1729: 1718: 1706: 1703: 1700: 1697: 1694: 1691: 1688: 1685: 1682: 1679: 1676: 1666: 1654: 1651: 1648: 1638: 1616: 1611: 1606: 1603: 1600: 1596: 1592: 1588: 1585: 1581: 1577: 1572: 1569: 1566: 1562: 1558: 1554: 1551: 1547: 1543: 1539: 1534: 1527: 1526: 1516: 1511: 1506: 1503: 1500: 1496: 1492: 1488: 1485: 1481: 1477: 1472: 1469: 1466: 1462: 1458: 1454: 1451: 1447: 1443: 1439: 1434: 1426: 1423: 1420: 1411: 1410: 1395: 1392: 1380: 1379: 1368: 1363: 1360: 1357: 1352: 1347: 1342: 1339: 1336: 1331: 1326: 1322: 1316: 1312: 1306: 1303: 1300: 1295: 1290: 1285: 1282: 1279: 1274: 1269: 1266: 1263: 1258: 1255: 1252: 1247: 1242: 1237: 1234: 1231: 1226: 1221: 1218: 1214: 1195: 1194: 1182: 1179: 1176: 1173: 1169: 1158: 1146: 1143: 1140: 1137: 1134: 1124: 1112: 1109: 1106: 1096: 1074: 1069: 1064: 1061: 1058: 1054: 1050: 1046: 1043: 1039: 1035: 1030: 1027: 1024: 1020: 1016: 1012: 1009: 1005: 1001: 997: 992: 985: 984: 974: 969: 964: 961: 958: 954: 950: 946: 943: 939: 935: 930: 927: 924: 920: 916: 912: 909: 905: 901: 897: 892: 884: 881: 878: 869: 868: 853: 850: 849: 848: 837: 832: 829: 826: 821: 816: 811: 808: 805: 800: 795: 790: 786: 780: 777: 774: 769: 764: 759: 756: 753: 748: 743: 740: 737: 732: 729: 726: 721: 716: 711: 708: 705: 700: 695: 692: 688: 669: 668: 656: 653: 650: 647: 644: 634: 622: 619: 616: 606: 589: 584: 581: 578: 575: 571: 567: 562: 559: 556: 552: 548: 544: 541: 538: 534: 529: 526: 523: 519: 515: 511: 508: 505: 499: 496: 495: 492: 489: 486: 483: 479: 475: 470: 467: 464: 460: 456: 452: 449: 446: 442: 437: 434: 431: 427: 423: 419: 416: 413: 407: 404: 403: 401: 396: 393: 388: 385: 382: 378: 374: 369: 366: 363: 359: 355: 352: 349: 348: 333: 330: 326: 325: 314: 311: 308: 305: 302: 299: 296: 293: 290: 287: 284: 281: 278: 275: 272: 269: 266: 263: 260: 257: 246: 235: 232: 229: 226: 223: 220: 217: 214: 211: 208: 205: 202: 199: 188: 177: 174: 171: 167: 163: 160: 157: 154: 151: 148: 145: 142: 131: 120: 117: 114: 111: 108: 105: 102: 99: 72: 69: 15: 13: 10: 9: 6: 4: 3: 2: 2312: 2301: 2298: 2297: 2295: 2281: 2276: 2269: 2266: 2260: 2255: 2248: 2245: 2239: 2234: 2227: 2224: 2218: 2213: 2206: 2203: 2197: 2192: 2185: 2182: 2174: 2168: 2165: 2157: 2151: 2148: 2140: 2134: 2131: 2123: 2117: 2114: 2106: 2100: 2097: 2092: 2086: 2083: 2080: 2076: 2070: 2067: 2062: 2058: 2051: 2044: 2041: 2027: 2020: 2017: 2012: 2008: 2004: 1998: 1994: 1990: 1986: 1979: 1977: 1973: 1968: 1964: 1960: 1958:0-7695-2372-2 1954: 1950: 1946: 1942: 1935: 1932: 1927: 1923: 1916: 1909: 1906: 1902: 1898: 1894: 1890: 1886: 1880: 1876: 1872: 1868: 1864: 1860: 1853: 1850: 1844: 1839: 1835: 1831: 1827: 1823: 1817: 1813: 1809: 1805: 1801: 1797: 1792: 1791: 1787: 1783: 1780: 1778: 1775: 1774: 1770: 1768: 1764: 1757: 1737: 1731: 1727: 1719: 1701: 1695: 1689: 1683: 1677: 1667: 1652: 1649: 1646: 1639: 1614: 1609: 1601: 1594: 1590: 1586: 1579: 1575: 1567: 1560: 1556: 1552: 1545: 1541: 1537: 1514: 1509: 1501: 1494: 1490: 1486: 1479: 1475: 1467: 1460: 1456: 1452: 1445: 1441: 1437: 1424: 1421: 1418: 1401: 1400: 1399: 1393: 1391: 1389: 1385: 1358: 1345: 1337: 1314: 1301: 1288: 1280: 1264: 1253: 1240: 1232: 1216: 1212: 1204: 1203: 1202: 1200: 1177: 1171: 1167: 1159: 1141: 1135: 1125: 1110: 1107: 1104: 1097: 1072: 1067: 1059: 1052: 1048: 1044: 1037: 1033: 1025: 1018: 1014: 1010: 1003: 999: 995: 972: 967: 959: 952: 948: 944: 937: 933: 925: 918: 914: 910: 903: 899: 895: 882: 879: 876: 859: 858: 857: 851: 827: 814: 806: 788: 775: 762: 754: 738: 727: 714: 706: 690: 686: 678: 677: 676: 674: 651: 645: 635: 620: 617: 614: 607: 582: 579: 576: 573: 565: 557: 550: 546: 542: 536: 532: 524: 517: 513: 509: 490: 487: 484: 481: 473: 465: 458: 454: 450: 444: 440: 432: 425: 421: 417: 399: 394: 383: 376: 372: 364: 357: 350: 339: 338: 337: 331: 329: 309: 306: 303: 297: 294: 288: 285: 282: 276: 273: 267: 264: 261: 255: 247: 230: 227: 224: 218: 215: 209: 206: 203: 197: 189: 175: 172: 169: 161: 158: 152: 149: 146: 140: 132: 118: 115: 109: 106: 103: 97: 89: 88: 87: 84: 82: 78: 70: 68: 66: 62: 58: 54: 50: 45: 41: 39: 35: 30: 26: 22: 2268: 2247: 2226: 2205: 2184: 2167: 2150: 2133: 2116: 2099: 2085: 2069: 2060: 2056: 2043: 2032:. Retrieved 2019: 1984: 1940: 1934: 1925: 1921: 1908: 1862: 1852: 1799: 1782:Triplet loss 1765: 1761: 1397: 1381: 1196: 855: 670: 335: 327: 85: 77:triplet loss 74: 46: 42: 34:fingerprints 24: 20: 18: 2280:1901.01660 2259:1812.11703 2238:1812.05050 2217:1812.06148 2196:1808.06048 2079:1606.09549 2034:2018-12-07 1928:: 737–744. 1845:References 1901:221144012 1838:221144012 1738:⋅ 1732:⁡ 1728:δ 1702:⋅ 1696:⁡ 1684:⋅ 1678:⁡ 1587:⁡ 1553:⁡ 1542:⁡ 1538:δ 1530:otherwise 1487:⁡ 1453:⁡ 1442:⁡ 1438:δ 1346:− 1289:− 1265:≈ 1217:⁡ 1213:δ 1178:⋅ 1172:⁡ 1168:δ 1142:⋅ 1136:⁡ 1045:⁡ 1011:⁡ 1000:⁡ 996:δ 988:otherwise 945:⁡ 911:⁡ 900:⁡ 896:δ 815:− 763:− 739:≈ 691:⁡ 687:δ 652:⋅ 646:⁡ 580:≠ 570:‖ 543:⁡ 537:− 510:⁡ 504:‖ 478:‖ 451:⁡ 445:− 418:⁡ 412:‖ 351:δ 298:δ 277:δ 274:≤ 256:δ 219:δ 198:δ 166:⟺ 141:δ 116:≥ 98:δ 2294:Category 2063:: 49–55. 1893:32804361 1830:32804361 1771:See also 1621:is large 1521:is small 1079:is large 979:is small 71:Learning 61:DeepFace 27:) is an 2011:2814088 1967:5555257 2009:  1999:  1965:  1955:  1899:  1891:  1881:  1836:  1828:  1818:  501:  409:  2275:arXiv 2254:arXiv 2233:arXiv 2212:arXiv 2191:arXiv 2176:(PDF) 2159:(PDF) 2142:(PDF) 2125:(PDF) 2108:(PDF) 2075:arXiv 2059:. 1. 2053:(PDF) 2029:(PDF) 2007:S2CID 1963:S2CID 1918:(PDF) 1897:S2CID 1834:S2CID 1997:ISBN 1953:ISBN 1889:PMID 1879:ISBN 1826:PMID 1816:ISBN 1430:then 1386:and 888:then 1989:doi 1945:doi 1871:doi 1808:doi 498:max 406:min 79:or 2296:: 2055:. 2005:. 1995:. 1975:^ 1961:. 1951:. 1924:. 1920:. 1895:, 1887:, 1877:, 1861:, 1832:, 1824:, 1814:, 1798:, 1414:if 1390:. 872:if 40:. 19:A 2283:. 2277:: 2262:. 2256:: 2241:. 2235:: 2220:. 2214:: 2199:. 2193:: 2178:. 2161:. 2144:. 2127:. 2110:. 2093:. 2077:: 2061:2 2037:. 2013:. 1991:: 1969:. 1947:: 1926:6 1873:: 1810:: 1741:) 1735:( 1705:) 1699:( 1693:g 1690:, 1687:) 1681:( 1675:f 1653:j 1650:, 1647:i 1615:] 1610:) 1605:) 1602:j 1599:( 1595:x 1591:( 1584:g 1580:, 1576:) 1571:) 1568:i 1565:( 1561:x 1557:( 1550:f 1546:[ 1515:] 1510:) 1505:) 1502:j 1499:( 1495:x 1491:( 1484:g 1480:, 1476:) 1471:) 1468:i 1465:( 1461:x 1457:( 1450:f 1446:[ 1425:j 1422:= 1419:i 1367:) 1362:) 1359:j 1356:( 1351:x 1341:) 1338:i 1335:( 1330:x 1325:( 1321:M 1315:T 1311:) 1305:) 1302:j 1299:( 1294:x 1284:) 1281:i 1278:( 1273:x 1268:( 1262:) 1257:) 1254:j 1251:( 1246:x 1241:, 1236:) 1233:i 1230:( 1225:x 1220:( 1181:) 1175:( 1145:) 1139:( 1133:f 1111:j 1108:, 1105:i 1073:] 1068:) 1063:) 1060:j 1057:( 1053:x 1049:( 1042:f 1038:, 1034:) 1029:) 1026:i 1023:( 1019:x 1015:( 1008:f 1004:[ 973:] 968:) 963:) 960:j 957:( 953:x 949:( 942:f 938:, 934:) 929:) 926:i 923:( 919:x 915:( 908:f 904:[ 883:j 880:= 877:i 836:) 831:) 828:j 825:( 820:x 810:) 807:i 804:( 799:x 794:( 789:T 785:) 779:) 776:j 773:( 768:x 758:) 755:i 752:( 747:x 742:( 736:) 731:) 728:j 725:( 720:x 715:, 710:) 707:i 704:( 699:x 694:( 655:) 649:( 643:f 621:j 618:, 615:i 583:j 577:i 574:, 566:) 561:) 558:j 555:( 551:x 547:( 540:f 533:) 528:) 525:i 522:( 518:x 514:( 507:f 491:j 488:= 485:i 482:, 474:) 469:) 466:j 463:( 459:x 455:( 448:f 441:) 436:) 433:i 430:( 426:x 422:( 415:f 400:{ 395:= 392:) 387:) 384:j 381:( 377:x 373:, 368:) 365:i 362:( 358:x 354:( 313:) 310:z 307:, 304:y 301:( 295:+ 292:) 289:y 286:, 283:x 280:( 271:) 268:z 265:, 262:x 259:( 234:) 231:x 228:, 225:y 222:( 216:= 213:) 210:y 207:, 204:x 201:( 176:y 173:= 170:x 162:0 159:= 156:) 153:y 150:, 147:x 144:( 119:0 113:) 110:y 107:, 104:x 101:(

Index

artificial neural network
fingerprints
locality-sensitive hashing
recognizing handwritten
detection of faces
face recognition
DeepFace
face verification
triplet loss
contrastive loss
Euclidean distance
Mahalanobis distance
Unsupervised learning
Supervised learning
Artificial neural network
Triplet loss
"Siamese neural networks: an overview"
Springer Protocols
doi
10.1007/978-1-0716-0826-5_3
ISBN
978-1-0716-0826-5
PMID
32804361
S2CID
221144012
"Siamese neural networks: an overview"
Springer Protocols
doi
10.1007/978-1-0716-0826-5_3

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.