Knowledge (XXG)

Torch (machine learning)

Source đź“ť

117: 1711: 275:. It provides a flexible N-dimensional array or Tensor, which supports basic routines for indexing, slicing, transposing, type-casting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library. The Tensor also supports mathematical operations like 1214:
Many packages other than the above official packages are used with Torch. These are listed in the torch cheatsheet. These extra packages provide a wide range of utilities such as parallelism, asynchronous input/output, image processing, and so on. They can be installed with
1747: 1536: 947:
methods for computing the loss and backpropagating gradients, respectively. Criteria are helpful to train neural network on classical tasks. Common criteria are the
1905: 1427: 652:
Objects created with the torch factory can also be serialized, as long as they do not contain references to objects that cannot be serialized, such as Lua
634: 27: 1910: 1740: 1556: 1545: 1529: 1263: 1900: 1814: 1733: 288: 308: 1895: 1522: 1874: 1405: 260: 1822: 1644: 1203: 630: 96: 1609: 1195: 614: 70: 1756: 1244: 701: 642: 638: 232: 154: 130: 737: 646: 1240: 296: 244: 134: 1781: 1714: 1291: 626: 1251:. It has been used to build hardware implementations for data flows like those found in neural networks. 1768: 741: 312: 1374: 1483: 1837: 1664: 1566: 224: 217: 1690: 292: 169: 1296: 1854: 1841: 948: 228: 26: 1680: 1634: 709: 1332: 1584: 220: 186: 175: 145: 1438: 963:. What follows is an example of a Lua function that can be iteratively called to train an 1796: 1685: 1488: 705: 116: 1352: 1869: 1589: 685: 621:
by providing various convenience functions which are used throughout its packages. The
316: 1406:
KDnuggets Interview with Yann LeCun, Deep Learning Expert, Director of Facebook AI Lab
1889: 956: 928: 618: 240: 179: 1579: 1574: 1202:
package provides much more options in this respect, like momentum and weight decay
736:
make up the basic component modules. This modular interface provides first-order
304: 191: 40: 1416: 1672: 1629: 1460: 1219:, the Lua package manager which is also included with the Torch distribution. 34: 1386: 1864: 1849: 1654: 1594: 1471: 664:
can be serialized if it is wrapped by a table (or metatable) that provides
653: 1449: 1254:
Facebook has released a set of extension modules as open source software.
1804: 1649: 1514: 1216: 158: 1786: 1764: 1675: 1624: 1290:"Torch: a modular machine learning software library". 30 October 2002. 1268: 256: 1695: 1619: 1614: 1391: 1363: 1317: 1236: 236: 138: 1827: 1725: 1604: 248: 150: 105: 1776: 1599: 323: 300: 252: 1729: 1518: 1472:
NeuFlow: A Runtime Reconfigurable Dataflow Processor for Vision
1248: 1232: 162: 724:
to create complex task-tailored graphs. Simpler modules like
1228: 688:. It is divided into modular objects that share a common 1333:"Torch7: A Matlab-like Environment for Machine Learning" 740:. What follows is an example use-case for building a 1836: 1813: 1795: 1763: 1663: 1565: 1312: 708:, respectively. Modules can be joined using module 202: 197: 185: 168: 144: 126: 95: 69: 47: 33: 1507: 1285: 1283: 322:The following exemplifies using torch via its 1741: 1530: 8: 839:-- some hyperbolic tangent transfer function 637:is called, torch initializes and sets a Lua 19: 1484:"Facebook Open-Sources a Trove of AI Tools" 1748: 1734: 1726: 1537: 1523: 1515: 1194:class for training a neural network using 115: 18: 1295: 1279: 1439:IDIAP Research Institute : Torch 1340:Neural Information Processing Systems 1243:. Torch has been extended for use on 255:. Torch development moved in 2017 to 7: 1264:Comparison of deep learning software 1906:Lua (programming language) software 935:, which has a similar interface to 623:torch.class(classname, parentclass) 931:are implemented as sub-classes of 738:automatic gradient differentiation 14: 287:, statistical distributions like 1710: 1709: 1387:"Cheatsheet · torch/torch7 Wiki" 25: 1450:Torch-android GitHub repository 625:function can be used to create 76:7.0 / February 27, 2017 1911:Software using the BSD license 1875:List of applications using Lua 1353:Torch GitHub repository ReadMe 1227:Torch is used by the Facebook 1: 684:package is used for building 271:The core package of Torch is 809:-- 10 input, 25 hidden units 313:matrix–matrix multiplication 309:matrix–vector multiplication 1610:Microsoft Cognitive Toolkit 1461:Torch-ios GitHub repository 1196:stochastic gradient descent 645:, which makes the table an 615:object-oriented programming 259:, a port of the library to 53:; 21 years ago 1927: 1428:Yann Lecun's Facebook Page 700:method that allow them to 692:interface. Modules have a 243:algorithms implemented in 1901:Free statistical software 1705: 1552: 1364:PyTorch GitHub repository 959:criterion implemented in 951:criterion implemented in 91: 65: 24: 1241:Idiap Research Institute 981: 746: 613:package also simplifies 328: 249:Idiap Research Institute 247:. It was created by the 967:Module on input Tensor 1896:Deep learning software 1546:Deep learning software 641:with the user-defined 16:Deep learning software 1698:Deep Learning Toolbox 1375:PyTorch Documentation 742:multilayer perceptron 78:; 7 years ago 225:scientific computing 1691:Wolfram Mathematica 21: 1492:. 16 January 2015. 1311:Collobert, Ronan. 1192:StochasticGradient 1104:zeroGradParameters 949:mean squared error 229:scripting language 43:, Johnny MariĂ©thoz 35:Original author(s) 1883: 1882: 1723: 1722: 1032:ClassNLLCriterion 961:ClassNLLCriterion 227:framework, and a 211: 210: 51:October 2002 39:Ronan Collobert, 1918: 1750: 1743: 1736: 1727: 1713: 1712: 1539: 1532: 1525: 1516: 1511: 1510: 1508:Official website 1494: 1493: 1480: 1474: 1469: 1463: 1458: 1452: 1447: 1441: 1436: 1430: 1425: 1419: 1414: 1408: 1403: 1397: 1396: 1383: 1377: 1372: 1366: 1361: 1355: 1350: 1344: 1343: 1337: 1329: 1323: 1322: 1308: 1302: 1301: 1299: 1287: 1231:Research Group, 1201: 1193: 1186: 1183: 1180: 1177: 1174: 1173:updateParameters 1171: 1168: 1165: 1162: 1159: 1156: 1153: 1150: 1147: 1144: 1141: 1138: 1135: 1132: 1129: 1126: 1123: 1120: 1117: 1114: 1111: 1108: 1105: 1102: 1099: 1096: 1093: 1090: 1087: 1084: 1081: 1078: 1075: 1072: 1069: 1066: 1063: 1060: 1057: 1054: 1051: 1048: 1045: 1042: 1039: 1036: 1033: 1030: 1027: 1024: 1021: 1018: 1015: 1012: 1009: 1006: 1003: 1000: 997: 994: 991: 988: 985: 978: 974: 971:, target Tensor 970: 966: 962: 954: 946: 942: 938: 934: 924: 921: 918: 915: 912: 909: 906: 903: 900: 897: 894: 891: 888: 885: 882: 879: 876: 873: 870: 867: 864: 861: 858: 855: 852: 849: 846: 843: 840: 837: 834: 831: 828: 825: 822: 819: 816: 813: 810: 807: 804: 801: 798: 795: 792: 789: 786: 783: 780: 777: 774: 771: 768: 765: 762: 759: 756: 753: 750: 735: 731: 727: 723: 719: 715: 699: 695: 691: 683: 671: 667: 627:object factories 624: 612: 605: 602: 599: 596: 593: 590: 587: 584: 581: 578: 575: 572: 569: 566: 563: 560: 557: 554: 551: 548: 545: 542: 539: 536: 533: 530: 527: 524: 521: 518: 515: 512: 509: 506: 503: 500: 497: 494: 491: 488: 485: 482: 479: 476: 473: 470: 467: 464: 461: 458: 455: 452: 449: 446: 443: 440: 437: 434: 433:0.34010116549482 431: 428: 425: 422: 419: 416: 413: 410: 407: 404: 401: 398: 395: 392: 389: 386: 383: 380: 377: 374: 371: 368: 365: 362: 359: 356: 353: 350: 347: 344: 341: 338: 335: 332: 303:operations like 286: 282: 278: 274: 221:machine learning 207: 204: 176:machine learning 146:Operating system 119: 114: 111: 109: 107: 86: 84: 79: 61: 59: 54: 29: 22: 1926: 1925: 1921: 1920: 1919: 1917: 1916: 1915: 1886: 1885: 1884: 1879: 1832: 1809: 1797:Package manager 1791: 1759: 1757:Lua programming 1754: 1724: 1719: 1701: 1686:Neural Designer 1659: 1561: 1548: 1543: 1506: 1505: 1502: 1497: 1482: 1481: 1477: 1470: 1466: 1459: 1455: 1448: 1444: 1437: 1433: 1426: 1422: 1415: 1411: 1404: 1400: 1385: 1384: 1380: 1373: 1369: 1362: 1358: 1351: 1347: 1335: 1331: 1330: 1326: 1310: 1309: 1305: 1289: 1288: 1281: 1277: 1260: 1225: 1212: 1199: 1198:, although the 1191: 1188: 1187: 1184: 1181: 1178: 1175: 1172: 1169: 1166: 1163: 1160: 1157: 1154: 1151: 1148: 1145: 1142: 1139: 1136: 1133: 1130: 1127: 1124: 1121: 1118: 1115: 1112: 1109: 1106: 1103: 1100: 1097: 1094: 1091: 1088: 1085: 1082: 1079: 1076: 1073: 1070: 1067: 1064: 1061: 1058: 1055: 1052: 1049: 1046: 1043: 1040: 1037: 1034: 1031: 1028: 1025: 1022: 1019: 1016: 1013: 1010: 1007: 1004: 1001: 998: 995: 992: 989: 986: 983: 976: 972: 968: 964: 960: 952: 944: 940: 936: 932: 926: 925: 922: 919: 916: 913: 910: 907: 904: 901: 898: 895: 892: 889: 886: 883: 880: 877: 874: 871: 868: 865: 862: 859: 856: 853: 850: 847: 844: 841: 838: 835: 832: 829: 826: 823: 820: 817: 814: 811: 808: 805: 802: 799: 796: 793: 790: 787: 784: 781: 778: 775: 772: 769: 766: 763: 760: 757: 754: 751: 748: 744:using Modules: 733: 729: 725: 721: 717: 713: 697: 693: 689: 686:neural networks 681: 678: 669: 665: 622: 610: 607: 606: 604:1.7844365427828 603: 600: 597: 594: 591: 588: 585: 582: 579: 576: 573: 570: 567: 564: 561: 558: 555: 552: 549: 546: 543: 540: 537: 534: 531: 528: 525: 522: 519: 516: 513: 510: 507: 504: 501: 498: 495: 492: 489: 486: 483: 480: 477: 474: 471: 468: 465: 462: 459: 456: 453: 450: 447: 444: 441: 438: 435: 432: 429: 426: 423: 420: 417: 414: 411: 408: 405: 402: 399: 396: 393: 390: 387: 384: 381: 378: 375: 372: 369: 366: 363: 360: 357: 354: 351: 348: 345: 342: 339: 336: 333: 330: 284: 280: 276: 272: 269: 201: 122: 104: 87: 82: 80: 77: 57: 55: 52: 48:Initial release 17: 12: 11: 5: 1924: 1922: 1914: 1913: 1908: 1903: 1898: 1888: 1887: 1881: 1880: 1878: 1877: 1872: 1867: 1862: 1857: 1852: 1846: 1844: 1834: 1833: 1831: 1830: 1825: 1819: 1817: 1811: 1810: 1808: 1807: 1801: 1799: 1793: 1792: 1790: 1789: 1784: 1779: 1773: 1771: 1761: 1760: 1755: 1753: 1752: 1745: 1738: 1730: 1721: 1720: 1718: 1717: 1706: 1703: 1702: 1700: 1699: 1693: 1688: 1683: 1678: 1669: 1667: 1661: 1660: 1658: 1657: 1652: 1647: 1642: 1637: 1632: 1627: 1622: 1617: 1612: 1607: 1602: 1597: 1592: 1590:Deeplearning4j 1587: 1582: 1577: 1571: 1569: 1563: 1562: 1560: 1559: 1553: 1550: 1549: 1544: 1542: 1541: 1534: 1527: 1519: 1513: 1512: 1501: 1500:External links 1498: 1496: 1495: 1475: 1464: 1453: 1442: 1431: 1420: 1409: 1398: 1378: 1367: 1356: 1345: 1324: 1303: 1278: 1276: 1273: 1272: 1271: 1266: 1259: 1256: 1224: 1221: 1211: 1210:Other packages 1208: 1204:regularization 982: 975:with a scalar 939:. It also has 929:Loss functions 747: 677: 674: 329: 317:matrix product 268: 265: 239:interfaces to 235:. It provides 209: 208: 199: 195: 194: 189: 183: 182: 172: 166: 165: 148: 142: 141: 128: 124: 123: 121: 120: 101: 99: 93: 92: 89: 88: 75: 73: 67: 66: 63: 62: 49: 45: 44: 37: 31: 30: 15: 13: 10: 9: 6: 4: 3: 2: 1923: 1912: 1909: 1907: 1904: 1902: 1899: 1897: 1894: 1893: 1891: 1876: 1873: 1871: 1868: 1866: 1863: 1861: 1858: 1856: 1853: 1851: 1848: 1847: 1845: 1843: 1839: 1835: 1829: 1826: 1824: 1821: 1820: 1818: 1816: 1812: 1806: 1803: 1802: 1800: 1798: 1794: 1788: 1785: 1783: 1780: 1778: 1775: 1774: 1772: 1770: 1769:distributions 1766: 1762: 1758: 1751: 1746: 1744: 1739: 1737: 1732: 1731: 1728: 1716: 1708: 1707: 1704: 1697: 1694: 1692: 1689: 1687: 1684: 1682: 1679: 1677: 1674: 1671: 1670: 1668: 1666: 1662: 1656: 1653: 1651: 1648: 1646: 1643: 1641: 1638: 1636: 1633: 1631: 1628: 1626: 1623: 1621: 1618: 1616: 1613: 1611: 1608: 1606: 1603: 1601: 1598: 1596: 1593: 1591: 1588: 1586: 1583: 1581: 1578: 1576: 1573: 1572: 1570: 1568: 1564: 1558: 1555: 1554: 1551: 1547: 1540: 1535: 1533: 1528: 1526: 1521: 1520: 1517: 1509: 1504: 1503: 1499: 1491: 1490: 1485: 1479: 1476: 1473: 1468: 1465: 1462: 1457: 1454: 1451: 1446: 1443: 1440: 1435: 1432: 1429: 1424: 1421: 1418: 1413: 1410: 1407: 1402: 1399: 1394: 1393: 1388: 1382: 1379: 1376: 1371: 1368: 1365: 1360: 1357: 1354: 1349: 1346: 1341: 1334: 1328: 1325: 1320: 1319: 1314: 1307: 1304: 1298: 1297:10.1.1.8.9850 1293: 1286: 1284: 1280: 1274: 1270: 1267: 1265: 1262: 1261: 1257: 1255: 1252: 1250: 1246: 1242: 1238: 1234: 1230: 1222: 1220: 1218: 1209: 1207: 1205: 1197: 980: 958: 957:cross-entropy 950: 930: 745: 743: 739: 711: 707: 706:backpropagate 703: 687: 675: 673: 663: 659: 655: 650: 648: 644: 640: 636: 632: 628: 620: 619:serialization 616: 327: 326:interpreter: 325: 320: 318: 314: 310: 306: 302: 298: 294: 290: 266: 264: 262: 258: 254: 250: 246: 242: 241:deep learning 238: 234: 230: 226: 222: 219: 215: 206: 200: 196: 193: 190: 188: 184: 181: 180:deep learning 177: 173: 171: 167: 164: 160: 156: 152: 149: 147: 143: 140: 136: 132: 129: 125: 118: 113: 103: 102: 100: 98: 94: 90: 74: 72: 71:Final release 68: 64: 50: 46: 42: 38: 36: 32: 28: 23: 1859: 1838:Applications 1639: 1580:Apache SINGA 1575:Apache MXNet 1487: 1478: 1467: 1456: 1445: 1434: 1423: 1412: 1401: 1390: 1381: 1370: 1359: 1348: 1339: 1327: 1316: 1306: 1253: 1226: 1223:Applications 1213: 1190:It also has 1189: 1179:learningRate 1011:learningRate 977:learningRate 953:MSECriterion 927: 679: 661: 657: 651: 633:). When the 608: 321: 270: 223:library, a 213: 212: 174:Library for 1665:Proprietary 1567:Open source 1417:Hacker News 881:-- 1 output 702:feedforward 660:. However, 635:constructor 305:dot product 297:multinomial 218:open-source 192:BSD License 41:Samy Bengio 1890:Categories 1842:frameworks 1681:IBM Watson 1630:TensorFlow 1557:Comparison 1275:References 987:gradUpdate 945:backward() 764:Sequential 714:Sequential 710:composites 698:backward() 656:, and Lua 654:coroutines 532:LongTensor 127:Written in 97:Repository 83:2017-02-27 1865:Tarantool 1850:OpenResty 1655:MindSpore 1595:DeepSpeed 1292:CiteSeerX 1119:criterion 1074:criterion 1020:criterion 941:forward() 933:Criterion 694:forward() 672:methods. 643:metatable 231:based on 1805:LuaRocks 1715:Category 1650:OpenVINO 1313:"Torch7" 1258:See also 1239:and the 1217:LuaRocks 1149:backward 1125:backward 984:function 955:and the 718:Parallel 662:userdata 658:userdata 159:Mac OS X 1855:Prosody 1787:Solar2D 1676:Core ML 1625:PyTorch 1342:. 2011. 1269:PyTorch 1245:Android 1080:forward 1053:forward 896:forward 712:, like 670:write() 631:classes 289:uniform 257:PyTorch 198:Website 187:License 155:Android 112:/torch7 81: ( 58:2002-10 56: ( 1823:Decoda 1696:MATLAB 1635:Theano 1620:OpenNN 1615:ML.NET 1392:GitHub 1318:GitHub 1294:  1237:Yandex 979:: 937:Module 923:0.1815 863:Linear 791:Linear 726:Linear 722:Concat 690:Module 666:read() 647:object 583:0.8299 580:0.1708 577:1.6249 574:0.1411 571:0.2615 565:1.7844 559:0.3401 553:0.2381 502:0.8299 499:0.1708 496:1.6249 493:0.1411 490:0.2615 484:1.7844 478:0.3401 472:0.2381 445:narrow 421:0.8465 418:1.0525 415:2.2291 412:1.0434 406:0.8299 403:0.1708 400:1.6249 397:0.1411 394:0.2615 388:1.7844 382:0.3401 376:0.2381 299:, and 293:normal 261:Python 237:LuaJIT 216:is an 110:/torch 106:github 1860:Torch 1828:SciTE 1765:Ports 1673:Apple 1640:Torch 1605:Keras 1585:Caffe 1489:Wired 1336:(PDF) 1200:optim 1110:local 1065:local 1038:local 1017:local 908:randn 902:torch 639:table 611:torch 526:torch 514:index 346:randn 340:torch 273:torch 267:torch 214:Torch 203:torch 151:Linux 20:Torch 1815:IDEs 1782:LĂ–VE 1777:Plua 1645:ONNX 1600:Dlib 1247:and 1131:pred 1086:pred 1041:pred 943:and 884:> 842:> 833:Tanh 812:> 770:> 749:> 732:and 730:Tanh 720:and 704:and 696:and 680:The 668:and 617:and 609:The 586:> 505:> 436:> 424:> 364:> 331:> 324:REPL 315:and 301:BLAS 295:and 253:EPFL 178:and 170:Type 108:.com 1870:IUP 1249:iOS 1233:IBM 1185:end 1167:mlp 1143:mlp 1107:(); 1098:mlp 1068:err 1047:mlp 993:mlp 965:mlp 890:mlp 851:add 845:mlp 836:()) 821:add 815:mlp 779:add 773:mlp 752:mlp 734:Max 595:min 285:sum 281:min 277:max 251:at 233:Lua 205:.ch 163:iOS 139:C++ 131:Lua 1892:: 1840:, 1767:, 1486:. 1389:. 1338:. 1315:. 1282:^ 1235:, 1229:AI 1206:. 1182:); 1164:); 1140:); 1095:); 1035:() 1026:nn 917:)) 914:10 878:)) 869:25 857:nn 827:nn 806:)) 803:25 797:10 785:nn 767:() 758:nn 728:, 716:, 682:nn 676:nn 649:. 598:() 547:}) 319:. 311:, 307:, 291:, 283:, 279:, 263:. 161:, 157:, 153:, 137:, 133:, 1749:e 1742:t 1735:v 1538:e 1531:t 1524:v 1395:. 1321:. 1300:. 1176:( 1170:: 1161:t 1158:, 1155:x 1152:( 1146:: 1137:y 1134:, 1128:( 1122:: 1116:= 1113:t 1101:: 1092:y 1089:, 1083:( 1077:: 1071:= 1062:) 1059:x 1056:( 1050:: 1044:= 1029:. 1023:= 1014:) 1008:, 1005:y 1002:, 999:x 996:, 990:( 973:y 969:x 920:- 911:( 905:. 899:( 893:: 887:= 875:1 872:, 866:( 860:. 854:( 848:: 830:. 824:( 818:: 800:, 794:( 788:. 782:( 776:: 761:. 755:= 629:( 601:- 592:: 589:a 568:- 562:- 556:- 550:- 544:2 541:, 538:1 535:{ 529:. 523:, 520:1 517:( 511:: 508:a 487:- 481:- 475:- 469:- 466:) 463:2 460:, 457:1 454:, 451:1 448:( 442:: 439:a 430:- 427:a 409:- 391:- 385:- 379:- 373:- 370:a 367:= 361:) 358:4 355:, 352:3 349:( 343:. 337:= 334:a 245:C 135:C 85:) 60:)

Index

Torch logo
Original author(s)
Samy Bengio
Final release
Repository
github.com/torch/torch7
Edit this at Wikidata
Lua
C
C++
Operating system
Linux
Android
Mac OS X
iOS
Type
machine learning
deep learning
License
BSD License
torch.ch
open-source
machine learning
scientific computing
scripting language
Lua
LuaJIT
deep learning
C
Idiap Research Institute

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑