Knowledge (XXG)

Feature (machine learning)

Source 📝

1326:. Feature construction is the application of a set of constructive operators to a set of existing features resulting in construction of new features. Examples of such constructive operators include checking for the equality conditions {=, ≠}, the arithmetic operators {+,−,×, /}, the array operators {max(S), min(S), average(S)} as well as other more sophisticated operators, for example count(S,C) that counts the number of features in the feature vector S satisfying some condition C or, for example, distances to other recognition classes generalized by some accepting device. Feature construction has long been considered a powerful tool for increasing both accuracy and understanding of structure, particularly in high-dimensional problems. Applications include studies of disease and 43: 1279:
in machine learning require a numerical representation of objects, since such representations facilitate processing and statistical analysis. When representing images, the feature values might correspond to the pixels of an image, while when representing texts the features might be the frequencies of
1142:
The type of feature that is used in feature engineering depends on the specific machine learning algorithm that is being used. Some machine learning algorithms, such as decision trees, can handle both numerical and categorical features. Other machine learning algorithms, such as linear regression,
1138:
are discrete values that can be grouped into categories. Examples of categorical features include gender, color, and zip code. Categorical features typically need to be converted to numerical features before they can be used in machine learning algorithms. This can be done using a variety of
1131:
Numerical features are continuous values that can be measured on a scale. Examples of numerical features include age, height, weight, and income. Numerical features can be used in machine learning algorithms directly.
1486:
Bloedorn, E., Michalski, R. Data-driven constructive induction: a methodology and its applications. IEEE Intelligent Systems, Special issue on Feature Transformation and Subset Selection, pp. 30-37, March/April,
1226:
detection algorithms, features may include the presence or absence of certain email headers, the email structure, the language, the frequency of specific terms, the grammatical correctness of the text.
1318:
Higher-level features can be obtained from already available features and added to the feature vector; for example, for the study of diseases the feature 'Age' is useful and is defined as
986: 1348:
The initial set of raw features can be redundant and large enough that estimation and optimization is made difficult or ineffective. Therefore, a preliminary step in many applications of
1024: 1084:
is an individual measurable property or characteristic of a phenomenon. Choosing informative, discriminating and independent features is a crucial element of effective algorithms in
981: 971: 812: 1019: 976: 827: 558: 60: 1059: 862: 938: 487: 1371:. It requires the experimentation of multiple possibilities and the combination of automated techniques with the intuition and knowledge of the 1534: 996: 759: 294: 1208:
counting the number of black pixels along horizontal and vertical directions, number of internal holes, stroke detection and many others.
1413: 1014: 107: 847: 822: 771: 79: 1442: 1089: 895: 890: 543: 126: 553: 191: 86: 948: 1052: 712: 533: 64: 1105: 923: 625: 401: 93: 1566: 1171: 1101: 880: 817: 727: 705: 548: 538: 1031: 943: 928: 389: 211: 918: 75: 1167:
between the feature vector and a vector of weights, qualifying those observations whose result exceeds a threshold.
1408: 1179: 991: 668: 563: 351: 284: 244: 53: 1561: 1297: 1235: 1195: 1175: 1156: 1097: 1045: 651: 419: 289: 31: 1393: 1312: 673: 593: 516: 434: 264: 226: 221: 181: 176: 1367:
Extracting or selecting features is a combination of art and science; developing systems to do so is known as
1364:
a new and reduced set of features to facilitate learning, and to improve generalization and interpretability.
620: 469: 369: 196: 800: 776: 678: 439: 414: 374: 186: 1513:. Internet Technology and Secured Transactions Conference 2009 (ICITST-2009), London, November 9–12. IEEE 1201: 1152: 754: 576: 528: 384: 299: 171: 1474: 100: 1556: 1281: 1135: 1109: 683: 633: 1477:. In Journal of Expert Systems with Applications. Vol. 36 , Iss. 2 (March 2009), pp. 3401-3406, 2009 1398: 1368: 1353: 1327: 1260: 1093: 1085: 1077: 786: 722: 693: 598: 424: 357: 343: 329: 304: 254: 206: 166: 1361: 1343: 1272: 1212: 1183: 764: 688: 474: 269: 1530: 1438: 1357: 1339: 1289: 1223: 1117: 857: 700: 613: 409: 379: 324: 319: 274: 216: 1376: 1349: 1264: 1219:
can include noise ratios, length of sounds, relative power, filter matches and many others.
1128:
In feature engineering, two types of features are commonly used: numerical and categorical.
1073: 885: 638: 588: 498: 482: 452: 314: 309: 259: 249: 147: 1231: 913: 717: 583: 523: 1510: 1151:
A numeric feature can be conveniently described by a feature vector. One way to achieve
1511:
Syntactic learning for ESEDA.1, tool for enhanced speech emotion detection and analysis
1379:, where a machine not only uses features for learning, but learns the features itself. 1247: 933: 464: 201: 1550: 1403: 1372: 1253: 852: 781: 663: 394: 279: 1304: 1524: 1459: 1293: 1164: 658: 152: 42: 1285: 1280:
occurrence of textual terms. Feature vectors are equivalent to the vectors of
1160: 1113: 807: 503: 429: 17: 1526:
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
1388: 1276: 1205: 1139:
techniques, such as one-hot encoding, label encoding, and ordinal encoding.
966: 747: 1252:"Feature space" redirects here. For feature spaces in kernel machines, see 1475:
Iterative feature construction for improving inductive learning algorithms
1311:. In order to reduce the dimensionality of the feature space, a number of 1216: 1163:) with a feature vector as input. The method consists of calculating the 742: 493: 737: 732: 459: 1523:
Hastie, Trevor; Tibshirani, Robert; Friedman, Jerome H. (2009).
1096:. Features are usually numeric, but structural features such as 36: 1170:
Algorithms for classification from a feature vector include
1300:
that is used to determine a score for making a prediction.
1292:. Feature vectors are often combined with weights using a 1460:
Feature Selection for Knowledge Discovery and Data Mining
1025:
List of datasets in computer vision and image processing
1496:
Breiman, L. Friedman, T., Olshen, R., Stone, C. (1984)
1275:
of numerical features that represent some object. Many
1464:, Kluwer Academic Publishers. Norwell, MA, USA. 1998. 67:. Unsourced material may be challenged and removed. 1307:associated with these vectors is often called the 1108:. The concept of "feature" is related to that of 1020:List of datasets for machine-learning research 1053: 8: 1320:Age = 'Year of death' minus 'Year of birth' 1060: 1046: 138: 127:Learn how and when to remove this message 1435:Pattern recognition and machine learning 1425: 1234:, there are a large number of possible 146: 27:Measurable property or characteristic 7: 1143:can only handle numerical features. 65:adding citations to reliable sources 1498:Classification and regression trees 1414:Explainable artificial intelligence 1015:Glossary of artificial intelligence 25: 1322:. This process is referred to as 41: 1172:nearest neighbor classification 76:"Feature" machine learning 52:needs additional citations for 435:Relevance vector machine (RVM) 1: 1375:. Automating this process is 1238:, such as edges and objects. 1106:syntactic pattern recognition 924:Computational learning theory 488:Expectation–maximization (EM) 1473:Piramuthu, S., Sikora R. T. 1433:Bishop, Christopher (2006). 1315:techniques can be employed. 881:Coefficient of determination 728:Convolutional neural network 440:Support vector machine (SVM) 1215:, features for recognizing 1032:Outline of machine learning 929:Empirical risk minimization 1583: 1457:Liu, H., Motoda H. (1998) 1409:Statistical classification 1337: 1251: 1245: 1193: 669:Feedforward neural network 420:Artificial neural networks 29: 1360:a subset of features, or 1298:linear predictor function 1196:Feature (computer vision) 1157:linear predictor function 652:Artificial neural network 32:Feature (computer vision) 1394:Dimensionality reduction 1334:Selection and extraction 1313:dimensionality reduction 1296:in order to construct a 961:Journals and conferences 908:Mathematical foundations 818:Temporal difference (TD) 674:Recurrent neural network 594:Conditional random field 517:Dimensionality reduction 265:Dimensionality reduction 227:Quantum machine learning 222:Neuromorphic engineering 182:Self-supervised learning 177:Semi-supervised learning 30:Not to be confused with 1509:Sidorova, J., Badia T. 1204:, features may include 370:Apprenticeship learning 1180:statistical techniques 919:Bias–variance tradeoff 801:Reinforcement learning 777:Spiking neural network 187:Reinforcement learning 1282:explanatory variables 1202:character recognition 1153:binary classification 755:Neural radiance field 577:Structured prediction 300:Structured prediction 172:Unsupervised learning 1437:. Berlin: Springer. 1324:feature construction 1271:is an n-dimensional 1136:Categorical features 1110:explanatory variable 944:Statistical learning 842:Learning with humans 634:Local outlier factor 61:improve this article 1567:Pattern recognition 1399:Feature engineering 1369:feature engineering 1354:pattern recognition 1328:emotion recognition 1288:procedures such as 1261:pattern recognition 1184:Bayesian approaches 1116:techniques such as 1086:pattern recognition 1078:pattern recognition 787:Electrochemical RAM 694:reservoir computing 425:Logistic regression 344:Supervised learning 330:Multimodal learning 305:Feature engineering 250:Generative modeling 212:Rule-based learning 207:Curriculum learning 167:Supervised learning 142:Part of a series on 1344:Feature extraction 1213:speech recognition 355: • 270:Density estimation 1536:978-0-387-84884-6 1340:Feature selection 1290:linear regression 1118:linear regression 1070: 1069: 875:Model diagnostics 858:Human-in-the-loop 701:Boltzmann machine 614:Anomaly detection 410:Linear regression 325:Ontology learning 320:Grammar induction 295:Semantic analysis 290:Association rules 275:Anomaly detection 217:Neuro-symbolic AI 137: 136: 129: 111: 16:(Redirected from 1574: 1562:Machine learning 1541: 1540: 1520: 1514: 1507: 1501: 1494: 1488: 1484: 1478: 1471: 1465: 1455: 1449: 1448: 1430: 1377:feature learning 1350:machine learning 1265:machine learning 1159:(related to the 1074:machine learning 1062: 1055: 1048: 1009:Related articles 886:Confusion matrix 639:Isolation forest 584:Graphical models 363: 362: 315:Learning to rank 310:Feature learning 148:Machine learning 139: 132: 125: 121: 118: 112: 110: 69: 45: 37: 21: 1582: 1581: 1577: 1576: 1575: 1573: 1572: 1571: 1547: 1546: 1545: 1544: 1537: 1522: 1521: 1517: 1508: 1504: 1495: 1491: 1485: 1481: 1472: 1468: 1456: 1452: 1445: 1432: 1431: 1427: 1422: 1385: 1346: 1338:Main articles: 1336: 1257: 1250: 1244: 1242:Feature vectors 1232:computer vision 1198: 1192: 1176:neural networks 1149: 1126: 1066: 1037: 1036: 1010: 1002: 1001: 962: 954: 953: 914:Kernel machines 909: 901: 900: 876: 868: 867: 848:Active learning 843: 835: 834: 803: 793: 792: 718:Diffusion model 654: 644: 643: 616: 606: 605: 579: 569: 568: 524:Factor analysis 519: 509: 508: 492: 455: 445: 444: 365: 364: 348: 347: 346: 335: 334: 240: 232: 231: 197:Online learning 162: 150: 133: 122: 116: 113: 70: 68: 58: 46: 35: 28: 23: 22: 15: 12: 11: 5: 1580: 1578: 1570: 1569: 1564: 1559: 1549: 1548: 1543: 1542: 1535: 1515: 1502: 1489: 1479: 1466: 1450: 1443: 1424: 1423: 1421: 1418: 1417: 1416: 1411: 1406: 1401: 1396: 1391: 1384: 1381: 1335: 1332: 1269:feature vector 1248:Word embedding 1243: 1240: 1191: 1188: 1165:scalar product 1148: 1147:Classification 1145: 1125: 1122: 1090:classification 1068: 1067: 1065: 1064: 1057: 1050: 1042: 1039: 1038: 1035: 1034: 1029: 1028: 1027: 1017: 1011: 1008: 1007: 1004: 1003: 1000: 999: 994: 989: 984: 979: 974: 969: 963: 960: 959: 956: 955: 952: 951: 946: 941: 936: 934:Occam learning 931: 926: 921: 916: 910: 907: 906: 903: 902: 899: 898: 893: 891:Learning curve 888: 883: 877: 874: 873: 870: 869: 866: 865: 860: 855: 850: 844: 841: 840: 837: 836: 833: 832: 831: 830: 820: 815: 810: 804: 799: 798: 795: 794: 791: 790: 784: 779: 774: 769: 768: 767: 757: 752: 751: 750: 745: 740: 735: 725: 720: 715: 710: 709: 708: 698: 697: 696: 691: 686: 681: 671: 666: 661: 655: 650: 649: 646: 645: 642: 641: 636: 631: 623: 617: 612: 611: 608: 607: 604: 603: 602: 601: 596: 591: 580: 575: 574: 571: 570: 567: 566: 561: 556: 551: 546: 541: 536: 531: 526: 520: 515: 514: 511: 510: 507: 506: 501: 496: 490: 485: 480: 472: 467: 462: 456: 451: 450: 447: 446: 443: 442: 437: 432: 427: 422: 417: 412: 407: 399: 398: 397: 392: 387: 377: 375:Decision trees 372: 366: 352:classification 342: 341: 340: 337: 336: 333: 332: 327: 322: 317: 312: 307: 302: 297: 292: 287: 282: 277: 272: 267: 262: 257: 252: 247: 245:Classification 241: 238: 237: 234: 233: 230: 229: 224: 219: 214: 209: 204: 202:Batch learning 199: 194: 189: 184: 179: 174: 169: 163: 160: 159: 156: 155: 144: 143: 135: 134: 49: 47: 40: 26: 24: 18:Feature vector 14: 13: 10: 9: 6: 4: 3: 2: 1579: 1568: 1565: 1563: 1560: 1558: 1555: 1554: 1552: 1538: 1532: 1528: 1527: 1519: 1516: 1512: 1506: 1503: 1499: 1493: 1490: 1483: 1480: 1476: 1470: 1467: 1463: 1461: 1454: 1451: 1446: 1444:0-387-31073-8 1440: 1436: 1429: 1426: 1419: 1415: 1412: 1410: 1407: 1405: 1404:Hashing trick 1402: 1400: 1397: 1395: 1392: 1390: 1387: 1386: 1382: 1380: 1378: 1374: 1373:domain expert 1370: 1365: 1363: 1359: 1355: 1351: 1345: 1341: 1333: 1331: 1330:from speech. 1329: 1325: 1321: 1316: 1314: 1310: 1309:feature space 1306: 1301: 1299: 1295: 1291: 1287: 1283: 1278: 1274: 1270: 1266: 1262: 1255: 1254:Kernel method 1249: 1241: 1239: 1237: 1233: 1228: 1225: 1220: 1218: 1214: 1209: 1207: 1203: 1197: 1189: 1187: 1185: 1181: 1177: 1173: 1168: 1166: 1162: 1158: 1154: 1146: 1144: 1140: 1137: 1133: 1129: 1124:Feature types 1123: 1121: 1119: 1115: 1111: 1107: 1103: 1099: 1095: 1091: 1087: 1083: 1079: 1075: 1063: 1058: 1056: 1051: 1049: 1044: 1043: 1041: 1040: 1033: 1030: 1026: 1023: 1022: 1021: 1018: 1016: 1013: 1012: 1006: 1005: 998: 995: 993: 990: 988: 985: 983: 980: 978: 975: 973: 970: 968: 965: 964: 958: 957: 950: 947: 945: 942: 940: 937: 935: 932: 930: 927: 925: 922: 920: 917: 915: 912: 911: 905: 904: 897: 894: 892: 889: 887: 884: 882: 879: 878: 872: 871: 864: 861: 859: 856: 854: 853:Crowdsourcing 851: 849: 846: 845: 839: 838: 829: 826: 825: 824: 821: 819: 816: 814: 811: 809: 806: 805: 802: 797: 796: 788: 785: 783: 782:Memtransistor 780: 778: 775: 773: 770: 766: 763: 762: 761: 758: 756: 753: 749: 746: 744: 741: 739: 736: 734: 731: 730: 729: 726: 724: 721: 719: 716: 714: 711: 707: 704: 703: 702: 699: 695: 692: 690: 687: 685: 682: 680: 677: 676: 675: 672: 670: 667: 665: 664:Deep learning 662: 660: 657: 656: 653: 648: 647: 640: 637: 635: 632: 630: 628: 624: 622: 619: 618: 615: 610: 609: 600: 599:Hidden Markov 597: 595: 592: 590: 587: 586: 585: 582: 581: 578: 573: 572: 565: 562: 560: 557: 555: 552: 550: 547: 545: 542: 540: 537: 535: 532: 530: 527: 525: 522: 521: 518: 513: 512: 505: 502: 500: 497: 495: 491: 489: 486: 484: 481: 479: 477: 473: 471: 468: 466: 463: 461: 458: 457: 454: 449: 448: 441: 438: 436: 433: 431: 428: 426: 423: 421: 418: 416: 413: 411: 408: 406: 404: 400: 396: 395:Random forest 393: 391: 388: 386: 383: 382: 381: 378: 376: 373: 371: 368: 367: 360: 359: 354: 353: 345: 339: 338: 331: 328: 326: 323: 321: 318: 316: 313: 311: 308: 306: 303: 301: 298: 296: 293: 291: 288: 286: 283: 281: 280:Data cleaning 278: 276: 273: 271: 268: 266: 263: 261: 258: 256: 253: 251: 248: 246: 243: 242: 236: 235: 228: 225: 223: 220: 218: 215: 213: 210: 208: 205: 203: 200: 198: 195: 193: 192:Meta-learning 190: 188: 185: 183: 180: 178: 175: 173: 170: 168: 165: 164: 158: 157: 154: 149: 145: 141: 140: 131: 128: 120: 117:December 2014 109: 106: 102: 99: 95: 92: 88: 85: 81: 78: â€“  77: 73: 72:Find sources: 66: 62: 56: 55: 50:This article 48: 44: 39: 38: 33: 19: 1529:. Springer. 1525: 1518: 1505: 1497: 1492: 1482: 1469: 1458: 1453: 1434: 1428: 1366: 1362:constructing 1356:consists of 1347: 1323: 1319: 1317: 1308: 1305:vector space 1302: 1268: 1258: 1229: 1221: 1210: 1199: 1169: 1150: 1141: 1134: 1130: 1127: 1104:are used in 1081: 1071: 939:PAC learning 626: 475: 470:Hierarchical 402: 356: 350: 123: 114: 104: 97: 90: 83: 71: 59:Please help 54:verification 51: 1557:Data mining 1500:, Wadsworth 1294:dot product 1286:statistical 1155:is using a 1114:statistical 823:Multi-agent 760:Transformer 659:Autoencoder 415:Naive Bayes 153:data mining 1551:Categories 1420:References 1277:algorithms 1246:See also: 1206:histograms 1194:See also: 1161:perceptron 1094:regression 808:Q-learning 706:Restricted 504:Mean shift 453:Clustering 430:Perceptron 358:regression 260:Clustering 255:Regression 87:newspapers 1389:Covariate 1358:selecting 967:ECML PKDD 949:VC theory 896:ROC curve 828:Self-play 748:DeepDream 589:Bayes net 380:Ensembles 161:Paradigms 1383:See also 1284:used in 1236:features 1217:phonemes 1190:Examples 1182:such as 1112:used in 390:Boosting 239:Problems 1098:strings 1082:feature 972:NeurIPS 789:(ECRAM) 743:AlexNet 385:Bagging 101:scholar 1533:  1441:  1273:vector 1178:, and 1102:graphs 765:Vision 621:RANSAC 499:OPTICS 494:DBSCAN 478:-means 285:AutoML 103:  96:  89:  82:  74:  987:IJCAI 813:SARSA 772:Mamba 738:LeNet 733:U-Net 559:t-SNE 483:Fuzzy 460:BIRCH 108:JSTOR 94:books 1531:ISBN 1487:1998 1439:ISBN 1352:and 1342:and 1303:The 1267:, a 1263:and 1224:spam 1100:and 1092:and 1080:, a 1076:and 997:JMLR 982:ICLR 977:ICML 863:RLHF 679:LSTM 465:CURE 151:and 80:news 1259:In 1230:In 1222:In 1211:In 1200:In 1088:, 1072:In 723:SOM 713:GAN 689:ESN 684:GRU 629:-NN 564:SDL 554:PGD 549:PCA 544:NMF 539:LDA 534:ICA 529:CCA 405:-NN 63:by 1553:: 1186:. 1174:, 1120:. 992:ML 1539:. 1462:. 1447:. 1256:. 1061:e 1054:t 1047:v 627:k 476:k 403:k 361:) 349:( 130:) 124:( 119:) 115:( 105:¡ 98:¡ 91:¡ 84:¡ 57:. 34:. 20:)

Index

Feature vector
Feature (computer vision)

verification
improve this article
adding citations to reliable sources
"Feature" machine learning
news
newspapers
books
scholar
JSTOR
Learn how and when to remove this message
Machine learning
data mining
Supervised learning
Unsupervised learning
Semi-supervised learning
Self-supervised learning
Reinforcement learning
Meta-learning
Online learning
Batch learning
Curriculum learning
Rule-based learning
Neuro-symbolic AI
Neuromorphic engineering
Quantum machine learning
Classification
Generative modeling

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑