Knowledge (XXG)

Interval predictor model

Source πŸ“

68:
vector, and the size of the convex set is minimized such that every possible data point can be predicted by one possible value of the parameters. Ellipsoidal parameters sets were used by Campi (2009), which yield a convex optimization program to train the IPM. Crespo (2016) proposed the use of a hyperrectangular parameter set, which results in a convenient, linear form for the bounds of the IPM. Hence the IPM can be trained with a linear optimization program:
331: 960:
Sadeghi (2019) demonstrates that the non-convex scenario approach from Campi (2015) can be extended to train deeper neural networks which predict intervals with hetreoscedastic uncertainty on datasets with imprecision. This is achieved by proposing generalizations to the max-error loss function given
67:
Typically the interval predictor model is created by specifying a parametric function, which is usually chosen to be the product of a parameter vector and a basis. Usually the basis is made up of polynomial features or a radial basis is sometimes used. Then a convex set is assigned to the parameter
543:, for the Interval Predictor Model after training and hence making predictions about the reliability of the model. This enables non-convex IPMs to be created, such as a single layer neural network. Campi (2015) demonstrates that an algorithm where the scenario optimization program is only solved 39:
Multiple-input multiple-output IPMs for multi-point data commonly used to represent functions have been recently developed. These IPM prescribe the parameters of the model as a path-connected, semi-algebraic set using sliced-normal or sliced-exponential distributions. A key advantage of this
74: 731: 1079: 857: 955: 326:{\displaystyle \operatorname {arg\,min} _{p}\left\{\mathbb {E} _{x}({\bar {y}}_{p}(x)-{\underline {y}}_{p}(x)):{\bar {y}}_{p}(x^{(i)})>y^{(i)}>{\underline {y}}_{p}(x^{(i)}),i=1,\ldots ,N\right\}} 40:
approach is its ability to characterize complex parameter dependencies to varying fidelity levels. This practice enables the analyst to adjust the desired level of conservatism in the prediction.
569: 443: 486: 1134: 47:, in many cases rigorous predictions can be made regarding the performance of the model at test time. Hence an interval predictor model can be seen as a guaranteed bound on 1673:
Patelli, Edoardo; Broggi, Matteo; Tolo, Silvia; Sadeghi, Jonathan (2017). "Cossan Software: A Multidisciplinary and Collaborative Software for Uncertainty Quantification".
563:
times which can determine the reliability of the model at test time without a prior evaluation on a validation set. This is achieved by solving the optimisation program
400: 367: 1580:
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.; Norman, Ryan B.; Blattnig, Steve (2016). "Application of Interval Predictor Models to Space Radiation Shielding".
967: 739: 561: 541: 506: 1136:-regression) belong to a particular class of Interval Predictor Models, for which the reliability is invariant with respect to the distribution of the data. 36:
technique, because a potentially infinite set of functions are contained by the IPM, and no specific distribution is implied for the regressed variables.
508:. The reliability of such an IPM is obtained by noting that for a convex IPM the number of support constraints is less than the dimensionality of the 32:, where usually one wishes to estimate point values or an entire probability distribution. Interval Predictor Models are sometimes referred to as a 862: 515:
Lacerda (2017) demonstrated that this approach can be extended to situations where the training data is interval valued rather than point valued.
1106:
analysis problem. Brandt (2017) applies interval predictor models to fatigue damage estimation of offshore wind turbines jacket substructures.
1690: 1646: 1605: 1470: 1453:
Campi, Marco C.; Garatti, Simone; Ramponi, Federico A. (2015). "Non-convex scenario optimization with application to system identification".
1426: 1258: 1178: 1629:
Crespo, Luis G.; Kenny, Sean P.; Colbert, Brendon K.; Slagel, Tanner (2021). "Interval Predictor Models for Robust System Identification".
1161:
Crespo, Luis G.; Kenny, Sean P.; Colbert, Brendon K.; Slagel, Tanner (2021). "Interval Predictor Models for Robust System Identification".
1675:
Proceedings of the 2nd International Conference on Uncertainty Quantification in Computational Sciences and Engineering (UNCECOMP 2017)
1241:
Crespo, Luis G.; Colbert, Brendon K.; Slager, Tanner; Kenny, Sean P. (2021). "Robust Estimation of Sliced-Exponential Distributions".
1099:
Crespo (2015) and (2021) applied Interval Predictor Models to the design of space radiation shielding and to system identification.
523:
In Campi (2015) a non-convex theory of scenario optimization was proposed. This involves measuring the number of support constraints,
28:) is an approach to regression where bounds on the function to be approximated are obtained. This differs from other techniques in 1706:
Faes, Matthias; Sadeghi, Jonathan; Broggi, Matteo; De Angelis, Marco; Patelli, Edoardo; Beer, Michael; Moens, David (2019).
1366:
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P. (2016). "Interval Predictor Models With a Linear Parameter Dependency".
1323:
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P. (2018). "Staircase predictor models for reliability and risk analysis".
1853: 1753:
Brandt, Sebastian; Broggi, Matteo; Hafele, Jan; Guillermo Gebhardt, Cristian; Rolfes, Raimund; Beer, Michael (2017).
726:{\displaystyle \operatorname {arg\,min} _{p}\left\{h:|{\hat {y}}_{p}(x^{(i)})-y^{(i)}|<h,i=1,\ldots ,N\right\},} 1794:
Garatti, S.; Campi, M.C.; Carè, A. (2019). "On a class of Interval Predictor Models with universal reliability".
1401:
Lacerda, Marcio J.; Crespo, Luis G. (2017). "Interval predictor models for data with measurement uncertainty".
33: 1285:
Campi, M.C.; Calafiore, G.; Garatti, S. (2009). "Interval predictor models: Identification and reliability".
405: 448: 1103: 1109:
Garatti (2019) proved that Chebyshev layers (i.e., the minimax layers around functions fitted by linear
1093: 52: 44: 1112: 509: 48: 17: 1829: 1735: 1652: 1611: 1523: 1476: 1432: 1348: 1264: 1223: 1184: 1102:
In Patelli (2017), Faes (2019), and Crespo (2018), Interval Predictor models were applied to the
1712:
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part B: Mechanical Engineering
1204:"On the quantification of aleatory and epistemic uncertainty using Sliced-Normal distributions" 1821: 1776: 1727: 1686: 1642: 1601: 1562: 1515: 1466: 1422: 1383: 1340: 1302: 1254: 1174: 1074:{\displaystyle {\mathcal {L}}_{\text{max-error}}=\max _{i}|y^{(i)}-{\hat {y}}_{p}(x^{(i)})|,} 372: 339: 1811: 1803: 1766: 1719: 1678: 1634: 1593: 1585: 1554: 1507: 1458: 1414: 1406: 1375: 1332: 1294: 1246: 1215: 1166: 852:{\displaystyle {\hat {y}}_{p}(x)=({\overline {y}}_{p}(x)+{\underline {y}}_{p}(x))\times 1/2} 56: 29: 1755:"Meta-models for fatigue damage estimation of offshore wind turbines jacket substructures" 1542: 546: 526: 491: 1708:"On the robust estimation of small failure probabilities for strong non-linear models" 1847: 1833: 1807: 1739: 1707: 1656: 1615: 1527: 1352: 1298: 1268: 1227: 1188: 1558: 1436: 1336: 1219: 1638: 1480: 1250: 1170: 1084:
which is equivalent to solving the optimisation program proposed by Campi (2015).
1771: 1754: 1511: 1203: 957:. This results in an IPM which makes predictions with homoscedastic uncertainty. 1682: 1410: 1825: 1780: 1731: 1566: 1462: 1387: 1344: 1306: 1597: 1496:"Efficient Training of Interval Neural Networks for Imprecise Training Data" 1418: 950:{\displaystyle h=({\overline {y}}_{p}(x)-{\underline {y}}_{p}(x))\times 1/2} 1519: 1495: 1144:
OpenCOSSAN provides a Matlab implementation of the work of Crespo (2015).
1816: 1589: 51:. Interval predictor models can also be seen as a way to prescribe the 1723: 1379: 1202:
Crespo, Luis; Colbert, Brendon; Kenny, Sean; Giesy, Daniel (2019).
1494:
Sadeghi, Jonathan C.; De Angelis, Marco; Patelli, Edoardo (2019).
1368:
Journal of Verification, Validation and Uncertainty Quantification
1545:(2009). "The scenario approach for systems and control design". 974: 1631:
2021 60th IEEE Conference on Decision and Control (CDC)
1455:
2015 54th IEEE Conference on Decision and Control (CDC)
1243:
2021 60th IEEE Conference on Decision and Control (CDC)
1163:
2021 60th IEEE Conference on Decision and Control (CDC)
1115: 970: 865: 742: 572: 549: 529: 494: 451: 408: 375: 342: 77: 1668: 1666: 1280: 1278: 1448: 1446: 1128: 1073: 949: 851: 725: 555: 535: 512:, and hence the scenario approach can be applied. 500: 480: 437: 394: 361: 325: 1582:18th AIAA Non-Deterministic Approaches Conference 1318: 1316: 989: 736:where the interval predictor model center line 8: 488:are parameterised by the parameter vector 402:, and the Interval Predictor Model bounds 1815: 1770: 1120: 1114: 1063: 1048: 1035: 1024: 1023: 1007: 998: 992: 979: 973: 972: 969: 939: 915: 905: 886: 876: 864: 841: 817: 807: 788: 778: 756: 745: 744: 741: 680: 668: 646: 633: 622: 621: 615: 595: 584: 574: 571: 548: 528: 493: 463: 453: 450: 420: 410: 407: 380: 374: 347: 341: 279: 266: 256: 240: 218: 205: 194: 193: 171: 161: 142: 131: 130: 120: 116: 115: 100: 89: 79: 76: 1096:was applied to robust control problems. 1153: 438:{\displaystyle {\underline {y}}_{p}(x)} 55:of random predictor models, of which a 1403:2017 American Control Conference (ACC) 481:{\displaystyle {\overline {y}}_{p}(x)} 336:where the training data examples are 7: 519:Non-convex interval predictor models 1541:Campi, Marco C.; Garatti, Simone; 1121: 591: 588: 585: 581: 578: 575: 96: 93: 90: 86: 83: 80: 43:As a consequence of the theory of 14: 1808:10.1016/j.automatica.2019.108542 1299:10.1016/j.automatica.2008.09.004 63:Convex interval predictor models 1559:10.1016/j.arcontrol.2009.07.001 1129:{\displaystyle \ell _{\infty }} 1337:10.1016/j.strusafe.2018.05.002 1220:10.1016/j.sysconle.2019.104560 1064: 1060: 1055: 1049: 1041: 1029: 1014: 1008: 999: 930: 927: 921: 898: 892: 872: 832: 829: 823: 800: 794: 774: 768: 762: 750: 681: 675: 669: 658: 653: 647: 639: 627: 616: 475: 469: 432: 426: 387: 381: 354: 348: 291: 286: 280: 272: 247: 241: 230: 225: 219: 211: 199: 186: 183: 177: 154: 148: 136: 126: 1: 1639:10.1109/CDC45484.2021.9683582 1251:10.1109/CDC45484.2021.9683584 1171:10.1109/CDC45484.2021.9683582 1772:10.1016/j.proeng.2017.09.292 1512:10.1016/j.neunet.2019.07.005 881: 783: 458: 1208:Systems and Control Letters 1870: 1683:10.7712/120217.5364.16982 1547:Annual Reviews in Control 1411:10.23919/ACC.2017.7963163 1463:10.1109/CDC.2015.7402845 1140:Software implementations 34:nonparametric regression 22:interval predictor model 395:{\displaystyle x^{(i)}} 362:{\displaystyle y^{(i)}} 1457:. pp. 4023–4028. 1405:. pp. 1487–1492. 1245:. pp. 6742–6748. 1130: 1104:structural reliability 1075: 951: 859:, and the model width 853: 727: 557: 537: 502: 482: 439: 396: 363: 327: 59:is a specific case . 1131: 1094:scenario optimization 1076: 952: 854: 728: 558: 538: 503: 483: 440: 397: 364: 328: 45:scenario optimization 1759:Procedia Engineering 1677:. pp. 212–224. 1633:. pp. 872–879. 1165:. pp. 872–879. 1113: 968: 863: 740: 570: 547: 527: 510:trainable parameters 492: 449: 406: 373: 340: 75: 1854:Regression analysis 1590:10.2514/6.2016-0431 49:quantile regression 18:regression analysis 1126: 1071: 997: 947: 913: 849: 815: 723: 553: 533: 498: 478: 435: 418: 392: 359: 323: 264: 169: 1724:10.1115/1.4044044 1692:978-618-82844-4-9 1648:978-1-6654-3659-5 1607:978-1-62410-397-1 1472:978-1-4799-7886-1 1428:978-1-5090-5992-8 1380:10.1115/1.4032070 1325:Structural Safety 1260:978-1-6654-3659-5 1180:978-1-6654-3659-5 1032: 988: 982: 906: 884: 808: 786: 753: 630: 556:{\displaystyle S} 536:{\displaystyle S} 501:{\displaystyle p} 461: 411: 257: 202: 162: 139: 1861: 1838: 1837: 1819: 1791: 1785: 1784: 1774: 1750: 1744: 1743: 1703: 1697: 1696: 1670: 1661: 1660: 1626: 1620: 1619: 1598:2060/20160007750 1577: 1571: 1570: 1538: 1532: 1531: 1491: 1485: 1484: 1450: 1441: 1440: 1419:2060/20170005690 1398: 1392: 1391: 1363: 1357: 1356: 1320: 1311: 1310: 1282: 1273: 1272: 1238: 1232: 1231: 1199: 1193: 1192: 1158: 1135: 1133: 1132: 1127: 1125: 1124: 1080: 1078: 1077: 1072: 1067: 1059: 1058: 1040: 1039: 1034: 1033: 1025: 1018: 1017: 1002: 996: 984: 983: 980: 978: 977: 956: 954: 953: 948: 943: 920: 919: 914: 891: 890: 885: 877: 858: 856: 855: 850: 845: 822: 821: 816: 793: 792: 787: 779: 761: 760: 755: 754: 746: 732: 730: 729: 724: 719: 715: 684: 679: 678: 657: 656: 638: 637: 632: 631: 623: 619: 600: 599: 594: 562: 560: 559: 554: 542: 540: 539: 534: 507: 505: 504: 499: 487: 485: 484: 479: 468: 467: 462: 454: 444: 442: 441: 436: 425: 424: 419: 401: 399: 398: 393: 391: 390: 368: 366: 365: 360: 358: 357: 332: 330: 329: 324: 322: 318: 290: 289: 271: 270: 265: 251: 250: 229: 228: 210: 209: 204: 203: 195: 176: 175: 170: 147: 146: 141: 140: 132: 125: 124: 119: 105: 104: 99: 57:Gaussian process 30:machine learning 1869: 1868: 1864: 1863: 1862: 1860: 1859: 1858: 1844: 1843: 1842: 1841: 1793: 1792: 1788: 1752: 1751: 1747: 1705: 1704: 1700: 1693: 1672: 1671: 1664: 1649: 1628: 1627: 1623: 1608: 1579: 1578: 1574: 1543:Prandini, Maria 1540: 1539: 1535: 1500:Neural Networks 1493: 1492: 1488: 1473: 1452: 1451: 1444: 1429: 1400: 1399: 1395: 1365: 1364: 1360: 1322: 1321: 1314: 1284: 1283: 1276: 1261: 1240: 1239: 1235: 1201: 1200: 1196: 1181: 1160: 1159: 1155: 1150: 1142: 1116: 1111: 1110: 1090: 1044: 1022: 1003: 971: 966: 965: 904: 875: 861: 860: 806: 777: 743: 738: 737: 664: 642: 620: 608: 604: 573: 568: 567: 545: 544: 525: 524: 521: 490: 489: 452: 447: 446: 409: 404: 403: 376: 371: 370: 343: 338: 337: 275: 255: 236: 214: 192: 160: 129: 114: 113: 109: 78: 73: 72: 65: 12: 11: 5: 1867: 1865: 1857: 1856: 1846: 1845: 1840: 1839: 1786: 1745: 1698: 1691: 1662: 1647: 1621: 1606: 1572: 1553:(2): 149–157. 1533: 1486: 1471: 1442: 1427: 1393: 1358: 1312: 1293:(2): 382–392. 1274: 1259: 1233: 1194: 1179: 1152: 1151: 1149: 1146: 1141: 1138: 1123: 1119: 1089: 1086: 1082: 1081: 1070: 1066: 1062: 1057: 1054: 1051: 1047: 1043: 1038: 1031: 1028: 1021: 1016: 1013: 1010: 1006: 1001: 995: 991: 987: 976: 946: 942: 938: 935: 932: 929: 926: 923: 918: 912: 909: 903: 900: 897: 894: 889: 883: 880: 874: 871: 868: 848: 844: 840: 837: 834: 831: 828: 825: 820: 814: 811: 805: 802: 799: 796: 791: 785: 782: 776: 773: 770: 767: 764: 759: 752: 749: 734: 733: 722: 718: 714: 711: 708: 705: 702: 699: 696: 693: 690: 687: 683: 677: 674: 671: 667: 663: 660: 655: 652: 649: 645: 641: 636: 629: 626: 618: 614: 611: 607: 603: 598: 593: 590: 587: 583: 580: 577: 552: 532: 520: 517: 497: 477: 474: 471: 466: 460: 457: 434: 431: 428: 423: 417: 414: 389: 386: 383: 379: 356: 353: 350: 346: 334: 333: 321: 317: 314: 311: 308: 305: 302: 299: 296: 293: 288: 285: 282: 278: 274: 269: 263: 260: 254: 249: 246: 243: 239: 235: 232: 227: 224: 221: 217: 213: 208: 201: 198: 191: 188: 185: 182: 179: 174: 168: 165: 159: 156: 153: 150: 145: 138: 135: 128: 123: 118: 112: 108: 103: 98: 95: 92: 88: 85: 82: 64: 61: 13: 10: 9: 6: 4: 3: 2: 1866: 1855: 1852: 1851: 1849: 1835: 1831: 1827: 1823: 1818: 1817:11311/1121161 1813: 1809: 1805: 1801: 1797: 1790: 1787: 1782: 1778: 1773: 1768: 1765:: 1158–1163. 1764: 1760: 1756: 1749: 1746: 1741: 1737: 1733: 1729: 1725: 1721: 1717: 1713: 1709: 1702: 1699: 1694: 1688: 1684: 1680: 1676: 1669: 1667: 1663: 1658: 1654: 1650: 1644: 1640: 1636: 1632: 1625: 1622: 1617: 1613: 1609: 1603: 1599: 1595: 1591: 1587: 1583: 1576: 1573: 1568: 1564: 1560: 1556: 1552: 1548: 1544: 1537: 1534: 1529: 1525: 1521: 1517: 1513: 1509: 1505: 1501: 1497: 1490: 1487: 1482: 1478: 1474: 1468: 1464: 1460: 1456: 1449: 1447: 1443: 1438: 1434: 1430: 1424: 1420: 1416: 1412: 1408: 1404: 1397: 1394: 1389: 1385: 1381: 1377: 1374:(2): 021007. 1373: 1369: 1362: 1359: 1354: 1350: 1346: 1342: 1338: 1334: 1330: 1326: 1319: 1317: 1313: 1308: 1304: 1300: 1296: 1292: 1288: 1281: 1279: 1275: 1270: 1266: 1262: 1256: 1252: 1248: 1244: 1237: 1234: 1229: 1225: 1221: 1217: 1213: 1209: 1205: 1198: 1195: 1190: 1186: 1182: 1176: 1172: 1168: 1164: 1157: 1154: 1147: 1145: 1139: 1137: 1117: 1107: 1105: 1100: 1097: 1095: 1087: 1085: 1068: 1052: 1045: 1036: 1026: 1019: 1011: 1004: 993: 985: 964: 963: 962: 958: 944: 940: 936: 933: 924: 916: 910: 907: 901: 895: 887: 878: 869: 866: 846: 842: 838: 835: 826: 818: 812: 809: 803: 797: 789: 780: 771: 765: 757: 747: 720: 716: 712: 709: 706: 703: 700: 697: 694: 691: 688: 685: 672: 665: 661: 650: 643: 634: 624: 612: 609: 605: 601: 596: 566: 565: 564: 550: 530: 518: 516: 513: 511: 495: 472: 464: 455: 429: 421: 415: 412: 384: 377: 351: 344: 319: 315: 312: 309: 306: 303: 300: 297: 294: 283: 276: 267: 261: 258: 252: 244: 237: 233: 222: 215: 206: 196: 189: 180: 172: 166: 163: 157: 151: 143: 133: 121: 110: 106: 101: 71: 70: 69: 62: 60: 58: 54: 50: 46: 41: 37: 35: 31: 27: 23: 19: 1799: 1795: 1789: 1762: 1758: 1748: 1715: 1711: 1701: 1674: 1630: 1624: 1581: 1575: 1550: 1546: 1536: 1503: 1499: 1489: 1454: 1402: 1396: 1371: 1367: 1361: 1328: 1324: 1290: 1286: 1242: 1236: 1211: 1207: 1197: 1162: 1156: 1143: 1108: 1101: 1098: 1091: 1088:Applications 1083: 959: 735: 522: 514: 335: 66: 42: 38: 25: 21: 15: 1506:: 338–351. 1092:Initially, 1802:: 108542. 1796:Automatica 1287:Automatica 1214:: 104560. 1148:References 1834:204188183 1826:0005-1098 1781:1877-7058 1740:197472507 1732:2332-9017 1657:246479771 1616:124192684 1567:1367-5788 1528:199383010 1388:2377-2158 1353:126167977 1345:0167-4730 1331:: 35–44. 1307:0005-1098 1269:246476974 1228:209339118 1189:246479771 1122:∞ 1118:ℓ 1030:^ 1020:− 981:max-error 934:× 911:_ 902:− 882:¯ 836:× 813:_ 784:¯ 751:^ 707:… 662:− 628:^ 602:⁡ 459:¯ 416:_ 310:… 262:_ 200:¯ 167:_ 158:− 137:¯ 107:⁡ 1848:Category 1520:31369950 1437:3713493 53:support 1832:  1824:  1779:  1738:  1730:  1689:  1655:  1645:  1614:  1604:  1565:  1526:  1518:  1481:127406 1479:  1469:  1435:  1425:  1386:  1351:  1343:  1305:  1267:  1257:  1226:  1187:  1177:  1830:S2CID 1736:S2CID 1718:(4). 1653:S2CID 1612:S2CID 1524:S2CID 1477:S2CID 1433:S2CID 1349:S2CID 1265:S2CID 1224:S2CID 1185:S2CID 20:, an 1822:ISSN 1777:ISSN 1728:ISSN 1687:ISBN 1643:ISBN 1602:ISBN 1563:ISSN 1516:PMID 1467:ISBN 1423:ISBN 1384:ISSN 1341:ISSN 1303:ISSN 1255:ISBN 1175:ISBN 686:< 445:and 369:and 253:> 234:> 1812:hdl 1804:doi 1800:110 1767:doi 1763:199 1720:doi 1679:doi 1635:doi 1594:hdl 1586:doi 1555:doi 1508:doi 1504:118 1459:doi 1415:hdl 1407:doi 1376:doi 1333:doi 1295:doi 1247:doi 1216:doi 1167:doi 990:max 961:by 26:IPM 16:In 1850:: 1828:. 1820:. 1810:. 1798:. 1775:. 1761:. 1757:. 1734:. 1726:. 1714:. 1710:. 1685:. 1665:^ 1651:. 1641:. 1610:. 1600:. 1592:. 1584:. 1561:. 1551:33 1549:. 1522:. 1514:. 1502:. 1498:. 1475:. 1465:. 1445:^ 1431:. 1421:. 1413:. 1382:. 1370:. 1347:. 1339:. 1329:75 1327:. 1315:^ 1301:. 1291:45 1289:. 1277:^ 1263:. 1253:. 1222:. 1212:34 1210:. 1206:. 1183:. 1173:. 1836:. 1814:: 1806:: 1783:. 1769:: 1742:. 1722:: 1716:5 1695:. 1681:: 1659:. 1637:: 1618:. 1596:: 1588:: 1569:. 1557:: 1530:. 1510:: 1483:. 1461:: 1439:. 1417:: 1409:: 1390:. 1378:: 1372:1 1355:. 1335:: 1309:. 1297:: 1271:. 1249:: 1230:. 1218:: 1191:. 1169:: 1069:, 1065:| 1061:) 1056:) 1053:i 1050:( 1046:x 1042:( 1037:p 1027:y 1015:) 1012:i 1009:( 1005:y 1000:| 994:i 986:= 975:L 945:2 941:/ 937:1 931:) 928:) 925:x 922:( 917:p 908:y 899:) 896:x 893:( 888:p 879:y 873:( 870:= 867:h 847:2 843:/ 839:1 833:) 830:) 827:x 824:( 819:p 810:y 804:+ 801:) 798:x 795:( 790:p 781:y 775:( 772:= 769:) 766:x 763:( 758:p 748:y 721:, 717:} 713:N 710:, 704:, 701:1 698:= 695:i 692:, 689:h 682:| 676:) 673:i 670:( 666:y 659:) 654:) 651:i 648:( 644:x 640:( 635:p 625:y 617:| 613:: 610:h 606:{ 597:p 592:n 589:i 586:m 582:g 579:r 576:a 551:S 531:S 496:p 476:) 473:x 470:( 465:p 456:y 433:) 430:x 427:( 422:p 413:y 388:) 385:i 382:( 378:x 355:) 352:i 349:( 345:y 320:} 316:N 313:, 307:, 304:1 301:= 298:i 295:, 292:) 287:) 284:i 281:( 277:x 273:( 268:p 259:y 248:) 245:i 242:( 238:y 231:) 226:) 223:i 220:( 216:x 212:( 207:p 197:y 190:: 187:) 184:) 181:x 178:( 173:p 164:y 155:) 152:x 149:( 144:p 134:y 127:( 122:x 117:E 111:{ 102:p 97:n 94:i 91:m 87:g 84:r 81:a 24:(

Index

regression analysis
machine learning
nonparametric regression
scenario optimization
quantile regression
support
Gaussian process
trainable parameters
scenario optimization
structural reliability
doi
10.1109/CDC45484.2021.9683582
ISBN
978-1-6654-3659-5
S2CID
246479771
"On the quantification of aleatory and epistemic uncertainty using Sliced-Normal distributions"
doi
10.1016/j.sysconle.2019.104560
S2CID
209339118
doi
10.1109/CDC45484.2021.9683584
ISBN
978-1-6654-3659-5
S2CID
246476974


doi

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑