137:
means that it will possibly take a symbolic regression algorithm longer to find an appropriate model and parametrization, than traditional regression techniques. This can be attenuated by limiting the set of building blocks provided to the algorithm, based on existing knowledge of the system that produced the data; but in the end, using symbolic regression is a decision that has to be balanced with how much is known about the underlying system.
101:(to ensure the models accurately predict the data), but also special complexity measures, thus ensuring that the resulting models reveal the data's underlying structure in a way that's understandable from a human perspective. This facilitates reasoning and favors the odds of getting insights about the data-generating system, as well as improving generalisability and extrapolation behaviour by preventing
213:
In the real-world track, methods were trained to build interpretable predictive models for 14-day forecast counts of COVID-19 cases, hospitalizations, and deaths in New York State. These models were reviewed by a subject expert and assigned trust ratings and evaluated for accuracy and simplicity. The
136:
This approach has the disadvantage of having a much larger space to search, because not only the search space in symbolic regression is infinite, but there are an infinite number of models which will perfectly fit a finite data set (provided that the model complexity isn't artificially limited). This
144:
requires diversity in order to effectively explore the search space, the result is likely to be a selection of high-scoring models (and their corresponding set of parameters). Examining this collection could provide better insight into the underlying process, and allows the user to identify an
132:
While conventional regression techniques seek to optimize the parameters for a pre-specified model structure, symbolic regression avoids imposing prior assumptions, and instead infers the model from the data. In other words, it attempts to discover both model structures and model parameters.
62:. Usually, a subset of these primitives will be specified by the person operating it, but that's not a requirement of the technique. The symbolic regression problem for mathematical functions has been tackled with a variety of methods, including recombining equations most commonly using
178:
in Boston, MA. The competition pitted nine leading symbolic regression algorithms against each other on a novel set of data problems and considered different evaluation criteria. The competition was organized in two tracks, a synthetic track and a real-world data track.
22:
124:. Nevertheless, if the sought-for equation is not too complex it is possible to solve the symbolic regression problem exactly by generating every possible function (built from some predefined set of operators) and evaluating them on the dataset in question.
74:. Another non-classical alternative method to SR is called Universal Functions Originator (UFO), which has a different mechanism, search-space, and building strategy. Further methods such as Exact Learning attempt to transform the fitting problem into a
93:. It attempts to uncover the intrinsic relationships of the dataset, by letting the patterns in the data itself reveal the appropriate models, rather than imposing a model structure that is deemed mathematically tractable from a human perspective. The
254:
developed the "AI Feynman" algorithm, which attempts symbolic regression by training a neural network to represent the mystery function, then runs tests against the neural network to attempt to break up the problem into smaller parts. For example, if
516:, solved only 71. AI Feynman, in contrast to classic symbolic regression methods, requires a very large dataset in order to first train the neural network and is naturally biased towards equations that are common in elementary physics.
458:
187:
In the synthetic track, methods were compared according to five properties: re-discovery of exact expressions; feature selection; resistance to local optima; extrapolation; and sensitivity to noise. Rankings of the methods were:
504:, which reduces the size of the problem to be more manageable. AI Feynman also transforms the inputs and outputs of the mystery function in order to produce a new function which can be solved with other techniques, and performs
1201:
Kevin René Broløs; Meera Vieira
Machado; Chris Cave; Jaan Kasak; Valdemar Stentoft-Hansen; Victor Galindo Batanero; Tom Jelen; Casper Wilstrup (2021-04-12). "An Approach to Symbolic Regression Using Feyn".
1166:
Udrescu, Silviu-Marian; Tan, Andrew; Feng, Jiahai; Neto, Orisvaldo; Wu, Tailin; Tegmark, Max (2020-12-16). "AI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity".
243:
by implementing evolutionary algorithms that iteratively improve the best-fit expression over many generations. Recently, researchers have proposed algorithms utilizing other tactics in
1401:
46:
No particular model is provided as a starting point for symbolic regression. Instead, initial expressions are formed by randomly combining mathematical building blocks such as
166:. The benchmark intends to be a living project: it encourages the submission of improvements, new datasets, and new methods, to keep track of the state of the art in SR.
1452:
162:
was proposed as a large benchmark for symbolic regression. In its inception, SRBench featured 14 symbolic regression methods, 7 other ML methods, and 252 datasets from
175:
258:
533:
is a quantum-inspired simulation and machine learning technology that helps search through an infinite list of potential mathematical models to solve a problem.
498:
478:
1044:
1442:"Performance improvement of machine learning via automatic discovery of facilitating functions as applied to a problem of symbolic system identification"
1043:
La Cava, William; Orzechowski, Patryk; Burlacu, Bogdan; de Franca, Fabricio; Virgolin, Marco; Jin, Ying; Kommenda, Michael; Moore, Jason (2021).
43:
that searches the space of mathematical expressions to find the model that best fits a given dataset, both in terms of accuracy and simplicity.
1421:
1412:
1376:
1482:(Java applet) — approximates a function by evolving combinations of simple arithmetic operators, using algorithms developed by
509:
105:. Accuracy and simplicity may be left as two separate objectives of the regression—in which case the optimum solutions form a
694:
75:
943:"Order of nonlinearity as a complexity measure for models generated by symbolic regression via pareto genetic programming"
629:
599:
98:
120:
problem, in the sense that one cannot always find the best possible mathematical expression to fit to a given dataset in
1517:
1512:
679:
603:
574:
570:
501:
659:
564:
560:
557:, a software environment for heuristic and evolutionary algorithms, including symbolic regression (free, open source)
174:
In 2022, SRBench announced the competition
Interpretable Symbolic Regression for Data Science, which was held at the
669:
635:-compatible interface, achieved one of the best trade-offs between accuracy and simplicity of discovered models on
110:
1522:
674:
71:
908:
508:
to reduce the number of independent variables involved. The algorithm was able to "discover" 100 equations from
1491:
244:
240:
47:
942:
735:
649:
141:
55:
1432:
664:
861:
1114:
816:
727:
505:
740:
689:
684:
654:
625:
607:
67:
63:
40:
1076:
Michael
Kommenda; William La Cava; Maimuna Majumder; Fabricio Olivetti de França; Marco Virgolin.
1336:
1203:
1167:
1104:
1056:
1025:
1007:
965:
923:
889:
780:
761:
109:—or they may be combined into a single objective by means of a model selection principle such as
1297:
542:
539:
is a
Genetic Programming-based automated feature construction algorithm for symbolic regression.
218:
202:
1441:
1331:
Izzo, Dario; Biscani, Francesco; Mereta, Alessio (2016). "Differentiable genetic programming".
89:
specification of a model, symbolic regression isn't affected by human bias, or unknown gaps in
1279:
1244:
1148:
1130:
881:
842:
779:
Ying Jin; Weilin Fu; Jian Kang; Jiadong Guo; Jian Guo (2019). "Bayesian
Symbolic Regression".
753:
79:
51:
1002:
Bartlett, Deaglan; Desmond, Harry; Ferreira, Pedro (2023). "Exhaustive
Symbolic Regression".
1271:
1236:
1138:
1122:
1017:
957:
915:
873:
832:
824:
745:
584:
453:{\displaystyle f(x_{1},...,x_{i},x_{i+1},...,x_{n})=g(x_{1},...,x_{i})+h(x_{i+1},...,x_{n})}
94:
90:
1381:
140:
Nevertheless, this characteristic of symbolic regression also has advantages: because the
121:
25:
1468:
1049:
Proceedings of the Neural
Information Processing Systems Track on Datasets and Benchmarks
460:, tests against the neural network can recognize the separation and proceed to solve for
1118:
820:
731:
1143:
1092:
837:
804:
483:
463:
59:
715:
536:
1506:
1029:
927:
893:
1477:
1400:
Mark J. Willis; Hugo G. Hiden; Ben McKay; Gary A. Montague; Peter
Marenbach (1997).
969:
21:
765:
632:
554:
106:
1263:
919:
1224:
1077:
251:
145:
approximation that better fits their needs in terms of accuracy and simplicity.
117:
102:
1264:"SR-Forest: A Genetic Programming based Heterogeneous Ensemble Learning Method"
877:
1275:
1240:
1078:"SRBench Competition 2022: Interpretable Symbolic Regression for Data Science"
1021:
1283:
1248:
1134:
984:
961:
885:
551:, differentiable Cartesian Genetic Programming in python (free, open source)
1483:
749:
617:
228:
1152:
1126:
846:
828:
757:
1262:
Zhang, Hengzhe; Zhou, Aimin; Chen, Qi; Xue, Bing; Zhang, Mengjie (2023).
1045:"Contemporary Symbolic Regression Methods and their Relative Performance"
611:
567:
technique for various problems including symbolic regression (commercial)
530:
500:
separately and with different variables as inputs. This is an example of
223:
192:
78:
in a natural function space, usually built around generalizations of the
592:, symbolic regression software based on simulated annealing (commercial)
1356:
595:
197:
163:
1361:
1302:
815:(16). American Association for the Advancement of Science: eaay2631.
580:
548:
513:
1495:
1317:
941:
Ekaterina J. Vladislavleva; Guido F. Smits; Dick Den Hertog (2009).
726:(5923). American Association for the Advancement of Science: 81–85.
97:
that drives the evolution of the models takes into account not only
1341:
1208:
1172:
1109:
1061:
1012:
785:
621:
1318:"Differentiable Cartesian Genetic Programming, v1.6 Documentation"
20:
1402:"Genetic programming: An introduction and survey of applications"
589:
28:
as it can be used in symbolic regression to represent a function.
1377:"'Machine Scientists' Distill the Laws of Physics From Raw Data"
1093:"AI Feynman: A physics-inspired method for symbolic regression"
805:"AI Feynman: A physics-inspired method for symbolic regression"
636:
577:
for symbolic regression and classification (free, open source)
583:, evolutionary symbolic regression software (commercial), and
159:
1333:
Proceedings of the
European Conference on Genetic Programming
545:
is a deep learning framework for symbolic optimization tasks
909:"A Natural Representation of Functions for Exact Learning"
716:"Distilling free-form natural laws from experimental data"
512:, while a leading software using evolutionary algorithms,
650:
Closed-form expression § Conversion from numerical forms
1187:
1478:"Simple Symbolic Regression Using Genetic Programming"
1429:
Empowering
Knowledge Computing with Variable Selection
1223:
Zhang, Hengzhe; Zhou, Aimin; Zhang, Hu (August 2022).
1440:
John R. Koza; Martin A. Keane; James P. Rice (1993).
486:
466:
261:
1492:"Symbolic Regression: Function Discovery & More"
1091:Udrescu, Silviu-Marian; Tegmark, Max (2020-04-17).
1188:"Feyn is a Python module for running the QLattice"
492:
472:
452:
116:It has been proven that symbolic regression is an
1449:IEEE International Conference on Neural Networks
1357:"High-Performance Symbolic Regression in Python"
860:Ali R. Al-Roomi; Mohamed E. El-Hawary (2020).
1268:IEEE Transactions on Evolutionary Computation
1229:IEEE Transactions on Evolutionary Computation
1004:IEEE Transactions on Evolutionary Computation
950:IEEE Transactions on Evolutionary Computation
598:, symbolic regression environment written in
8:
239:Most symbolic regression algorithms prevent
803:Silviu-Marian Udrescu; Max Tegmark (2020).
798:
796:
66:, as well as more recent methods utilizing
983:Virgolin, Marco; Pissis, Solon P. (2022).
1340:
1207:
1171:
1142:
1108:
1060:
1011:
989:Transactions on Machine Learning Research
836:
784:
739:
485:
465:
441:
410:
388:
363:
341:
310:
297:
272:
260:
1225:"An Evolutionary Forest for Regression"
706:
614:-free optimization (free, open source)
1420:Wouter Minnebo; Sean Stijven (2011).
7:
714:Michael Schmidt; Hod Lipson (2009).
128:Difference from classical regression
1469:"Symbolic regression — an overview"
14:
219:uDSR (Deep Symbolic Optimization)
203:uDSR (Deep Symbolic Optimization)
198:PySR (Python Symbolic Regression)
1422:"Chapter 4: Symbolic Regression"
985:"Symbolic Regression is NP-hard"
862:"Universal Functions Originator"
606:, using regularized evolution,
510:The Feynman Lectures on Physics
695:Discovery system (AI research)
571:Multi Expression Programming X
447:
403:
394:
356:
347:
265:
229:geneticengine (Genetic Engine)
1:
907:Benedict W. J. Irwin (2021).
1298:"Deep symbolic optimization"
680:Multi expression programming
575:Multi expression programming
214:ranking of the methods was:
1409:IEE Conference Publications
660:Gene expression programming
639:in 2021 (free, open source)
565:Gene expression programming
16:Type of regression analysis
1539:
920:10.21203/rs.3.rs-149856/v1
878:10.1016/j.asoc.2020.106417
670:Linear genetic programming
250:Silviu-Marian Udrescu and
111:minimum description length
1276:10.1109/TEVC.2023.3243172
1241:10.1109/TEVC.2021.3136667
1022:10.1109/TEVC.2023.3280250
872:. Elsevier B.V.: 106417.
675:Mathematical optimization
628:symbolic regression with
563:, - an implementation of
1476:Hansueli Gerber (1998).
962:10.1109/tevc.2008.926486
170:SRBench Competition 2022
750:10.1126/science.1165893
573:, an implementation of
241:combinatorial explosion
1127:10.1126/sciadv.aay2631
866:Applied Soft Computing
829:10.1126/sciadv.aay2631
494:
474:
454:
142:evolutionary algorithm
48:mathematical operators
29:
1490:Katya Vladislavleva.
1467:Ivan Zelinka (2004).
1433:University of Antwerp
665:Kolmogorov complexity
495:
475:
455:
24:
506:dimensional analysis
484:
464:
259:
235:Non-standard methods
1518:Genetic programming
1513:Regression analysis
1455:. pp. 191–198.
1415:. pp. 314–319.
1119:2020SciA....6.2631U
821:2020SciA....6.2631U
732:2009Sci...324...81S
690:Reverse mathematics
685:Regression analysis
655:Genetic programming
608:simulated annealing
537:Evolutionary Forest
64:genetic programming
41:regression analysis
33:Symbolic regression
502:divide and conquer
490:
470:
450:
52:analytic functions
30:
1451:. San Francisco:
1365:. 18 August 2022.
525:End-user software
493:{\displaystyle h}
473:{\displaystyle g}
85:By not requiring
80:Meijer-G function
1530:
1523:Computer algebra
1499:
1494:. Archived from
1481:
1472:
1456:
1446:
1436:
1431:(M.Sc. thesis).
1426:
1416:
1406:
1387:
1386:
1373:
1367:
1366:
1353:
1347:
1346:
1344:
1328:
1322:
1321:
1320:. June 10, 2022.
1314:
1308:
1307:
1306:. June 22, 2022.
1294:
1288:
1287:
1259:
1253:
1252:
1220:
1214:
1213:
1211:
1198:
1192:
1191:
1190:. June 22, 2022.
1184:
1178:
1177:
1175:
1163:
1157:
1156:
1146:
1112:
1103:(16): eaay2631.
1097:Science Advances
1088:
1082:
1081:
1073:
1067:
1066:
1064:
1040:
1034:
1033:
1015:
999:
993:
992:
980:
974:
973:
947:
938:
932:
931:
913:
904:
898:
897:
857:
851:
850:
840:
809:Science_Advances
800:
791:
790:
788:
776:
770:
769:
743:
711:
585:software library
499:
497:
496:
491:
479:
477:
476:
471:
459:
457:
456:
451:
446:
445:
421:
420:
393:
392:
368:
367:
346:
345:
321:
320:
302:
301:
277:
276:
209:Real-world Track
176:GECCO conference
95:fitness function
91:domain knowledge
68:Bayesian methods
1538:
1537:
1533:
1532:
1531:
1529:
1528:
1527:
1503:
1502:
1489:
1475:
1466:
1463:
1444:
1439:
1424:
1419:
1404:
1399:
1396:
1394:Further reading
1391:
1390:
1385:. May 10, 2022.
1382:Quanta Magazine
1375:
1374:
1370:
1355:
1354:
1350:
1330:
1329:
1325:
1316:
1315:
1311:
1296:
1295:
1291:
1261:
1260:
1256:
1222:
1221:
1217:
1200:
1199:
1195:
1186:
1185:
1181:
1165:
1164:
1160:
1090:
1089:
1085:
1075:
1074:
1070:
1042:
1041:
1037:
1001:
1000:
996:
982:
981:
977:
945:
940:
939:
935:
911:
906:
905:
901:
859:
858:
854:
802:
801:
794:
778:
777:
773:
741:10.1.1.308.2245
713:
712:
708:
703:
646:
527:
522:
482:
481:
462:
461:
437:
406:
384:
359:
337:
306:
293:
268:
257:
256:
237:
211:
185:
183:Synthetic Track
172:
156:
151:
130:
122:polynomial time
76:moments problem
72:neural networks
60:state variables
39:) is a type of
26:Expression tree
17:
12:
11:
5:
1536:
1534:
1526:
1525:
1520:
1515:
1505:
1504:
1501:
1500:
1498:on 2014-12-18.
1487:
1473:
1462:
1461:External links
1459:
1458:
1457:
1437:
1417:
1395:
1392:
1389:
1388:
1368:
1348:
1323:
1309:
1289:
1254:
1235:(4): 735–749.
1215:
1193:
1179:
1158:
1083:
1068:
1035:
994:
975:
956:(2): 333–349.
933:
899:
852:
792:
771:
705:
704:
702:
699:
698:
697:
692:
687:
682:
677:
672:
667:
662:
657:
652:
645:
642:
641:
640:
615:
593:
587:
578:
568:
558:
552:
546:
540:
534:
526:
523:
521:
518:
489:
469:
449:
444:
440:
436:
433:
430:
427:
424:
419:
416:
413:
409:
405:
402:
399:
396:
391:
387:
383:
380:
377:
374:
371:
366:
362:
358:
355:
352:
349:
344:
340:
336:
333:
330:
327:
324:
319:
316:
313:
309:
305:
300:
296:
292:
289:
286:
283:
280:
275:
271:
267:
264:
236:
233:
232:
231:
226:
221:
210:
207:
206:
205:
200:
195:
184:
181:
171:
168:
155:
152:
150:
147:
129:
126:
15:
13:
10:
9:
6:
4:
3:
2:
1535:
1524:
1521:
1519:
1516:
1514:
1511:
1510:
1508:
1497:
1493:
1488:
1485:
1479:
1474:
1470:
1465:
1464:
1460:
1454:
1450:
1443:
1438:
1434:
1430:
1423:
1418:
1414:
1410:
1403:
1398:
1397:
1393:
1384:
1383:
1378:
1372:
1369:
1364:
1363:
1358:
1352:
1349:
1343:
1338:
1334:
1327:
1324:
1319:
1313:
1310:
1305:
1304:
1299:
1293:
1290:
1285:
1281:
1277:
1273:
1269:
1265:
1258:
1255:
1250:
1246:
1242:
1238:
1234:
1230:
1226:
1219:
1216:
1210:
1205:
1197:
1194:
1189:
1183:
1180:
1174:
1169:
1162:
1159:
1154:
1150:
1145:
1140:
1136:
1132:
1128:
1124:
1120:
1116:
1111:
1106:
1102:
1098:
1094:
1087:
1084:
1079:
1072:
1069:
1063:
1058:
1054:
1050:
1046:
1039:
1036:
1031:
1027:
1023:
1019:
1014:
1009:
1005:
998:
995:
990:
986:
979:
976:
971:
967:
963:
959:
955:
951:
944:
937:
934:
929:
925:
921:
917:
910:
903:
900:
895:
891:
887:
883:
879:
875:
871:
867:
863:
856:
853:
848:
844:
839:
834:
830:
826:
822:
818:
814:
810:
806:
799:
797:
793:
787:
782:
775:
772:
767:
763:
759:
755:
751:
747:
742:
737:
733:
729:
725:
721:
717:
710:
707:
700:
696:
693:
691:
688:
686:
683:
681:
678:
676:
673:
671:
668:
666:
663:
661:
658:
656:
653:
651:
648:
647:
643:
638:
634:
631:
627:
623:
619:
616:
613:
609:
605:
601:
597:
594:
591:
588:
586:
582:
579:
576:
572:
569:
566:
562:
561:GeneXProTools
559:
556:
553:
550:
547:
544:
541:
538:
535:
532:
529:
528:
524:
519:
517:
515:
511:
507:
503:
487:
467:
442:
438:
434:
431:
428:
425:
422:
417:
414:
411:
407:
400:
397:
389:
385:
381:
378:
375:
372:
369:
364:
360:
353:
350:
342:
338:
334:
331:
328:
325:
322:
317:
314:
311:
307:
303:
298:
294:
290:
287:
284:
281:
278:
273:
269:
262:
253:
248:
246:
242:
234:
230:
227:
225:
222:
220:
217:
216:
215:
208:
204:
201:
199:
196:
194:
191:
190:
189:
182:
180:
177:
169:
167:
165:
161:
153:
148:
146:
143:
138:
134:
127:
125:
123:
119:
114:
112:
108:
104:
100:
99:error metrics
96:
92:
88:
83:
81:
77:
73:
69:
65:
61:
57:
53:
49:
44:
42:
38:
34:
27:
23:
19:
1496:the original
1448:
1428:
1408:
1380:
1371:
1360:
1351:
1332:
1326:
1312:
1301:
1292:
1267:
1257:
1232:
1228:
1218:
1196:
1182:
1161:
1100:
1096:
1086:
1071:
1052:
1048:
1038:
1003:
997:
988:
978:
953:
949:
936:
914:(Preprint).
902:
869:
865:
855:
812:
808:
774:
723:
719:
709:
633:scikit-learn
626:evolutionary
555:HeuristicLab
249:
238:
212:
186:
173:
157:
149:Benchmarking
139:
135:
131:
115:
107:Pareto front
86:
84:
45:
36:
32:
31:
18:
252:Max Tegmark
103:overfitting
1507:Categories
1342:1611.04766
1209:2104.05417
1173:2006.10782
1110:1905.11481
1062:2107.14351
1013:2211.11461
786:1910.08892
701:References
624:back-end)
1484:John Koza
1284:1089-778X
1249:1089-778X
1135:2375-2548
1030:253735380
928:234014141
894:219743405
886:1568-4946
736:CiteSeerX
590:TuringBot
158:In 2021,
56:constants
1153:32426452
970:12072764
847:32426452
758:19342586
644:See also
620:, fast (
618:GP-GOMEA
612:gradient
531:QLattice
520:Software
224:QLattice
193:QLattice
87:a priori
1270:: 1–1.
1144:7159912
1115:Bibcode
838:7159912
817:Bibcode
766:7366016
728:Bibcode
720:Science
637:SRBench
160:SRBench
154:SRBench
118:NP-hard
1362:GitHub
1303:GitHub
1282:
1247:
1151:
1141:
1133:
1028:
968:
926:
892:
884:
845:
835:
764:
756:
738:
630:Python
610:, and
600:Python
581:Eureqa
514:Eureqa
58:, and
1445:(PDF)
1425:(PDF)
1405:(PDF)
1337:arXiv
1204:arXiv
1168:arXiv
1105:arXiv
1057:arXiv
1026:S2CID
1008:arXiv
1006:: 1.
966:S2CID
946:(PDF)
924:S2CID
912:(PDF)
890:S2CID
781:arXiv
762:S2CID
604:Julia
1453:IEEE
1280:ISSN
1245:ISSN
1149:PMID
1131:ISSN
882:ISSN
843:PMID
754:PMID
602:and
596:PySR
549:dCGP
543:uDSR
480:and
164:PMLB
70:and
1413:IEE
1272:doi
1237:doi
1139:PMC
1123:doi
1018:doi
958:doi
916:doi
874:doi
833:PMC
825:doi
746:doi
724:324
622:C++
1509::
1447:.
1427:.
1411:.
1407:.
1379:.
1359:.
1335:.
1300:.
1278:.
1266:.
1243:.
1233:26
1231:.
1227:.
1147:.
1137:.
1129:.
1121:.
1113:.
1099:.
1095:.
1055:.
1051:.
1047:.
1024:.
1016:.
987:.
964:.
954:13
952:.
948:.
922:.
888:.
880:.
870:94
868:.
864:.
841:.
831:.
823:.
811:.
807:.
795:^
760:.
752:.
744:.
734:.
722:.
718:.
247:.
245:AI
113:.
82:.
54:,
50:,
37:SR
1486:.
1480:.
1471:.
1435:.
1345:.
1339::
1286:.
1274::
1251:.
1239::
1212:.
1206::
1176:.
1170::
1155:.
1125::
1117::
1107::
1101:6
1080:.
1065:.
1059::
1053:1
1032:.
1020::
1010::
991:.
972:.
960::
930:.
918::
896:.
876::
849:.
827::
819::
813:6
789:.
783::
768:.
748::
730::
488:h
468:g
448:)
443:n
439:x
435:,
432:.
429:.
426:.
423:,
418:1
415:+
412:i
408:x
404:(
401:h
398:+
395:)
390:i
386:x
382:,
379:.
376:.
373:.
370:,
365:1
361:x
357:(
354:g
351:=
348:)
343:n
339:x
335:,
332:.
329:.
326:.
323:,
318:1
315:+
312:i
308:x
304:,
299:i
295:x
291:,
288:.
285:.
282:.
279:,
274:1
270:x
266:(
263:f
35:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.