1887:
5088:
1141:
33:
6115:
4326:
5678:
1580:
3465:
4809:
1135:
5356:
2344:
142:
6351:
In these sub-variants, while the individual neurons do not store any information from one step to the next, the network as a whole still can have persistent memory because of the implicit one-step delay between the synaptic inputs and the resulting firing of the neuron. In other words, the state of
5871:
6478:
Prior integrate-and-fire models with stochastic characteristics relied on including a noise to simulate stochasticity. The Galves–Löcherbach model distinguishes itself because it is inherently stochastic, incorporating probabilistic measures directly in the calculation of spikes. It is also a model
4050:
6343:
4633:
1882:{\displaystyle \mathop {\mathrm {Prob} } {\biggl (}\,\bigcap _{k\in K}{\bigl \{}X_{k}=a_{k}{\bigr \}}\;{\mathrel {\bigg |}}\;X{\biggr )}\;\;=\;\;\prod _{k\in K}\mathop {\mathrm {Prob} } {\biggl (}\,X_{k}=a_{k}\;{\mathrel {\bigg |}}\;X{\bigl \,{\mathrel {:}}\,t-1{\bigl ]}\,{\biggr )}}
5429:
6479:
that may be applied relatively easily, from a computational standpoint, with a good ratio between cost and efficiency. It remains a non-Markovian model, since the probability of a given neuronal spike depends on the accumulated activity of the system since the last spike.
5420:(and hence the firing probability) depends only on the inputs in the previous time step; all earlier firings of the network, including of the same neuron, are ignored. That is, the neuron does not have any internal state, and is essentially a (stochastic) function block.
4643:
These formulas imply that the potential decays towards zero with time, when there are no external or synaptic inputs and the neuron itself does not fire. Under these conditions, the membrane potential of a biological neuron will tend towards some negative value, the
2802:
5860:
3215:
6486:
limit of the interacting neuronal system, the long-range behavior and aspects pertaining to the process in the sense of predicting and classifying behaviors according to a fonction of parameters, and the generalization of the model to the continuous time.
5083:{\displaystyle V_{i}\;\;=\;\;\left\{{\begin{array}{ll}V_{i}^{\mathsf {R}}&\quad \mathrm {if} \;X_{i}=1\\\displaystyle \mu _{i}\,V_{i}\;+\;E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}&\quad \mathrm {if} \;X_{i}=0\end{array}}\right.}
3663:
934:
5099:
923:
6110:{\displaystyle V_{i}\;\;=\;\;\left\{{\begin{array}{ll}V_{i}^{\mathsf {R}}&\quad \mathrm {if} \;X_{i}=1\\\displaystyle E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}&\quad \mathrm {if} \;X_{i}=0\end{array}}\right.}
4321:{\displaystyle V_{i}\;\;=\;\;\left\{{\begin{array}{ll}V_{i}^{\mathsf {R}}&\mathrm {if} \;X_{i}=1\\\mu _{i}\,V_{i}&\mathrm {if} \;X_{i}=0\end{array}}\right\}\;+\;E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}}
2083:
6121:
4390:
5673:{\displaystyle V_{i}\;\;=\;\;\left\{{\begin{array}{ll}V_{i}^{\mathsf {R}}&\mathrm {if} \;X_{i}=1\\0&\mathrm {if} \;X_{i}=0\end{array}}\right\}\;+\;E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}}
211:
of those spikes. The potential may include the spikes of only a finite subset of other neurons, thus modeling arbitrary synapse topologies. In particular, the GL model includes as a special case the general
145:
3D Vizualization of the Galves–Löcherbach model simulating the spiking of 4000 neurons (4 layers with one population of inhibitory neurons and one population of excitatory neurons each) in 180 time intervals.
688:
1966:
3460:{\displaystyle V_{i}\;\;=\;\;\left\{{\begin{array}{ll}0&\mathrm {if} \;X_{i}=1\\\mu _{i}\,V_{i}&\mathrm {if} \;X_{i}=0\end{array}}\right\}\;+\;E_{i}\;+\;\sum _{j\in I}w_{j\to i}\,X_{j}}
2650:
5684:
4382:
270:
In the GL model, all neurons are assumed evolve synchronously and atomically between successive sampling times. In particular, within each time step, each neuron may fire at most once. A
7001:
1572:
4729:
4680:
1192:
3147:
2380:
446:
7536:
1130:{\displaystyle \mathop {\mathrm {Prob} } {\biggl (}\,X_{i}=1\;{\mathrel {\bigg |}}\;X\,{\biggr )}\;\;=\;\;\Phi _{i}{\biggl (}X{\bigl \,{\mathrel {:}}\,t-1{\bigr ]}{\biggr )}}
5351:{\displaystyle V_{i}\;\;=\;\;X_{i}\,V_{i}^{\mathsf {R}}\;\;+\;\;(1-X_{i})\,{\biggl (}\mu _{i}\,V_{i}\;+\;E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}{\biggr )}}
3476:
3733:
3012:
2854:
2575:
1488:
Moreover, the firings in the same time step are conditionally independent, given the past network history, with the above probabilities. That is, for each finite subset
1436:
4801:
4038:
2922:
1512:
2832:
2533:
1394:
771:
7360:
5391:
3177:
3963:
3921:
3895:
3853:
3701:
2502:
546:
418:
376:
265:
1992:
334:
6464:
6429:
5418:
4756:
4005:
3207:
3114:
3047:
2969:
2473:
2051:
1338:
308:
6878:
Yaginuma, K. (2015). "A Stochastic System with
Infinite Interacting Components to Model the Time Evolution of the Membrane Potentials of a Population of Neurons".
7963:
6956:
3791:
1482:
819:
568:
472:
7493:
7473:
6390:
6370:
3811:
3765:
3087:
3067:
2942:
2639:
2619:
2599:
2432:
2408:
2075:
2012:
1456:
1358:
1296:
1276:
1256:
1236:
811:
791:
735:
521:
245:
2886:
7877:
2339:{\displaystyle V_{i}\;\;=\;\;\sum _{t'=\tau _{i}}^{t-1}{\biggl (}E_{i}+\sum _{j\in I}w_{j\rightarrow i}\,X_{j}{\biggr )}\alpha _{i}{\bigl ,t-1-t'{\bigr ]}}
6541:
Galves, A.; Löcherbach, E. (2013). "Infinite
Systems of Interacting Chains with Memory of Variable Length—A Stochastic Model for Biological Neural Nets".
196:
last fired. Thus each neuron "forgets" all previous spikes, including its own, whenever it fires. This property is a defining feature of the GL model.
6338:{\displaystyle V_{i}\;\;=\;\;X_{i}\,V_{i}^{\mathsf {R}}\;\;+\;\;(1-X_{i})\,{\biggl (}E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}{\biggr )}}
4628:{\displaystyle V_{i}\;\;=\;\;X_{i}\,V_{i}^{\mathsf {R}}\;\;+\;\;(1-X_{i})\,\mu _{i}\,V_{i}\;+\;E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}}
7794:
7804:
7478:
7488:
109:
7846:
81:
7561:
7743:
4044:
that the neuron assumes just after firing, apart from other contributions. The potential evolution formula therefore can be written also as
57:
of the topic and provide significant coverage of it beyond a mere trivial mention. If notability cannot be shown, the article is likely to be
8033:
8023:
7546:
6475:. Galves and Löcherbach referred to the process that Cessac described as "a version in a finite dimension" of their own probabilistic model.
267:. For simplicity, let's assume that these times extend to infinity in both directions, implying that the network has existed since forever.
7933:
7897:
88:
7850:
8232:
8201:
7938:
7048:
6949:
95:
8003:
7581:
7551:
128:
7854:
7838:
8048:
7753:
6973:
77:
7953:
7918:
7887:
7882:
7521:
7318:
7235:
7892:
7220:
4770:
variant of the integrate-and-fire GL neuron, which ignores all external and synaptic inputs (except possibly the self-synapse
7516:
7323:
7242:
1900:
224:
The GL model has been formalized in several different ways. The notations below are borrowed from several of those sources.
7978:
7858:
2797:{\displaystyle \mathop {\mathrm {Prob} } {\biggl (}\,X_{i}=1\;{\mathrel {\bigg |}}\;X\,{\biggr )}\;\;=\;\;\phi _{i}(V_{i})}
576:
8206:
7983:
7819:
7718:
7703:
7115:
7031:
6942:
54:
7993:
7629:
5855:{\displaystyle V_{i}\;\;=\;\;X_{i}\,V_{i}^{\mathsf {R}}\;\;+\;\;E_{i}\;+\;\sum _{j\in I\setminus \{i\}}w_{j\to i}\,X_{j}}
7988:
7591:
2057:
of a biological neuron), and is basically a weighted sum of the past spike indicators, since the last firing of neuron
7175:
7120:
7036:
6456:
247:
of indices. The state is defined only at discrete sampling times, represented by integers, with some fixed time step
7923:
7913:
7556:
7526:
6922:
4334:
7928:
7093:
6991:
6740:
Plesser, H. E.; Gerstner, W. (2000). "Noise in
Integrate-and-Fire Neurons: From Stochastic Input to Escape Rates".
6594:
Baccelli, François; Taillefumier, Thibaud (2019). "Replica-mean-field limits for intensity-based neural networks".
6513:
50:
7639:
7215:
6996:
102:
8227:
8008:
7809:
7723:
7708:
7098:
162:
66:
7842:
7728:
7150:
6508:
5366:
Even more specific sub-variants of the integrate-and-fire GL neuron are obtained by setting the recharge factor
7230:
7205:
1438:. The blue frame encloses all firing events that influence the probability of neuron 3 firing in the step from
7948:
7531:
7066:
1517:
8143:
8133:
7824:
7606:
7345:
7210:
7021:
6783:
De Masi, A.; Galves, A.; Löcherbach, E.; Presutti, E. (2015). "Hydrodynamic limit for interacting neurons".
6503:
3736:
2975:
are approximated in the GL model. The absence of a synapse between those two neurons is modeled by setting
213:
7428:
8085:
8013:
7272:
6687:
Cessac, B. (2011). "A discrete time neural network model with spiking neurons: II: Dynamics with noise".
4700:
4651:
1147:
8108:
8090:
8070:
8065:
7784:
7616:
7596:
7443:
7386:
7225:
7135:
3119:
2352:
7576:
423:
199:
In specific versions of the GL model, the past network spike history since the last firing of a neuron
3658:{\displaystyle V_{i}\;\;=\;\;(1-X_{i})\,\mu _{i}\,V_{i}\;+\;E_{i}\;+\;\sum _{j\in I}w_{j\to i}\,X_{j}}
1144:
Illustration of the general Galves-Löcherbach model for a neuronal network of 7 neurons, with indices
141:
8183:
8138:
8128:
7869:
7814:
7789:
7758:
7738:
7498:
7483:
7350:
6802:
6641:
6560:
4694:
43:
6448:
4758:
has no influence outside of the neuron, its zero level can be chosen independently for each neuron.
8178:
8018:
7943:
7748:
7508:
7418:
7308:
1484:(blue arrow and empty blue box). The red details indicate the corresponding concepts for neuron 6.
172:
In the most general definition, a GL network consists of a countable number of elements (idealized
6468:
3706:
2978:
2837:
8148:
8113:
8028:
7998:
7768:
7763:
7586:
7423:
7088:
7026:
6965:
6905:
6887:
6858:
6837:
6818:
6792:
6765:
6722:
6696:
6631:
6595:
6576:
6550:
3049:
is defined to be a decaying weighted sum of the firings of other neurons. Namely, when a neuron
2972:
2538:
2054:
1399:
158:
62:
7829:
6472:
4773:
4010:
2894:
1491:
2810:
1363:
740:
8168:
7973:
7624:
7381:
7298:
7267:
7160:
7140:
7130:
6986:
6981:
6836:
Duarte, A.; Ost, G. (2014). "A model for neural activity in the absence of external stimuli".
6757:
6714:
6669:
5369:
4645:
3155:
58:
7834:
7571:
3926:
3900:
3858:
3816:
3671:
381:
339:
250:
8188:
8075:
7958:
7328:
7303:
7252:
7180:
7103:
7056:
6897:
6810:
6749:
6706:
6659:
6649:
6620:"Phase transitions and self-organized criticality in networks of stochastic spiking neurons"
6568:
1971:
918:{\displaystyle \tau _{i}=\mathop {\mathrm {max} } \{s<t\;{\mathrel {\big |}}\;X_{i}=1\}.}
313:
6398:
5396:
4803:) during the time step immediately after its own firing. The equation for this variant is
4734:
3968:
3185:
3092:
3025:
2947:
2437:
2020:
1301:
277:
8153:
8053:
8038:
7799:
7733:
7411:
7355:
7338:
7083:
6431:
for each neuron, which can be assumed to be stored in its axon in the form of a traveling
3149:. Apart from those contributions, during each time step, the potential decays by a fixed
2507:
2383:
271:
7968:
7200:
3770:
2479:, represents some additional contribution to the potential that may arrive between times
1461:
551:
451:
6806:
6645:
6564:
5912:
4850:
2482:
526:
8158:
8123:
8043:
7649:
7396:
7313:
7282:
7277:
7257:
7247:
7190:
7185:
7165:
7145:
7110:
7078:
7061:
6664:
6619:
6460:
6444:
6432:
6375:
6355:
3796:
3750:
3072:
3052:
2927:
2624:
2604:
2584:
2417:
2393:
2060:
1997:
1441:
1343:
1281:
1261:
1241:
1197:
796:
776:
696:
480:
230:
17:
3069:
fires, its potential is reset to zero. Until its next firing, a spike from any neuron
2859:
8221:
8060:
7601:
7438:
7433:
7391:
7333:
7155:
7071:
7011:
6909:
6822:
6580:
6452:
166:
6769:
8118:
8080:
7634:
7566:
7455:
7450:
7262:
7195:
7170:
7006:
6726:
6483:
4690:
208:
7698:
8163:
7682:
7677:
7672:
7662:
7465:
7406:
7401:
7365:
7125:
7016:
4731:
of the neuron as the reference for potential measurements. Since the potential
189:
6753:
523:
denote the matrix whose rows are the histories of all neuron firings from time
8173:
7713:
7657:
7541:
6901:
6857:
Fournier, N.; Löcherbach, E. (2014). "On a toy model of interacting neurons".
6814:
6710:
6572:
6471:'s study on the leaky integrate-and-fire model, who himself was influenced by
5470:
4091:
3256:
1897:
In a common special case of the GL model, the part of the past firing history
192:
that depends on the history of the firings of all neurons since the last time
7667:
4683:
6761:
6718:
6673:
1140:
4689:
However, this apparent discrepancy exists only because it is customary in
227:
The GL network model consists of a countable set of neurons with some set
6518:
6491:
2411:
7494:
Generalized autoregressive conditional heteroskedasticity (GARCH) model
6934:
6654:
4697:. That discrepancy disappears if one chooses the baseline potential
2535:
from other sources besides the firings of other neurons. The factor
6892:
6636:
6600:
1340:. The blue digit indicates the last firing of neuron 3 before time
6863:
6842:
6797:
6701:
6555:
1139:
176:) that interact by sporadic nearly-instantaneous discrete events (
140:
737:
be defined similarly, but extending infinitely in the past. Let
49:
Please help to demonstrate the notability of the topic by citing
3668:
This special case results from taking the history weight factor
2387:
6938:
6392:
26:
6923:"Modelos matemáticos do cérebro", Fernanda Teixeira Ribeiro,
3022:
In an even more specific case of the GL model, the potential
6104:
5077:
7474:
Autoregressive conditional heteroskedasticity (ARCH) model
2581:
that modulates the contributions of firings that happened
2014:
is summarized by a real-valued internal state variable or
7002:
Independent and identically distributed random variables
1194:. The matrix of 0s and 1s represents the firing history
5393:
to zero. In the resulting neuron model, the potential
4693:
to measure electric potentials relative to that of the
7479:
Autoregressive integrated moving average (ARIMA) model
6482:
Contributions to the model were made, considering the
1961:{\displaystyle X{\bigl \,{\mathrel {:}}\,t-1{\bigr ]}}
6490:
The Galves–Löcherbach model was a cornerstone to the
6401:
6378:
6358:
6124:
5973:
5874:
5687:
5432:
5399:
5372:
5102:
4911:
4812:
4776:
4737:
4703:
4654:
4393:
4337:
4053:
4013:
3971:
3929:
3903:
3861:
3819:
3799:
3773:
3753:
3709:
3674:
3479:
3218:
3188:
3158:
3122:
3095:
3075:
3055:
3028:
2981:
2950:
2930:
2897:
2862:
2840:
2813:
2653:
2627:
2607:
2587:
2541:
2510:
2485:
2440:
2420:
2396:
2355:
2086:
2063:
2023:
2000:
1974:
1903:
1583:
1520:
1494:
1464:
1444:
1402:
1366:
1346:
1304:
1284:
1264:
1244:
1200:
1150:
937:
822:
799:
779:
743:
699:
683:{\displaystyle X=((X_{i})_{t'\leq s\leq t})_{i\in I}}
579:
554:
529:
483:
454:
426:
384:
342:
316:
280:
253:
233:
8101:
7906:
7868:
7777:
7691:
7648:
7615:
7507:
7464:
7374:
7291:
7047:
6972:
6443:The GL model was defined in 2013 by mathematicians
2382:is a numeric weight, that corresponds to the total
6423:
6384:
6364:
6337:
6109:
5854:
5672:
5412:
5385:
5350:
5082:
4795:
4750:
4723:
4674:
4627:
4376:
4320:
4032:
3999:
3957:
3915:
3889:
3847:
3805:
3785:
3759:
3727:
3695:
3657:
3459:
3201:
3171:
3141:
3108:
3081:
3061:
3041:
3006:
2963:
2936:
2916:
2880:
2848:
2826:
2796:
2633:
2613:
2593:
2569:
2527:
2496:
2467:
2426:
2402:
2374:
2338:
2069:
2045:
2006:
1986:
1960:
1881:
1566:
1506:
1476:
1450:
1430:
1388:
1352:
1332:
1290:
1270:
1250:
1230:
1186:
1129:
917:
805:
785:
765:
729:
682:
562:
540:
515:
466:
440:
412:
370:
328:
302:
259:
239:
6330:
6235:
5343:
5213:
2747:
2710:
2675:
2254:
2163:
1874:
1810:
1768:
1719:
1677:
1605:
1122:
1061:
1037:
994:
959:
7361:Stochastic chains with memory of variable length
3182:In this variant, the evolution of the potential
2834:is a monotonically non-decreasing function from
6465:stochastic chain with memory of variable length
203:may be summarized by an internal variable, the
6467:. Another work that influenced this model was
4377:{\displaystyle V_{i}^{\mathsf {R}}=w_{i\to i}}
6950:
5865:for the variant without refractory step, and
4040:. This self-weight therefore represents the
3703:of the general potential-based variant to be
2331:
2271:
1953:
1909:
1866:
1822:
1668:
1629:
1115:
1071:
877:
773:be the time before the last firing of neuron
8:
6287:
6281:
6021:
6015:
5811:
5805:
5629:
5623:
5300:
5294:
4994:
4988:
4584:
4578:
4384:is the reset potential. Or, more compactly,
4277:
4271:
2601:whole steps after the last firing of neuron
1546:
1534:
1181:
1157:
909:
861:
6613:
6611:
6536:
6534:
7489:Autoregressive–moving-average (ARMA) model
6957:
6943:
6935:
6263:
6259:
6200:
6199:
6195:
6194:
6156:
6155:
6151:
6150:
6074:
5997:
5993:
5943:
5906:
5905:
5901:
5900:
5787:
5783:
5763:
5762:
5758:
5757:
5719:
5718:
5714:
5713:
5605:
5601:
5581:
5577:
5543:
5500:
5464:
5463:
5459:
5458:
5423:The evolution equations then simplify to
5276:
5272:
5252:
5248:
5178:
5177:
5173:
5172:
5134:
5133:
5129:
5128:
5047:
4970:
4966:
4946:
4942:
4881:
4844:
4843:
4839:
4838:
4560:
4556:
4536:
4532:
4469:
4468:
4464:
4463:
4425:
4424:
4420:
4419:
4253:
4249:
4229:
4225:
4191:
4121:
4085:
4084:
4080:
4079:
3602:
3598:
3578:
3574:
3511:
3510:
3506:
3505:
3404:
3400:
3380:
3376:
3342:
3272:
3250:
3249:
3245:
3244:
2758:
2757:
2753:
2752:
2716:
2706:
2112:
2111:
2107:
2106:
1816:
1806:
1730:
1729:
1725:
1724:
1683:
1673:
1360:, which occurred in the time step between
1048:
1047:
1043:
1042:
1000:
990:
883:
873:
6891:
6862:
6841:
6796:
6700:
6663:
6653:
6635:
6599:
6554:
6406:
6400:
6377:
6357:
6329:
6328:
6313:
6308:
6296:
6268:
6244:
6234:
6233:
6232:
6214:
6187:
6186:
6181:
6176:
6161:
6129:
6123:
6079:
6066:
6047:
6042:
6030:
6002:
5978:
5948:
5935:
5925:
5924:
5919:
5911:
5879:
5873:
5837:
5832:
5820:
5792:
5768:
5750:
5749:
5744:
5739:
5724:
5692:
5686:
5655:
5650:
5638:
5610:
5586:
5548:
5535:
5505:
5492:
5483:
5482:
5477:
5469:
5437:
5431:
5404:
5398:
5377:
5371:
5342:
5341:
5326:
5321:
5309:
5281:
5257:
5233:
5228:
5222:
5212:
5211:
5210:
5192:
5165:
5164:
5159:
5154:
5139:
5107:
5101:
5052:
5039:
5020:
5015:
5003:
4975:
4951:
4927:
4922:
4916:
4886:
4873:
4863:
4862:
4857:
4849:
4817:
4811:
4781:
4775:
4742:
4736:
4714:
4713:
4708:
4702:
4665:
4664:
4659:
4653:
4610:
4605:
4593:
4565:
4541:
4517:
4512:
4506:
4501:
4483:
4456:
4455:
4450:
4445:
4430:
4398:
4392:
4362:
4348:
4347:
4342:
4336:
4303:
4298:
4286:
4258:
4234:
4196:
4183:
4166:
4161:
4155:
4126:
4113:
4104:
4103:
4098:
4090:
4058:
4052:
4018:
4012:
3976:
3970:
3934:
3928:
3902:
3866:
3860:
3824:
3818:
3798:
3772:
3752:
3719:
3714:
3708:
3673:
3640:
3635:
3623:
3607:
3583:
3559:
3554:
3548:
3543:
3525:
3484:
3478:
3442:
3437:
3425:
3409:
3385:
3347:
3334:
3317:
3312:
3306:
3277:
3264:
3255:
3223:
3217:
3209:can be expressed by a recurrence formula
3193:
3187:
3163:
3157:
3127:
3121:
3100:
3094:
3074:
3054:
3033:
3027:
2986:
2980:
2955:
2949:
2929:
2902:
2896:
2861:
2842:
2841:
2839:
2818:
2812:
2776:
2763:
2746:
2745:
2744:
2709:
2708:
2707:
2685:
2680:
2674:
2673:
2655:
2654:
2652:
2626:
2606:
2586:
2546:
2540:
2509:
2484:
2445:
2439:
2419:
2395:
2360:
2354:
2330:
2329:
2291:
2270:
2269:
2263:
2253:
2252:
2232:
2227:
2215:
2199:
2172:
2162:
2161:
2149:
2133:
2117:
2091:
2085:
2062:
2028:
2022:
1999:
1973:
1952:
1951:
1941:
1935:
1934:
1933:
1918:
1908:
1907:
1902:
1873:
1872:
1871:
1865:
1864:
1854:
1848:
1847:
1846:
1831:
1821:
1820:
1809:
1808:
1807:
1800:
1778:
1773:
1767:
1766:
1748:
1747:
1735:
1718:
1717:
1704:
1698:
1697:
1696:
1676:
1675:
1674:
1667:
1666:
1660:
1638:
1628:
1627:
1615:
1610:
1604:
1603:
1585:
1584:
1582:
1525:
1519:
1493:
1463:
1443:
1407:
1401:
1371:
1365:
1345:
1309:
1303:
1283:
1263:
1243:
1221:
1215:
1214:
1213:
1199:
1149:
1121:
1120:
1114:
1113:
1103:
1097:
1096:
1095:
1080:
1070:
1069:
1060:
1059:
1053:
1036:
1035:
1034:
1021:
1015:
1014:
1013:
993:
992:
991:
969:
964:
958:
957:
939:
938:
936:
888:
876:
875:
874:
846:
845:
827:
821:
798:
778:
748:
742:
720:
714:
713:
712:
698:
668:
641:
622:
602:
596:
595:
594:
578:
555:
553:
528:
506:
500:
499:
498:
482:
453:
434:
433:
425:
389:
383:
347:
341:
315:
285:
279:
252:
232:
129:Learn how and when to remove this message
1567:{\displaystyle a_{i}\in \{0,1\},i\in K,}
6618:Brochini, Ludmila; et al. (2016).
6530:
6278:
6012:
5802:
5620:
5291:
4985:
4575:
4268:
7795:Doob's martingale convergence theorems
6348:for the variant with refractory step.
6188:
5926:
5751:
5484:
5166:
4864:
4766:Some authors use a slightly different
4715:
4666:
4457:
4349:
4105:
7547:Constant elasticity of variance (CEV)
7537:Chan–Karolyi–Longstaff–Sanders (CKLS)
2641:whole steps before the current time.
2386:or strength of the synapses from the
7:
928:Then the general GL model says that
4724:{\displaystyle V_{i}^{\mathsf {B}}}
4675:{\displaystyle V_{i}^{\mathsf {B}}}
2924:is negative, each firing of neuron
1187:{\displaystyle I=\{1,2,\ldots ,7\}}
8034:Skorokhod's representation theorem
7815:Law of large numbers (weak/strong)
6070:
6067:
5939:
5936:
5539:
5536:
5496:
5493:
5043:
5040:
4877:
4874:
4187:
4184:
4117:
4114:
3923:),and there is no external input (
3338:
3335:
3268:
3265:
3142:{\displaystyle w_{j\rightarrow i}}
2726:
2665:
2662:
2659:
2656:
2375:{\displaystyle w_{j\rightarrow i}}
1758:
1755:
1752:
1749:
1693:
1595:
1592:
1589:
1586:
1210:
1050:
1010:
949:
946:
943:
940:
853:
850:
847:
709:
254:
25:
8004:Martingale representation theorem
3018:Leaky integrate and fire variants
441:{\displaystyle t\in \mathbb {Z} }
8049:Stochastic differential equation
7939:Doob's optional stopping theorem
7934:Doob–Meyer decomposition theorem
1968:that is relevant to each neuron
184:). At each moment, each neuron
31:
7919:Convergence of random variables
7805:Fisher–Tippett–Gnedenko theorem
6689:Journal of Mathematical Biology
6065:
5934:
5038:
4872:
7517:Binomial options pricing model
6880:Journal of Statistical Physics
6785:Journal of Statistical Physics
6543:Journal of Statistical Physics
6418:
6412:
6325:
6319:
6300:
6256:
6250:
6229:
6226:
6220:
6201:
6173:
6167:
6147:
6135:
6091:
6085:
6059:
6053:
6034:
5990:
5984:
5960:
5954:
5897:
5885:
5849:
5843:
5824:
5780:
5774:
5736:
5730:
5710:
5698:
5667:
5661:
5642:
5598:
5592:
5560:
5554:
5517:
5511:
5455:
5443:
5338:
5332:
5313:
5269:
5263:
5245:
5239:
5207:
5204:
5198:
5179:
5151:
5145:
5125:
5113:
5064:
5058:
5032:
5026:
5007:
4963:
4957:
4939:
4933:
4898:
4892:
4835:
4823:
4785:
4762:Variant with refractory period
4622:
4616:
4597:
4553:
4547:
4529:
4523:
4498:
4495:
4489:
4470:
4442:
4436:
4416:
4404:
4366:
4315:
4309:
4290:
4246:
4240:
4208:
4202:
4178:
4172:
4138:
4132:
4076:
4064:
4022:
3994:
3982:
3946:
3940:
3878:
3872:
3836:
3830:
3737:leaky integrate and fire model
3690:
3678:
3652:
3646:
3627:
3595:
3589:
3571:
3565:
3540:
3537:
3531:
3512:
3502:
3490:
3454:
3448:
3429:
3397:
3391:
3359:
3353:
3329:
3323:
3289:
3283:
3241:
3229:
3131:
2990:
2971:to decrease. This is the way
2906:
2875:
2863:
2791:
2788:
2782:
2769:
2741:
2720:
2697:
2691:
2564:
2552:
2462:
2451:
2364:
2303:
2297:
2249:
2238:
2219:
2189:
2178:
2145:
2139:
2103:
2097:
2040:
2034:
1930:
1924:
1843:
1837:
1790:
1784:
1714:
1687:
1650:
1644:
1419:
1413:
1383:
1377:
1327:
1315:
1298:. The rightmost column shows
1225:
1204:
1092:
1086:
1031:
1004:
981:
975:
900:
894:
839:
833:
760:
754:
724:
703:
665:
638:
634:
628:
615:
612:
606:
583:
510:
487:
401:
395:
359:
353:
297:
291:
1:
7984:Kolmogorov continuity theorem
7820:Law of the iterated logarithm
6451:. Its inspirations included
4682:, on the order of −40 to −80
3735:. It is very similar to the
7989:Kolmogorov extension theorem
7668:Generalized queueing network
7176:Interacting particle systems
3728:{\displaystyle \mu _{i}^{s}}
3007:{\displaystyle w_{j\to i}=0}
2849:{\displaystyle \mathbb {R} }
1278:shows the firings of neuron
188:fires independently, with a
44:general notability guideline
7121:Continuous-time random walk
6457:interacting particle system
2570:{\displaystyle \alpha _{i}}
1431:{\displaystyle \tau _{3}+1}
310:denotes whether the neuron
8249:
8233:Computational neuroscience
8129:Extreme value theory (EVT)
7929:Doob decomposition theorem
7221:Ornstein–Uhlenbeck process
6992:Chinese restaurant process
6754:10.1162/089976600300015835
6514:Computational neuroscience
4796:{\displaystyle w_{i\to i}}
4033:{\displaystyle w_{i\to i}}
3855:), no other neuron fires (
2917:{\displaystyle w_{j\to i}}
1507:{\displaystyle K\subset I}
207:of that neuron, that is a
51:reliable secondary sources
40:The topic of this article
8197:
8009:Optional stopping theorem
7810:Large deviation principle
7562:Heath–Jarrow–Morton (HJM)
7499:Moving-average (MA) model
7484:Autoregressive (AR) model
7309:Hidden Markov model (HMM)
7243:Schramm–Loewner evolution
6902:10.1007/s10955-016-1490-3
6815:10.1007/s10955-014-1145-1
6711:10.1007/s00285-010-0358-4
6573:10.1007/s10955-013-0733-9
2827:{\displaystyle \phi _{i}}
2053:(that corresponds to the
1389:{\displaystyle \tau _{3}}
766:{\displaystyle \tau _{i}}
420:) between sampling times
78:"Galves–Löcherbach model"
42:may not meet Knowledge's
7924:Doléans-Dade exponential
7754:Progressively measurable
7552:Cox–Ingersoll–Ross (CIR)
5386:{\displaystyle \mu _{i}}
3172:{\displaystyle \mu _{i}}
1893:Potential-based variants
214:leaky integrate-and-fire
8144:Mathematical statistics
8134:Large deviations theory
7964:Infinitesimal generator
7825:Maximal ergodic theorem
7744:Piecewise-deterministic
7346:Random dynamical system
7211:Markov additive process
6504:Biological neuron model
3958:{\displaystyle E_{i}=0}
3916:{\displaystyle j\neq i}
3890:{\displaystyle X_{j}=0}
3848:{\displaystyle X_{i}=1}
3696:{\displaystyle \alpha }
3116:by the constant amount
2891:If the synaptic weight
2579:history weight function
413:{\displaystyle X_{i}=0}
371:{\displaystyle X_{i}=1}
260:{\displaystyle \Delta }
151:Galves–Löcherbach model
18:Galves-Löcherbach model
7979:Karhunen–Loève theorem
7914:Cameron–Martin formula
7878:Burkholder–Davis–Gundy
7273:Variance gamma process
6425:
6395:, namely the value of
6386:
6366:
6339:
6111:
5856:
5674:
5414:
5387:
5352:
5084:
4797:
4752:
4725:
4676:
4648:or baseline potential
4629:
4378:
4322:
4034:
4001:
3959:
3917:
3891:
3849:
3807:
3787:
3761:
3729:
3697:
3659:
3461:
3203:
3173:
3143:
3110:
3083:
3063:
3043:
3008:
2965:
2938:
2918:
2882:
2850:
2828:
2798:
2635:
2615:
2595:
2571:
2529:
2498:
2469:
2428:
2404:
2376:
2340:
2160:
2071:
2047:
2008:
1994:at each sampling time
1988:
1987:{\displaystyle i\in I}
1962:
1883:
1568:
1514:and any configuration
1508:
1485:
1478:
1452:
1432:
1390:
1354:
1334:
1292:
1272:
1252:
1232:
1188:
1131:
919:
807:
787:
767:
731:
684:
564:
542:
517:
468:
442:
414:
372:
330:
329:{\displaystyle i\in I}
304:
261:
241:
146:
8109:Actuarial mathematics
8071:Uniform integrability
8066:Stratonovich integral
7994:Lévy–Prokhorov metric
7898:Marcinkiewicz–Zygmund
7785:Central limit theorem
7387:Gaussian random field
7216:McKean–Vlasov process
7136:Dyson Brownian motion
6997:Galton–Watson process
6426:
6424:{\displaystyle X_{i}}
6387:
6372:neurons is a list of
6367:
6340:
6112:
5857:
5675:
5415:
5413:{\displaystyle V_{i}}
5388:
5353:
5085:
4798:
4753:
4751:{\displaystyle V_{i}}
4726:
4677:
4630:
4379:
4323:
4035:
4002:
4000:{\displaystyle V_{i}}
3960:
3918:
3892:
3850:
3808:
3788:
3762:
3730:
3698:
3660:
3462:
3204:
3202:{\displaystyle V_{i}}
3174:
3144:
3111:
3109:{\displaystyle V_{i}}
3084:
3064:
3044:
3042:{\displaystyle V_{i}}
3009:
2966:
2964:{\displaystyle V_{i}}
2944:causes the potential
2939:
2919:
2883:
2851:
2829:
2799:
2636:
2616:
2596:
2572:
2530:
2499:
2470:
2468:{\displaystyle E_{i}}
2429:
2405:
2377:
2341:
2113:
2072:
2048:
2046:{\displaystyle V_{i}}
2009:
1989:
1963:
1884:
1569:
1509:
1479:
1453:
1433:
1391:
1355:
1335:
1333:{\displaystyle X_{i}}
1293:
1273:
1253:
1233:
1189:
1143:
1132:
920:
808:
788:
768:
732:
685:
565:
543:
518:
469:
443:
415:
373:
331:
305:
303:{\displaystyle X_{i}}
262:
242:
144:
8184:Time series analysis
8139:Mathematical finance
8024:Reflection principle
7351:Regenerative process
7151:Fleming–Viot process
6966:Stochastic processes
6509:Hodgkin–Huxley model
6399:
6376:
6356:
6122:
5872:
5685:
5430:
5397:
5370:
5100:
5093:or, more compactly,
4810:
4774:
4735:
4701:
4695:extracellular medium
4652:
4391:
4335:
4051:
4011:
3969:
3927:
3901:
3859:
3817:
3797:
3771:
3751:
3707:
3672:
3477:
3470:Or, more compactly,
3216:
3186:
3156:
3120:
3093:
3073:
3053:
3026:
2979:
2948:
2928:
2895:
2860:
2838:
2811:
2651:
2625:
2605:
2585:
2539:
2528:{\displaystyle t'+1}
2508:
2483:
2438:
2418:
2394:
2353:
2084:
2061:
2021:
1998:
1972:
1901:
1581:
1518:
1492:
1462:
1442:
1400:
1364:
1344:
1302:
1282:
1262:
1242:
1198:
1148:
935:
820:
797:
777:
741:
697:
577:
552:
527:
481:
452:
424:
382:
340:
314:
278:
251:
231:
8179:Stochastic analysis
8019:Quadratic variation
8014:Prokhorov's theorem
7949:Feynman–Kac formula
7419:Markov random field
7067:Birth–death process
6807:2015JSP...158..866D
6646:2016NatSR...635831B
6565:2013JSP...151..896G
6193:
5931:
5756:
5489:
5171:
4869:
4720:
4671:
4462:
4354:
4110:
3786:{\displaystyle t+1}
3724:
2973:inhibitory synapses
1477:{\displaystyle t+1}
570:inclusive, that is
563:{\displaystyle {}t}
467:{\displaystyle t+1}
8149:Probability theory
8029:Skorokhod integral
7999:Malliavin calculus
7582:Korn-Kreer-Lenssen
7466:Time series models
7429:Pitman–Yor process
6742:Neural Computation
6624:Scientific Reports
6421:
6382:
6362:
6335:
6291:
6177:
6107:
6102:
6062:
6025:
5915:
5852:
5815:
5740:
5670:
5633:
5571:
5473:
5410:
5383:
5362:Forgetful variants
5348:
5304:
5155:
5080:
5075:
5035:
4998:
4853:
4793:
4748:
4721:
4704:
4672:
4655:
4625:
4588:
4446:
4374:
4338:
4318:
4281:
4219:
4094:
4030:
3997:
3955:
3913:
3887:
3845:
3803:
3783:
3757:
3747:If, between times
3725:
3710:
3693:
3655:
3618:
3457:
3420:
3370:
3199:
3169:
3139:
3106:
3079:
3059:
3039:
3004:
2961:
2934:
2914:
2878:
2856:into the interval
2846:
2824:
2794:
2631:
2611:
2591:
2567:
2525:
2497:{\displaystyle t'}
2494:
2465:
2424:
2400:
2372:
2336:
2210:
2067:
2055:membrane potential
2043:
2004:
1984:
1958:
1879:
1746:
1626:
1564:
1504:
1486:
1474:
1448:
1428:
1386:
1350:
1330:
1288:
1268:
1248:
1228:
1184:
1127:
915:
803:
783:
763:
727:
680:
560:
541:{\displaystyle t'}
538:
513:
464:
438:
410:
368:
326:
300:
257:
237:
163:network of neurons
159:mathematical model
147:
46:
8215:
8214:
8169:Signal processing
7888:Doob's upcrossing
7883:Doob's martingale
7847:Engelbert–Schmidt
7790:Donsker's theorem
7724:Feller-continuous
7592:Rendleman–Bartter
7382:Dirichlet process
7299:Branching process
7268:Telegraph process
7161:Geometric process
7141:Empirical process
7131:Diffusion process
6987:Branching process
6982:Bernoulli process
6655:10.1038/srep35831
6630:. article 35831.
6385:{\displaystyle n}
6365:{\displaystyle n}
6264:
5998:
5788:
5606:
5277:
4971:
4639:Resting potential
4561:
4254:
3806:{\displaystyle i}
3760:{\displaystyle t}
3603:
3405:
3082:{\displaystyle j}
3062:{\displaystyle i}
2937:{\displaystyle j}
2644:Then one defines
2634:{\displaystyle s}
2614:{\displaystyle i}
2594:{\displaystyle r}
2427:{\displaystyle i}
2403:{\displaystyle j}
2349:In this formula,
2195:
2070:{\displaystyle i}
2007:{\displaystyle t}
1731:
1611:
1451:{\displaystyle t}
1353:{\displaystyle t}
1291:{\displaystyle i}
1271:{\displaystyle i}
1251:{\displaystyle t}
1231:{\displaystyle X}
806:{\displaystyle t}
786:{\displaystyle i}
730:{\displaystyle X}
516:{\displaystyle X}
240:{\displaystyle I}
220:Formal definition
139:
138:
131:
113:
41:
16:(Redirected from
8240:
8228:Neural circuitry
8189:Machine learning
8076:Usual hypotheses
7959:Girsanov theorem
7944:Dynkin's formula
7709:Continuous paths
7617:Actuarial models
7557:Garman–Kohlhagen
7527:Black–Karasinski
7522:Black–Derman–Toy
7509:Financial models
7375:Fields and other
7304:Gaussian process
7253:Sigma-martingale
7057:Additive process
6959:
6952:
6945:
6936:
6929:
6920:
6914:
6913:
6895:
6875:
6869:
6868:
6866:
6854:
6848:
6847:
6845:
6833:
6827:
6826:
6800:
6780:
6774:
6773:
6737:
6731:
6730:
6704:
6684:
6678:
6677:
6667:
6657:
6639:
6615:
6606:
6605:
6603:
6591:
6585:
6584:
6558:
6538:
6430:
6428:
6427:
6422:
6411:
6410:
6391:
6389:
6388:
6383:
6371:
6369:
6368:
6363:
6344:
6342:
6341:
6336:
6334:
6333:
6318:
6317:
6307:
6306:
6290:
6249:
6248:
6239:
6238:
6219:
6218:
6192:
6191:
6185:
6166:
6165:
6134:
6133:
6116:
6114:
6113:
6108:
6106:
6103:
6084:
6083:
6073:
6052:
6051:
6041:
6040:
6024:
5983:
5982:
5953:
5952:
5942:
5930:
5929:
5923:
5884:
5883:
5861:
5859:
5858:
5853:
5842:
5841:
5831:
5830:
5814:
5773:
5772:
5755:
5754:
5748:
5729:
5728:
5697:
5696:
5679:
5677:
5676:
5671:
5660:
5659:
5649:
5648:
5632:
5591:
5590:
5576:
5572:
5553:
5552:
5542:
5510:
5509:
5499:
5488:
5487:
5481:
5442:
5441:
5419:
5417:
5416:
5411:
5409:
5408:
5392:
5390:
5389:
5384:
5382:
5381:
5357:
5355:
5354:
5349:
5347:
5346:
5331:
5330:
5320:
5319:
5303:
5262:
5261:
5238:
5237:
5227:
5226:
5217:
5216:
5197:
5196:
5170:
5169:
5163:
5144:
5143:
5112:
5111:
5089:
5087:
5086:
5081:
5079:
5076:
5057:
5056:
5046:
5025:
5024:
5014:
5013:
4997:
4956:
4955:
4932:
4931:
4921:
4920:
4891:
4890:
4880:
4868:
4867:
4861:
4822:
4821:
4802:
4800:
4799:
4794:
4792:
4791:
4757:
4755:
4754:
4749:
4747:
4746:
4730:
4728:
4727:
4722:
4719:
4718:
4712:
4681:
4679:
4678:
4673:
4670:
4669:
4663:
4634:
4632:
4631:
4626:
4615:
4614:
4604:
4603:
4587:
4546:
4545:
4522:
4521:
4511:
4510:
4488:
4487:
4461:
4460:
4454:
4435:
4434:
4403:
4402:
4383:
4381:
4380:
4375:
4373:
4372:
4353:
4352:
4346:
4327:
4325:
4324:
4319:
4308:
4307:
4297:
4296:
4280:
4239:
4238:
4224:
4220:
4201:
4200:
4190:
4171:
4170:
4160:
4159:
4131:
4130:
4120:
4109:
4108:
4102:
4063:
4062:
4039:
4037:
4036:
4031:
4029:
4028:
4006:
4004:
4003:
3998:
3981:
3980:
3964:
3962:
3961:
3956:
3939:
3938:
3922:
3920:
3919:
3914:
3896:
3894:
3893:
3888:
3871:
3870:
3854:
3852:
3851:
3846:
3829:
3828:
3813:fires (that is,
3812:
3810:
3809:
3804:
3792:
3790:
3789:
3784:
3766:
3764:
3763:
3758:
3734:
3732:
3731:
3726:
3723:
3718:
3702:
3700:
3699:
3694:
3664:
3662:
3661:
3656:
3645:
3644:
3634:
3633:
3617:
3588:
3587:
3564:
3563:
3553:
3552:
3530:
3529:
3489:
3488:
3466:
3464:
3463:
3458:
3447:
3446:
3436:
3435:
3419:
3390:
3389:
3375:
3371:
3352:
3351:
3341:
3322:
3321:
3311:
3310:
3282:
3281:
3271:
3228:
3227:
3208:
3206:
3205:
3200:
3198:
3197:
3178:
3176:
3175:
3170:
3168:
3167:
3148:
3146:
3145:
3140:
3138:
3137:
3115:
3113:
3112:
3107:
3105:
3104:
3088:
3086:
3085:
3080:
3068:
3066:
3065:
3060:
3048:
3046:
3045:
3040:
3038:
3037:
3013:
3011:
3010:
3005:
2997:
2996:
2970:
2968:
2967:
2962:
2960:
2959:
2943:
2941:
2940:
2935:
2923:
2921:
2920:
2915:
2913:
2912:
2887:
2885:
2884:
2881:{\displaystyle }
2879:
2855:
2853:
2852:
2847:
2845:
2833:
2831:
2830:
2825:
2823:
2822:
2803:
2801:
2800:
2795:
2781:
2780:
2768:
2767:
2751:
2750:
2715:
2714:
2713:
2690:
2689:
2679:
2678:
2669:
2668:
2640:
2638:
2637:
2632:
2620:
2618:
2617:
2612:
2600:
2598:
2597:
2592:
2576:
2574:
2573:
2568:
2551:
2550:
2534:
2532:
2531:
2526:
2518:
2503:
2501:
2500:
2495:
2493:
2474:
2472:
2471:
2466:
2461:
2450:
2449:
2433:
2431:
2430:
2425:
2409:
2407:
2406:
2401:
2381:
2379:
2378:
2373:
2371:
2370:
2345:
2343:
2342:
2337:
2335:
2334:
2328:
2296:
2295:
2283:
2275:
2274:
2268:
2267:
2258:
2257:
2248:
2237:
2236:
2226:
2225:
2209:
2188:
2177:
2176:
2167:
2166:
2159:
2148:
2138:
2137:
2125:
2096:
2095:
2076:
2074:
2073:
2068:
2052:
2050:
2049:
2044:
2033:
2032:
2013:
2011:
2010:
2005:
1993:
1991:
1990:
1985:
1967:
1965:
1964:
1959:
1957:
1956:
1940:
1939:
1923:
1922:
1913:
1912:
1888:
1886:
1885:
1880:
1878:
1877:
1870:
1869:
1853:
1852:
1836:
1835:
1826:
1825:
1815:
1814:
1813:
1805:
1804:
1783:
1782:
1772:
1771:
1762:
1761:
1745:
1723:
1722:
1703:
1702:
1682:
1681:
1680:
1672:
1671:
1665:
1664:
1643:
1642:
1633:
1632:
1625:
1609:
1608:
1599:
1598:
1573:
1571:
1570:
1565:
1530:
1529:
1513:
1511:
1510:
1505:
1483:
1481:
1480:
1475:
1457:
1455:
1454:
1449:
1437:
1435:
1434:
1429:
1412:
1411:
1395:
1393:
1392:
1387:
1376:
1375:
1359:
1357:
1356:
1351:
1339:
1337:
1336:
1331:
1314:
1313:
1297:
1295:
1294:
1289:
1277:
1275:
1274:
1269:
1257:
1255:
1254:
1249:
1238:up to some time
1237:
1235:
1234:
1229:
1220:
1219:
1193:
1191:
1190:
1185:
1136:
1134:
1133:
1128:
1126:
1125:
1119:
1118:
1102:
1101:
1085:
1084:
1075:
1074:
1065:
1064:
1058:
1057:
1041:
1040:
1020:
1019:
999:
998:
997:
974:
973:
963:
962:
953:
952:
924:
922:
921:
916:
893:
892:
882:
881:
880:
857:
856:
832:
831:
812:
810:
809:
804:
792:
790:
789:
784:
772:
770:
769:
764:
753:
752:
736:
734:
733:
728:
719:
718:
689:
687:
686:
681:
679:
678:
663:
662:
649:
627:
626:
601:
600:
593:
569:
567:
566:
561:
556:
547:
545:
544:
539:
537:
522:
520:
519:
514:
505:
504:
497:
473:
471:
470:
465:
447:
445:
444:
439:
437:
419:
417:
416:
411:
394:
393:
377:
375:
374:
369:
352:
351:
335:
333:
332:
327:
309:
307:
306:
301:
290:
289:
266:
264:
263:
258:
246:
244:
243:
238:
134:
127:
123:
120:
114:
112:
71:
35:
34:
27:
21:
8248:
8247:
8243:
8242:
8241:
8239:
8238:
8237:
8218:
8217:
8216:
8211:
8193:
8154:Queueing theory
8097:
8039:Skorokhod space
7902:
7893:Kunita–Watanabe
7864:
7830:Sanov's theorem
7800:Ergodic theorem
7773:
7769:Time-reversible
7687:
7650:Queueing models
7644:
7640:Sparre–Anderson
7630:Cramér–Lundberg
7611:
7597:SABR volatility
7503:
7460:
7412:Boolean network
7370:
7356:Renewal process
7287:
7236:Non-homogeneous
7226:Poisson process
7116:Contact process
7079:Brownian motion
7049:Continuous time
7043:
7037:Maximal entropy
6968:
6963:
6933:
6932:
6925:Mente e Cérebro
6921:
6917:
6877:
6876:
6872:
6856:
6855:
6851:
6835:
6834:
6830:
6782:
6781:
6777:
6739:
6738:
6734:
6686:
6685:
6681:
6617:
6616:
6609:
6593:
6592:
6588:
6540:
6539:
6532:
6527:
6500:
6441:
6402:
6397:
6396:
6374:
6373:
6354:
6353:
6352:a network with
6309:
6292:
6240:
6210:
6157:
6125:
6120:
6119:
6101:
6100:
6075:
6063:
6043:
6026:
5974:
5970:
5969:
5944:
5932:
5907:
5875:
5870:
5869:
5833:
5816:
5764:
5720:
5688:
5683:
5682:
5651:
5634:
5582:
5570:
5569:
5544:
5533:
5527:
5526:
5501:
5490:
5465:
5433:
5428:
5427:
5400:
5395:
5394:
5373:
5368:
5367:
5364:
5322:
5305:
5253:
5229:
5218:
5188:
5135:
5103:
5098:
5097:
5074:
5073:
5048:
5036:
5016:
4999:
4947:
4923:
4912:
4908:
4907:
4882:
4870:
4845:
4813:
4808:
4807:
4777:
4772:
4771:
4764:
4738:
4733:
4732:
4699:
4698:
4650:
4649:
4641:
4606:
4589:
4537:
4513:
4502:
4479:
4426:
4394:
4389:
4388:
4358:
4333:
4332:
4299:
4282:
4230:
4218:
4217:
4192:
4181:
4162:
4151:
4148:
4147:
4122:
4111:
4086:
4054:
4049:
4048:
4042:reset potential
4014:
4009:
4008:
3972:
3967:
3966:
3930:
3925:
3924:
3899:
3898:
3862:
3857:
3856:
3820:
3815:
3814:
3795:
3794:
3769:
3768:
3749:
3748:
3745:
3743:Reset potential
3705:
3704:
3670:
3669:
3636:
3619:
3579:
3555:
3544:
3521:
3480:
3475:
3474:
3438:
3421:
3381:
3369:
3368:
3343:
3332:
3313:
3302:
3299:
3298:
3273:
3262:
3251:
3219:
3214:
3213:
3189:
3184:
3183:
3159:
3154:
3153:
3151:recharge factor
3123:
3118:
3117:
3096:
3091:
3090:
3071:
3070:
3051:
3050:
3029:
3024:
3023:
3020:
2982:
2977:
2976:
2951:
2946:
2945:
2926:
2925:
2898:
2893:
2892:
2858:
2857:
2836:
2835:
2814:
2809:
2808:
2772:
2759:
2681:
2649:
2648:
2623:
2622:
2603:
2602:
2583:
2582:
2542:
2537:
2536:
2511:
2506:
2505:
2486:
2481:
2480:
2454:
2441:
2436:
2435:
2416:
2415:
2392:
2391:
2356:
2351:
2350:
2321:
2287:
2276:
2259:
2241:
2228:
2211:
2181:
2168:
2129:
2118:
2087:
2082:
2081:
2059:
2058:
2024:
2019:
2018:
1996:
1995:
1970:
1969:
1914:
1899:
1898:
1895:
1827:
1796:
1774:
1656:
1634:
1579:
1578:
1521:
1516:
1515:
1490:
1489:
1460:
1459:
1440:
1439:
1403:
1398:
1397:
1367:
1362:
1361:
1342:
1341:
1305:
1300:
1299:
1280:
1279:
1260:
1259:
1240:
1239:
1196:
1195:
1146:
1145:
1076:
1049:
965:
933:
932:
884:
823:
818:
817:
795:
794:
775:
774:
744:
739:
738:
695:
694:
664:
642:
637:
618:
586:
575:
574:
550:
549:
530:
525:
524:
490:
479:
478:
450:
449:
422:
421:
385:
380:
379:
343:
338:
337:
312:
311:
281:
276:
275:
249:
248:
229:
228:
222:
165:with intrinsic
135:
124:
118:
115:
72:
70:
48:
36:
32:
23:
22:
15:
12:
11:
5:
8246:
8244:
8236:
8235:
8230:
8220:
8219:
8213:
8212:
8210:
8209:
8204:
8202:List of topics
8198:
8195:
8194:
8192:
8191:
8186:
8181:
8176:
8171:
8166:
8161:
8159:Renewal theory
8156:
8151:
8146:
8141:
8136:
8131:
8126:
8124:Ergodic theory
8121:
8116:
8114:Control theory
8111:
8105:
8103:
8099:
8098:
8096:
8095:
8094:
8093:
8088:
8078:
8073:
8068:
8063:
8058:
8057:
8056:
8046:
8044:Snell envelope
8041:
8036:
8031:
8026:
8021:
8016:
8011:
8006:
8001:
7996:
7991:
7986:
7981:
7976:
7971:
7966:
7961:
7956:
7951:
7946:
7941:
7936:
7931:
7926:
7921:
7916:
7910:
7908:
7904:
7903:
7901:
7900:
7895:
7890:
7885:
7880:
7874:
7872:
7866:
7865:
7863:
7862:
7843:Borel–Cantelli
7832:
7827:
7822:
7817:
7812:
7807:
7802:
7797:
7792:
7787:
7781:
7779:
7778:Limit theorems
7775:
7774:
7772:
7771:
7766:
7761:
7756:
7751:
7746:
7741:
7736:
7731:
7726:
7721:
7716:
7711:
7706:
7701:
7695:
7693:
7689:
7688:
7686:
7685:
7680:
7675:
7670:
7665:
7660:
7654:
7652:
7646:
7645:
7643:
7642:
7637:
7632:
7627:
7621:
7619:
7613:
7612:
7610:
7609:
7604:
7599:
7594:
7589:
7584:
7579:
7574:
7569:
7564:
7559:
7554:
7549:
7544:
7539:
7534:
7529:
7524:
7519:
7513:
7511:
7505:
7504:
7502:
7501:
7496:
7491:
7486:
7481:
7476:
7470:
7468:
7462:
7461:
7459:
7458:
7453:
7448:
7447:
7446:
7441:
7431:
7426:
7421:
7416:
7415:
7414:
7409:
7399:
7397:Hopfield model
7394:
7389:
7384:
7378:
7376:
7372:
7371:
7369:
7368:
7363:
7358:
7353:
7348:
7343:
7342:
7341:
7336:
7331:
7326:
7316:
7314:Markov process
7311:
7306:
7301:
7295:
7293:
7289:
7288:
7286:
7285:
7283:Wiener sausage
7280:
7278:Wiener process
7275:
7270:
7265:
7260:
7258:Stable process
7255:
7250:
7248:Semimartingale
7245:
7240:
7239:
7238:
7233:
7223:
7218:
7213:
7208:
7203:
7198:
7193:
7191:Jump diffusion
7188:
7183:
7178:
7173:
7168:
7166:Hawkes process
7163:
7158:
7153:
7148:
7146:Feller process
7143:
7138:
7133:
7128:
7123:
7118:
7113:
7111:Cauchy process
7108:
7107:
7106:
7101:
7096:
7091:
7086:
7076:
7075:
7074:
7064:
7062:Bessel process
7059:
7053:
7051:
7045:
7044:
7042:
7041:
7040:
7039:
7034:
7029:
7024:
7014:
7009:
7004:
6999:
6994:
6989:
6984:
6978:
6976:
6970:
6969:
6964:
6962:
6961:
6954:
6947:
6939:
6931:
6930:
6915:
6886:(3): 642–658.
6870:
6849:
6828:
6791:(4): 866–902.
6775:
6748:(2): 367–384.
6732:
6695:(6): 863–900.
6679:
6607:
6586:
6549:(5): 896–921.
6529:
6528:
6526:
6523:
6522:
6521:
6516:
6511:
6506:
6499:
6496:
6461:Jorma Rissanen
6449:Eva Löcherbach
6445:Antonio Galves
6440:
6437:
6433:depolarization
6420:
6417:
6414:
6409:
6405:
6381:
6361:
6346:
6345:
6332:
6327:
6324:
6321:
6316:
6312:
6305:
6302:
6299:
6295:
6289:
6286:
6283:
6280:
6277:
6274:
6271:
6267:
6262:
6258:
6255:
6252:
6247:
6243:
6237:
6231:
6228:
6225:
6222:
6217:
6213:
6209:
6206:
6203:
6198:
6190:
6184:
6180:
6175:
6172:
6169:
6164:
6160:
6154:
6149:
6146:
6143:
6140:
6137:
6132:
6128:
6117:
6105:
6099:
6096:
6093:
6090:
6087:
6082:
6078:
6072:
6069:
6064:
6061:
6058:
6055:
6050:
6046:
6039:
6036:
6033:
6029:
6023:
6020:
6017:
6014:
6011:
6008:
6005:
6001:
5996:
5992:
5989:
5986:
5981:
5977:
5972:
5971:
5968:
5965:
5962:
5959:
5956:
5951:
5947:
5941:
5938:
5933:
5928:
5922:
5918:
5914:
5913:
5910:
5904:
5899:
5896:
5893:
5890:
5887:
5882:
5878:
5863:
5862:
5851:
5848:
5845:
5840:
5836:
5829:
5826:
5823:
5819:
5813:
5810:
5807:
5804:
5801:
5798:
5795:
5791:
5786:
5782:
5779:
5776:
5771:
5767:
5761:
5753:
5747:
5743:
5738:
5735:
5732:
5727:
5723:
5717:
5712:
5709:
5706:
5703:
5700:
5695:
5691:
5680:
5669:
5666:
5663:
5658:
5654:
5647:
5644:
5641:
5637:
5631:
5628:
5625:
5622:
5619:
5616:
5613:
5609:
5604:
5600:
5597:
5594:
5589:
5585:
5580:
5575:
5568:
5565:
5562:
5559:
5556:
5551:
5547:
5541:
5538:
5534:
5532:
5529:
5528:
5525:
5522:
5519:
5516:
5513:
5508:
5504:
5498:
5495:
5491:
5486:
5480:
5476:
5472:
5471:
5468:
5462:
5457:
5454:
5451:
5448:
5445:
5440:
5436:
5407:
5403:
5380:
5376:
5363:
5360:
5359:
5358:
5345:
5340:
5337:
5334:
5329:
5325:
5318:
5315:
5312:
5308:
5302:
5299:
5296:
5293:
5290:
5287:
5284:
5280:
5275:
5271:
5268:
5265:
5260:
5256:
5251:
5247:
5244:
5241:
5236:
5232:
5225:
5221:
5215:
5209:
5206:
5203:
5200:
5195:
5191:
5187:
5184:
5181:
5176:
5168:
5162:
5158:
5153:
5150:
5147:
5142:
5138:
5132:
5127:
5124:
5121:
5118:
5115:
5110:
5106:
5091:
5090:
5078:
5072:
5069:
5066:
5063:
5060:
5055:
5051:
5045:
5042:
5037:
5034:
5031:
5028:
5023:
5019:
5012:
5009:
5006:
5002:
4996:
4993:
4990:
4987:
4984:
4981:
4978:
4974:
4969:
4965:
4962:
4959:
4954:
4950:
4945:
4941:
4938:
4935:
4930:
4926:
4919:
4915:
4910:
4909:
4906:
4903:
4900:
4897:
4894:
4889:
4885:
4879:
4876:
4871:
4866:
4860:
4856:
4852:
4851:
4848:
4842:
4837:
4834:
4831:
4828:
4825:
4820:
4816:
4790:
4787:
4784:
4780:
4763:
4760:
4745:
4741:
4717:
4711:
4707:
4668:
4662:
4658:
4640:
4637:
4636:
4635:
4624:
4621:
4618:
4613:
4609:
4602:
4599:
4596:
4592:
4586:
4583:
4580:
4577:
4574:
4571:
4568:
4564:
4559:
4555:
4552:
4549:
4544:
4540:
4535:
4531:
4528:
4525:
4520:
4516:
4509:
4505:
4500:
4497:
4494:
4491:
4486:
4482:
4478:
4475:
4472:
4467:
4459:
4453:
4449:
4444:
4441:
4438:
4433:
4429:
4423:
4418:
4415:
4412:
4409:
4406:
4401:
4397:
4371:
4368:
4365:
4361:
4357:
4351:
4345:
4341:
4329:
4328:
4317:
4314:
4311:
4306:
4302:
4295:
4292:
4289:
4285:
4279:
4276:
4273:
4270:
4267:
4264:
4261:
4257:
4252:
4248:
4245:
4242:
4237:
4233:
4228:
4223:
4216:
4213:
4210:
4207:
4204:
4199:
4195:
4189:
4186:
4182:
4180:
4177:
4174:
4169:
4165:
4158:
4154:
4150:
4149:
4146:
4143:
4140:
4137:
4134:
4129:
4125:
4119:
4116:
4112:
4107:
4101:
4097:
4093:
4092:
4089:
4083:
4078:
4075:
4072:
4069:
4066:
4061:
4057:
4027:
4024:
4021:
4017:
3996:
3993:
3990:
3987:
3984:
3979:
3975:
3954:
3951:
3948:
3945:
3942:
3937:
3933:
3912:
3909:
3906:
3886:
3883:
3880:
3877:
3874:
3869:
3865:
3844:
3841:
3838:
3835:
3832:
3827:
3823:
3802:
3782:
3779:
3776:
3756:
3744:
3741:
3722:
3717:
3713:
3692:
3689:
3686:
3683:
3680:
3677:
3666:
3665:
3654:
3651:
3648:
3643:
3639:
3632:
3629:
3626:
3622:
3616:
3613:
3610:
3606:
3601:
3597:
3594:
3591:
3586:
3582:
3577:
3573:
3570:
3567:
3562:
3558:
3551:
3547:
3542:
3539:
3536:
3533:
3528:
3524:
3520:
3517:
3514:
3509:
3504:
3501:
3498:
3495:
3492:
3487:
3483:
3468:
3467:
3456:
3453:
3450:
3445:
3441:
3434:
3431:
3428:
3424:
3418:
3415:
3412:
3408:
3403:
3399:
3396:
3393:
3388:
3384:
3379:
3374:
3367:
3364:
3361:
3358:
3355:
3350:
3346:
3340:
3337:
3333:
3331:
3328:
3325:
3320:
3316:
3309:
3305:
3301:
3300:
3297:
3294:
3291:
3288:
3285:
3280:
3276:
3270:
3267:
3263:
3261:
3258:
3257:
3254:
3248:
3243:
3240:
3237:
3234:
3231:
3226:
3222:
3196:
3192:
3179:towards zero.
3166:
3162:
3136:
3133:
3130:
3126:
3103:
3099:
3078:
3058:
3036:
3032:
3019:
3016:
3003:
3000:
2995:
2992:
2989:
2985:
2958:
2954:
2933:
2911:
2908:
2905:
2901:
2877:
2874:
2871:
2868:
2865:
2844:
2821:
2817:
2805:
2804:
2793:
2790:
2787:
2784:
2779:
2775:
2771:
2766:
2762:
2756:
2749:
2743:
2740:
2737:
2734:
2731:
2728:
2725:
2722:
2719:
2712:
2705:
2702:
2699:
2696:
2693:
2688:
2684:
2677:
2672:
2667:
2664:
2661:
2658:
2630:
2610:
2590:
2566:
2563:
2560:
2557:
2554:
2549:
2545:
2524:
2521:
2517:
2514:
2492:
2489:
2477:external input
2464:
2460:
2457:
2453:
2448:
2444:
2423:
2399:
2369:
2366:
2363:
2359:
2347:
2346:
2333:
2327:
2324:
2320:
2317:
2314:
2311:
2308:
2305:
2302:
2299:
2294:
2290:
2286:
2282:
2279:
2273:
2266:
2262:
2256:
2251:
2247:
2244:
2240:
2235:
2231:
2224:
2221:
2218:
2214:
2208:
2205:
2202:
2198:
2194:
2191:
2187:
2184:
2180:
2175:
2171:
2165:
2158:
2155:
2152:
2147:
2144:
2141:
2136:
2132:
2128:
2124:
2121:
2116:
2110:
2105:
2102:
2099:
2094:
2090:
2066:
2042:
2039:
2036:
2031:
2027:
2003:
1983:
1980:
1977:
1955:
1950:
1947:
1944:
1938:
1932:
1929:
1926:
1921:
1917:
1911:
1906:
1894:
1891:
1890:
1889:
1876:
1868:
1863:
1860:
1857:
1851:
1845:
1842:
1839:
1834:
1830:
1824:
1819:
1812:
1803:
1799:
1795:
1792:
1789:
1786:
1781:
1777:
1770:
1765:
1760:
1757:
1754:
1751:
1744:
1741:
1738:
1734:
1728:
1721:
1716:
1713:
1710:
1707:
1701:
1695:
1692:
1689:
1686:
1679:
1670:
1663:
1659:
1655:
1652:
1649:
1646:
1641:
1637:
1631:
1624:
1621:
1618:
1614:
1607:
1602:
1597:
1594:
1591:
1588:
1563:
1560:
1557:
1554:
1551:
1548:
1545:
1542:
1539:
1536:
1533:
1528:
1524:
1503:
1500:
1497:
1473:
1470:
1467:
1447:
1427:
1424:
1421:
1418:
1415:
1410:
1406:
1385:
1382:
1379:
1374:
1370:
1349:
1329:
1326:
1323:
1320:
1317:
1312:
1308:
1287:
1267:
1247:
1227:
1224:
1218:
1212:
1209:
1206:
1203:
1183:
1180:
1177:
1174:
1171:
1168:
1165:
1162:
1159:
1156:
1153:
1138:
1137:
1124:
1117:
1112:
1109:
1106:
1100:
1094:
1091:
1088:
1083:
1079:
1073:
1068:
1063:
1056:
1052:
1046:
1039:
1033:
1030:
1027:
1024:
1018:
1012:
1009:
1006:
1003:
996:
989:
986:
983:
980:
977:
972:
968:
961:
956:
951:
948:
945:
942:
926:
925:
914:
911:
908:
905:
902:
899:
896:
891:
887:
879:
872:
869:
866:
863:
860:
855:
852:
849:
844:
841:
838:
835:
830:
826:
802:
782:
762:
759:
756:
751:
747:
726:
723:
717:
711:
708:
705:
702:
691:
690:
677:
674:
671:
667:
661:
658:
655:
652:
648:
645:
640:
636:
633:
630:
625:
621:
617:
614:
611:
608:
605:
599:
592:
589:
585:
582:
559:
536:
533:
512:
509:
503:
496:
493:
489:
486:
463:
460:
457:
436:
432:
429:
409:
406:
403:
400:
397:
392:
388:
367:
364:
361:
358:
355:
350:
346:
325:
322:
319:
299:
296:
293:
288:
284:
256:
236:
221:
218:
216:neuron model.
137:
136:
119:September 2020
39:
37:
30:
24:
14:
13:
10:
9:
6:
4:
3:
2:
8245:
8234:
8231:
8229:
8226:
8225:
8223:
8208:
8205:
8203:
8200:
8199:
8196:
8190:
8187:
8185:
8182:
8180:
8177:
8175:
8172:
8170:
8167:
8165:
8162:
8160:
8157:
8155:
8152:
8150:
8147:
8145:
8142:
8140:
8137:
8135:
8132:
8130:
8127:
8125:
8122:
8120:
8117:
8115:
8112:
8110:
8107:
8106:
8104:
8100:
8092:
8089:
8087:
8084:
8083:
8082:
8079:
8077:
8074:
8072:
8069:
8067:
8064:
8062:
8061:Stopping time
8059:
8055:
8052:
8051:
8050:
8047:
8045:
8042:
8040:
8037:
8035:
8032:
8030:
8027:
8025:
8022:
8020:
8017:
8015:
8012:
8010:
8007:
8005:
8002:
8000:
7997:
7995:
7992:
7990:
7987:
7985:
7982:
7980:
7977:
7975:
7972:
7970:
7967:
7965:
7962:
7960:
7957:
7955:
7952:
7950:
7947:
7945:
7942:
7940:
7937:
7935:
7932:
7930:
7927:
7925:
7922:
7920:
7917:
7915:
7912:
7911:
7909:
7905:
7899:
7896:
7894:
7891:
7889:
7886:
7884:
7881:
7879:
7876:
7875:
7873:
7871:
7867:
7860:
7856:
7852:
7851:Hewitt–Savage
7848:
7844:
7840:
7836:
7835:Zero–one laws
7833:
7831:
7828:
7826:
7823:
7821:
7818:
7816:
7813:
7811:
7808:
7806:
7803:
7801:
7798:
7796:
7793:
7791:
7788:
7786:
7783:
7782:
7780:
7776:
7770:
7767:
7765:
7762:
7760:
7757:
7755:
7752:
7750:
7747:
7745:
7742:
7740:
7737:
7735:
7732:
7730:
7727:
7725:
7722:
7720:
7717:
7715:
7712:
7710:
7707:
7705:
7702:
7700:
7697:
7696:
7694:
7690:
7684:
7681:
7679:
7676:
7674:
7671:
7669:
7666:
7664:
7661:
7659:
7656:
7655:
7653:
7651:
7647:
7641:
7638:
7636:
7633:
7631:
7628:
7626:
7623:
7622:
7620:
7618:
7614:
7608:
7605:
7603:
7600:
7598:
7595:
7593:
7590:
7588:
7585:
7583:
7580:
7578:
7575:
7573:
7570:
7568:
7565:
7563:
7560:
7558:
7555:
7553:
7550:
7548:
7545:
7543:
7540:
7538:
7535:
7533:
7532:Black–Scholes
7530:
7528:
7525:
7523:
7520:
7518:
7515:
7514:
7512:
7510:
7506:
7500:
7497:
7495:
7492:
7490:
7487:
7485:
7482:
7480:
7477:
7475:
7472:
7471:
7469:
7467:
7463:
7457:
7454:
7452:
7449:
7445:
7442:
7440:
7437:
7436:
7435:
7434:Point process
7432:
7430:
7427:
7425:
7422:
7420:
7417:
7413:
7410:
7408:
7405:
7404:
7403:
7400:
7398:
7395:
7393:
7392:Gibbs measure
7390:
7388:
7385:
7383:
7380:
7379:
7377:
7373:
7367:
7364:
7362:
7359:
7357:
7354:
7352:
7349:
7347:
7344:
7340:
7337:
7335:
7332:
7330:
7327:
7325:
7322:
7321:
7320:
7317:
7315:
7312:
7310:
7307:
7305:
7302:
7300:
7297:
7296:
7294:
7290:
7284:
7281:
7279:
7276:
7274:
7271:
7269:
7266:
7264:
7261:
7259:
7256:
7254:
7251:
7249:
7246:
7244:
7241:
7237:
7234:
7232:
7229:
7228:
7227:
7224:
7222:
7219:
7217:
7214:
7212:
7209:
7207:
7204:
7202:
7199:
7197:
7194:
7192:
7189:
7187:
7184:
7182:
7181:Itô diffusion
7179:
7177:
7174:
7172:
7169:
7167:
7164:
7162:
7159:
7157:
7156:Gamma process
7154:
7152:
7149:
7147:
7144:
7142:
7139:
7137:
7134:
7132:
7129:
7127:
7124:
7122:
7119:
7117:
7114:
7112:
7109:
7105:
7102:
7100:
7097:
7095:
7092:
7090:
7087:
7085:
7082:
7081:
7080:
7077:
7073:
7070:
7069:
7068:
7065:
7063:
7060:
7058:
7055:
7054:
7052:
7050:
7046:
7038:
7035:
7033:
7030:
7028:
7027:Self-avoiding
7025:
7023:
7020:
7019:
7018:
7015:
7013:
7012:Moran process
7010:
7008:
7005:
7003:
7000:
6998:
6995:
6993:
6990:
6988:
6985:
6983:
6980:
6979:
6977:
6975:
6974:Discrete time
6971:
6967:
6960:
6955:
6953:
6948:
6946:
6941:
6940:
6937:
6928:
6926:
6919:
6916:
6911:
6907:
6903:
6899:
6894:
6889:
6885:
6881:
6874:
6871:
6865:
6860:
6853:
6850:
6844:
6839:
6832:
6829:
6824:
6820:
6816:
6812:
6808:
6804:
6799:
6794:
6790:
6786:
6779:
6776:
6771:
6767:
6763:
6759:
6755:
6751:
6747:
6743:
6736:
6733:
6728:
6724:
6720:
6716:
6712:
6708:
6703:
6698:
6694:
6690:
6683:
6680:
6675:
6671:
6666:
6661:
6656:
6651:
6647:
6643:
6638:
6633:
6629:
6625:
6621:
6614:
6612:
6608:
6602:
6597:
6590:
6587:
6582:
6578:
6574:
6570:
6566:
6562:
6557:
6552:
6548:
6544:
6537:
6535:
6531:
6524:
6520:
6517:
6515:
6512:
6510:
6507:
6505:
6502:
6501:
6497:
6495:
6493:
6488:
6485:
6480:
6476:
6474:
6470:
6466:
6463:'s notion of
6462:
6458:
6454:
6453:Frank Spitzer
6450:
6446:
6438:
6436:
6434:
6415:
6407:
6403:
6394:
6379:
6359:
6349:
6322:
6314:
6310:
6303:
6297:
6293:
6284:
6275:
6272:
6269:
6265:
6260:
6253:
6245:
6241:
6223:
6215:
6211:
6207:
6204:
6196:
6182:
6178:
6170:
6162:
6158:
6152:
6144:
6141:
6138:
6130:
6126:
6118:
6097:
6094:
6088:
6080:
6076:
6056:
6048:
6044:
6037:
6031:
6027:
6018:
6009:
6006:
6003:
5999:
5994:
5987:
5979:
5975:
5966:
5963:
5957:
5949:
5945:
5920:
5916:
5908:
5902:
5894:
5891:
5888:
5880:
5876:
5868:
5867:
5866:
5846:
5838:
5834:
5827:
5821:
5817:
5808:
5799:
5796:
5793:
5789:
5784:
5777:
5769:
5765:
5759:
5745:
5741:
5733:
5725:
5721:
5715:
5707:
5704:
5701:
5693:
5689:
5681:
5664:
5656:
5652:
5645:
5639:
5635:
5626:
5617:
5614:
5611:
5607:
5602:
5595:
5587:
5583:
5578:
5573:
5566:
5563:
5557:
5549:
5545:
5530:
5523:
5520:
5514:
5506:
5502:
5478:
5474:
5466:
5460:
5452:
5449:
5446:
5438:
5434:
5426:
5425:
5424:
5421:
5405:
5401:
5378:
5374:
5361:
5335:
5327:
5323:
5316:
5310:
5306:
5297:
5288:
5285:
5282:
5278:
5273:
5266:
5258:
5254:
5249:
5242:
5234:
5230:
5223:
5219:
5201:
5193:
5189:
5185:
5182:
5174:
5160:
5156:
5148:
5140:
5136:
5130:
5122:
5119:
5116:
5108:
5104:
5096:
5095:
5094:
5070:
5067:
5061:
5053:
5049:
5029:
5021:
5017:
5010:
5004:
5000:
4991:
4982:
4979:
4976:
4972:
4967:
4960:
4952:
4948:
4943:
4936:
4928:
4924:
4917:
4913:
4904:
4901:
4895:
4887:
4883:
4858:
4854:
4846:
4840:
4832:
4829:
4826:
4818:
4814:
4806:
4805:
4804:
4788:
4782:
4778:
4769:
4761:
4759:
4743:
4739:
4709:
4705:
4696:
4692:
4687:
4685:
4660:
4656:
4647:
4638:
4619:
4611:
4607:
4600:
4594:
4590:
4581:
4572:
4569:
4566:
4562:
4557:
4550:
4542:
4538:
4533:
4526:
4518:
4514:
4507:
4503:
4492:
4484:
4480:
4476:
4473:
4465:
4451:
4447:
4439:
4431:
4427:
4421:
4413:
4410:
4407:
4399:
4395:
4387:
4386:
4385:
4369:
4363:
4359:
4355:
4343:
4339:
4312:
4304:
4300:
4293:
4287:
4283:
4274:
4265:
4262:
4259:
4255:
4250:
4243:
4235:
4231:
4226:
4221:
4214:
4211:
4205:
4197:
4193:
4175:
4167:
4163:
4156:
4152:
4144:
4141:
4135:
4127:
4123:
4099:
4095:
4087:
4081:
4073:
4070:
4067:
4059:
4055:
4047:
4046:
4045:
4043:
4025:
4019:
4015:
3991:
3988:
3985:
3977:
3973:
3952:
3949:
3943:
3935:
3931:
3910:
3907:
3904:
3884:
3881:
3875:
3867:
3863:
3842:
3839:
3833:
3825:
3821:
3800:
3780:
3777:
3774:
3754:
3742:
3740:
3738:
3720:
3715:
3711:
3687:
3684:
3681:
3675:
3649:
3641:
3637:
3630:
3624:
3620:
3614:
3611:
3608:
3604:
3599:
3592:
3584:
3580:
3575:
3568:
3560:
3556:
3549:
3545:
3534:
3526:
3522:
3518:
3515:
3507:
3499:
3496:
3493:
3485:
3481:
3473:
3472:
3471:
3451:
3443:
3439:
3432:
3426:
3422:
3416:
3413:
3410:
3406:
3401:
3394:
3386:
3382:
3377:
3372:
3365:
3362:
3356:
3348:
3344:
3326:
3318:
3314:
3307:
3303:
3295:
3292:
3286:
3278:
3274:
3259:
3252:
3246:
3238:
3235:
3232:
3224:
3220:
3212:
3211:
3210:
3194:
3190:
3180:
3164:
3160:
3152:
3134:
3128:
3124:
3101:
3097:
3076:
3056:
3034:
3030:
3017:
3015:
3001:
2998:
2993:
2987:
2983:
2974:
2956:
2952:
2931:
2909:
2903:
2899:
2889:
2872:
2869:
2866:
2819:
2815:
2785:
2777:
2773:
2764:
2760:
2754:
2738:
2735:
2732:
2729:
2723:
2717:
2703:
2700:
2694:
2686:
2682:
2670:
2647:
2646:
2645:
2642:
2628:
2608:
2588:
2580:
2561:
2558:
2555:
2547:
2543:
2522:
2519:
2515:
2512:
2490:
2487:
2478:
2458:
2455:
2446:
2442:
2421:
2413:
2397:
2389:
2385:
2367:
2361:
2357:
2325:
2322:
2318:
2315:
2312:
2309:
2306:
2300:
2292:
2288:
2284:
2280:
2277:
2264:
2260:
2245:
2242:
2233:
2229:
2222:
2216:
2212:
2206:
2203:
2200:
2196:
2192:
2185:
2182:
2173:
2169:
2156:
2153:
2150:
2142:
2134:
2130:
2126:
2122:
2119:
2114:
2108:
2100:
2092:
2088:
2080:
2079:
2078:
2064:
2056:
2037:
2029:
2025:
2017:
2001:
1981:
1978:
1975:
1948:
1945:
1942:
1936:
1927:
1919:
1915:
1904:
1892:
1861:
1858:
1855:
1849:
1840:
1832:
1828:
1817:
1801:
1797:
1793:
1787:
1779:
1775:
1763:
1742:
1739:
1736:
1732:
1726:
1711:
1708:
1705:
1699:
1690:
1684:
1661:
1657:
1653:
1647:
1639:
1635:
1622:
1619:
1616:
1612:
1600:
1577:
1576:
1575:
1561:
1558:
1555:
1552:
1549:
1543:
1540:
1537:
1531:
1526:
1522:
1501:
1498:
1495:
1471:
1468:
1465:
1445:
1425:
1422:
1416:
1408:
1404:
1380:
1372:
1368:
1347:
1324:
1321:
1318:
1310:
1306:
1285:
1265:
1245:
1222:
1216:
1207:
1201:
1178:
1175:
1172:
1169:
1166:
1163:
1160:
1154:
1151:
1142:
1110:
1107:
1104:
1098:
1089:
1081:
1077:
1066:
1054:
1044:
1028:
1025:
1022:
1016:
1007:
1001:
987:
984:
978:
970:
966:
954:
931:
930:
929:
912:
906:
903:
897:
889:
885:
870:
867:
864:
858:
842:
836:
828:
824:
816:
815:
814:
800:
780:
757:
749:
745:
721:
715:
706:
700:
675:
672:
669:
659:
656:
653:
650:
646:
643:
631:
623:
619:
609:
603:
597:
590:
587:
580:
573:
572:
571:
557:
534:
531:
507:
501:
494:
491:
484:
475:
461:
458:
455:
430:
427:
407:
404:
398:
390:
386:
365:
362:
356:
348:
344:
323:
320:
317:
294:
286:
282:
273:
268:
234:
225:
219:
217:
215:
210:
206:
202:
197:
195:
191:
187:
183:
179:
175:
170:
168:
167:stochasticity
164:
160:
156:
152:
143:
133:
130:
122:
111:
108:
104:
101:
97:
94:
90:
87:
83:
80: –
79:
75:
74:Find sources:
68:
64:
60:
56:
52:
45:
38:
29:
28:
19:
8119:Econometrics
8081:Wiener space
7969:Itô integral
7870:Inequalities
7759:Self-similar
7729:Gauss–Markov
7719:Exchangeable
7699:Càdlàg paths
7635:Risk process
7587:LIBOR market
7456:Random graph
7451:Random field
7263:Superprocess
7201:Lévy process
7196:Jump process
7171:Hunt process
7007:Markov chain
6924:
6918:
6883:
6879:
6873:
6852:
6831:
6788:
6784:
6778:
6745:
6741:
6735:
6692:
6688:
6682:
6627:
6623:
6589:
6546:
6542:
6489:
6484:hydrodynamic
6481:
6477:
6469:Bruno Cessac
6442:
6350:
6347:
5864:
5422:
5365:
5092:
4767:
4765:
4691:neurobiology
4688:
4642:
4330:
4041:
3746:
3667:
3469:
3181:
3150:
3021:
2890:
2806:
2643:
2578:
2476:
2348:
2015:
1896:
1487:
1258:, where row
927:
793:before time
692:
476:
269:
226:
223:
209:weighted sum
204:
200:
198:
193:
185:
181:
177:
173:
171:
154:
150:
148:
125:
116:
106:
99:
92:
85:
73:
8164:Ruin theory
8102:Disciplines
7974:Itô's lemma
7749:Predictable
7424:Percolation
7407:Potts model
7402:Ising model
7366:White noise
7324:Differences
7186:Itô process
7126:Cox process
7022:Loop-erased
7017:Random walk
6927:, Jun. 2014
3089:increments
2434:. The term
2077:. That is,
190:probability
55:independent
8222:Categories
8174:Statistics
7954:Filtration
7855:Kolmogorov
7839:Blumenthal
7764:Stationary
7704:Continuous
7692:Properties
7577:Hull–White
7319:Martingale
7206:Local time
7094:Fractional
7072:pure birth
6893:1505.00045
6637:1606.06391
6601:1902.03504
6525:References
6473:Hédi Soula
4768:refractory
4684:millivolts
2414:of neuron
2390:of neuron
813:, that is
378:) or not (
89:newspapers
63:redirected
8086:Classical
7099:Geometric
7089:Excursion
6910:254746914
6864:1410.3263
6843:1410.6086
6823:254694893
6798:1401.4264
6702:1002.3275
6581:254698364
6556:1212.5505
6494:project.
6301:→
6279:∖
6273:∈
6266:∑
6208:−
6035:→
6013:∖
6007:∈
6000:∑
5825:→
5803:∖
5797:∈
5790:∑
5643:→
5621:∖
5615:∈
5608:∑
5375:μ
5314:→
5292:∖
5286:∈
5279:∑
5220:μ
5186:−
5008:→
4986:∖
4980:∈
4973:∑
4914:μ
4786:→
4598:→
4576:∖
4570:∈
4563:∑
4504:μ
4477:−
4367:→
4291:→
4269:∖
4263:∈
4256:∑
4153:μ
4023:→
3908:≠
3793:, neuron
3712:μ
3676:α
3628:→
3612:∈
3605:∑
3546:μ
3519:−
3430:→
3414:∈
3407:∑
3304:μ
3161:μ
3132:→
2991:→
2907:→
2816:ϕ
2761:ϕ
2736:−
2727:∞
2724:−
2671:
2544:α
2412:dendrites
2365:→
2319:−
2313:−
2289:τ
2285:−
2261:α
2220:→
2204:∈
2197:∑
2154:−
2131:τ
2115:∑
2016:potential
1979:∈
1946:−
1916:τ
1859:−
1829:τ
1764:
1740:∈
1733:∏
1709:−
1694:∞
1691:−
1620:∈
1613:⋂
1601:
1556:∈
1532:∈
1499:⊂
1405:τ
1369:τ
1322:−
1211:∞
1208:−
1173:…
1108:−
1078:τ
1051:Φ
1026:−
1011:∞
1008:−
955:
859:
825:τ
746:τ
710:∞
707:−
673:∈
657:≤
651:≤
431:∈
321:∈
274:variable
255:Δ
205:potential
53:that are
8207:Category
8091:Abstract
7625:Bühlmann
7231:Compound
6770:14108665
6762:10636947
6719:20658138
6674:27819336
6519:NeuroMat
6498:See also
6492:NeuroMat
4007:will be
3965:), then
3897:for all
2516:′
2491:′
2459:′
2326:′
2281:′
2246:′
2186:′
2123:′
1574:we have
693:and let
647:′
591:′
548:to time
535:′
495:′
155:GL model
7714:Ergodic
7602:Vašíček
7444:Poisson
7104:Meander
6803:Bibcode
6727:1072268
6665:5098137
6642:Bibcode
6561:Bibcode
6439:History
4646:resting
2410:to the
336:fired (
272:Boolean
182:firings
174:neurons
157:) is a
103:scholar
67:deleted
8054:Tanaka
7739:Mixing
7734:Markov
7607:Wilkie
7572:Ho–Lee
7567:Heston
7339:Super-
7084:Bridge
7032:Biased
6908:
6821:
6768:
6760:
6725:
6717:
6672:
6662:
6579:
6435:zone.
4331:where
2807:where
2475:, the
2384:weight
178:spikes
161:for a
105:
98:
91:
84:
76:
59:merged
7907:Tools
7683:M/M/c
7678:M/M/1
7673:M/G/1
7663:Fluid
7329:Local
6906:S2CID
6888:arXiv
6859:arXiv
6838:arXiv
6819:S2CID
6793:arXiv
6766:S2CID
6723:S2CID
6697:arXiv
6632:arXiv
6596:arXiv
6577:S2CID
6551:arXiv
2577:is a
110:JSTOR
96:books
65:, or
7859:Lévy
7658:Bulk
7542:Chen
7334:Sub-
7292:Both
6758:PMID
6715:PMID
6670:PMID
6459:and
6447:and
6393:bits
3767:and
2621:and
2504:and
2388:axon
1396:and
868:<
477:Let
448:and
153:(or
149:The
82:news
7439:Cox
6898:doi
6884:163
6811:doi
6789:158
6750:doi
6707:doi
6660:PMC
6650:doi
6569:doi
6547:151
6455:'s
1458:to
180:or
8224::
7857:,
7853:,
7849:,
7845:,
7841:,
6904:.
6896:.
6882:.
6817:.
6809:.
6801:.
6787:.
6764:.
6756:.
6746:12
6744:.
6721:.
6713:.
6705:.
6693:62
6691:.
6668:.
6658:.
6648:.
6640:.
6626:.
6622:.
6610:^
6575:.
6567:.
6559:.
6545:.
6533:^
4686:.
3739:.
3014:.
2888:.
474:.
169:.
61:,
7861:)
7837:(
6958:e
6951:t
6944:v
6912:.
6900::
6890::
6867:.
6861::
6846:.
6840::
6825:.
6813::
6805::
6795::
6772:.
6752::
6729:.
6709::
6699::
6676:.
6652::
6644::
6634::
6628:6
6604:.
6598::
6583:.
6571::
6563::
6553::
6419:]
6416:t
6413:[
6408:i
6404:X
6380:n
6360:n
6331:)
6326:]
6323:t
6320:[
6315:j
6311:X
6304:i
6298:j
6294:w
6288:}
6285:i
6282:{
6276:I
6270:j
6261:+
6257:]
6254:t
6251:[
6246:i
6242:E
6236:(
6230:)
6227:]
6224:t
6221:[
6216:i
6212:X
6205:1
6202:(
6197:+
6189:R
6183:i
6179:V
6174:]
6171:t
6168:[
6163:i
6159:X
6153:=
6148:]
6145:1
6142:+
6139:t
6136:[
6131:i
6127:V
6098:0
6095:=
6092:]
6089:t
6086:[
6081:i
6077:X
6071:f
6068:i
6060:]
6057:t
6054:[
6049:j
6045:X
6038:i
6032:j
6028:w
6022:}
6019:i
6016:{
6010:I
6004:j
5995:+
5991:]
5988:t
5985:[
5980:i
5976:E
5967:1
5964:=
5961:]
5958:t
5955:[
5950:i
5946:X
5940:f
5937:i
5927:R
5921:i
5917:V
5909:{
5903:=
5898:]
5895:1
5892:+
5889:t
5886:[
5881:i
5877:V
5850:]
5847:t
5844:[
5839:j
5835:X
5828:i
5822:j
5818:w
5812:}
5809:i
5806:{
5800:I
5794:j
5785:+
5781:]
5778:t
5775:[
5770:i
5766:E
5760:+
5752:R
5746:i
5742:V
5737:]
5734:t
5731:[
5726:i
5722:X
5716:=
5711:]
5708:1
5705:+
5702:t
5699:[
5694:i
5690:V
5668:]
5665:t
5662:[
5657:j
5653:X
5646:i
5640:j
5636:w
5630:}
5627:i
5624:{
5618:I
5612:j
5603:+
5599:]
5596:t
5593:[
5588:i
5584:E
5579:+
5574:}
5567:0
5564:=
5561:]
5558:t
5555:[
5550:i
5546:X
5540:f
5537:i
5531:0
5524:1
5521:=
5518:]
5515:t
5512:[
5507:i
5503:X
5497:f
5494:i
5485:R
5479:i
5475:V
5467:{
5461:=
5456:]
5453:1
5450:+
5447:t
5444:[
5439:i
5435:V
5406:i
5402:V
5379:i
5344:)
5339:]
5336:t
5333:[
5328:j
5324:X
5317:i
5311:j
5307:w
5301:}
5298:i
5295:{
5289:I
5283:j
5274:+
5270:]
5267:t
5264:[
5259:i
5255:E
5250:+
5246:]
5243:t
5240:[
5235:i
5231:V
5224:i
5214:(
5208:)
5205:]
5202:t
5199:[
5194:i
5190:X
5183:1
5180:(
5175:+
5167:R
5161:i
5157:V
5152:]
5149:t
5146:[
5141:i
5137:X
5131:=
5126:]
5123:1
5120:+
5117:t
5114:[
5109:i
5105:V
5071:0
5068:=
5065:]
5062:t
5059:[
5054:i
5050:X
5044:f
5041:i
5033:]
5030:t
5027:[
5022:j
5018:X
5011:i
5005:j
5001:w
4995:}
4992:i
4989:{
4983:I
4977:j
4968:+
4964:]
4961:t
4958:[
4953:i
4949:E
4944:+
4940:]
4937:t
4934:[
4929:i
4925:V
4918:i
4905:1
4902:=
4899:]
4896:t
4893:[
4888:i
4884:X
4878:f
4875:i
4865:R
4859:i
4855:V
4847:{
4841:=
4836:]
4833:1
4830:+
4827:t
4824:[
4819:i
4815:V
4789:i
4783:i
4779:w
4744:i
4740:V
4716:B
4710:i
4706:V
4667:B
4661:i
4657:V
4623:]
4620:t
4617:[
4612:j
4608:X
4601:i
4595:j
4591:w
4585:}
4582:i
4579:{
4573:I
4567:j
4558:+
4554:]
4551:t
4548:[
4543:i
4539:E
4534:+
4530:]
4527:t
4524:[
4519:i
4515:V
4508:i
4499:)
4496:]
4493:t
4490:[
4485:i
4481:X
4474:1
4471:(
4466:+
4458:R
4452:i
4448:V
4443:]
4440:t
4437:[
4432:i
4428:X
4422:=
4417:]
4414:1
4411:+
4408:t
4405:[
4400:i
4396:V
4370:i
4364:i
4360:w
4356:=
4350:R
4344:i
4340:V
4316:]
4313:t
4310:[
4305:j
4301:X
4294:i
4288:j
4284:w
4278:}
4275:i
4272:{
4266:I
4260:j
4251:+
4247:]
4244:t
4241:[
4236:i
4232:E
4227:+
4222:}
4215:0
4212:=
4209:]
4206:t
4203:[
4198:i
4194:X
4188:f
4185:i
4179:]
4176:t
4173:[
4168:i
4164:V
4157:i
4145:1
4142:=
4139:]
4136:t
4133:[
4128:i
4124:X
4118:f
4115:i
4106:R
4100:i
4096:V
4088:{
4082:=
4077:]
4074:1
4071:+
4068:t
4065:[
4060:i
4056:V
4026:i
4020:i
4016:w
3995:]
3992:1
3989:+
3986:t
3983:[
3978:i
3974:V
3953:0
3950:=
3947:]
3944:t
3941:[
3936:i
3932:E
3911:i
3905:j
3885:0
3882:=
3879:]
3876:t
3873:[
3868:j
3864:X
3843:1
3840:=
3837:]
3834:t
3831:[
3826:i
3822:X
3801:i
3781:1
3778:+
3775:t
3755:t
3721:s
3716:i
3691:]
3688:s
3685:,
3682:r
3679:[
3653:]
3650:t
3647:[
3642:j
3638:X
3631:i
3625:j
3621:w
3615:I
3609:j
3600:+
3596:]
3593:t
3590:[
3585:i
3581:E
3576:+
3572:]
3569:t
3566:[
3561:i
3557:V
3550:i
3541:)
3538:]
3535:t
3532:[
3527:i
3523:X
3516:1
3513:(
3508:=
3503:]
3500:1
3497:+
3494:t
3491:[
3486:i
3482:V
3455:]
3452:t
3449:[
3444:j
3440:X
3433:i
3427:j
3423:w
3417:I
3411:j
3402:+
3398:]
3395:t
3392:[
3387:i
3383:E
3378:+
3373:}
3366:0
3363:=
3360:]
3357:t
3354:[
3349:i
3345:X
3339:f
3336:i
3330:]
3327:t
3324:[
3319:i
3315:V
3308:i
3296:1
3293:=
3290:]
3287:t
3284:[
3279:i
3275:X
3269:f
3266:i
3260:0
3253:{
3247:=
3242:]
3239:1
3236:+
3233:t
3230:[
3225:i
3221:V
3195:i
3191:V
3165:i
3135:i
3129:j
3125:w
3102:i
3098:V
3077:j
3057:i
3035:i
3031:V
3002:0
2999:=
2994:i
2988:j
2984:w
2957:i
2953:V
2932:j
2910:i
2904:j
2900:w
2876:]
2873:1
2870:,
2867:0
2864:[
2843:R
2820:i
2792:)
2789:]
2786:t
2783:[
2778:i
2774:V
2770:(
2765:i
2755:=
2748:)
2742:]
2739:1
2733:t
2730::
2721:[
2718:X
2711:|
2704:1
2701:=
2698:]
2695:t
2692:[
2687:i
2683:X
2676:(
2666:b
2663:o
2660:r
2657:P
2629:s
2609:i
2589:r
2565:]
2562:s
2559:,
2556:r
2553:[
2548:i
2523:1
2520:+
2513:t
2488:t
2463:]
2456:t
2452:[
2447:i
2443:E
2422:i
2398:j
2368:i
2362:j
2358:w
2332:]
2323:t
2316:1
2310:t
2307:,
2304:]
2301:t
2298:[
2293:i
2278:t
2272:[
2265:i
2255:)
2250:]
2243:t
2239:[
2234:j
2230:X
2223:i
2217:j
2213:w
2207:I
2201:j
2193:+
2190:]
2183:t
2179:[
2174:i
2170:E
2164:(
2157:1
2151:t
2146:]
2143:t
2140:[
2135:i
2127:=
2120:t
2109:=
2104:]
2101:t
2098:[
2093:i
2089:V
2065:i
2041:]
2038:t
2035:[
2030:i
2026:V
2002:t
1982:I
1976:i
1954:]
1949:1
1943:t
1937::
1931:]
1928:t
1925:[
1920:i
1910:[
1905:X
1875:)
1867:]
1862:1
1856:t
1850::
1844:]
1841:t
1838:[
1833:k
1823:[
1818:X
1811:|
1802:k
1798:a
1794:=
1791:]
1788:t
1785:[
1780:k
1776:X
1769:(
1759:b
1756:o
1753:r
1750:P
1743:K
1737:k
1727:=
1720:)
1715:]
1712:1
1706:t
1700::
1688:[
1685:X
1678:|
1669:}
1662:k
1658:a
1654:=
1651:]
1648:t
1645:[
1640:k
1636:X
1630:{
1623:K
1617:k
1606:(
1596:b
1593:o
1590:r
1587:P
1562:,
1559:K
1553:i
1550:,
1547:}
1544:1
1541:,
1538:0
1535:{
1527:i
1523:a
1502:I
1496:K
1472:1
1469:+
1466:t
1446:t
1426:1
1423:+
1420:]
1417:t
1414:[
1409:3
1384:]
1381:t
1378:[
1373:3
1348:t
1328:]
1325:1
1319:t
1316:[
1311:i
1307:X
1286:i
1266:i
1246:t
1226:]
1223:t
1217::
1205:[
1202:X
1182:}
1179:7
1176:,
1170:,
1167:2
1164:,
1161:1
1158:{
1155:=
1152:I
1123:)
1116:]
1111:1
1105:t
1099::
1093:]
1090:t
1087:[
1082:i
1072:[
1067:X
1062:(
1055:i
1045:=
1038:)
1032:]
1029:1
1023:t
1017::
1005:[
1002:X
995:|
988:1
985:=
982:]
979:t
976:[
971:i
967:X
960:(
950:b
947:o
944:r
941:P
913:.
910:}
907:1
904:=
901:]
898:s
895:[
890:i
886:X
878:|
871:t
865:s
862:{
854:x
851:a
848:m
843:=
840:]
837:t
834:[
829:i
801:t
781:i
761:]
758:t
755:[
750:i
725:]
722:t
716::
704:[
701:X
676:I
670:i
666:)
660:t
654:s
644:t
639:)
635:]
632:s
629:[
624:i
620:X
616:(
613:(
610:=
607:]
604:t
598::
588:t
584:[
581:X
558:t
532:t
511:]
508:t
502::
492:t
488:[
485:X
462:1
459:+
456:t
435:Z
428:t
408:0
405:=
402:]
399:t
396:[
391:i
387:X
366:1
363:=
360:]
357:t
354:[
349:i
345:X
324:I
318:i
298:]
295:t
292:[
287:i
283:X
235:I
201:N
194:N
186:N
132:)
126:(
121:)
117:(
107:·
100:·
93:·
86:·
69:.
47:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.