Knowledge

Synaptic weight

Source 📝

553: 496: 280:
Computationally, this means that if a large signal from one of the input neurons results in a large signal from one of the output neurons, then the synaptic weight between those two neurons will increase. The rule is unstable, however, and is typically modified using such variations as
237: 392:) is still poorly understood. Hebb's original learning rule was originally applied to biological systems, but has had to undergo many modifications as a number of theoretical and experimental problems came to light. 112: 88: 261: 132: 140: 537: 594: 354:
The amount of neurotransmitter released into the synapse and the amount that can be absorbed in the following cell (determined by the number of
613: 633: 530: 628: 587: 114:, or pre- and post-synaptic neurons respectively, are interconnected with synaptic weights represented by the matrix 523: 618: 48: 44: 623: 580: 385: 320:, signal transmission is carried out by interconnected networks of nerve cells, or neurons. For the basic 317: 286: 243:
where the rows of the synaptic matrix represent the vector of synaptic weights for the output indexed by
444: 389: 406: 381: 93: 69: 301:
For biological networks, the effect of synaptic weights is not as simple as for linear neurons or
373: 472: 560: 462: 452: 337: 321: 302: 63: 36: 24: 16:
Strength or amplitude of a connection between two nodes in neuroscience and computer science
35:
of a connection between two nodes, corresponding in biology to the amount of influence the
290: 448: 232:{\displaystyle y_{j}=\sum _{i}w_{ij}x_{i}~~{\textrm {or}}~~{\textbf {y}}=w{\textbf {x}}} 564: 507: 467: 432: 411: 401: 267: 246: 117: 51: 607: 359: 355: 282: 266:
The synaptic weight is changed by using a learning rule, the most basic of which is
503: 20: 457: 348: 433:"The influence of synaptic weight distribution on neuronal population dynamics" 343:
The synaptic weight in this process is determined by several variable factors:
310: 306: 333: 32: 552: 476: 363: 329: 495: 40: 313:
have seen some success in mathematically describing these networks.
340:
which is analogous to the output signal in the computational case.
369:
The number of such connections made by the axon to the dendrites,
325: 347:
How well the input signal propagates through the axon (see
431:
Iyer, R; Menon, V; Buice, M; Koch, C; Mihalas, S (2013).
568: 511: 380:
The changes in synaptic weight that occur is known as
362:
on the cell membrane and the amount of intracellular
328:, which releases neurotransmitter chemicals into the 249: 143: 120: 96: 72: 270:, which is usually stated in biological terms as 255: 231: 126: 106: 82: 336:of the next neuron, which can then generate an 43:has on another. The term is typically used in 588: 531: 8: 384:, and the process behind long-term changes ( 595: 581: 538: 524: 275:Neurons that fire together, wire together. 466: 456: 248: 223: 222: 210: 209: 197: 196: 184: 171: 161: 148: 142: 119: 98: 97: 95: 74: 73: 71: 423: 324:, the input signal is carried by the 62:In a computational neural network, a 7: 549: 547: 492: 490: 372:How well the signal propagates and 224: 211: 99: 75: 14: 551: 494: 1: 107:{\displaystyle {\textbf {y}}} 83:{\displaystyle {\textbf {x}}} 567:. You can help Knowledge by 510:. You can help Knowledge by 458:10.1371/journal.pcbi.1003248 134:, where for a linear neuron 650: 614:Artificial neural networks 546: 489: 437:PLOS Computational Biology 332:which is picked up by the 31:refers to the strength or 376:in the postsynaptic cell. 634:Computer science stubs 386:long-term potentiation 318:central nervous system 287:radial basis functions 278: 257: 233: 128: 108: 84: 272: 258: 234: 129: 109: 85: 247: 141: 118: 94: 70: 449:2013PLSCB...9E3248I 407:Synaptic plasticity 382:synaptic plasticity 629:Neuroscience stubs 253: 229: 166: 124: 104: 80: 576: 575: 519: 518: 316:In the mammalian 256:{\displaystyle j} 226: 213: 208: 205: 200: 195: 192: 157: 127:{\displaystyle w} 101: 77: 66:or set of inputs 641: 619:Neural circuitry 597: 590: 583: 561:computer science 555: 548: 540: 533: 526: 498: 491: 481: 480: 470: 460: 443:(10): e1003248. 428: 366:and other ions), 338:action potential 322:pyramidal neuron 303:Hebbian learning 262: 260: 259: 254: 238: 236: 235: 230: 228: 227: 215: 214: 206: 203: 202: 201: 198: 193: 190: 189: 188: 179: 178: 165: 153: 152: 133: 131: 130: 125: 113: 111: 110: 105: 103: 102: 89: 87: 86: 81: 79: 78: 25:computer science 649: 648: 644: 643: 642: 640: 639: 638: 624:Neuroplasticity 604: 603: 602: 601: 545: 544: 487: 485: 484: 430: 429: 425: 420: 398: 309:models such as 299: 291:backpropagation 245: 244: 180: 167: 144: 139: 138: 116: 115: 92: 91: 68: 67: 60: 29:synaptic weight 17: 12: 11: 5: 647: 645: 637: 636: 631: 626: 621: 616: 606: 605: 600: 599: 592: 585: 577: 574: 573: 556: 543: 542: 535: 528: 520: 517: 516: 499: 483: 482: 422: 421: 419: 416: 415: 414: 412:Hebbian theory 409: 404: 402:Neural network 397: 394: 378: 377: 370: 367: 360:NMDA receptors 352: 298: 295: 252: 241: 240: 221: 218: 187: 183: 177: 174: 170: 164: 160: 156: 151: 147: 123: 59: 56: 52:neural network 15: 13: 10: 9: 6: 4: 3: 2: 646: 635: 632: 630: 627: 625: 622: 620: 617: 615: 612: 611: 609: 598: 593: 591: 586: 584: 579: 578: 572: 570: 566: 563:article is a 562: 557: 554: 550: 541: 536: 534: 529: 527: 522: 521: 515: 513: 509: 506:article is a 505: 500: 497: 493: 488: 478: 474: 469: 464: 459: 454: 450: 446: 442: 438: 434: 427: 424: 417: 413: 410: 408: 405: 403: 400: 399: 395: 393: 391: 387: 383: 375: 371: 368: 365: 361: 357: 353: 350: 346: 345: 344: 341: 339: 335: 331: 327: 323: 319: 314: 312: 308: 304: 296: 294: 292: 288: 284: 277: 276: 271: 269: 264: 250: 219: 216: 185: 181: 175: 172: 168: 162: 158: 154: 149: 145: 137: 136: 135: 121: 65: 57: 55: 53: 50: 46: 42: 38: 34: 30: 26: 22: 569:expanding it 558: 512:expanding it 504:neuroscience 501: 486: 440: 436: 426: 379: 342: 315: 300: 279: 274: 273: 265: 242: 90:and outputs 61: 28: 21:neuroscience 18: 349:myelination 307:biophysical 305:. However, 293:algorithm. 268:Hebb's rule 58:Computation 608:Categories 418:References 390:depression 374:integrates 311:BCM theory 283:Oja's rule 54:research. 49:biological 45:artificial 334:dendrites 159:∑ 33:amplitude 477:24204219 396:See also 468:3808453 445:Bibcode 364:calcium 330:synapse 297:Biology 289:or the 39:of one 475:  465:  207:  204:  194:  191:  64:vector 41:neuron 37:firing 559:This 502:This 565:stub 508:stub 473:PMID 388:and 358:and 356:AMPA 326:axon 47:and 23:and 463:PMC 453:doi 19:In 610:: 471:. 461:. 451:. 439:. 435:. 351:), 285:, 263:. 199:or 27:, 596:e 589:t 582:v 571:. 539:e 532:t 525:v 514:. 479:. 455:: 447:: 441:9 251:j 239:. 225:x 220:w 217:= 212:y 186:i 182:x 176:j 173:i 169:w 163:i 155:= 150:j 146:y 122:w 100:y 76:x

Index

neuroscience
computer science
amplitude
firing
neuron
artificial
biological
neural network
vector
Hebb's rule
Oja's rule
radial basis functions
backpropagation
Hebbian learning
biophysical
BCM theory
central nervous system
pyramidal neuron
axon
synapse
dendrites
action potential
myelination
AMPA
NMDA receptors
calcium
integrates
synaptic plasticity
long-term potentiation
depression

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.