Knowledge

On Intelligence

Source ๐Ÿ“

435:, at a precise location in the visual field (the scene)". It has been experimentally determined, for example, after mapping the angular position of some objects in the visual field, there will be a one-to-one correspondence of cells in the scene to the angular positions of those objects. Hawkins predicts that when the features of a visual scene are known in a memory, anticipatory cells should fire 25: 333:) of patterns and developing invariant representations. Higher levels of the cortical hierarchy predict the future on a longer time scale, or over a wider range of sensory input. Lower levels interpret or control limited domains of experience, or sensory or effector systems. Connections from the higher level states predispose some selected transitions in the lower-level state machines. 480:
will be active during a learned sequence. Hawkins posits that these cells will remain active for the duration of the learned sequence, even if the remainder of the cortical column is shifting state. Since we do not know the encoding of the sequence, we do not yet know the definition of
349:
is a basic element in the framework. Hawkins places particular emphasis on the role of the interconnections from peer columns, and the activation of columns as a whole. He strongly implies that a column is the cortex's physical representation of a state in a state machine.
303:
Hawkins' basic idea is that the brain is a mechanism to predict the future, specifically, hierarchical regions of the brain predict their future input sequences. Perhaps not always far in the future, but far enough to be of real use to an organism. As such, the brain is a
361:. For example, for the purposes of his framework, the nerve impulses can be taken to form a temporal sequence (but phase encoding could be a possible implementation of such a sequence; these details are immaterial for the framework). 514:
6. Hawkins' novel prediction is that certain cells are inhibited during a learned sequence. A class of cells in layers 2 and 3 should NOT fire during a learned sequence, the axons of these "exception cells" should fire
129: 447:
3. In layers 2 and 3, predictive activity (neural firing) should stop propagating at specific cells, corresponding to a specific prediction. Hawkins does not rule out anticipatory cells in layers 4 and 5.
924: 357:, but merely signals that the natural process has performed Hawkins' functional decomposition in a different, unexpected way, as Hawkins' motivation is to create intelligent 353:
As an engineer, any specific failure to find a natural occurrence of some process in his framework does not signal a fault in the memory-prediction framework
464:
are in layer 2, physically adjacent to layer 1. Hawkins does not rule out the existence of layer 3 cells with dendrites in layer 1, which might perform as
675: 929: 815: 527:
7. If an unusual event occurs (the learned sequence fails), the "exception cells" should fire, propagating up the cortical hierarchy to the
759:
George, Dileep; Hawkins, Jeff (2005). "A Hierarchical Bayesian Model of Invariant Pattern Recognition in the Visual Cortex": 1812โ€“1817.
884: 658: 674:
Fogassi, Leonardo; Ferrari, Pier Francesco; Gesierich, Benno; Rozzi, Stefano; Chersi, Fabian; Rizzolatti, Giacomo (April 29, 2005).
222: 108: 871: 42: 919: 89: 339:
is part of the framework, in which the event of learning physically alters neurons and connections, as learning takes place.
46: 61: 608: 244: 539:
8. Hawkins predicts a cascade of predictions, when recognition occurs, propagating down the cortical column (with each
68: 797:- Research papers and programs presenting experimental results with Bayesian models of the Memory-Prediction Framework 614: 298: 282: 308: 794: 75: 899: 377:
as a prototype for some example predictions, such as Predictions 2, 8, 10, and 11. Other predictions cite the
122:
On Intelligence: How a New Understanding of the Brain will Lead to the Creation of Truly Intelligent Machines
270:
On Intelligence: How a New Understanding of the Brain will Lead to the Creation of Truly Intelligent Machines
57: 35: 519:. This prevents flooding the brain with the usual sensations, leaving only exceptions for post-processing. 760: 323: 305: 774: 722: 584: 690: 765: 839: 714: 342: 460:. Hawkins calls the cells which fire in this sequence "name cells". Hawkins suggests that these 706: 654: 650: 643: 611:, a technology by Hawkins's startup Numenta Inc. to replicate the properties of the neocortex. 571:. Hawkins posits a temporal window (presuming time-encoded firing) which is necessary for his 330: 229: 217: 82: 892: 856: 831: 698: 336: 278: 255: 146: 787: 498: 428: 395: 378: 346: 694: 588: 556: 452:"Name cells" at layers 2 and 3 should preferentially connect to layer 6 cells of cortex 875: 913: 432: 411: 374: 319: 903: 718: 599:
11. Hawkins predicts that "name cells" will be found in all regions of the cortex.
274: 142: 551:
Pyramidal cells should detect coincidences of synaptic activity on thin dendrites
528: 493:(i.e., the name cells simultaneously "light up") across an array of name cells. 490: 174: 24: 431:, Hawkins predicts, for example, "we should find anticipatory cells in or near 326:
state machine, the machine responds to future events predicted from past data.
403: 370: 164: 800: 702: 544: 202: 710: 443:
Prediction should stop propagating in the cortical column at layers 2 and 3
489:; Hawkins suggests that the ON pattern may be as simple as a simultaneous 560: 312: 236: 860: 835: 579:
Learned representations move down the cortical hierarchy, with training
568: 540: 358: 329:
The hierarchy is capable of memorizing frequently observed sequences (
572: 564: 676:"Parietal lobe: from action organization to intention understanding" 286: 803:- An open source project for modeling Memory-Prediction Framework 827: 751: 128: 847:
Colwell, B. (2005). "Machine Intelligence Meets Neuroscience".
322:
actually controls the behavior of the organism. Since it is a
385:
An Appendix of 11 Testable Predictions, beginning on page 237:
18: 510:"Exception cells" should remain OFF during a learned sequence 456:
4. Learned sequences of firings comprise a representation of
230: 587:(IT) level has learned a sequence, that eventually cells in 365:
Predictions of the theory of the memory-prediction framework
390:
Enhanced neural activity in anticipation of a sensory event
256: 559:
should be capable of detecting coincident events on thin
523:"Exception cells" should propagate unanticipated events 472:"Name cells" should remain ON during a learned sequence 254: 242: 228: 216: 208: 198: 190: 180: 170: 160: 152: 138: 49:. Unsourced material may be challenged and removed. 642: 402:", cells that fire in anticipation of a sensory 925:Non-fiction books about Artificial intelligence 535:"Aha! cells" should trigger predictive activity 583:10. Hawkins posits, for example, that if the 8: 121: 595:"Name cells" exist in all regions of cortex 439:the actual objects are seen in the scene. 398:, Hawkins (2004) predicts "we should find 311:with special properties that enable it to 127: 120: 816:"Machine Intelligence Meets Neuroscience" 764: 636: 634: 632: 630: 109:Learn how and when to remove this message 289:and describes some of its consequences. 885:"On Intelligence, People and Computers" 626: 900:On Biological and Digital Intelligence 783: 772: 649:(1st ed.). Times Books. pp.  517:only if a local prediction is failing 7: 547:over a learned scene, for example). 505:which perform this type of function. 47:adding citations to reliable sources 795:Saulius Garalevicius' research page 883:Kling, Arnold (22 November 2004). 531:, the repository of new memories. 14: 870:Dill, Franz (October 30, 2004). 23: 872:"Jeff Hawkins: On Intelligence" 381:( Predictions 1, 3, 4, and 7). 34:needs additional citations for 930:Books about human intelligence 591:will also learn the sequence. 458:temporally constant invariants 1: 814:Colwell, Bob (January 2005). 478:temporally constant invariant 423:Spatially specific prediction 281:. The book explains Hawkins' 609:Hierarchical temporal memory 615:Memory-prediction framework 414:have been observed to fire 299:Memory-prediction framework 283:memory-prediction framework 946: 309:hierarchical state machine 296: 126: 499:Neural ensemble#Encoding 703:10.1126/science.1106138 920:2004 non-fiction books 782:Cite journal requires 641:Hawkins, Jeff (2004). 427:2. In primary sensory 418:an anticipated event. 889:Tech Central Station 476:5. By definition, a 345:'s formulation of a 43:improve this article 16:Book by Jeff Hawkins 695:2005Sci...308..662F 503:grandmother neurons 394:1. In all areas of 123: 861:10.1109/MC.2005.24 836:10.1109/MC.2005.24 575:to remain viable. 567:with thousands of 400:anticipatory cells 343:Vernon Mountcastle 273:is a 2004 book by 801:Project Neocortex 689:(5722): 662โ€“667. 410:Note: As of 2005 331:Cognitive modules 266: 265: 191:Publication place 119: 118: 111: 93: 58:"On Intelligence" 937: 896: 891:. Archived from 879: 874:. Archived from 864: 843: 838:. Archived from 791: 785: 780: 778: 770: 768: 755: 754: 752:Official website 737: 736: 734: 733: 727: 721:. Archived from 680: 671: 665: 664: 648: 638: 337:Hebbian learning 279:Sandra Blakeslee 262:QP376 .H294 2004 258: 232: 182:Publication date 147:Sandra Blakeslee 131: 124: 114: 107: 103: 100: 94: 92: 51: 27: 19: 945: 944: 940: 939: 938: 936: 935: 934: 910: 909: 882: 869: 846: 813: 810: 781: 771: 766:10.1.1.132.6744 758: 750: 749: 746: 741: 740: 731: 729: 725: 678: 673: 672: 668: 661: 645:On Intelligence 640: 639: 628: 623: 605: 597: 581: 557:Pyramidal cells 553: 537: 525: 512: 474: 454: 445: 425: 392: 379:auditory system 367: 347:cortical column 301: 295: 247: 199:Media type 183: 134: 115: 104: 98: 95: 52: 50: 40: 28: 17: 12: 11: 5: 943: 941: 933: 932: 927: 922: 912: 911: 908: 907: 897: 895:on 2012-03-05. 880: 878:on 2012-02-05. 867: 866: 865: 842:on 2005-02-04. 809: 806: 805: 804: 798: 792: 784:|journal= 756: 745: 744:External links 742: 739: 738: 666: 660:978-0805074567 659: 625: 624: 622: 619: 618: 617: 612: 604: 601: 596: 593: 585:inferotemporal 580: 577: 552: 549: 536: 533: 524: 521: 511: 508: 507: 506: 473: 470: 453: 450: 444: 441: 424: 421: 420: 419: 412:mirror neurons 391: 388: 387: 386: 366: 363: 297:Main article: 294: 291: 285:theory of the 264: 263: 260: 252: 251: 248: 243: 240: 239: 234: 226: 225: 220: 214: 213: 210: 206: 205: 200: 196: 195: 192: 188: 187: 184: 181: 178: 177: 172: 168: 167: 162: 158: 157: 154: 150: 149: 140: 136: 135: 132: 117: 116: 31: 29: 22: 15: 13: 10: 9: 6: 4: 3: 2: 942: 931: 928: 926: 923: 921: 918: 917: 915: 905: 901: 898: 894: 890: 886: 881: 877: 873: 868: 862: 858: 854: 850: 845: 844: 841: 837: 833: 829: 825: 821: 817: 812: 811: 807: 802: 799: 796: 793: 789: 776: 767: 762: 757: 753: 748: 747: 743: 728:on 2017-08-09 724: 720: 716: 712: 708: 704: 700: 696: 692: 688: 684: 677: 670: 667: 662: 656: 652: 647: 646: 637: 635: 633: 631: 627: 620: 616: 613: 610: 607: 606: 602: 600: 594: 592: 590: 586: 578: 576: 574: 570: 566: 563:, even for a 562: 558: 550: 548: 546: 542: 534: 532: 530: 522: 520: 518: 509: 504: 500: 496: 495: 494: 492: 488: 484: 479: 471: 469: 467: 463: 459: 451: 449: 442: 440: 438: 434: 430: 422: 417: 413: 409: 408: 407: 405: 401: 397: 389: 384: 383: 382: 380: 376: 375:visual system 372: 364: 362: 360: 356: 351: 348: 344: 340: 338: 334: 332: 327: 325: 321: 320:state machine 316: 314: 310: 307: 300: 292: 290: 288: 284: 280: 276: 272: 271: 261: 259: 257:LC Class 253: 249: 246: 245:Dewey Decimal 241: 238: 235: 233: 227: 224: 223:0-8050-7456-2 221: 219: 215: 211: 207: 204: 201: 197: 194:United States 193: 189: 185: 179: 176: 173: 169: 166: 163: 159: 155: 151: 148: 144: 141: 137: 130: 125: 113: 110: 102: 91: 88: 84: 81: 77: 74: 70: 67: 63: 60: โ€“  59: 55: 54:Find sources: 48: 44: 38: 37: 32:This article 30: 26: 21: 20: 906:(7 Oct 2004) 904:Ben Goertzel 902:A review by 893:the original 888: 876:the original 852: 848: 840:the original 823: 819: 775:cite journal 730:. Retrieved 723:the original 686: 682: 669: 644: 598: 582: 554: 538: 526: 516: 513: 502: 486: 482: 477: 475: 465: 461: 457: 455: 446: 436: 426: 415: 399: 393: 368: 354: 352: 341: 335: 328: 324:feed forward 317: 306:feed forward 302: 275:Jeff Hawkins 269: 268: 267: 143:Jeff Hawkins 105: 96: 86: 79: 72: 65: 53: 41:Please help 36:verification 33: 529:hippocampus 371:predictions 175:Times Books 133:Front cover 914:Categories 732:2006-11-18 621:References 466:name cells 462:name cells 293:The theory 250:612.8/2 22 165:Psychology 99:April 2013 69:newspapers 855:: 12โ€“15. 830:: 12โ€“15. 761:CiteSeerX 561:dendrites 203:Paperback 171:Publisher 849:Computer 820:Computer 711:15860620 603:See also 569:synapses 373:use the 359:machines 237:55510125 153:Language 808:Reviews 719:5720234 691:Bibcode 683:Science 543:of the 541:saccade 161:Subject 156:English 83:scholar 763:  717:  709:  657:  573:theory 565:neuron 487:active 437:before 429:cortex 416:before 396:cortex 355:per se 145:& 139:Author 85:  78:  71:  64:  56:  826:(1). 726:(PDF) 715:S2CID 679:(PDF) 404:event 313:learn 287:brain 209:Pages 90:JSTOR 76:books 828:IEEE 788:help 707:PMID 655:ISBN 501:for 497:See 369:His 318:The 277:and 231:OCLC 218:ISBN 186:2004 62:news 857:doi 832:doi 699:doi 687:308 651:272 555:9. 545:eye 491:AND 485:or 212:272 45:by 916:: 887:. 853:38 851:. 824:38 822:. 818:. 779:: 777:}} 773:{{ 713:. 705:. 697:. 685:. 681:. 653:. 629:^ 589:V4 483:ON 468:. 433:V1 406:. 315:. 863:. 859:: 834:: 790:) 786:( 769:. 735:. 701:: 693:: 663:. 112:) 106:( 101:) 97:( 87:ยท 80:ยท 73:ยท 66:ยท 39:.

Index


verification
improve this article
adding citations to reliable sources
"On Intelligence"
news
newspapers
books
scholar
JSTOR
Learn how and when to remove this message

Jeff Hawkins
Sandra Blakeslee
Psychology
Times Books
Paperback
ISBN
0-8050-7456-2
OCLC
55510125
Dewey Decimal
LC Class
Jeff Hawkins
Sandra Blakeslee
memory-prediction framework
brain
Memory-prediction framework
feed forward
hierarchical state machine

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

โ†‘