Knowledge

On Intelligence

Source ๐Ÿ“

446:, at a precise location in the visual field (the scene)". It has been experimentally determined, for example, after mapping the angular position of some objects in the visual field, there will be a one-to-one correspondence of cells in the scene to the angular positions of those objects. Hawkins predicts that when the features of a visual scene are known in a memory, anticipatory cells should fire 36: 344:) of patterns and developing invariant representations. Higher levels of the cortical hierarchy predict the future on a longer time scale, or over a wider range of sensory input. Lower levels interpret or control limited domains of experience, or sensory or effector systems. Connections from the higher level states predispose some selected transitions in the lower-level state machines. 491:
will be active during a learned sequence. Hawkins posits that these cells will remain active for the duration of the learned sequence, even if the remainder of the cortical column is shifting state. Since we do not know the encoding of the sequence, we do not yet know the definition of
360:
is a basic element in the framework. Hawkins places particular emphasis on the role of the interconnections from peer columns, and the activation of columns as a whole. He strongly implies that a column is the cortex's physical representation of a state in a state machine.
314:
Hawkins' basic idea is that the brain is a mechanism to predict the future, specifically, hierarchical regions of the brain predict their future input sequences. Perhaps not always far in the future, but far enough to be of real use to an organism. As such, the brain is a
372:. For example, for the purposes of his framework, the nerve impulses can be taken to form a temporal sequence (but phase encoding could be a possible implementation of such a sequence; these details are immaterial for the framework). 525:
6. Hawkins' novel prediction is that certain cells are inhibited during a learned sequence. A class of cells in layers 2 and 3 should NOT fire during a learned sequence, the axons of these "exception cells" should fire
140: 458:
3. In layers 2 and 3, predictive activity (neural firing) should stop propagating at specific cells, corresponding to a specific prediction. Hawkins does not rule out anticipatory cells in layers 4 and 5.
935: 368:, but merely signals that the natural process has performed Hawkins' functional decomposition in a different, unexpected way, as Hawkins' motivation is to create intelligent 17: 364:
As an engineer, any specific failure to find a natural occurrence of some process in his framework does not signal a fault in the memory-prediction framework
475:
are in layer 2, physically adjacent to layer 1. Hawkins does not rule out the existence of layer 3 cells with dendrites in layer 1, which might perform as
686: 940: 826: 538:
7. If an unusual event occurs (the learned sequence fails), the "exception cells" should fire, propagating up the cortical hierarchy to the
770:
George, Dileep; Hawkins, Jeff (2005). "A Hierarchical Bayesian Model of Invariant Pattern Recognition in the Visual Cortex": 1812โ€“1817.
895: 669: 685:
Fogassi, Leonardo; Ferrari, Pier Francesco; Gesierich, Benno; Rozzi, Stefano; Chersi, Fabian; Rizzolatti, Giacomo (April 29, 2005).
233: 119: 882: 53: 930: 100: 350:
is part of the framework, in which the event of learning physically alters neurons and connections, as learning takes place.
57: 72: 619: 255: 550:
8. Hawkins predicts a cascade of predictions, when recognition occurs, propagating down the cortical column (with each
79: 808:- Research papers and programs presenting experimental results with Bayesian models of the Memory-Prediction Framework 625: 309: 293: 319: 805: 86: 910: 388:
as a prototype for some example predictions, such as Predictions 2, 8, 10, and 11. Other predictions cite the
133:
On Intelligence: How a New Understanding of the Brain will Lead to the Creation of Truly Intelligent Machines
281:
On Intelligence: How a New Understanding of the Brain will Lead to the Creation of Truly Intelligent Machines
18:
On Intelligence: How a New Understanding of the Brain will Lead to the Creation of Truly Intelligent Machines
68: 46: 530:. This prevents flooding the brain with the usual sensations, leaving only exceptions for post-processing. 771: 334: 316: 785: 733: 595: 701: 776: 850: 725: 353: 471:. Hawkins calls the cells which fire in this sequence "name cells". Hawkins suggests that these 717: 665: 661: 654: 622:, a technology by Hawkins's startup Numenta Inc. to replicate the properties of the neocortex. 582:. Hawkins posits a temporal window (presuming time-encoded firing) which is necessary for his 341: 240: 228: 93: 903: 867: 842: 709: 347: 289: 266: 157: 798: 509: 439: 406: 389: 357: 705: 599: 567: 463:"Name cells" at layers 2 and 3 should preferentially connect to layer 6 cells of cortex 886: 924: 443: 422: 385: 330: 914: 729: 610:
11. Hawkins predicts that "name cells" will be found in all regions of the cortex.
285: 153: 562:
Pyramidal cells should detect coincidences of synaptic activity on thin dendrites
539: 504:(i.e., the name cells simultaneously "light up") across an array of name cells. 501: 185: 35: 442:, Hawkins predicts, for example, "we should find anticipatory cells in or near 337:
state machine, the machine responds to future events predicted from past data.
414: 381: 175: 811: 713: 555: 213: 721: 454:
Prediction should stop propagating in the cortical column at layers 2 and 3
500:; Hawkins suggests that the ON pattern may be as simple as a simultaneous 571: 323: 247: 871: 846: 590:
Learned representations move down the cortical hierarchy, with training
579: 551: 369: 340:
The hierarchy is capable of memorizing frequently observed sequences (
583: 575: 687:"Parietal lobe: from action organization to intention understanding" 297: 814:- An open source project for modeling Memory-Prediction Framework 838: 762: 139: 858:
Colwell, B. (2005). "Machine Intelligence Meets Neuroscience".
333:
actually controls the behavior of the organism. Since it is a
396:
An Appendix of 11 Testable Predictions, beginning on page 237:
29: 521:"Exception cells" should remain OFF during a learned sequence 467:
4. Learned sequences of firings comprise a representation of
241: 598:(IT) level has learned a sequence, that eventually cells in 376:
Predictions of the theory of the memory-prediction framework
401:
Enhanced neural activity in anticipation of a sensory event
267: 570:
should be capable of detecting coincident events on thin
534:"Exception cells" should propagate unanticipated events 483:"Name cells" should remain ON during a learned sequence 265: 253: 239: 227: 219: 209: 201: 191: 181: 171: 163: 149: 60:. Unsourced material may be challenged and removed. 653: 413:", cells that fire in anticipation of a sensory 936:Non-fiction books about Artificial intelligence 546:"Aha! cells" should trigger predictive activity 594:10. Hawkins posits, for example, that if the 8: 132: 606:"Name cells" exist in all regions of cortex 450:the actual objects are seen in the scene. 409:, Hawkins (2004) predicts "we should find 322:with special properties that enable it to 138: 131: 827:"Machine Intelligence Meets Neuroscience" 775: 647: 645: 643: 641: 120:Learn how and when to remove this message 300:and describes some of its consequences. 896:"On Intelligence, People and Computers" 637: 911:On Biological and Digital Intelligence 794: 783: 660:(1st ed.). Times Books. pp.  528:only if a local prediction is failing 7: 558:over a learned scene, for example). 516:which perform this type of function. 58:adding citations to reliable sources 806:Saulius Garalevicius' research page 894:Kling, Arnold (22 November 2004). 542:, the repository of new memories. 25: 881:Dill, Franz (October 30, 2004). 34: 883:"Jeff Hawkins: On Intelligence" 392:( Predictions 1, 3, 4, and 7). 45:needs additional citations for 941:Books about human intelligence 602:will also learn the sequence. 469:temporally constant invariants 1: 825:Colwell, Bob (January 2005). 489:temporally constant invariant 434:Spatially specific prediction 292:. The book explains Hawkins' 620:Hierarchical temporal memory 626:Memory-prediction framework 425:have been observed to fire 310:Memory-prediction framework 294:memory-prediction framework 957: 320:hierarchical state machine 307: 137: 510:Neural ensemble#Encoding 714:10.1126/science.1106138 931:2004 non-fiction books 793:Cite journal requires 652:Hawkins, Jeff (2004). 438:2. In primary sensory 429:an anticipated event. 900:Tech Central Station 487:5. By definition, a 356:'s formulation of a 54:improve this article 27:Book by Jeff Hawkins 706:2005Sci...308..662F 514:grandmother neurons 405:1. In all areas of 134: 872:10.1109/MC.2005.24 847:10.1109/MC.2005.24 586:to remain viable. 578:with thousands of 411:anticipatory cells 354:Vernon Mountcastle 284:is a 2004 book by 812:Project Neocortex 700:(5722): 662โ€“667. 421:Note: As of 2005 342:Cognitive modules 277: 276: 202:Publication place 130: 129: 122: 104: 69:"On Intelligence" 16:(Redirected from 948: 907: 902:. Archived from 890: 885:. Archived from 875: 854: 849:. Archived from 802: 796: 791: 789: 781: 779: 766: 765: 763:Official website 748: 747: 745: 744: 738: 732:. Archived from 691: 682: 676: 675: 659: 649: 348:Hebbian learning 290:Sandra Blakeslee 273:QP376 .H294 2004 269: 243: 193:Publication date 158:Sandra Blakeslee 142: 135: 125: 118: 114: 111: 105: 103: 62: 38: 30: 21: 956: 955: 951: 950: 949: 947: 946: 945: 921: 920: 893: 880: 857: 824: 821: 792: 782: 777:10.1.1.132.6744 769: 761: 760: 757: 752: 751: 742: 740: 736: 689: 684: 683: 679: 672: 656:On Intelligence 651: 650: 639: 634: 616: 608: 592: 568:Pyramidal cells 564: 548: 536: 523: 485: 465: 456: 436: 403: 390:auditory system 378: 358:cortical column 312: 306: 258: 210:Media type 194: 145: 126: 115: 109: 106: 63: 61: 51: 39: 28: 23: 22: 15: 12: 11: 5: 954: 952: 944: 943: 938: 933: 923: 922: 919: 918: 908: 906:on 2012-03-05. 891: 889:on 2012-02-05. 878: 877: 876: 853:on 2005-02-04. 820: 817: 816: 815: 809: 803: 795:|journal= 767: 756: 755:External links 753: 750: 749: 677: 671:978-0805074567 670: 636: 635: 633: 630: 629: 628: 623: 615: 612: 607: 604: 596:inferotemporal 591: 588: 563: 560: 547: 544: 535: 532: 522: 519: 518: 517: 484: 481: 464: 461: 455: 452: 435: 432: 431: 430: 423:mirror neurons 402: 399: 398: 397: 377: 374: 308:Main article: 305: 302: 296:theory of the 275: 274: 271: 263: 262: 259: 254: 251: 250: 245: 237: 236: 231: 225: 224: 221: 217: 216: 211: 207: 206: 203: 199: 198: 195: 192: 189: 188: 183: 179: 178: 173: 169: 168: 165: 161: 160: 151: 147: 146: 143: 128: 127: 42: 40: 33: 26: 24: 14: 13: 10: 9: 6: 4: 3: 2: 953: 942: 939: 937: 934: 932: 929: 928: 926: 916: 912: 909: 905: 901: 897: 892: 888: 884: 879: 873: 869: 865: 861: 856: 855: 852: 848: 844: 840: 836: 832: 828: 823: 822: 818: 813: 810: 807: 804: 800: 787: 778: 773: 768: 764: 759: 758: 754: 739:on 2017-08-09 735: 731: 727: 723: 719: 715: 711: 707: 703: 699: 695: 688: 681: 678: 673: 667: 663: 658: 657: 648: 646: 644: 642: 638: 631: 627: 624: 621: 618: 617: 613: 611: 605: 603: 601: 597: 589: 587: 585: 581: 577: 574:, even for a 573: 569: 561: 559: 557: 553: 545: 543: 541: 533: 531: 529: 520: 515: 511: 507: 506: 505: 503: 499: 495: 490: 482: 480: 478: 474: 470: 462: 460: 453: 451: 449: 445: 441: 433: 428: 424: 420: 419: 418: 416: 412: 408: 400: 395: 394: 393: 391: 387: 386:visual system 383: 375: 373: 371: 367: 362: 359: 355: 351: 349: 345: 343: 338: 336: 332: 331:state machine 327: 325: 321: 318: 311: 303: 301: 299: 295: 291: 287: 283: 282: 272: 270: 268:LC Class 264: 260: 257: 256:Dewey Decimal 252: 249: 246: 244: 238: 235: 234:0-8050-7456-2 232: 230: 226: 222: 218: 215: 212: 208: 205:United States 204: 200: 196: 190: 187: 184: 180: 177: 174: 170: 166: 162: 159: 155: 152: 148: 141: 136: 124: 121: 113: 102: 99: 95: 92: 88: 85: 81: 78: 74: 71: โ€“  70: 66: 65:Find sources: 59: 55: 49: 48: 43:This article 41: 37: 32: 31: 19: 917:(7 Oct 2004) 915:Ben Goertzel 913:A review by 904:the original 899: 887:the original 863: 859: 851:the original 834: 830: 786:cite journal 741:. Retrieved 734:the original 697: 693: 680: 655: 609: 593: 565: 549: 537: 527: 524: 513: 497: 493: 488: 486: 476: 472: 468: 466: 457: 447: 437: 426: 410: 404: 379: 365: 363: 352: 346: 339: 335:feed forward 328: 317:feed forward 313: 286:Jeff Hawkins 280: 279: 278: 154:Jeff Hawkins 116: 107: 97: 90: 83: 76: 64: 52:Please help 47:verification 44: 540:hippocampus 382:predictions 186:Times Books 144:Front cover 925:Categories 743:2006-11-18 632:References 477:name cells 473:name cells 304:The theory 261:612.8/2 22 176:Psychology 110:April 2013 80:newspapers 866:: 12โ€“15. 841:: 12โ€“15. 772:CiteSeerX 572:dendrites 214:Paperback 182:Publisher 860:Computer 831:Computer 722:15860620 614:See also 580:synapses 384:use the 370:machines 248:55510125 164:Language 819:Reviews 730:5720234 702:Bibcode 694:Science 554:of the 552:saccade 172:Subject 167:English 94:scholar 774:  728:  720:  668:  584:theory 576:neuron 498:active 448:before 440:cortex 427:before 407:cortex 366:per se 156:& 150:Author 96:  89:  82:  75:  67:  837:(1). 737:(PDF) 726:S2CID 690:(PDF) 415:event 324:learn 298:brain 220:Pages 101:JSTOR 87:books 839:IEEE 799:help 718:PMID 666:ISBN 512:for 508:See 380:His 329:The 288:and 242:OCLC 229:ISBN 197:2004 73:news 868:doi 843:doi 710:doi 698:308 662:272 566:9. 556:eye 502:AND 496:or 223:272 56:by 927:: 898:. 864:38 862:. 835:38 833:. 829:. 790:: 788:}} 784:{{ 724:. 716:. 708:. 696:. 692:. 664:. 640:^ 600:V4 494:ON 479:. 444:V1 417:. 326:. 874:. 870:: 845:: 801:) 797:( 780:. 746:. 712:: 704:: 674:. 123:) 117:( 112:) 108:( 98:ยท 91:ยท 84:ยท 77:ยท 50:. 20:)

Index

On Intelligence: How a New Understanding of the Brain will Lead to the Creation of Truly Intelligent Machines

verification
improve this article
adding citations to reliable sources
"On Intelligence"
news
newspapers
books
scholar
JSTOR
Learn how and when to remove this message

Jeff Hawkins
Sandra Blakeslee
Psychology
Times Books
Paperback
ISBN
0-8050-7456-2
OCLC
55510125
Dewey Decimal
LC Class
Jeff Hawkins
Sandra Blakeslee
memory-prediction framework
brain
Memory-prediction framework
feed forward

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

โ†‘