Knowledge (XXG)

Fisher kernel

Source đź“ť

32:
of two objects on the basis of sets of measurements for each object and a statistical model. In a classification procedure, the class for a new object (whose real class is unknown) can be estimated by minimising, across classes, an average of the Fisher kernel distance from the new object to each
374:. The FV encoding stores the mean and the covariance deviation vectors per component k of the Gaussian-Mixture-Model (GMM) and each element of the local feature descriptors together. In a systematic comparison, FV outperformed all compared encoding methods ( 378:, Kernel Codebook encoding (KCB), Locality Constrained Linear Coding (LLC), Vector of Locally Aggregated Descriptors (VLAD)) showing that the encoding of second order information (aka codeword covariances) indeed benefits classification performance. 293: 341:
The Fisher kernel is the kernel for a generative probabilistic model. As such, it constitutes a bridge between generative and probabilistic models of documents. Fisher kernels exist for numerous models, notably
366:
representation suffers from sparsity and high dimensionality. The Fisher kernel can result in a compact and dense representation, which is more desirable for image classification and retrieval problems.
147: 321: 616: 370:
The Fisher Vector (FV), a special, approximate, and improved case of the general Fisher kernel, is an image representation obtained by pooling local image
351: 190: 417: 375: 363: 453: 595: 362:
The Fisher kernel can also be applied to image representation for classification or retrieval problems. Currently, the most popular
86: 496:
A.P. Twinanda et al. (2014), “Fisher Kernel Based Task Boundary Retrieval in Laparoscopic Database with Single Video Query”
45: 17: 387: 371: 478:
Florent Perronnin and Christopher Dance (2007), “Fisher Kernels on Visual Vocabularies for Image Categorization”
600: 49: 532:"Plant species classification using flower images—A comparative study of local feature representations" 436: 302: 405: 41: 462: 325: 76: 56:
generative models can process data of variable length (adding or removing data is well-supported)
29: 591: 571: 553: 530:
Seeland, Marco; Rzanny, Michael; Alaqraa, Nedal; Wäldchen, Jana; Mäder, Patrick (2017-02-24).
487:
Herve Jegou et al. (2010), “Aggregating local descriptors into a compact image representation”
413: 561: 543: 37: 408:
and David Haussler (1998), Exploiting Generative Models in Discriminative Classifiers. In
437:"Generative vs Discriminative Approaches to Entity Recognition from Label-Deficient Data" 424: 566: 531: 169: 420: 610: 506: 25: 440:
JADT 2004, 7èmes journées internationales analyse statistique des données textuelles
588:
An Introduction to Support Vector Machines and other kernel-based learning methods
548: 347: 343: 557: 575: 59:
discriminative methods can have flexible criteria and yield better results.
288:{\displaystyle K(X_{i},X_{j})=U_{X_{i}}^{T}{\mathcal {I}}^{-1}U_{X_{j}}} 36:
The Fisher kernel was introduced in 1998. It combines the advantages of
435:
Cyril Goutte, Eric Gaussier, Nicola Cancedda, Hervé Dejean (2004))
308: 254: 156:
being a set (vector) of parameters. The function taking
142:{\displaystyle U_{X}=\nabla _{\theta }\log P(X|\theta )} 305: 193: 89: 410:
Advances in Neural Information Processing Systems 11
315: 287: 141: 8: 442:, Louvain-la-Neuve, Belgium, 10-12 mars 2004 586:Nello Cristianini and John Shawe-Taylor. 565: 547: 307: 306: 304: 277: 272: 259: 253: 252: 245: 238: 233: 217: 204: 192: 128: 107: 94: 88: 398: 352:probabilistic latent semantic analysis 7: 590:. Cambridge University Press, 2000. 617:Kernel methods for machine learning 507:"VLFeat - Documentation > C API" 73:The Fisher kernel makes use of the 455:Deriving TF-IDF as a fisher kernel 412:, pages 487–493. MIT Press. 358:Image classification and retrieval 104: 14: 33:known member of the given class. 316:{\displaystyle {\mathcal {I}}} 223: 197: 136: 129: 122: 1: 38:generative statistical models 549:10.1371/journal.pone.0170629 172:of the probabilistic model. 633: 18:statistical classification 388:Fisher information metric 376:Bag of Visual Words (BoW) 461:. SPIRE. Archived from 50:support vector machines 30:measures the similarity 452:Charles Elkan (2005). 317: 289: 143: 46:discriminative methods 468:on December 20, 2013. 337:Information retrieval 318: 290: 144: 28:, is a function that 394:Notes and references 303: 191: 87: 364:bag-of-visual-words 250: 42:hidden Markov model 326:Fisher information 313: 285: 229: 139: 418:978-0-262-11245-1 624: 580: 579: 569: 551: 527: 521: 520: 518: 517: 503: 497: 494: 488: 485: 479: 476: 470: 469: 467: 460: 449: 443: 433: 427: 403: 322: 320: 319: 314: 312: 311: 294: 292: 291: 286: 284: 283: 282: 281: 267: 266: 258: 257: 249: 244: 243: 242: 222: 221: 209: 208: 148: 146: 145: 140: 132: 112: 111: 99: 98: 632: 631: 627: 626: 625: 623: 622: 621: 607: 606: 583: 542:(2): e0170629. 529: 528: 524: 515: 513: 505: 504: 500: 495: 491: 486: 482: 477: 473: 465: 458: 451: 450: 446: 434: 430: 404: 400: 396: 384: 360: 339: 334: 301: 300: 273: 268: 251: 234: 213: 200: 189: 188: 178: 103: 90: 85: 84: 71: 66: 44:) and those of 12: 11: 5: 630: 628: 620: 619: 609: 608: 605: 604: 582: 581: 522: 511:www.vlfeat.org 498: 489: 480: 471: 444: 428: 397: 395: 392: 391: 390: 383: 380: 359: 356: 338: 335: 333: 330: 310: 296: 295: 280: 276: 271: 265: 262: 256: 248: 241: 237: 232: 228: 225: 220: 216: 212: 207: 203: 199: 196: 184:is defined as 177: 174: 170:log-likelihood 160:to log P( 150: 149: 138: 135: 131: 127: 124: 121: 118: 115: 110: 106: 102: 97: 93: 70: 67: 65: 62: 61: 60: 57: 24:, named after 13: 10: 9: 6: 4: 3: 2: 629: 618: 615: 614: 612: 603: 601: 597: 596:0-521-78019-5 593: 589: 585: 584: 577: 573: 568: 563: 559: 555: 550: 545: 541: 537: 533: 526: 523: 512: 508: 502: 499: 493: 490: 484: 481: 475: 472: 464: 457: 456: 448: 445: 441: 438: 432: 429: 426: 422: 419: 415: 411: 407: 406:Tommi Jaakola 402: 399: 393: 389: 386: 385: 381: 379: 377: 373: 368: 365: 357: 355: 353: 349: 345: 336: 331: 329: 327: 323: 278: 274: 269: 263: 260: 246: 239: 235: 230: 226: 218: 214: 210: 205: 201: 194: 187: 186: 185: 183: 182:Fisher kernel 176:Fisher kernel 175: 173: 171: 167: 163: 159: 155: 133: 125: 119: 116: 113: 108: 100: 95: 91: 83: 82: 81: 80:, defined as 79: 78: 68: 63: 58: 55: 54: 53: 51: 47: 43: 39: 34: 31: 27: 26:Ronald Fisher 23: 22:Fisher kernel 19: 598: 587: 539: 535: 525: 514:. Retrieved 510: 501: 492: 483: 474: 463:the original 454: 447: 439: 431: 409: 401: 369: 361: 340: 332:Applications 299: 297: 181: 179: 165: 161: 157: 153: 151: 74: 72: 69:Fisher score 35: 21: 15: 348:Naive Bayes 516:2017-03-04 324:being the 64:Derivation 40:(like the 602:SVM Book) 558:1932-6203 261:− 168:) is the 134:θ 117:⁡ 109:θ 105:∇ 611:Category 576:28234999 536:PLOS ONE 425:Citeseer 382:See also 372:features 328:matrix. 567:5325198 75:Fisher 594:  574:  564:  556:  416:  344:tf–idf 48:(like 20:, the 466:(PDF) 459:(PDF) 298:with 152:with 77:score 592:ISBN 572:PMID 554:ISSN 414:ISBN 350:and 180:The 562:PMC 544:doi 114:log 52:): 16:In 613:: 570:. 560:. 552:. 540:12 538:. 534:. 509:. 423:, 421:PS 354:. 346:, 599:( 578:. 546:: 519:. 309:I 279:j 275:X 270:U 264:1 255:I 247:T 240:i 236:X 231:U 227:= 224:) 219:j 215:X 211:, 206:i 202:X 198:( 195:K 166:θ 164:| 162:X 158:θ 154:θ 137:) 130:| 126:X 123:( 120:P 101:= 96:X 92:U

Index

statistical classification
Ronald Fisher
measures the similarity
generative statistical models
hidden Markov model
discriminative methods
support vector machines
score
log-likelihood
Fisher information
tf–idf
Naive Bayes
probabilistic latent semantic analysis
bag-of-visual-words
features
Bag of Visual Words (BoW)
Fisher information metric
Tommi Jaakola
ISBN
978-0-262-11245-1
PS
Citeseer
"Generative vs Discriminative Approaches to Entity Recognition from Label-Deficient Data"
Deriving TF-IDF as a fisher kernel
the original
"VLFeat - Documentation > C API"
"Plant species classification using flower images—A comparative study of local feature representations"
doi
10.1371/journal.pone.0170629
ISSN

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑