Knowledge (XXG)

Rate–distortion theory

Source 📝

2794:) is the differential entropy of a Gaussian random variable with variance D. This lower bound is extensible to sources with memory and other distortion measures. One important feature of the SLB is that it is asymptotically tight in the low distortion regime for a wide class of sources and in some occasions, it actually coincides with the rate–distortion function. Shannon Lower Bounds can generally be found if the distortion between any two numbers can be expressed as a function of the difference between the value of these two numbers. 307: 5351: 5341: 3769: 42: 183: 2550: 3455: 521:
Rate–distortion theory gives an analytical expression for how much compression can be achieved using lossy compression methods. Many of the existing audio, speech, image, and video compression techniques have transforms, quantization, and bit-rate allocation procedures that capitalize on the general
3471:
Rate–distortion theory tell us that 'no compression system exists that performs outside the gray area'. The closer a practical compression system is to the red (lower) bound, the better it performs. As a general rule, this bound can only be attained by increasing the coding block length parameter.
1659: 3699: 3479:
This rate–distortion function holds only for Gaussian memoryless sources. It is known that the Gaussian source is the most "difficult" source to encode: for a given mean square error, it requires the greatest number of bits. The performance of a practical compression system working
3466: 2294: 794: 3283: 2697:
is often difficult to obtain except in some instances for which we next offer two of the best known examples. The rate–distortion function of any source is known to obey several fundamental properties, the most important ones being that it is a
3011: 1006: 3172: 1797: 544:
is a subject of on-going discussion. In the most simple case (which is actually used in most cases), the distortion is defined as the expected value of the square of the difference between input and output signal (i.e., the
1462: 1470: 2808:
When working with stationary sources with memory, it is necessary to modify the definition of the rate distortion function and it must be understood in the sense of a limit taken over sequences of increasing lengths.
2805:, is an elegant iterative technique for numerically obtaining rate–distortion functions of arbitrary finite input/output alphabet sources and much work has been done to extend it to more general problem instances. 3529: 884: 2545:{\displaystyle D_{Q}=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }P_{X,Y}(x,y)(x-y)^{2}\,dx\,dy=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }Q_{Y\mid X}(y\mid x)P_{X}(x)(x-y)^{2}\,dx\,dy.} 2877: 677: 294:, that should be communicated over a channel, so that the source (input signal) can be approximately reconstructed at the receiver (output signal) without exceeding an expected distortion 593:, but are often not easy to include in rate–distortion theory. In image and video compression, the human perception models are less well developed and inclusion is mostly limited to the 1267: 2781: 2657: 2274: 1057: 3450:{\displaystyle R(D)={\begin{cases}{\frac {1}{2}}\log _{2}(\sigma _{x}^{2}/D),&{\text{if }}0\leq D\leq \sigma _{x}^{2}\\0,&{\text{if }}D>\sigma _{x}^{2}.\end{cases}}} 1885: 2129: 3509: 1942: 515: 452: 585:. In audio compression, perceptual models (and therefore perceptual distortion measures) are relatively well developed and routinely used in compression techniques such as 2064: 3237: 1846: 1334: 1146: 664: 2888: 2609: 1983: 3913: 3848: 3762: 3729: 2684: 2186: 2159: 479: 416: 389: 362: 335: 3942: 3810: 1299: 4135:
PyRated is a very simple Python package to do the most basic calculation in rate-distortion theory: the determination of the "codebook" and the transmission rate
3874: 900: 4175: 3257: 3203: 2573: 2226: 2206: 2023: 2003: 1192: 1172: 1104: 1084: 635: 3876:
bits/symbol will be lost when transmitting this information over the given channel. For the user to have any hope of reconstructing with a maximum distortion
164: 200: 4843: 4654: 3022: 1674: 4543: 2713:
and thus the shape for the function in the examples is typical (even measured rate–distortion functions in real life tend to have very similar forms).
4357: 1350: 5049: 4872: 4666: 1654:{\displaystyle H(Y\mid X)=-\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }Q_{Y\mid X}(y\mid x)P_{X}(x)\log _{2}(Q_{Y\mid X}(y\mid x))\,dx\,dy.} 3812:
bits/symbol of information from the source must reach the user. We also know from Shannon's channel coding theorem that if the source entropy is
5054: 4631: 4093: 4064: 247: 4784: 3694:{\displaystyle R(D)=\left\{{\begin{matrix}H_{b}(p)-H_{b}(D),&0\leq D\leq \min {(p,1-p)}\\0,&D>\min {(p,1-p)}\end{matrix}}\right.} 219: 96: 5161: 4899: 4838: 4649: 4599: 4422: 4267: 2720:(SLB), which in the case of squared error and memoryless sources, states that for arbitrary sources with finite differential entropy, 226: 123: 4282: 4168: 266: 107: 3971: 2716:
Although analytical solutions to this problem are scarce, there are upper and lower bounds to these functions including the famous
807: 5274: 3473: 606: 602: 157: 233: 5284: 5122: 4973: 4892: 4686: 2815: 5257: 4877: 4671: 4459: 204: 81: 4390: 789:{\displaystyle d(x,{\hat {x}})={\begin{cases}0&{\text{if }}x={\hat {x}}\\1&{\text{if }}x\neq {\hat {x}}\end{cases}}} 5019: 4347: 215: 5354: 1805:
The mutual information can be understood as a measure for 'prior' uncertainty the receiver has about the sender's signal (
3953: 3880:, we must impose the requirement that the information lost in transmission does not exceed the maximum tolerable loss of 2798: 5344: 5247: 4789: 4161: 2686:. These definitions can be formulated measure-theoretically to account for discrete and mixed random variables as well. 1063: 133: 51: 138: 5385: 4337: 4332: 150: 2555:
As the above equations show, calculating a rate–distortion function requires the stochastic description of the input
5206: 5044: 5024: 4968: 4626: 4417: 4220: 3260: 894:
The functions that relate the rate and distortion are found as the solution of the following minimization problem:
4148: 193: 5380: 5289: 5230: 5156: 5004: 4594: 4589: 4444: 4362: 4287: 3520: 5294: 4867: 4661: 4493: 2703: 4262: 1200: 5235: 4606: 4449: 4245: 4235: 3732: 1060: 240: 3177:
where superscripts denote a complete sequence up to that time and the subscript 0 indicates initial state.
4860: 4611: 4395: 4240: 2726: 2710: 2614: 2231: 1014: 306: 287: 2717: 290:; it addresses the problem of determining the minimal number of bits per symbol, as measured by the rate 5264: 3274: 2690: 3768: 1851: 2069: 1813:)), diminished by the uncertainty that is left after receiving information about the sender's signal ( 4948: 4410: 4372: 4193: 3483: 2694: 1897: 1665: 570: 76: 56: 3307: 1848:). Of course the decrease in uncertainty is due to the communicated amount of information, which is 716: 484: 421: 5179: 5029: 5014: 4983: 4978: 4887: 4794: 4696: 4681: 4464: 3779:
Suppose we want to transmit information about a source to the user with a distortion not exceeding
3206: 2699: 1341: 61: 3006:{\displaystyle R_{n}(D)={\frac {1}{n}}\inf _{Q_{Y^{n}\mid X^{n}}\in {\mathcal {Q}}}I(Y^{n},X^{n})} 2028: 27:
Branch of information theory which provides the theoretical foundations for lossy data compression
5252: 5222: 5201: 5107: 5039: 4933: 4621: 4437: 4427: 4322: 4302: 4297: 4019: 3215: 2277: 1150: 546: 283: 71: 33: 4833: 1816: 1304: 1109: 640: 5196: 5184: 5166: 5034: 4918: 4855: 4701: 4616: 4572: 4533: 4215: 4107: 4099: 4089: 4060: 4052: 2578: 2285: 1947: 578: 550: 5171: 5127: 5100: 5095: 5070: 4953: 4938: 4848: 4757: 4752: 4727: 4581: 4314: 4292: 4184: 3959: 3883: 3827: 3817: 666:. Typical distortion functions are the Hamming distortion and the Squared-error distortion. 557:, watching pictures and video) the distortion measure should preferably be modeled on human 128: 86: 3741: 3707: 3476:
that operate at distances from the rate–distortion function that are practically relevant.
2662: 2164: 2137: 1001:{\displaystyle \inf _{Q_{Y\mid X}(y\mid x)}I_{Q}(Y;X){\text{ subject to }}D_{Q}\leq D^{*}.} 457: 394: 367: 340: 313: 5090: 4904: 4828: 4809: 4779: 4747: 4713: 4272: 4210: 3918: 3786: 3465: 2706: 1275: 582: 3853: 4082: 17: 4882: 4676: 4405: 4400: 4257: 4230: 4202: 4008: 3980: 3265: 3242: 3188: 2802: 2558: 2211: 2191: 2008: 1988: 1177: 1157: 1089: 1069: 620: 526: 5374: 5189: 5137: 4804: 4799: 4774: 4706: 4327: 4225: 3965: 3167:{\displaystyle {\mathcal {Q}}=\{Q_{Y^{n}\mid X^{n}}(Y^{n}\mid X^{n},X_{0}):E\leq D\}} 574: 66: 1792:{\displaystyle \inf _{Q_{Y\mid X}(y\mid x)}E]{\text{ subject to }}I_{Q}(Y;X)\leq R.} 1664:
The problem can also be formulated as a distortion–rate function, where we find the
41: 5310: 4277: 4252: 1668:
over achievable distortions for given rate constraint. The relevant expression is:
553:
techniques operate on data that will be perceived by human consumers (listening to
91: 1985:. Alternatively, if the communication channel is perfect and the received signal 5269: 5147: 4943: 4819: 4769: 3986: 566: 182: 3915:
bits/symbol. This means that the channel capacity must be at least as large as
1457:{\displaystyle H(Y)=-\int _{-\infty }^{\infty }P_{Y}(y)\log _{2}(P_{Y}(y))\,dy} 5326: 5117: 5112: 4999: 4958: 4764: 4130: 562: 558: 2281: 4153: 4111: 5240: 5085: 4742: 4139:, given a utility function (distortion matrix) and a Lagrange multiplier 3472:
Nevertheless, even at unit blocklengths one can often find good (scalar)
3210: 1802:
The two formulations lead to functions which are inverses of each other.
5009: 4483: 4432: 4009:"Rethinking Lossy Compression: The Rate-Distortion-Perception Tradeoff" 3181:
Memoryless (independent) Gaussian source with squared-error distortion
4523: 2276:
and the prescribed maximum distortion, respectively. When we use the
590: 4024: 5358: 5132: 4963: 4556: 4503: 3968: – Process of reducing correlation within one or more signals 554: 454:. We try to minimize the distortion between the original sequence 305: 4103: 4084:
Rate Distortion Theory: A Mathematical Basis for Data Compression
3515:
Memoryless (independent) Bernoulli source with Hamming distortion
4513: 4367: 4352: 4342: 598: 594: 4157: 4016:
Proceedings of the International Conference on Machine Learning
617:
Distortion functions measure the cost of representing a symbol
4488: 4454: 1066:(PDF) of the communication channel output (compressed signal) 586: 537: 176: 879:{\displaystyle d(x,{\hat {x}})=\left(x-{\hat {x}}\right)^{2}} 3028: 2964: 3688: 3443: 1344:
of the output signal given the input signal, respectively:
782: 540:
per data sample to be stored or transmitted. The notion of
3461:
The following figure shows what this function looks like:
3239:, and if we assume that successive samples of the signal 2872:{\displaystyle R(D)=\lim _{n\rightarrow \infty }R_{n}(D)} 573:, distortion measures can ultimately be identified with 4131:"PyRated: a python package for rate distortion theory" 3553: 3921: 3886: 3856: 3830: 3789: 3775:
Connecting rate-distortion theory to channel capacity
3744: 3710: 3532: 3486: 3286: 3245: 3218: 3191: 3025: 2891: 2818: 2729: 2665: 2617: 2581: 2561: 2297: 2234: 2214: 2194: 2167: 2140: 2072: 2031: 2011: 1991: 1950: 1900: 1854: 1819: 1677: 1473: 1353: 1307: 1278: 1203: 1180: 1160: 1112: 1092: 1072: 1017: 903: 810: 680: 643: 623: 487: 460: 424: 397: 370: 343: 316: 3976:
Pages displaying wikidata descriptions as a fallback
3974: – decision algorithm used in video compression 5319: 5303: 5221: 5146: 5078: 5069: 4992: 4926: 4917: 4818: 4735: 4726: 4642: 4580: 4571: 4473: 4383: 4313: 4201: 4192: 2134:In the definition of the rate–distortion function, 207:. Unsourced material may be challenged and removed. 4081: 3936: 3907: 3868: 3842: 3804: 3756: 3723: 3693: 3503: 3449: 3251: 3231: 3197: 3166: 3005: 2871: 2775: 2678: 2651: 2603: 2567: 2544: 2268: 2220: 2200: 2180: 2153: 2123: 2058: 2017: 1997: 1977: 1936: 1879: 1840: 1791: 1653: 1456: 1328: 1293: 1261: 1186: 1166: 1140: 1098: 1078: 1051: 1000: 878: 788: 658: 629: 509: 473: 446: 410: 383: 356: 329: 3956: – Class of algorithms in information theory 3480:on—say—images, may well be below the 4149:VcDemo Image and Video Compression Learning Tool 3783:. Rate–distortion theory tells us that at least 3658: 3614: 2925: 2835: 1679: 905: 529:in his foundational work on information theory. 310:Rate distortion encoder and decoder. An encoder 2611:, and then aims at finding the conditional PDF 286:which provides the theoretical foundations for 4169: 158: 8: 3161: 3036: 4038: 3989: – Type of signal in signal processing 5075: 4923: 4732: 4577: 4198: 4176: 4162: 4154: 4051:Cover, Thomas M.; Thomas, Joy A. (2012) . 2659:that minimize rate for a given distortion 1059:, sometimes called a test channel, is the 165: 151: 29: 4023: 3920: 3885: 3855: 3829: 3788: 3743: 3738:Plot of the rate-distortion function for 3715: 3709: 3661: 3617: 3582: 3560: 3552: 3531: 3485: 3431: 3426: 3411: 3393: 3388: 3367: 3351: 3345: 3340: 3324: 3310: 3302: 3285: 3244: 3223: 3217: 3190: 3143: 3130: 3102: 3089: 3076: 3061: 3048: 3043: 3027: 3026: 3024: 2994: 2981: 2963: 2962: 2951: 2938: 2933: 2928: 2914: 2896: 2890: 2854: 2838: 2817: 2772: 2728: 2670: 2664: 2622: 2616: 2586: 2580: 2560: 2532: 2525: 2519: 2488: 2457: 2447: 2439: 2429: 2421: 2407: 2400: 2394: 2351: 2341: 2333: 2323: 2315: 2302: 2296: 2239: 2233: 2213: 2193: 2172: 2166: 2145: 2139: 2071: 2030: 2010: 1990: 1949: 1899: 1853: 1818: 1759: 1750: 1726: 1687: 1682: 1676: 1641: 1634: 1604: 1588: 1569: 1538: 1528: 1520: 1510: 1502: 1472: 1447: 1429: 1413: 1394: 1384: 1376: 1352: 1306: 1277: 1258: 1202: 1179: 1159: 1117: 1111: 1091: 1071: 1022: 1016: 989: 976: 967: 946: 913: 908: 902: 870: 854: 853: 824: 823: 809: 768: 767: 756: 736: 735: 724: 711: 694: 693: 679: 645: 644: 642: 622: 501: 490: 489: 486: 465: 459: 438: 427: 426: 423: 402: 396: 375: 369: 348: 342: 321: 315: 267:Learn how and when to remove this message 3962: – Compact encoding of digital data 1262:{\displaystyle I(Y;X)=H(Y)-H(Y\mid X)\,} 3999: 536:is usually understood as the number of 32: 525:Rate–distortion theory was created by 3523:with Hamming distortion is given by: 1336:are the entropy of the output signal 7: 2776:{\displaystyle R(D)\geq h(X)-h(D)\,} 2652:{\displaystyle Q_{Y\mid X}(y\mid x)} 2280:as distortion measure, we have (for 2269:{\displaystyle Q_{Y\mid X}(y\mid x)} 1086:for a given input (original signal) 1052:{\displaystyle Q_{Y\mid X}(y\mid x)} 549:). However, since we know that most 522:shape of rate–distortion functions. 205:adding citations to reliable sources 97:Limiting density of discrete points 3519:The rate-distortion function of a 3277:for the rate–distortion function: 2845: 2448: 2443: 2430: 2425: 2342: 2337: 2324: 2319: 1529: 1524: 1511: 1506: 1385: 1380: 25: 1880:{\displaystyle I\left(Y;X\right)} 108:Asymptotic equipartition property 5350: 5349: 5340: 5339: 3767: 3464: 3263:(or equivalently, the source is 2124:{\displaystyle I(Y;X)=H(X)=H(Y)} 1890:As an example, in case there is 181: 40: 4007:Blau, Y.; Michaeli, T. (2019). 3504:{\displaystyle R\left(D\right)} 1937:{\displaystyle H(Y\mid X)=H(Y)} 532:In rate–distortion theory, the 481:and the reconstructed sequence 192:needs additional citations for 124:Shannon's source coding theorem 4057:Elements of Information Theory 3931: 3925: 3902: 3896: 3799: 3793: 3680: 3662: 3636: 3618: 3594: 3588: 3572: 3566: 3542: 3536: 3359: 3333: 3296: 3290: 3152: 3149: 3123: 3117: 3108: 3069: 3000: 2974: 2908: 2902: 2866: 2860: 2842: 2828: 2822: 2769: 2763: 2754: 2748: 2739: 2733: 2646: 2634: 2598: 2592: 2516: 2503: 2500: 2494: 2481: 2469: 2391: 2378: 2375: 2363: 2263: 2251: 2118: 2112: 2103: 2097: 2088: 2076: 2047: 2035: 1966: 1954: 1931: 1925: 1916: 1904: 1835: 1823: 1777: 1765: 1747: 1744: 1732: 1719: 1711: 1699: 1631: 1628: 1616: 1597: 1581: 1575: 1562: 1550: 1489: 1477: 1444: 1441: 1435: 1422: 1406: 1400: 1363: 1357: 1323: 1311: 1288: 1282: 1255: 1243: 1234: 1228: 1219: 1207: 1135: 1123: 1046: 1034: 964: 952: 937: 925: 859: 835: 829: 814: 773: 741: 705: 699: 684: 650: 510:{\displaystyle {\hat {X}}^{n}} 495: 447:{\displaystyle {\hat {X}}^{n}} 432: 82:Conditional mutual information 1: 4129:Marzen, Sarah; DeDeo, Simon. 3983: – Geometrical structure 4053:"10. Rate Distortion Theory" 3972:Rate–distortion optimization 2059:{\displaystyle H(Y\mid X)=0} 1064:probability density function 134:Noisy-channel coding theorem 3232:{\displaystyle \sigma ^{2}} 2188:are the distortion between 2005:is identical to the signal 1894:communication at all, then 5402: 5231:Compressed data structures 4553:RLE + BWT + MTF + Huffman 4221:Asymmetric numeral systems 4018:. PMLR. pp. 675–685. 3261:stochastically independent 1841:{\displaystyle H(Y\mid X)} 1329:{\displaystyle H(Y\mid X)} 1141:{\displaystyle I_{Q}(Y;X)} 659:{\displaystyle {\hat {x}}} 637:by an approximated symbol 5335: 4590:Discrete cosine transform 4520:LZ77 + Huffman + context 3521:Bernoulli random variable 3273:), we find the following 890:Rate–distortion functions 418:which outputs a sequence 391:is then fed to a decoder 5295:Smallest grammar problem 3954:Blahut–Arimoto algorithm 2799:Blahut–Arimoto algorithm 2704:monotonically decreasing 2604:{\displaystyle P_{X}(x)} 1978:{\displaystyle I(Y;X)=0} 800:Squared-error distortion 565:: much like the use of 216:"Rate–distortion theory" 18:Rate distortion function 5236:Compressed suffix array 4785:Nyquist–Shannon theorem 4059:(2nd ed.). Wiley. 4039:Cover & Thomas 2012 3733:binary entropy function 364:. The encoded sequence 139:Shannon–Hartley theorem 3938: 3909: 3908:{\displaystyle H-R(D)} 3870: 3844: 3843:{\displaystyle C<H} 3806: 3758: 3725: 3695: 3505: 3451: 3253: 3233: 3199: 3168: 3007: 2873: 2777: 2680: 2653: 2605: 2569: 2546: 2270: 2222: 2202: 2182: 2155: 2125: 2060: 2019: 1999: 1979: 1938: 1881: 1842: 1793: 1752: subject to  1655: 1458: 1330: 1295: 1263: 1188: 1168: 1142: 1100: 1080: 1053: 1002: 969: subject to  880: 790: 660: 631: 518: 511: 475: 448: 412: 385: 358: 331: 288:lossy data compression 280:Rate–distortion theory 113:Rate–distortion theory 5265:Kolmogorov complexity 5133:Video characteristics 4510:LZ77 + Huffman + ANS 4080:Berger, Toby (1971). 3939: 3910: 3871: 3845: 3816:bits/symbol, and the 3807: 3759: 3757:{\displaystyle p=0.5} 3726: 3724:{\displaystyle H_{b}} 3696: 3506: 3452: 3275:analytical expression 3254: 3234: 3209:random variable with 3200: 3169: 3008: 2874: 2778: 2681: 2679:{\displaystyle D^{*}} 2654: 2606: 2570: 2547: 2271: 2223: 2203: 2183: 2181:{\displaystyle D^{*}} 2156: 2154:{\displaystyle D_{Q}} 2126: 2061: 2020: 2000: 1980: 1939: 1882: 1843: 1794: 1656: 1459: 1331: 1296: 1264: 1189: 1169: 1143: 1101: 1081: 1054: 1003: 881: 791: 661: 632: 512: 476: 474:{\displaystyle X^{n}} 449: 413: 411:{\displaystyle g_{n}} 386: 384:{\displaystyle Y^{n}} 359: 357:{\displaystyle X^{n}} 332: 330:{\displaystyle f_{n}} 309: 282:is a major branch of 5355:Compression software 4949:Compression artifact 4905:Psychoacoustic model 3937:{\displaystyle R(D)} 3919: 3884: 3854: 3828: 3805:{\displaystyle R(D)} 3787: 3742: 3708: 3530: 3484: 3284: 3243: 3216: 3189: 3023: 2889: 2816: 2727: 2695:minimization problem 2663: 2615: 2579: 2575:in terms of the PDF 2559: 2295: 2232: 2212: 2192: 2165: 2138: 2070: 2029: 2025:at the sender, then 2009: 1989: 1948: 1898: 1852: 1817: 1675: 1471: 1351: 1305: 1294:{\displaystyle H(Y)} 1276: 1201: 1178: 1158: 1110: 1090: 1070: 1015: 901: 808: 678: 641: 621: 613:Distortion functions 577:as used in Bayesian 571:lossless compression 485: 458: 422: 395: 368: 341: 314: 201:improve this article 77:Directed information 57:Differential entropy 5345:Compression formats 4984:Texture compression 4979:Standard test image 4795:Silence compression 3869:{\displaystyle H-C} 3511:lower bound shown. 3436: 3398: 3350: 3269:, or the signal is 2718:Shannon lower bound 2452: 2434: 2346: 2328: 1533: 1515: 1389: 1342:conditional entropy 337:encodes a sequence 62:Conditional entropy 5386:Information theory 5253:Information theory 5108:Display resolution 4934:Chroma subsampling 4323:Byte pair encoding 4268:Shannon–Fano–Elias 3934: 3905: 3866: 3840: 3802: 3754: 3721: 3691: 3686: 3501: 3457:    3447: 3442: 3422: 3384: 3336: 3249: 3229: 3195: 3185:If we assume that 3164: 3003: 2970: 2869: 2849: 2773: 2676: 2649: 2601: 2565: 2542: 2435: 2417: 2329: 2311: 2286:continuous signals 2278:mean squared error 2266: 2218: 2198: 2178: 2151: 2121: 2056: 2015: 1995: 1975: 1934: 1877: 1838: 1789: 1715: 1651: 1516: 1498: 1454: 1372: 1326: 1291: 1259: 1184: 1164: 1151:mutual information 1138: 1096: 1076: 1049: 998: 941: 876: 786: 781: 670:Hamming distortion 656: 627: 547:mean squared error 519: 507: 471: 444: 408: 381: 354: 327: 284:information theory 72:Mutual information 34:Information theory 5368: 5367: 5217: 5216: 5167:Deblocking filter 5065: 5064: 4913: 4912: 4722: 4721: 4567: 4566: 4095:978-0-13-753103-5 4088:. Prentice Hall. 4066:978-1-118-58577-1 3414: 3370: 3318: 3252:{\displaystyle X} 3198:{\displaystyle X} 2924: 2922: 2834: 2801:, co-invented by 2693:solution to this 2568:{\displaystyle X} 2221:{\displaystyle Y} 2201:{\displaystyle X} 2018:{\displaystyle X} 1998:{\displaystyle Y} 1753: 1678: 1187:{\displaystyle X} 1167:{\displaystyle Y} 1099:{\displaystyle X} 1079:{\displaystyle Y} 970: 904: 862: 832: 776: 759: 744: 727: 702: 653: 630:{\displaystyle x} 551:lossy compression 498: 435: 277: 276: 269: 251: 175: 174: 16:(Redirected from 5393: 5381:Data compression 5353: 5352: 5343: 5342: 5172:Lapped transform 5076: 4954:Image resolution 4939:Coding tree unit 4924: 4733: 4578: 4199: 4185:Data compression 4178: 4171: 4164: 4155: 4145: 4116: 4115: 4087: 4077: 4071: 4070: 4048: 4042: 4036: 4030: 4029: 4027: 4013: 4004: 3977: 3960:Data compression 3943: 3941: 3940: 3935: 3914: 3912: 3911: 3906: 3875: 3873: 3872: 3867: 3849: 3847: 3846: 3841: 3818:channel capacity 3811: 3809: 3808: 3803: 3771: 3763: 3761: 3760: 3755: 3730: 3728: 3727: 3722: 3720: 3719: 3700: 3698: 3697: 3692: 3690: 3687: 3683: 3639: 3587: 3586: 3565: 3564: 3510: 3508: 3507: 3502: 3500: 3468: 3456: 3454: 3453: 3448: 3446: 3445: 3435: 3430: 3415: 3412: 3397: 3392: 3371: 3368: 3355: 3349: 3344: 3329: 3328: 3319: 3311: 3258: 3256: 3255: 3250: 3238: 3236: 3235: 3230: 3228: 3227: 3204: 3202: 3201: 3196: 3173: 3171: 3170: 3165: 3148: 3147: 3135: 3134: 3107: 3106: 3094: 3093: 3081: 3080: 3068: 3067: 3066: 3065: 3053: 3052: 3032: 3031: 3012: 3010: 3009: 3004: 2999: 2998: 2986: 2985: 2969: 2968: 2967: 2958: 2957: 2956: 2955: 2943: 2942: 2923: 2915: 2901: 2900: 2878: 2876: 2875: 2870: 2859: 2858: 2848: 2782: 2780: 2779: 2774: 2685: 2683: 2682: 2677: 2675: 2674: 2658: 2656: 2655: 2650: 2633: 2632: 2610: 2608: 2607: 2602: 2591: 2590: 2574: 2572: 2571: 2566: 2551: 2549: 2548: 2543: 2524: 2523: 2493: 2492: 2468: 2467: 2451: 2446: 2433: 2428: 2399: 2398: 2362: 2361: 2345: 2340: 2327: 2322: 2307: 2306: 2275: 2273: 2272: 2267: 2250: 2249: 2227: 2225: 2224: 2219: 2207: 2205: 2204: 2199: 2187: 2185: 2184: 2179: 2177: 2176: 2160: 2158: 2157: 2152: 2150: 2149: 2130: 2128: 2127: 2122: 2065: 2063: 2062: 2057: 2024: 2022: 2021: 2016: 2004: 2002: 2001: 1996: 1984: 1982: 1981: 1976: 1943: 1941: 1940: 1935: 1886: 1884: 1883: 1878: 1876: 1872: 1847: 1845: 1844: 1839: 1798: 1796: 1795: 1790: 1764: 1763: 1754: 1751: 1731: 1730: 1714: 1698: 1697: 1660: 1658: 1657: 1652: 1615: 1614: 1593: 1592: 1574: 1573: 1549: 1548: 1532: 1527: 1514: 1509: 1463: 1461: 1460: 1455: 1434: 1433: 1418: 1417: 1399: 1398: 1388: 1383: 1335: 1333: 1332: 1327: 1300: 1298: 1297: 1292: 1268: 1266: 1265: 1260: 1193: 1191: 1190: 1185: 1173: 1171: 1170: 1165: 1147: 1145: 1144: 1139: 1122: 1121: 1105: 1103: 1102: 1097: 1085: 1083: 1082: 1077: 1058: 1056: 1055: 1050: 1033: 1032: 1007: 1005: 1004: 999: 994: 993: 981: 980: 971: 968: 951: 950: 940: 924: 923: 885: 883: 882: 877: 875: 874: 869: 865: 864: 863: 855: 834: 833: 825: 795: 793: 792: 787: 785: 784: 778: 777: 769: 760: 757: 746: 745: 737: 728: 725: 704: 703: 695: 665: 663: 662: 657: 655: 654: 646: 636: 634: 633: 628: 516: 514: 513: 508: 506: 505: 500: 499: 491: 480: 478: 477: 472: 470: 469: 453: 451: 450: 445: 443: 442: 437: 436: 428: 417: 415: 414: 409: 407: 406: 390: 388: 387: 382: 380: 379: 363: 361: 360: 355: 353: 352: 336: 334: 333: 328: 326: 325: 272: 265: 261: 258: 252: 250: 209: 185: 177: 167: 160: 153: 129:Channel capacity 87:Relative entropy 44: 30: 21: 5401: 5400: 5396: 5395: 5394: 5392: 5391: 5390: 5371: 5370: 5369: 5364: 5331: 5315: 5299: 5280:Rate–distortion 5213: 5142: 5061: 4988: 4909: 4814: 4810:Sub-band coding 4718: 4643:Predictive type 4638: 4563: 4530:LZSS + Huffman 4480:LZ77 + Huffman 4469: 4379: 4315:Dictionary type 4309: 4211:Adaptive coding 4188: 4182: 4128: 4125: 4120: 4119: 4096: 4079: 4078: 4074: 4067: 4050: 4049: 4045: 4037: 4033: 4011: 4006: 4005: 4001: 3996: 3975: 3950: 3917: 3916: 3882: 3881: 3852: 3851: 3826: 3825: 3785: 3784: 3777: 3740: 3739: 3711: 3706: 3705: 3685: 3684: 3650: 3641: 3640: 3600: 3578: 3556: 3548: 3528: 3527: 3517: 3490: 3482: 3481: 3441: 3440: 3409: 3400: 3399: 3365: 3320: 3303: 3282: 3281: 3241: 3240: 3219: 3214: 3213: 3187: 3186: 3183: 3139: 3126: 3098: 3085: 3072: 3057: 3044: 3039: 3021: 3020: 2990: 2977: 2947: 2934: 2929: 2892: 2887: 2886: 2850: 2814: 2813: 2725: 2724: 2666: 2661: 2660: 2618: 2613: 2612: 2582: 2577: 2576: 2557: 2556: 2515: 2484: 2453: 2390: 2347: 2298: 2293: 2292: 2235: 2230: 2229: 2210: 2209: 2190: 2189: 2168: 2163: 2162: 2141: 2136: 2135: 2068: 2067: 2027: 2026: 2007: 2006: 1987: 1986: 1946: 1945: 1896: 1895: 1862: 1858: 1850: 1849: 1815: 1814: 1755: 1722: 1683: 1673: 1672: 1600: 1584: 1565: 1534: 1469: 1468: 1425: 1409: 1390: 1349: 1348: 1303: 1302: 1274: 1273: 1199: 1198: 1176: 1175: 1156: 1155: 1113: 1108: 1107: 1088: 1087: 1068: 1067: 1018: 1013: 1012: 985: 972: 942: 909: 899: 898: 892: 846: 842: 841: 806: 805: 802: 780: 779: 754: 748: 747: 722: 712: 676: 675: 672: 639: 638: 619: 618: 615: 583:decision theory 488: 483: 482: 461: 456: 455: 425: 420: 419: 398: 393: 392: 371: 366: 365: 344: 339: 338: 317: 312: 311: 304: 273: 262: 256: 253: 210: 208: 198: 186: 171: 28: 23: 22: 15: 12: 11: 5: 5399: 5397: 5389: 5388: 5383: 5373: 5372: 5366: 5365: 5363: 5362: 5347: 5336: 5333: 5332: 5330: 5329: 5323: 5321: 5317: 5316: 5314: 5313: 5307: 5305: 5301: 5300: 5298: 5297: 5292: 5287: 5282: 5277: 5272: 5267: 5262: 5261: 5260: 5250: 5245: 5244: 5243: 5238: 5227: 5225: 5219: 5218: 5215: 5214: 5212: 5211: 5210: 5209: 5204: 5194: 5193: 5192: 5187: 5182: 5174: 5169: 5164: 5159: 5153: 5151: 5144: 5143: 5141: 5140: 5135: 5130: 5125: 5120: 5115: 5110: 5105: 5104: 5103: 5098: 5093: 5082: 5080: 5073: 5067: 5066: 5063: 5062: 5060: 5059: 5058: 5057: 5052: 5047: 5042: 5032: 5027: 5022: 5017: 5012: 5007: 5002: 4996: 4994: 4990: 4989: 4987: 4986: 4981: 4976: 4971: 4966: 4961: 4956: 4951: 4946: 4941: 4936: 4930: 4928: 4921: 4915: 4914: 4911: 4910: 4908: 4907: 4902: 4897: 4896: 4895: 4890: 4885: 4880: 4875: 4865: 4864: 4863: 4853: 4852: 4851: 4846: 4836: 4831: 4825: 4823: 4816: 4815: 4813: 4812: 4807: 4802: 4797: 4792: 4787: 4782: 4777: 4772: 4767: 4762: 4761: 4760: 4755: 4750: 4739: 4737: 4730: 4724: 4723: 4720: 4719: 4717: 4716: 4714:Psychoacoustic 4711: 4710: 4709: 4704: 4699: 4691: 4690: 4689: 4684: 4679: 4674: 4669: 4659: 4658: 4657: 4646: 4644: 4640: 4639: 4637: 4636: 4635: 4634: 4629: 4624: 4614: 4609: 4604: 4603: 4602: 4597: 4586: 4584: 4582:Transform type 4575: 4569: 4568: 4565: 4564: 4562: 4561: 4560: 4559: 4551: 4550: 4549: 4546: 4538: 4537: 4536: 4528: 4527: 4526: 4518: 4517: 4516: 4508: 4507: 4506: 4498: 4497: 4496: 4491: 4486: 4477: 4475: 4471: 4470: 4468: 4467: 4462: 4457: 4452: 4447: 4442: 4441: 4440: 4435: 4425: 4420: 4415: 4414: 4413: 4403: 4398: 4393: 4387: 4385: 4381: 4380: 4378: 4377: 4376: 4375: 4370: 4365: 4360: 4355: 4350: 4345: 4340: 4335: 4325: 4319: 4317: 4311: 4310: 4308: 4307: 4306: 4305: 4300: 4295: 4290: 4280: 4275: 4270: 4265: 4260: 4255: 4250: 4249: 4248: 4243: 4238: 4228: 4223: 4218: 4213: 4207: 4205: 4196: 4190: 4189: 4183: 4181: 4180: 4173: 4166: 4158: 4152: 4151: 4146: 4124: 4123:External links 4121: 4118: 4117: 4094: 4072: 4065: 4043: 4031: 3998: 3997: 3995: 3992: 3991: 3990: 3984: 3981:Sphere packing 3978: 3969: 3963: 3957: 3949: 3946: 3933: 3930: 3927: 3924: 3904: 3901: 3898: 3895: 3892: 3889: 3865: 3862: 3859: 3839: 3836: 3833: 3801: 3798: 3795: 3792: 3776: 3773: 3753: 3750: 3747: 3718: 3714: 3702: 3701: 3689: 3682: 3679: 3676: 3673: 3670: 3667: 3664: 3660: 3657: 3654: 3651: 3649: 3646: 3643: 3642: 3638: 3635: 3632: 3629: 3626: 3623: 3620: 3616: 3613: 3610: 3607: 3604: 3601: 3599: 3596: 3593: 3590: 3585: 3581: 3577: 3574: 3571: 3568: 3563: 3559: 3555: 3554: 3551: 3547: 3544: 3541: 3538: 3535: 3516: 3513: 3499: 3496: 3493: 3489: 3459: 3458: 3444: 3439: 3434: 3429: 3425: 3421: 3418: 3410: 3408: 3405: 3402: 3401: 3396: 3391: 3387: 3383: 3380: 3377: 3374: 3366: 3364: 3361: 3358: 3354: 3348: 3343: 3339: 3335: 3332: 3327: 3323: 3317: 3314: 3309: 3308: 3306: 3301: 3298: 3295: 3292: 3289: 3248: 3226: 3222: 3194: 3182: 3179: 3175: 3174: 3163: 3160: 3157: 3154: 3151: 3146: 3142: 3138: 3133: 3129: 3125: 3122: 3119: 3116: 3113: 3110: 3105: 3101: 3097: 3092: 3088: 3084: 3079: 3075: 3071: 3064: 3060: 3056: 3051: 3047: 3042: 3038: 3035: 3030: 3014: 3013: 3002: 2997: 2993: 2989: 2984: 2980: 2976: 2973: 2966: 2961: 2954: 2950: 2946: 2941: 2937: 2932: 2927: 2921: 2918: 2913: 2910: 2907: 2904: 2899: 2895: 2880: 2879: 2868: 2865: 2862: 2857: 2853: 2847: 2844: 2841: 2837: 2833: 2830: 2827: 2824: 2821: 2803:Richard Blahut 2784: 2783: 2771: 2768: 2765: 2762: 2759: 2756: 2753: 2750: 2747: 2744: 2741: 2738: 2735: 2732: 2673: 2669: 2648: 2645: 2642: 2639: 2636: 2631: 2628: 2625: 2621: 2600: 2597: 2594: 2589: 2585: 2564: 2553: 2552: 2541: 2538: 2535: 2531: 2528: 2522: 2518: 2514: 2511: 2508: 2505: 2502: 2499: 2496: 2491: 2487: 2483: 2480: 2477: 2474: 2471: 2466: 2463: 2460: 2456: 2450: 2445: 2442: 2438: 2432: 2427: 2424: 2420: 2416: 2413: 2410: 2406: 2403: 2397: 2393: 2389: 2386: 2383: 2380: 2377: 2374: 2371: 2368: 2365: 2360: 2357: 2354: 2350: 2344: 2339: 2336: 2332: 2326: 2321: 2318: 2314: 2310: 2305: 2301: 2265: 2262: 2259: 2256: 2253: 2248: 2245: 2242: 2238: 2217: 2197: 2175: 2171: 2148: 2144: 2120: 2117: 2114: 2111: 2108: 2105: 2102: 2099: 2096: 2093: 2090: 2087: 2084: 2081: 2078: 2075: 2055: 2052: 2049: 2046: 2043: 2040: 2037: 2034: 2014: 1994: 1974: 1971: 1968: 1965: 1962: 1959: 1956: 1953: 1933: 1930: 1927: 1924: 1921: 1918: 1915: 1912: 1909: 1906: 1903: 1875: 1871: 1868: 1865: 1861: 1857: 1837: 1834: 1831: 1828: 1825: 1822: 1800: 1799: 1788: 1785: 1782: 1779: 1776: 1773: 1770: 1767: 1762: 1758: 1749: 1746: 1743: 1740: 1737: 1734: 1729: 1725: 1721: 1718: 1713: 1710: 1707: 1704: 1701: 1696: 1693: 1690: 1686: 1681: 1662: 1661: 1650: 1647: 1644: 1640: 1637: 1633: 1630: 1627: 1624: 1621: 1618: 1613: 1610: 1607: 1603: 1599: 1596: 1591: 1587: 1583: 1580: 1577: 1572: 1568: 1564: 1561: 1558: 1555: 1552: 1547: 1544: 1541: 1537: 1531: 1526: 1523: 1519: 1513: 1508: 1505: 1501: 1497: 1494: 1491: 1488: 1485: 1482: 1479: 1476: 1465: 1464: 1453: 1450: 1446: 1443: 1440: 1437: 1432: 1428: 1424: 1421: 1416: 1412: 1408: 1405: 1402: 1397: 1393: 1387: 1382: 1379: 1375: 1371: 1368: 1365: 1362: 1359: 1356: 1325: 1322: 1319: 1316: 1313: 1310: 1290: 1287: 1284: 1281: 1270: 1269: 1257: 1254: 1251: 1248: 1245: 1242: 1239: 1236: 1233: 1230: 1227: 1224: 1221: 1218: 1215: 1212: 1209: 1206: 1183: 1163: 1137: 1134: 1131: 1128: 1125: 1120: 1116: 1095: 1075: 1048: 1045: 1042: 1039: 1036: 1031: 1028: 1025: 1021: 1009: 1008: 997: 992: 988: 984: 979: 975: 966: 963: 960: 957: 954: 949: 945: 939: 936: 933: 930: 927: 922: 919: 916: 912: 907: 891: 888: 887: 886: 873: 868: 861: 858: 852: 849: 845: 840: 837: 831: 828: 822: 819: 816: 813: 801: 798: 797: 796: 783: 775: 772: 766: 763: 755: 753: 750: 749: 743: 740: 734: 731: 723: 721: 718: 717: 715: 710: 707: 701: 698: 692: 689: 686: 683: 671: 668: 652: 649: 626: 614: 611: 575:loss functions 527:Claude Shannon 504: 497: 494: 468: 464: 441: 434: 431: 405: 401: 378: 374: 351: 347: 324: 320: 303: 300: 275: 274: 189: 187: 180: 173: 172: 170: 169: 162: 155: 147: 144: 143: 142: 141: 136: 131: 126: 118: 117: 116: 115: 110: 102: 101: 100: 99: 94: 89: 84: 79: 74: 69: 64: 59: 54: 46: 45: 37: 36: 26: 24: 14: 13: 10: 9: 6: 4: 3: 2: 5398: 5387: 5384: 5382: 5379: 5378: 5376: 5360: 5356: 5348: 5346: 5338: 5337: 5334: 5328: 5325: 5324: 5322: 5318: 5312: 5309: 5308: 5306: 5302: 5296: 5293: 5291: 5288: 5286: 5283: 5281: 5278: 5276: 5273: 5271: 5268: 5266: 5263: 5259: 5256: 5255: 5254: 5251: 5249: 5246: 5242: 5239: 5237: 5234: 5233: 5232: 5229: 5228: 5226: 5224: 5220: 5208: 5205: 5203: 5200: 5199: 5198: 5195: 5191: 5188: 5186: 5183: 5181: 5178: 5177: 5175: 5173: 5170: 5168: 5165: 5163: 5160: 5158: 5155: 5154: 5152: 5149: 5145: 5139: 5138:Video quality 5136: 5134: 5131: 5129: 5126: 5124: 5121: 5119: 5116: 5114: 5111: 5109: 5106: 5102: 5099: 5097: 5094: 5092: 5089: 5088: 5087: 5084: 5083: 5081: 5077: 5074: 5072: 5068: 5056: 5053: 5051: 5048: 5046: 5043: 5041: 5038: 5037: 5036: 5033: 5031: 5028: 5026: 5023: 5021: 5018: 5016: 5013: 5011: 5008: 5006: 5003: 5001: 4998: 4997: 4995: 4991: 4985: 4982: 4980: 4977: 4975: 4972: 4970: 4967: 4965: 4962: 4960: 4957: 4955: 4952: 4950: 4947: 4945: 4942: 4940: 4937: 4935: 4932: 4931: 4929: 4925: 4922: 4920: 4916: 4906: 4903: 4901: 4898: 4894: 4891: 4889: 4886: 4884: 4881: 4879: 4876: 4874: 4871: 4870: 4869: 4866: 4862: 4859: 4858: 4857: 4854: 4850: 4847: 4845: 4842: 4841: 4840: 4837: 4835: 4832: 4830: 4827: 4826: 4824: 4821: 4817: 4811: 4808: 4806: 4805:Speech coding 4803: 4801: 4800:Sound quality 4798: 4796: 4793: 4791: 4788: 4786: 4783: 4781: 4778: 4776: 4775:Dynamic range 4773: 4771: 4768: 4766: 4763: 4759: 4756: 4754: 4751: 4749: 4746: 4745: 4744: 4741: 4740: 4738: 4734: 4731: 4729: 4725: 4715: 4712: 4708: 4705: 4703: 4700: 4698: 4695: 4694: 4692: 4688: 4685: 4683: 4680: 4678: 4675: 4673: 4670: 4668: 4665: 4664: 4663: 4660: 4656: 4653: 4652: 4651: 4648: 4647: 4645: 4641: 4633: 4630: 4628: 4625: 4623: 4620: 4619: 4618: 4615: 4613: 4610: 4608: 4605: 4601: 4598: 4596: 4593: 4592: 4591: 4588: 4587: 4585: 4583: 4579: 4576: 4574: 4570: 4558: 4555: 4554: 4552: 4547: 4545: 4542: 4541: 4540:LZ77 + Range 4539: 4535: 4532: 4531: 4529: 4525: 4522: 4521: 4519: 4515: 4512: 4511: 4509: 4505: 4502: 4501: 4499: 4495: 4492: 4490: 4487: 4485: 4482: 4481: 4479: 4478: 4476: 4472: 4466: 4463: 4461: 4458: 4456: 4453: 4451: 4448: 4446: 4443: 4439: 4436: 4434: 4431: 4430: 4429: 4426: 4424: 4421: 4419: 4416: 4412: 4409: 4408: 4407: 4404: 4402: 4399: 4397: 4394: 4392: 4389: 4388: 4386: 4382: 4374: 4371: 4369: 4366: 4364: 4361: 4359: 4356: 4354: 4351: 4349: 4346: 4344: 4341: 4339: 4336: 4334: 4331: 4330: 4329: 4326: 4324: 4321: 4320: 4318: 4316: 4312: 4304: 4301: 4299: 4296: 4294: 4291: 4289: 4286: 4285: 4284: 4281: 4279: 4276: 4274: 4271: 4269: 4266: 4264: 4261: 4259: 4256: 4254: 4251: 4247: 4244: 4242: 4239: 4237: 4234: 4233: 4232: 4229: 4227: 4224: 4222: 4219: 4217: 4214: 4212: 4209: 4208: 4206: 4204: 4200: 4197: 4195: 4191: 4186: 4179: 4174: 4172: 4167: 4165: 4160: 4159: 4156: 4150: 4147: 4144: 4142: 4138: 4132: 4127: 4126: 4122: 4113: 4109: 4105: 4101: 4097: 4091: 4086: 4085: 4076: 4073: 4068: 4062: 4058: 4054: 4047: 4044: 4041:, p. 310 4040: 4035: 4032: 4026: 4021: 4017: 4010: 4003: 4000: 3993: 3988: 3985: 3982: 3979: 3973: 3970: 3967: 3966:Decorrelation 3964: 3961: 3958: 3955: 3952: 3951: 3947: 3945: 3928: 3922: 3899: 3893: 3890: 3887: 3879: 3863: 3860: 3857: 3837: 3834: 3831: 3823: 3819: 3815: 3796: 3790: 3782: 3774: 3772: 3770: 3765: 3751: 3748: 3745: 3736: 3734: 3716: 3712: 3677: 3674: 3671: 3668: 3665: 3655: 3652: 3647: 3644: 3633: 3630: 3627: 3624: 3621: 3611: 3608: 3605: 3602: 3597: 3591: 3583: 3579: 3575: 3569: 3561: 3557: 3549: 3545: 3539: 3533: 3526: 3525: 3524: 3522: 3514: 3512: 3497: 3494: 3491: 3487: 3477: 3475: 3469: 3467: 3462: 3437: 3432: 3427: 3423: 3419: 3416: 3406: 3403: 3394: 3389: 3385: 3381: 3378: 3375: 3372: 3362: 3356: 3352: 3346: 3341: 3337: 3330: 3325: 3321: 3315: 3312: 3304: 3299: 3293: 3287: 3280: 3279: 3278: 3276: 3272: 3268: 3267: 3262: 3246: 3224: 3220: 3212: 3208: 3192: 3180: 3178: 3158: 3155: 3144: 3140: 3136: 3131: 3127: 3120: 3114: 3111: 3103: 3099: 3095: 3090: 3086: 3082: 3077: 3073: 3062: 3058: 3054: 3049: 3045: 3040: 3033: 3019: 3018: 3017: 2995: 2991: 2987: 2982: 2978: 2971: 2959: 2952: 2948: 2944: 2939: 2935: 2930: 2919: 2916: 2911: 2905: 2897: 2893: 2885: 2884: 2883: 2863: 2855: 2851: 2839: 2831: 2825: 2819: 2812: 2811: 2810: 2806: 2804: 2800: 2795: 2793: 2789: 2766: 2760: 2757: 2751: 2745: 2742: 2736: 2730: 2723: 2722: 2721: 2719: 2714: 2712: 2708: 2705: 2701: 2696: 2692: 2687: 2671: 2667: 2643: 2640: 2637: 2629: 2626: 2623: 2619: 2595: 2587: 2583: 2562: 2539: 2536: 2533: 2529: 2526: 2520: 2512: 2509: 2506: 2497: 2489: 2485: 2478: 2475: 2472: 2464: 2461: 2458: 2454: 2440: 2436: 2422: 2418: 2414: 2411: 2408: 2404: 2401: 2395: 2387: 2384: 2381: 2372: 2369: 2366: 2358: 2355: 2352: 2348: 2334: 2330: 2316: 2312: 2308: 2303: 2299: 2291: 2290: 2289: 2287: 2283: 2279: 2260: 2257: 2254: 2246: 2243: 2240: 2236: 2215: 2195: 2173: 2169: 2146: 2142: 2132: 2115: 2109: 2106: 2100: 2094: 2091: 2085: 2082: 2079: 2073: 2053: 2050: 2044: 2041: 2038: 2032: 2012: 1992: 1972: 1969: 1963: 1960: 1957: 1951: 1928: 1922: 1919: 1913: 1910: 1907: 1901: 1893: 1888: 1873: 1869: 1866: 1863: 1859: 1855: 1832: 1829: 1826: 1820: 1812: 1808: 1803: 1786: 1783: 1780: 1774: 1771: 1768: 1760: 1756: 1741: 1738: 1735: 1727: 1723: 1716: 1708: 1705: 1702: 1694: 1691: 1688: 1684: 1671: 1670: 1669: 1667: 1648: 1645: 1642: 1638: 1635: 1625: 1622: 1619: 1611: 1608: 1605: 1601: 1594: 1589: 1585: 1578: 1570: 1566: 1559: 1556: 1553: 1545: 1542: 1539: 1535: 1521: 1517: 1503: 1499: 1495: 1492: 1486: 1483: 1480: 1474: 1467: 1466: 1451: 1448: 1438: 1430: 1426: 1419: 1414: 1410: 1403: 1395: 1391: 1377: 1373: 1369: 1366: 1360: 1354: 1347: 1346: 1345: 1343: 1339: 1320: 1317: 1314: 1308: 1285: 1279: 1252: 1249: 1246: 1240: 1237: 1231: 1225: 1222: 1216: 1213: 1210: 1204: 1197: 1196: 1195: 1181: 1161: 1153: 1152: 1132: 1129: 1126: 1118: 1114: 1093: 1073: 1065: 1062: 1043: 1040: 1037: 1029: 1026: 1023: 1019: 995: 990: 986: 982: 977: 973: 961: 958: 955: 947: 943: 934: 931: 928: 920: 917: 914: 910: 897: 896: 895: 889: 871: 866: 856: 850: 847: 843: 838: 826: 820: 817: 811: 804: 803: 799: 770: 764: 761: 751: 738: 732: 729: 719: 713: 708: 696: 690: 687: 681: 674: 673: 669: 667: 647: 624: 612: 610: 608: 607:normalization 604: 600: 596: 592: 588: 584: 580: 576: 572: 568: 564: 560: 556: 552: 548: 543: 539: 535: 530: 528: 523: 502: 492: 466: 462: 439: 429: 403: 399: 376: 372: 349: 345: 322: 318: 308: 301: 299: 297: 293: 289: 285: 281: 271: 268: 260: 249: 246: 242: 239: 235: 232: 228: 225: 221: 218: –  217: 213: 212:Find sources: 206: 202: 196: 195: 190:This article 188: 184: 179: 178: 168: 163: 161: 156: 154: 149: 148: 146: 145: 140: 137: 135: 132: 130: 127: 125: 122: 121: 120: 119: 114: 111: 109: 106: 105: 104: 103: 98: 95: 93: 90: 88: 85: 83: 80: 78: 75: 73: 70: 68: 67:Joint entropy 65: 63: 60: 58: 55: 53: 50: 49: 48: 47: 43: 39: 38: 35: 31: 19: 5311:Hutter Prize 5279: 5275:Quantization 5180:Compensation 4974:Quantization 4697:Compensation 4263:Shannon–Fano 4203:Entropy type 4140: 4136: 4134: 4083: 4075: 4056: 4046: 4034: 4015: 4002: 3877: 3821: 3813: 3780: 3778: 3766: 3737: 3731:denotes the 3703: 3518: 3478: 3470: 3463: 3460: 3271:uncorrelated 3270: 3264: 3184: 3176: 3015: 2881: 2807: 2796: 2791: 2787: 2785: 2715: 2688: 2554: 2228:for a given 2133: 1891: 1889: 1810: 1806: 1804: 1801: 1663: 1337: 1271: 1149: 1010: 893: 616: 603:quantization 561:and perhaps 541: 533: 531: 524: 520: 302:Introduction 295: 291: 279: 278: 263: 254: 244: 237: 230: 223: 211: 199:Please help 194:verification 191: 112: 92:Entropy rate 5270:Prefix code 5123:Frame types 4944:Color space 4770:Convolution 4500:LZ77 + ANS 4411:Incremental 4384:Other types 4303:Levenshtein 3987:White noise 1194:defined as 1061:conditional 601:weighting ( 567:probability 5375:Categories 5327:Mark Adler 5285:Redundancy 5202:Daubechies 5185:Estimation 5118:Frame rate 5040:Daubechies 5000:Chain code 4959:Macroblock 4765:Companding 4702:Estimation 4622:Daubechies 4328:Lempel–Ziv 4288:Exp-Golomb 4216:Arithmetic 4025:1901.07821 3994:References 3474:quantizers 3266:memoryless 2700:continuous 2691:analytical 609:) matrix. 579:estimation 563:aesthetics 559:perception 542:distortion 257:March 2012 227:newspapers 5304:Community 5128:Interlace 4514:Zstandard 4293:Fibonacci 4283:Universal 4241:Canonical 4104:75-148254 3891:− 3861:− 3675:− 3631:− 3612:≤ 3606:≤ 3576:− 3424:σ 3386:σ 3382:≤ 3376:≤ 3338:σ 3331:⁡ 3221:σ 3156:≤ 3083:∣ 3055:∣ 2960:∈ 2945:∣ 2846:∞ 2843:→ 2758:− 2743:≥ 2672:∗ 2641:∣ 2627:∣ 2510:− 2476:∣ 2462:∣ 2449:∞ 2444:∞ 2441:− 2437:∫ 2431:∞ 2426:∞ 2423:− 2419:∫ 2385:− 2343:∞ 2338:∞ 2335:− 2331:∫ 2325:∞ 2320:∞ 2317:− 2313:∫ 2282:amplitude 2258:∣ 2244:∣ 2174:∗ 2042:∣ 1911:∣ 1830:∣ 1781:≤ 1706:∣ 1692:∣ 1623:∣ 1609:∣ 1595:⁡ 1557:∣ 1543:∣ 1530:∞ 1525:∞ 1522:− 1518:∫ 1512:∞ 1507:∞ 1504:− 1500:∫ 1496:− 1484:∣ 1420:⁡ 1386:∞ 1381:∞ 1378:− 1374:∫ 1370:− 1318:∣ 1250:∣ 1238:− 1041:∣ 1027:∣ 991:∗ 983:≤ 932:∣ 918:∣ 860:^ 851:− 830:^ 774:^ 765:≠ 742:^ 700:^ 651:^ 496:^ 433:^ 5290:Symmetry 5258:Timeline 5241:FM-index 5086:Bit rate 5079:Concepts 4927:Concepts 4790:Sampling 4743:Bit rate 4736:Concepts 4438:Sequitur 4273:Tunstall 4246:Modified 4236:Adaptive 4194:Lossless 3948:See also 3850:), then 3413:if  3369:if  3211:variance 3207:Gaussian 2711:function 1340:and the 1154:between 758:if  726:if  5248:Entropy 5197:Wavelet 5176:Motion 5035:Wavelet 5015:Fractal 5010:Deflate 4993:Methods 4780:Latency 4693:Motion 4617:Wavelet 4534:LHA/LZH 4484:Deflate 4433:Re-Pair 4428:Grammar 4258:Shannon 4231:Huffman 4187:methods 3824:(where 1666:infimum 1148:is the 241:scholar 52:Entropy 5359:codecs 5320:People 5223:Theory 5190:Vector 4707:Vector 4524:Brotli 4474:Hybrid 4373:Snappy 4226:Golomb 4112:156968 4110:  4102:  4092:  4063:  3704:where 2882:where 2786:where 2707:convex 1272:where 1106:, and 591:Vorbis 243:  236:  229:  222:  214:  5150:parts 5148:Codec 5113:Frame 5071:Video 5055:SPIHT 4964:Pixel 4919:Image 4873:ACELP 4844:ADPCM 4834:μ-law 4829:A-law 4822:parts 4820:Codec 4728:Audio 4667:ACELP 4655:ADPCM 4632:SPIHT 4573:Lossy 4557:bzip2 4548:LZHAM 4504:LZFSE 4406:Delta 4298:Gamma 4278:Unary 4253:Range 4020:arXiv 4012:(PDF) 3205:is a 1011:Here 555:music 248:JSTOR 234:books 5162:DPCM 4969:PSNR 4900:MDCT 4893:WLPC 4878:CELP 4839:DPCM 4687:WLPC 4672:CELP 4650:DPCM 4600:MDCT 4544:LZMA 4445:LDCT 4423:DPCM 4368:LZWL 4358:LZSS 4353:LZRW 4343:LZJB 4141:beta 4108:OCLC 4100:LCCN 4090:ISBN 4061:ISBN 3835:< 3656:> 3420:> 3259:are 3016:and 2797:The 2709:(U) 2208:and 2161:and 2066:and 1944:and 1301:and 1174:and 599:MPEG 597:and 595:JPEG 581:and 538:bits 534:rate 220:news 5207:DWT 5157:DCT 5101:VBR 5096:CBR 5091:ABR 5050:EZW 5045:DWT 5030:RLE 5020:KLT 5005:DCT 4888:LSP 4883:LAR 4868:LPC 4861:FFT 4758:VBR 4753:CBR 4748:ABR 4682:LSP 4677:LAR 4662:LPC 4627:DWT 4612:FFT 4607:DST 4595:DCT 4494:LZS 4489:LZX 4465:RLE 4460:PPM 4455:PAQ 4450:MTF 4418:DMC 4396:CTW 4391:BWT 4363:LZW 4348:LZO 4338:LZ4 4333:842 3820:is 3752:0.5 3659:min 3615:min 3322:log 2926:inf 2836:lim 2689:An 2288:): 1680:inf 1586:log 1411:log 906:inf 589:or 587:MP3 569:in 203:by 5377:: 5025:LP 4856:FT 4849:DM 4401:CM 4133:. 4106:. 4098:. 4055:. 4014:. 3944:. 3764:: 3735:. 2702:, 2131:. 1892:no 1887:. 605:, 298:. 5361:) 5357:( 4177:e 4170:t 4163:v 4143:. 4137:R 4114:. 4069:. 4028:. 4022:: 3932:) 3929:D 3926:( 3923:R 3903:) 3900:D 3897:( 3894:R 3888:H 3878:D 3864:C 3858:H 3838:H 3832:C 3822:C 3814:H 3800:) 3797:D 3794:( 3791:R 3781:D 3749:= 3746:p 3717:b 3713:H 3681:) 3678:p 3672:1 3669:, 3666:p 3663:( 3653:D 3648:, 3645:0 3637:) 3634:p 3628:1 3625:, 3622:p 3619:( 3609:D 3603:0 3598:, 3595:) 3592:D 3589:( 3584:b 3580:H 3573:) 3570:p 3567:( 3562:b 3558:H 3550:{ 3546:= 3543:) 3540:D 3537:( 3534:R 3498:) 3495:D 3492:( 3488:R 3438:. 3433:2 3428:x 3417:D 3407:, 3404:0 3395:2 3390:x 3379:D 3373:0 3363:, 3360:) 3357:D 3353:/ 3347:2 3342:x 3334:( 3326:2 3316:2 3313:1 3305:{ 3300:= 3297:) 3294:D 3291:( 3288:R 3247:X 3225:2 3193:X 3162:} 3159:D 3153:] 3150:) 3145:n 3141:Y 3137:, 3132:n 3128:X 3124:( 3121:d 3118:[ 3115:E 3112:: 3109:) 3104:0 3100:X 3096:, 3091:n 3087:X 3078:n 3074:Y 3070:( 3063:n 3059:X 3050:n 3046:Y 3041:Q 3037:{ 3034:= 3029:Q 3001:) 2996:n 2992:X 2988:, 2983:n 2979:Y 2975:( 2972:I 2965:Q 2953:n 2949:X 2940:n 2936:Y 2931:Q 2920:n 2917:1 2912:= 2909:) 2906:D 2903:( 2898:n 2894:R 2867:) 2864:D 2861:( 2856:n 2852:R 2840:n 2832:= 2829:) 2826:D 2823:( 2820:R 2792:D 2790:( 2788:h 2770:) 2767:D 2764:( 2761:h 2755:) 2752:X 2749:( 2746:h 2740:) 2737:D 2734:( 2731:R 2668:D 2647:) 2644:x 2638:y 2635:( 2630:X 2624:Y 2620:Q 2599:) 2596:x 2593:( 2588:X 2584:P 2563:X 2540:. 2537:y 2534:d 2530:x 2527:d 2521:2 2517:) 2513:y 2507:x 2504:( 2501:) 2498:x 2495:( 2490:X 2486:P 2482:) 2479:x 2473:y 2470:( 2465:X 2459:Y 2455:Q 2415:= 2412:y 2409:d 2405:x 2402:d 2396:2 2392:) 2388:y 2382:x 2379:( 2376:) 2373:y 2370:, 2367:x 2364:( 2359:Y 2356:, 2353:X 2349:P 2309:= 2304:Q 2300:D 2284:- 2264:) 2261:x 2255:y 2252:( 2247:X 2241:Y 2237:Q 2216:Y 2196:X 2170:D 2147:Q 2143:D 2119:) 2116:Y 2113:( 2110:H 2107:= 2104:) 2101:X 2098:( 2095:H 2092:= 2089:) 2086:X 2083:; 2080:Y 2077:( 2074:I 2054:0 2051:= 2048:) 2045:X 2039:Y 2036:( 2033:H 2013:X 1993:Y 1973:0 1970:= 1967:) 1964:X 1961:; 1958:Y 1955:( 1952:I 1932:) 1929:Y 1926:( 1923:H 1920:= 1917:) 1914:X 1908:Y 1905:( 1902:H 1874:) 1870:X 1867:; 1864:Y 1860:( 1856:I 1836:) 1833:X 1827:Y 1824:( 1821:H 1811:Y 1809:( 1807:H 1787:. 1784:R 1778:) 1775:X 1772:; 1769:Y 1766:( 1761:Q 1757:I 1748:] 1745:] 1742:Y 1739:, 1736:X 1733:[ 1728:Q 1724:D 1720:[ 1717:E 1712:) 1709:x 1703:y 1700:( 1695:X 1689:Y 1685:Q 1649:. 1646:y 1643:d 1639:x 1636:d 1632:) 1629:) 1626:x 1620:y 1617:( 1612:X 1606:Y 1602:Q 1598:( 1590:2 1582:) 1579:x 1576:( 1571:X 1567:P 1563:) 1560:x 1554:y 1551:( 1546:X 1540:Y 1536:Q 1493:= 1490:) 1487:X 1481:Y 1478:( 1475:H 1452:y 1449:d 1445:) 1442:) 1439:y 1436:( 1431:Y 1427:P 1423:( 1415:2 1407:) 1404:y 1401:( 1396:Y 1392:P 1367:= 1364:) 1361:Y 1358:( 1355:H 1338:Y 1324:) 1321:X 1315:Y 1312:( 1309:H 1289:) 1286:Y 1283:( 1280:H 1256:) 1253:X 1247:Y 1244:( 1241:H 1235:) 1232:Y 1229:( 1226:H 1223:= 1220:) 1217:X 1214:; 1211:Y 1208:( 1205:I 1182:X 1162:Y 1136:) 1133:X 1130:; 1127:Y 1124:( 1119:Q 1115:I 1094:X 1074:Y 1047:) 1044:x 1038:y 1035:( 1030:X 1024:Y 1020:Q 996:. 987:D 978:Q 974:D 965:) 962:X 959:; 956:Y 953:( 948:Q 944:I 938:) 935:x 929:y 926:( 921:X 915:Y 911:Q 872:2 867:) 857:x 848:x 844:( 839:= 836:) 827:x 821:, 818:x 815:( 812:d 771:x 762:x 752:1 739:x 733:= 730:x 720:0 714:{ 709:= 706:) 697:x 691:, 688:x 685:( 682:d 648:x 625:x 517:. 503:n 493:X 467:n 463:X 440:n 430:X 404:n 400:g 377:n 373:Y 350:n 346:X 323:n 319:f 296:D 292:R 270:) 264:( 259:) 255:( 245:· 238:· 231:· 224:· 197:. 166:e 159:t 152:v 20:)

Index

Rate distortion function
Information theory

Entropy
Differential entropy
Conditional entropy
Joint entropy
Mutual information
Directed information
Conditional mutual information
Relative entropy
Entropy rate
Limiting density of discrete points
Asymptotic equipartition property
Rate–distortion theory
Shannon's source coding theorem
Channel capacity
Noisy-channel coding theorem
Shannon–Hartley theorem
v
t
e

verification
improve this article
adding citations to reliable sources
"Rate–distortion theory"
news
newspapers
books

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.