3304:
3294:
276:, which in mathematics, denotes a difference), but the term is typically only used if both versions are meaningful outside compression and decompression. For example, while the process of compressing the error in the above-mentioned lossless audio compression scheme could be described as delta encoding from the approximated sound wave to the original sound wave, the approximated version of the sound wave is not meaningful in any other context.
238:, but there are other techniques that do not work for typical text that are useful for some images (particularly simple bitmaps), and other techniques that take advantage of the specific characteristics of images (such as the common phenomenon of contiguous 2-D areas of similar tones, and the fact that color images usually have a preponderance of a limited range of colors out of those representable in the color space).
726:) are the latest generation of lossless algorithms that compress data (typically sequences of nucleotides) using both conventional compression algorithms and specific algorithms adapted to genetic data. In 2012, a team of scientists from Johns Hopkins University published the first genetic compression algorithm that does not rely on external genetic databases for compression. HAPZIPPER was tailored for
781:, so winners in these benchmarks may be unsuitable for everyday use due to the slow speed of the top performers. Another drawback of some benchmarks is that their data files are known, so some program writers may optimize their programs for best performance on a particular data set. The winners on these benchmarks often come from the class of
1068:; for example, a compression application may consider files whose names end in ".zip", ".arj" or ".lha" uncompressible without any more sophisticated detection. A common way of handling this situation is quoting input, or uncompressible parts of the input in the output, minimizing the compression overhead. For example, the
1173:
1008: − 1 bits, these kinds of claims can be safely discarded without even looking at any further details regarding the purported compression scheme. Such an algorithm contradicts fundamental laws of mathematics because, if it existed, it could be applied repeatedly to losslessly reduce any file to length 1.
950:
Most practical compression algorithms provide an "escape" facility that can turn off the normal coding for files that would become longer by being encoded. In theory, only a single additional bit is required to tell the decoder that the normal coding has been turned off for the entire input; however,
872:
Lossless data compression algorithms cannot guarantee compression for all input data sets. In other words, for any lossless data compression algorithm, there will be an input data set that does not get smaller when processed by the algorithm, and for any lossless data compression algorithm that makes
831:
Sami Runsas (the author of NanoZip) maintained
Compression Ratings, a benchmark similar to Maximum Compression multiple file test, but with minimum speed requirements. It offered the calculator that allowed the user to weight the importance of speed and compression ratio. The top programs were fairly
156:
model, the data is analyzed and a model is constructed, then this model is stored with the compressed data. This approach is simple and modular, but has the disadvantage that the model itself can be expensive to store, and also that it forces using a single model for all data being compressed, and so
1063:
Real compression algorithm designers accept that streams of high information entropy cannot be compressed, and accordingly, include facilities for detecting and handling this condition. An obvious way of detection is applying a raw compression algorithm and testing if its output is smaller than its
863:
The
Compression Analysis Tool is a Windows application that enables end users to benchmark the performance characteristics of streaming implementations of LZF4, Deflate, ZLIB, GZIP, BZIP2 and LZMA using their own data. It produces measurements and charts with which users can compare the compression
161:
models dynamically update the model as the data is compressed. Both the encoder and decoder begin with a trivial model, yielding poor compression of initial data, but as they learn more about the data, performance improves. Most popular types of compression used in practice now use adaptive coders.
963:
than N. So if we know nothing about the properties of the data we are compressing, we might as well not compress it at all. A lossless compression algorithm is useful only when we are more likely to compress certain types of files than others; then the algorithm could be designed to compress those
194:
additionally uses data points from other pairs and multiplication factors to mix them into the difference. These factors must be integers, so that the result is an integer under all circumstances. So the values are increased, increasing file size, but hopefully the distribution of values is more
1080:
Mark Nelson, in response to claims of "magic" compression algorithms appearing in comp.compression, has constructed a 415,241 byte binary file of highly entropic content, and issued a public challenge of $ 100 to anyone to write a program that, together with its input, would be smaller than his
182:
These techniques take advantage of the specific characteristics of images such as the common phenomenon of contiguous 2-D areas of similar tones. Every pixel but the first is replaced by the difference to its left neighbor. This leads to small values having a much higher probability than large
1011:
On the other hand, it has also been proven that there is no algorithm to determine whether a file is incompressible in the sense of
Kolmogorov complexity. Hence it is possible that any particular file, even if it appears random, may be significantly compressed, even including the size of the
980:
that the algorithm is designed to remove, and thus belong to the subset of files that that algorithm can make shorter, whereas other files would not get compressed or even get bigger. Algorithms are generally quite specifically tuned to a particular type of file: for example, lossless audio
169:
meaning that they can accept any bitstring) can be used on any type of data, many are unable to achieve significant compression on data that are not of the form for which they were designed to compress. Many of the lossless compression techniques used for text also work reasonably well for
2094:
183:
values. This is often also applied to sound files, and can compress files that contain mostly low frequencies and low volumes. For images, this step can be repeated by taking the difference to the top pixel, and then in videos, the difference to the pixel in the next frame can be taken.
92:
Lossless compression is used in cases where it is important that the original and the decompressed data be identical, or where deviations from the original data would be unfavourable. Common examples are executable programs, text documents, and source code. Some image file formats, like
198:
The adaptive encoding uses the probabilities from the previous sample in sound encoding, from the left and upper pixel in image encoding, and additionally from the previous frame in video encoding. In the wavelet transformation, the probabilities are also passed through the hierarchy.
711:. However, many ordinary lossless compression algorithms produce headers, wrappers, tables, or other predictable output that might instead make cryptanalysis easier. Thus, cryptosystems must utilize compression algorithms whose output does not contain these predictable patterns.
958:
In fact, if we consider files of length N, if all files were equally probable, then for any lossless compression that reduces the size of some file, the expected length of a compressed file (averaged over all possible files of length N) must necessarily be
975:
The "trick" that allows lossless compression algorithms, used on the type of data they were designed for, to consistently compress such files to a shorter form is that the files the algorithms are designed to act on all have some form of easily modeled
2091:
851:
The
Monster of Compression benchmark by Nania Francesco Antonio tested compression on 1Gb of public data with a 40-minute time limit. In December 2009, the top ranked archiver was NanoZip 0.07a and the top ranked single file compressor was
884:
Suppose that there is a compression algorithm that transforms every file into an output file that is no longer than the original file, and that at least one file will be compressed into an output file that is shorter than the original
942:
that is simultaneously the output of the compression function on two different inputs. That file cannot be decompressed reliably (which of the two originals should that yield?), which contradicts the assumption that the algorithm was
971:
of all files that will become usefully shorter. This is the theoretical reason why we need to have different compression algorithms for different kinds of files: there cannot be any algorithm that is good for all kinds of data.
733:
Genomic sequence compression algorithms, also known as DNA sequence compressors, explore the fact that DNA sequences have characteristic properties, such as inverted repeats. The most successful compressors are XM and GeCo. For
302:. For this reason, many different algorithms exist that are designed either with a specific type of input data in mind or with specific assumptions about what kinds of redundancy the uncompressed data are likely to contain.
2101:, "n "Frequency-Time Based Data Compression Method" supporting the compression, encryption, decompression, and decryption and persistence of many binary digits through frequencies where each frequency represents many bits."
999:
It is provably impossible to create an algorithm that can losslessly compress any data. While there have been many claims through the years of companies achieving "perfect compression" where an arbitrary number
1038:
on sequences (normally of octets). Compression is successful if the resulting sequence is shorter than the original sequence (and the instructions for the decompression map). For a compression algorithm to be
752:
Self-extracting executables contain a compressed application and a decompressor. When executed, the decompressor transparently decompresses and runs the original application. This is especially often used in
129:
for the input data, and the second step uses this model to map input data to bit sequences in such a way that "probable" (i.e. frequently encountered) data will produce shorter output than "improbable" data.
803:
dating back to 1987 is no longer widely used due to its small size. Matt
Mahoney maintained the Calgary Compression Challenge, created and maintained from May 21, 1996, through May 21, 2016, by Leonid A.
241:
As mentioned previously, lossless sound compression is a somewhat specialized area. Lossless sound compression algorithms can take advantage of the repeating patterns shown by the wave-like nature of the
66:. Different algorithms exist that are designed either with a specific type of input data in mind or with specific assumptions about what kinds of redundancy the uncompressed data are likely to contain.
257:
models to predict the "next" value and encoding the (hopefully small) difference between the expected value and the actual data. If the difference between the predicted and the actual data (called the
186:
A hierarchical version of this technique takes neighboring pairs of data points, stores their difference and sum, and on a higher level with lower resolution continues with the sums. This is called
864:
speed, decompression speed and compression ratio of the different compression methods and to examine how the compression level, buffer size and flushing operations affect the results.
165:
Lossless compression methods may be categorized according to the type of data they are designed to compress. While, in principle, any general-purpose lossless compression algorithm (
215:
compression, and in particular licensing practices by patent holder Unisys that many developers considered abusive, some open source proponents encouraged people to avoid using the
730:
data and achieves over 20-fold compression (95% reduction in file size), providing 2- to 4-fold better compression much faster than leading general-purpose compression utilities.
1370:; Mandyam, Giridhar D.; Magotra, Neeraj (April 17, 1995). Rodriguez, Arturo A.; Safranek, Robert J.; Delp, Edward J. (eds.). "DCT-based scheme for lossless image compression".
261:) tends to be small, then certain difference values (like 0, +1, −1 etc. on sample values) become very frequent, which can be exploited by encoding them in few output bits.
967:
Thus, the main lesson from the argument is not that one risks big losses, but merely that one cannot always win. To choose an algorithm always means implicitly to select a
873:
at least one file smaller, there will be at least one file that it makes larger. This is easily proven with elementary mathematics using a counting argument called the
1808:
1554:
2128:
1018:, which appear random but can be generated by a very small program. However, even though it cannot be determined whether a particular file is incompressible, a
1272:
2796:
2607:
2496:
372:
2310:
385:
3338:
3002:
2825:
2619:
285:
207:
Many of these methods are implemented in open-source and proprietary tools, particularly LZW and its variants. Some algorithms are patented in the
289:
117:
files are typically used on portable players and in other cases where storage space is limited or exact replication of the audio is unnecessary.
3007:
2584:
719:
1850:
1697:
1469:
2737:
1081:
provided binary data yet be able to reconstitute it without error. A similar challenge, with $ 5,000 as reward, was issued by Mike
Goldman.
1977:
1953:
1136:
1047:
from "plain" to "compressed" bit sequences. The pigeonhole principle prohibits a bijection between the collection of sequences of length
149:, whereas Huffman compression is simpler and faster but produces poor results for models that deal with symbol probabilities close to 1.
55:, no lossless compression algorithm can shrink the size of all possible data: Some data will get longer by at least one symbol or bit.
3114:
2852:
2791:
2602:
2552:
2375:
2220:
1022:
shows that over 99% of files of any given length cannot be compressed by more than one byte (including the size of the decompressor).
435:
2235:
2121:
1774:
1434:
1351:
1152:
788:
738:
XM is slightly better in compression ratio, though for sequences larger than 100 MB its computational requirements are impractical.
550:
145:. Arithmetic coding achieves compression rates close to the best possible for a particular statistical model, which is given by the
946:
We must therefore conclude that our original hypothesis (that the compression function makes no file longer) is necessarily untrue.
3227:
1459:
1419:
Proceedings of the 1998 IEEE International
Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181)
3237:
3075:
2926:
2845:
2639:
977:
106:
63:
33:
988:
data cannot be consistently compressed by any conceivable lossless data compression algorithm; indeed, this result is used to
3210:
2830:
2624:
2412:
525:
413:
2343:
1072:
data format specifies the 'compression method' of 'Stored' for input files that have been copied into the archive verbatim.
333:
2972:
2300:
1622:
3307:
1673:
1055:−1. Therefore, it is not possible to produce a lossless algorithm that reduces the size of every possible input sequence.
211:
and other countries and their legal usage requires licensing by the patent holder. Because of patents on certain kinds of
2015:
3297:
3200:
2742:
2114:
1106:
1091:
727:
554:
2290:
2285:
761:. This type of compression is not strictly limited to binary executables, but can also be applied to scripts, such as
754:
216:
3232:
1217:
58:
Compression algorithms are usually effective for human- and machine-readable documents and cannot shrink the size of
3159:
2997:
2977:
2921:
2579:
2370:
2173:
1250:
314:
187:
3333:
3242:
3183:
3109:
2957:
2547:
2542:
2397:
2315:
2240:
1319:
951:
most encoding algorithms use at least one full byte (and typically more than one) for this purpose. For example,
642:
619:
615:
475:
469:
395:
366:
220:
212:
2083:
3247:
2820:
2614:
2446:
630:
114:
110:
2215:
231:
with a selection of domain-specific prediction filters. However, the patents on LZW expired on June 20, 2003.
1726:
1199:
824:
The
Generic Compression Benchmark, maintained by Matt Mahoney, tests compression of data generated by random
3188:
2559:
2402:
2198:
2188:
567:
860:
The
Compression Ratings website published a chart summary of the "frontier" in compression ratio and time.
2813:
2564:
2348:
2193:
1035:
778:
747:
498:
3217:
1126:
1031:
1019:
993:
774:
654:
510:
447:
1764:
1900:
1534:
Pratas, D.; Pinho, A. J.; Ferreira, P. J. S. G. (2016). "Efficient compression of genomic sequences".
2901:
2363:
2325:
1379:
1287:
1240:
874:
52:
28:
that allows the original data to be perfectly reconstructed from the compressed data with no loss of
837:
3132:
2982:
2967:
2936:
2931:
2840:
2747:
2649:
2634:
2417:
464:
425:(RLE) – Simple scheme that provides good compression of data containing many runs of the same value
422:
264:
It is sometimes beneficial to compress only the differences between two versions of a file (or, in
146:
1640:
3205:
3175:
3154:
3060:
2992:
2886:
2574:
2390:
2380:
2275:
2255:
2250:
2039:
1856:
1703:
1548:
1440:
1395:
1311:
1121:
1111:
1044:
832:
different due to the speed requirement. In
January 2010, the top program was NanoZip followed by
636:
81:. It is also often used as a component within lossy data compression technologies (e.g. lossless
2786:
773:
Lossless compression algorithms and their implementations are routinely tested in head-to-head
3149:
3137:
3119:
2987:
2871:
2808:
2654:
2569:
2525:
2486:
2168:
1973:
1949:
1846:
1770:
1693:
1678:
8th International Conference on Informatics in Schools: Situation, Evolution, and Perspectives
1516:
1465:
1430:
1347:
1337:
1303:
1142:
1069:
841:
723:
583:
481:
358:
327:
265:
142:
70:
37:
1826:
3124:
3080:
3053:
3048:
3023:
2906:
2891:
2801:
2710:
2705:
2680:
2534:
2267:
2245:
2137:
1838:
1713:
1685:
1506:
1498:
1422:
1387:
1295:
1101:
1096:
938:
But 2 is smaller than 2+1, so by the pigeonhole principle there must be some file of length
777:. There are a number of better-known compression benchmarks. Some benchmarks cover only the
704:
703:
encryption for added security. When properly implemented, compression greatly increases the
351:(LZ77 and LZ78) – Dictionary-based algorithm that forms the basis for many other algorithms
25:
3043:
2857:
2781:
2762:
2732:
2700:
2666:
2225:
2163:
2098:
1945:
1881:
1834:
1784:
1681:
853:
125:
Most lossless compression programs do two things in sequence: the first step generates a
1383:
1291:
234:
Many of the lossless compression techniques used for text also work reasonably well for
2835:
2629:
2358:
2353:
2210:
2183:
2155:
1969:
1511:
1486:
1343:
1131:
825:
800:
782:
686:
648:
453:
441:
342:
273:
269:
254:
134:
82:
69:
Lossless data compression is used in many applications. For example, it is used in the
1915:
3327:
3142:
3090:
2757:
2752:
2727:
2659:
2280:
2178:
1860:
1822:
1241:"General characteristics and design considerations for temporal subband video coding"
1147:
708:
589:
582:– (includes lossless compression method via Le Gall–Tabatabai 5/3 reversible integer
348:
235:
224:
208:
171:
2059:
1707:
1569:
1535:
1444:
1399:
955:
compressed files never need to grow by more than 5 bytes per 65,535 bytes of input.
3263:
2230:
2205:
1885:
1414:
1315:
1116:
808:
757:
coding, where competitions are held for demos with strict size limits, as small as
696:
459:
927:
keeps its size during compression. There are 2 such files possible. Together with
881:
Assume that each file is represented as a string of bits of some arbitrary length.
812:
1944:. The Morgan Kaufmann Series in Multimedia Information and Systems (5 ed.).
3222:
3100:
2896:
2772:
2722:
1842:
1689:
1626:
113:
formats are most often used for archiving or production purposes, while smaller
29:
1458:
Alfred J. Menezes; Paul C. van Oorschot; Scott A. Vanstone (October 16, 1996).
1426:
3279:
3070:
3065:
2952:
2911:
2717:
2023:
1065:
762:
735:
504:
487:
417:
16:
Data compression approach allowing perfect reconstruction of the original data
1995:
294:
No lossless compression algorithm can efficiently compress all possible data
1299:
579:
493:
322:
2106:
1520:
1307:
931:, this makes 2+1 files that all compress into one of the 2 files of length
1225:
553:– High Efficiency Image File Format (lossless or lossy compression, using
305:
Some of the most common lossless compression algorithms are listed below.
3193:
3038:
2695:
1966:
Lossless Compression Handbook (Communications, Networking and Multimedia)
1502:
1367:
1040:
758:
404:
380:
191:
45:
32:. Lossless compression is possible because most real-world data exhibits
336:
reversible transform for making textual data more compressible, used by
2962:
2436:
2385:
952:
833:
673:
601:
595:
519:
354:
228:
138:
1586:
1391:
2476:
2079:
985:
981:
compression programs do not work well on text files, and vice versa.
389:
59:
1012:
decompressor. An example is the digits of the mathematical constant
152:
There are two primary ways of constructing statistical models: in a
3311:
3085:
2916:
2509:
2456:
1245:
845:
818:
573:
564:
337:
318:
133:
The primary encoding algorithms used to produce bit sequences are
1676:(September 28 – October 1, 2015). "Surprising Computer Science".
1604:
2466:
2320:
2305:
2295:
660:
560:
544:
376:
362:
102:
78:
41:
40:
permits reconstruction only of an approximation of the original
2110:
2441:
2407:
815:
624:
538:
399:
98:
94:
86:
74:
1766:
An Introduction to Kolmogorov Complexity and its Applications
1727:"Lossless Compression - an overview | ScienceDirect Topics"
1378:. International Society for Optics and Photonics: 474–478.
1372:
Digital Video Compression: Algorithms and Technologies 1995
1174:"Unit 4 Lab 4: Data Representation and Compression, Page 6"
1014:
633:– Portable Document Format (lossless or lossy compression)
268:, of successive images within a sequence). This is called
157:
performs poorly on files that contain heterogeneous data.
1273:"Mathematical properties of the JPEG2000 wavelet filters"
357:– Combines LZ77 compression with Huffman coding, used by
1051:
and any subset of the collection of sequences of length
663:– (lossless or lossy compression of RGB and RGBA images)
1487:"HapZipper: sharing HapMap populations just got easier"
657:– Tag Image File Format (lossless or lossy compression)
1787:
1680:. Lecture Notes in Computer Science. Vol. 9378.
904:
be the length (in bits) of the compressed version of
807:
The Large Text Compression Benchmark and the similar
345:– Entropy encoding, pairs well with other algorithms
219:(GIF) for compressing still image files in favor of
3272:
3256:
3174:
3099:
3031:
3022:
2945:
2879:
2870:
2771:
2688:
2679:
2595:
2533:
2524:
2426:
2336:
2266:
2154:
2145:
1200:"Lossless Streaming – the future of high res audio"
791:, in his February 2010 edition of the free booklet
576:– (lossless or lossy compression of B&W images)
101:, use only lossless compression, while others like
1802:
1059:Points of application in real compression theory
900:bits that compresses to something shorter. Let
592:– (lossless/near-lossless compression standard)
1901:"The Million Random Digit Challenge Revisited"
892:be the least number such that there is a file
375:(LZMA) – Very high compression ratio, used by
2122:
1668:
1666:
8:
2060:"Lossless and lossy audio formats for music"
1485:Chanda, P.; Elhaik, E.; Bader, J.S. (2012).
676:– Lossless compression of 3D triangle meshes
1020:simple theorem about incompressible strings
1004:of random bits can always be compressed to
707:by removing patterns that might facilitate
3028:
2876:
2685:
2530:
2151:
2129:
2115:
2107:
1553:: CS1 maint: location missing publisher (
109:may use either lossless or lossy methods.
89:encoders and other lossy audio encoders).
1964:Sayood, Khalid, ed. (December 18, 2002).
1786:
1510:
1421:. Vol. 3. pp. 1769–1772 vol.3.
1064:input. Sometimes, detection is made by
612:, includes a lossless compression method
286:Category:Lossless compression algorithms
2086:from the original on February 10, 2013.
1339:The Essential Guide to Video Processing
1165:
290:List of lossless compression algorithms
44:, though usually with greatly improved
1750:
1657:
1546:
1415:"Reversible discrete cosine transform"
1239:Sullivan, Gary (December 8–12, 2003).
699:often compress data (the "plaintext")
1280:IEEE Transactions on Image Processing
48:(and therefore reduced media sizes).
7:
1137:Lossless Transform Audio Compression
795:, additionally lists the following:
1940:Sayood, Khalid (October 27, 2017).
1769:. New York: Springer. p. 102.
1625:. September 1, 2016. Archived from
1413:Komatsu, K.; Sezaki, Kaoru (1998).
1043:, the compression map must form an
444:(ALAC – Apple Lossless Audio Codec)
1916:"The $ 5000 Compression Challenge"
1587:"Large Text Compression Benchmark"
1076:The Million Random Digit Challenge
436:Adaptive Transform Acoustic Coding
416:(PPM) – Optimized for compressing
14:
1153:Universal code (data compression)
598:– (lossless or lossy compression)
373:Lempel–Ziv–Markov chain algorithm
3303:
3302:
3293:
3292:
1942:Introduction to Data Compression
1882:".ZIP File Format Specification"
1763:Li, Ming; Vitányi, Paul (1993).
1461:Handbook of Applied Cryptography
297:
3339:Lossless compression algorithms
1605:"Generic Compression Benchmark"
720:Genetics compression algorithms
563:– (lossless RLE compression of
1899:Nelson, Mark (June 20, 2006).
1797:
1791:
414:Prediction by partial matching
1:
2080:"Image Compression Benchmark"
1198:Price, Andy (March 3, 2022).
992:the concept of randomness in
687:list of lossless video codecs
392:in tandem with Huffman coding
2016:"Theory of Data Compression"
1570:"Data Compression Explained"
1107:Entropy (information theory)
1092:Comparison of file archivers
547:– Free Lossless Image Format
317:– Entropy encoding, used by
2044:Hydrogenaudio Knowledgebase
1843:10.1007/978-3-319-16250-8_3
1690:10.1007/978-3-319-25396-1_1
1641:"Compression Analysis Tool"
1537:Data Compression Conference
1271:Unser, M.; Blu, T. (2003).
645:– Portable Network Graphics
386:Lempel–Ziv–Storer–Szymanski
295:
217:Graphics Interchange Format
3355:
3184:Compressed data structures
2506:RLE + BWT + MTF + Huffman
2174:Asymmetric numeral systems
1827:"The Pigeonhole Principle"
1427:10.1109/ICASSP.1998.681802
1251:Video Coding Experts Group
793:Data Compression Explained
745:
450:(also known as MPEG-4 ALS)
283:
223:(PNG), which combines the
188:discrete wavelet transform
3288:
2543:Discrete cosine transform
2473:LZ77 + Huffman + context
2097:February 2, 2017, at the
1810:is not partial recursive.
1781:Theorem 2.6 The function
1204:Audio Media International
722:(not to be confused with
620:Discrete Cosine Transform
516:TTA (True Audio Lossless)
476:Meridian Lossless Packing
470:Free Lossless Audio Codec
334:Burrows–Wheeler transform
221:Portable Network Graphics
3248:Smallest grammar problem
1996:"LZF compression format"
1224:. Unisys. Archived from
1218:"LZW Patent Information"
984:In particular, files of
528:(Windows Media Lossless)
3189:Compressed suffix array
2738:Nyquist–Shannon theorem
1336:Bovik, Alan C. (2009).
1300:10.1109/TIP.2003.812329
1026:Mathematical background
639:– Quite OK Image Format
541:– AV1 Image File Format
272:(from the Greek letter
203:Historical legal issues
73:file format and in the
1804:
1647:. Noemax Technologies.
964:types of data better.
785:compression software.
779:data compression ratio
748:Executable compression
499:Original Sound Quality
490:(also known as HD-AAC)
454:Direct Stream Transfer
349:Lempel-Ziv compression
34:statistical redundancy
3218:Kolmogorov complexity
3086:Video characteristics
2463:LZ77 + Huffman + ANS
2040:"Lossless comparison"
1888:chapter V, section J.
1805:
1731:www.sciencedirect.com
1629:on September 1, 2016.
1568:Matt Mahoney (2010).
1127:Kolmogorov complexity
1032:compression algorithm
994:Kolmogorov complexity
715:Genetics and genomics
448:Audio Lossless Coding
83:mid/side joint stereo
62:data that contain no
3308:Compression software
2902:Compression artifact
2858:Psychoacoustic model
2092:US patent #7,096,360
1803:{\displaystyle C(x)}
1785:
1325:on October 13, 2019.
875:pigeonhole principle
507:(RealAudio Lossless)
484:(Monkey's Audio APE)
53:pigeonhole principle
51:By operation of the
22:Lossless compression
3298:Compression formats
2937:Texture compression
2932:Standard test image
2748:Silence compression
1384:1995SPIE.2419..474M
1292:2003ITIP...12.1080U
1034:can be viewed as a
811:both use a trimmed
465:DTS-HD Master Audio
423:Run-length encoding
147:information entropy
3206:Information theory
3061:Display resolution
2887:Chroma subsampling
2276:Byte pair encoding
2221:Shannon–Fano–Elias
2066:. November 6, 2003
1825:(March 18, 2015).
1800:
1712:See in particular
1503:10.1093/nar/gks709
1122:Information theory
1112:Grammar-based code
724:genetic algorithms
627:– PiCture eXchange
522:(WavPack lossless)
402:images and Unix's
330:– Entropy encoding
298:§ Limitations
253:essentially using
137:(also used by the
3321:
3320:
3170:
3169:
3120:Deblocking filter
3018:
3017:
2866:
2865:
2675:
2674:
2520:
2519:
2046:. January 5, 2015
1852:978-3-319-16250-8
1699:978-3-319-25396-1
1684:. pp. 1–11.
1543:. Snowbird, Utah.
1491:Nucleic Acids Res
1471:978-1-4398-2191-6
1392:10.1117/12.206386
1143:Lossy compression
584:wavelet transform
388:(LZSS) – Used by
328:Arithmetic coding
300:for more on this)
266:video compression
229:deflate algorithm
143:arithmetic coding
139:deflate algorithm
127:statistical model
85:preprocessing by
46:compression rates
38:lossy compression
3346:
3334:Data compression
3306:
3305:
3296:
3295:
3125:Lapped transform
3029:
2907:Image resolution
2892:Coding tree unit
2877:
2686:
2531:
2152:
2138:Data compression
2131:
2124:
2117:
2108:
2087:
2075:
2073:
2071:
2064:Bobulous Central
2055:
2053:
2051:
2035:
2033:
2031:
2022:. Archived from
2020:Data Compression
2010:
2008:
2006:
1983:
1979:978-0-12390754-7
1959:
1955:978-0-12809474-7
1927:
1926:
1924:
1922:
1914:Craig, Patrick.
1911:
1905:
1904:
1896:
1890:
1889:
1878:
1872:
1871:
1869:
1867:
1819:
1813:
1812:
1809:
1807:
1806:
1801:
1760:
1754:
1748:
1742:
1741:
1739:
1737:
1723:
1717:
1711:
1670:
1661:
1655:
1649:
1648:
1637:
1631:
1630:
1619:
1613:
1612:
1601:
1595:
1594:
1583:
1577:
1576:
1574:
1565:
1559:
1558:
1552:
1544:
1542:
1531:
1525:
1524:
1514:
1482:
1476:
1475:
1455:
1449:
1448:
1410:
1404:
1403:
1364:
1358:
1357:
1333:
1327:
1326:
1324:
1318:. Archived from
1286:(9): 1080–1090.
1277:
1268:
1262:
1261:
1259:
1257:
1236:
1230:
1229:
1228:on June 2, 2009.
1214:
1208:
1207:
1195:
1189:
1188:
1186:
1184:
1170:
1102:David A. Huffman
1097:Data compression
705:unicity distance
651:– Truevision TGA
407:
398:(LZW) – Used by
396:Lempel–Ziv–Welch
301:
252:
251:
247:
36:. By contrast,
26:data compression
3354:
3353:
3349:
3348:
3347:
3345:
3344:
3343:
3324:
3323:
3322:
3317:
3284:
3268:
3252:
3233:Rate–distortion
3166:
3095:
3014:
2941:
2862:
2767:
2763:Sub-band coding
2671:
2596:Predictive type
2591:
2516:
2483:LZSS + Huffman
2433:LZ77 + Huffman
2422:
2332:
2268:Dictionary type
2262:
2164:Adaptive coding
2141:
2135:
2099:Wayback Machine
2078:
2069:
2067:
2058:
2049:
2047:
2038:
2029:
2027:
2013:
2004:
2002:
1994:
1991:
1980:
1963:
1956:
1946:Morgan Kaufmann
1939:
1936:
1934:Further reading
1931:
1930:
1920:
1918:
1913:
1912:
1908:
1898:
1897:
1893:
1880:
1879:
1875:
1865:
1863:
1853:
1821:
1820:
1816:
1783:
1782:
1777:
1762:
1761:
1757:
1749:
1745:
1735:
1733:
1725:
1724:
1720:
1700:
1672:
1671:
1664:
1656:
1652:
1639:
1638:
1634:
1621:
1620:
1616:
1609:mattmahoney.net
1603:
1602:
1598:
1591:mattmahoney.net
1585:
1584:
1580:
1575:. pp. 3–5.
1572:
1567:
1566:
1562:
1545:
1540:
1533:
1532:
1528:
1484:
1483:
1479:
1472:
1457:
1456:
1452:
1437:
1412:
1411:
1407:
1366:
1365:
1361:
1354:
1346:. p. 355.
1335:
1334:
1330:
1322:
1275:
1270:
1269:
1265:
1255:
1253:
1238:
1237:
1233:
1216:
1215:
1211:
1197:
1196:
1192:
1182:
1180:
1172:
1171:
1167:
1162:
1157:
1087:
1078:
1061:
1028:
923:file of length
870:
826:Turing machines
813:Knowledge (XXG)
771:
750:
744:
717:
694:
683:
670:
535:
533:Raster graphics
432:
403:
311:
309:General purpose
292:
282:
249:
245:
243:
205:
180:
167:general-purpose
123:
17:
12:
11:
5:
3352:
3350:
3342:
3341:
3336:
3326:
3325:
3319:
3318:
3316:
3315:
3300:
3289:
3286:
3285:
3283:
3282:
3276:
3274:
3270:
3269:
3267:
3266:
3260:
3258:
3254:
3253:
3251:
3250:
3245:
3240:
3235:
3230:
3225:
3220:
3215:
3214:
3213:
3203:
3198:
3197:
3196:
3191:
3180:
3178:
3172:
3171:
3168:
3167:
3165:
3164:
3163:
3162:
3157:
3147:
3146:
3145:
3140:
3135:
3127:
3122:
3117:
3112:
3106:
3104:
3097:
3096:
3094:
3093:
3088:
3083:
3078:
3073:
3068:
3063:
3058:
3057:
3056:
3051:
3046:
3035:
3033:
3026:
3020:
3019:
3016:
3015:
3013:
3012:
3011:
3010:
3005:
3000:
2995:
2985:
2980:
2975:
2970:
2965:
2960:
2955:
2949:
2947:
2943:
2942:
2940:
2939:
2934:
2929:
2924:
2919:
2914:
2909:
2904:
2899:
2894:
2889:
2883:
2881:
2874:
2868:
2867:
2864:
2863:
2861:
2860:
2855:
2850:
2849:
2848:
2843:
2838:
2833:
2828:
2818:
2817:
2816:
2806:
2805:
2804:
2799:
2789:
2784:
2778:
2776:
2769:
2768:
2766:
2765:
2760:
2755:
2750:
2745:
2740:
2735:
2730:
2725:
2720:
2715:
2714:
2713:
2708:
2703:
2692:
2690:
2683:
2677:
2676:
2673:
2672:
2670:
2669:
2667:Psychoacoustic
2664:
2663:
2662:
2657:
2652:
2644:
2643:
2642:
2637:
2632:
2627:
2622:
2612:
2611:
2610:
2599:
2597:
2593:
2592:
2590:
2589:
2588:
2587:
2582:
2577:
2567:
2562:
2557:
2556:
2555:
2550:
2539:
2537:
2535:Transform type
2528:
2522:
2521:
2518:
2517:
2515:
2514:
2513:
2512:
2504:
2503:
2502:
2499:
2491:
2490:
2489:
2481:
2480:
2479:
2471:
2470:
2469:
2461:
2460:
2459:
2451:
2450:
2449:
2444:
2439:
2430:
2428:
2424:
2423:
2421:
2420:
2415:
2410:
2405:
2400:
2395:
2394:
2393:
2388:
2378:
2373:
2368:
2367:
2366:
2356:
2351:
2346:
2340:
2338:
2334:
2333:
2331:
2330:
2329:
2328:
2323:
2318:
2313:
2308:
2303:
2298:
2293:
2288:
2278:
2272:
2270:
2264:
2263:
2261:
2260:
2259:
2258:
2253:
2248:
2243:
2233:
2228:
2223:
2218:
2213:
2208:
2203:
2202:
2201:
2196:
2191:
2181:
2176:
2171:
2166:
2160:
2158:
2149:
2143:
2142:
2136:
2134:
2133:
2126:
2119:
2111:
2105:
2104:
2103:
2102:
2076:
2056:
2036:
2026:on May 8, 2016
2011:
1990:
1989:External links
1987:
1986:
1985:
1978:
1970:Academic Press
1968:(1 ed.).
1961:
1954:
1935:
1932:
1929:
1928:
1906:
1891:
1873:
1851:
1837:. p. 21.
1831:Proof Patterns
1823:Joshi, Mark S.
1814:
1799:
1796:
1793:
1790:
1775:
1755:
1743:
1718:
1698:
1662:
1650:
1632:
1614:
1596:
1578:
1560:
1526:
1477:
1470:
1450:
1435:
1405:
1359:
1352:
1344:Academic Press
1328:
1263:
1231:
1209:
1190:
1164:
1163:
1161:
1158:
1156:
1155:
1150:
1145:
1140:
1134:
1132:List of codecs
1129:
1124:
1119:
1114:
1109:
1104:
1099:
1094:
1088:
1086:
1083:
1077:
1074:
1060:
1057:
1030:Abstractly, a
1027:
1024:
948:
947:
944:
936:
909:
886:
882:
877:, as follows:
869:
866:
858:
857:
849:
829:
822:
805:
801:Calgary Corpus
783:context-mixing
770:
767:
746:Main article:
743:
740:
716:
713:
693:
690:
682:
679:
678:
677:
669:
666:
665:
664:
658:
652:
646:
640:
634:
628:
622:
613:
599:
593:
587:
577:
571:
558:
548:
542:
534:
531:
530:
529:
523:
517:
514:
508:
502:
496:
491:
485:
482:Monkey's Audio
479:
473:
467:
462:
457:
451:
445:
442:Apple Lossless
439:
431:
428:
427:
426:
420:
411:
410:
409:
393:
383:
370:
346:
343:Huffman coding
340:
331:
325:
310:
307:
281:
278:
270:delta encoding
255:autoregressive
236:indexed images
204:
201:
179:
176:
172:indexed images
135:Huffman coding
122:
119:
111:Lossless audio
24:is a class of
15:
13:
10:
9:
6:
4:
3:
2:
3351:
3340:
3337:
3335:
3332:
3331:
3329:
3313:
3309:
3301:
3299:
3291:
3290:
3287:
3281:
3278:
3277:
3275:
3271:
3265:
3262:
3261:
3259:
3255:
3249:
3246:
3244:
3241:
3239:
3236:
3234:
3231:
3229:
3226:
3224:
3221:
3219:
3216:
3212:
3209:
3208:
3207:
3204:
3202:
3199:
3195:
3192:
3190:
3187:
3186:
3185:
3182:
3181:
3179:
3177:
3173:
3161:
3158:
3156:
3153:
3152:
3151:
3148:
3144:
3141:
3139:
3136:
3134:
3131:
3130:
3128:
3126:
3123:
3121:
3118:
3116:
3113:
3111:
3108:
3107:
3105:
3102:
3098:
3092:
3091:Video quality
3089:
3087:
3084:
3082:
3079:
3077:
3074:
3072:
3069:
3067:
3064:
3062:
3059:
3055:
3052:
3050:
3047:
3045:
3042:
3041:
3040:
3037:
3036:
3034:
3030:
3027:
3025:
3021:
3009:
3006:
3004:
3001:
2999:
2996:
2994:
2991:
2990:
2989:
2986:
2984:
2981:
2979:
2976:
2974:
2971:
2969:
2966:
2964:
2961:
2959:
2956:
2954:
2951:
2950:
2948:
2944:
2938:
2935:
2933:
2930:
2928:
2925:
2923:
2920:
2918:
2915:
2913:
2910:
2908:
2905:
2903:
2900:
2898:
2895:
2893:
2890:
2888:
2885:
2884:
2882:
2878:
2875:
2873:
2869:
2859:
2856:
2854:
2851:
2847:
2844:
2842:
2839:
2837:
2834:
2832:
2829:
2827:
2824:
2823:
2822:
2819:
2815:
2812:
2811:
2810:
2807:
2803:
2800:
2798:
2795:
2794:
2793:
2790:
2788:
2785:
2783:
2780:
2779:
2777:
2774:
2770:
2764:
2761:
2759:
2758:Speech coding
2756:
2754:
2753:Sound quality
2751:
2749:
2746:
2744:
2741:
2739:
2736:
2734:
2731:
2729:
2728:Dynamic range
2726:
2724:
2721:
2719:
2716:
2712:
2709:
2707:
2704:
2702:
2699:
2698:
2697:
2694:
2693:
2691:
2687:
2684:
2682:
2678:
2668:
2665:
2661:
2658:
2656:
2653:
2651:
2648:
2647:
2645:
2641:
2638:
2636:
2633:
2631:
2628:
2626:
2623:
2621:
2618:
2617:
2616:
2613:
2609:
2606:
2605:
2604:
2601:
2600:
2598:
2594:
2586:
2583:
2581:
2578:
2576:
2573:
2572:
2571:
2568:
2566:
2563:
2561:
2558:
2554:
2551:
2549:
2546:
2545:
2544:
2541:
2540:
2538:
2536:
2532:
2529:
2527:
2523:
2511:
2508:
2507:
2505:
2500:
2498:
2495:
2494:
2493:LZ77 + Range
2492:
2488:
2485:
2484:
2482:
2478:
2475:
2474:
2472:
2468:
2465:
2464:
2462:
2458:
2455:
2454:
2452:
2448:
2445:
2443:
2440:
2438:
2435:
2434:
2432:
2431:
2429:
2425:
2419:
2416:
2414:
2411:
2409:
2406:
2404:
2401:
2399:
2396:
2392:
2389:
2387:
2384:
2383:
2382:
2379:
2377:
2374:
2372:
2369:
2365:
2362:
2361:
2360:
2357:
2355:
2352:
2350:
2347:
2345:
2342:
2341:
2339:
2335:
2327:
2324:
2322:
2319:
2317:
2314:
2312:
2309:
2307:
2304:
2302:
2299:
2297:
2294:
2292:
2289:
2287:
2284:
2283:
2282:
2279:
2277:
2274:
2273:
2271:
2269:
2265:
2257:
2254:
2252:
2249:
2247:
2244:
2242:
2239:
2238:
2237:
2234:
2232:
2229:
2227:
2224:
2222:
2219:
2217:
2214:
2212:
2209:
2207:
2204:
2200:
2197:
2195:
2192:
2190:
2187:
2186:
2185:
2182:
2180:
2177:
2175:
2172:
2170:
2167:
2165:
2162:
2161:
2159:
2157:
2153:
2150:
2148:
2144:
2139:
2132:
2127:
2125:
2120:
2118:
2113:
2112:
2109:
2100:
2096:
2093:
2090:
2089:
2085:
2081:
2077:
2065:
2061:
2057:
2045:
2041:
2037:
2025:
2021:
2017:
2014:Phamdo, Nam.
2012:
2001:
1997:
1993:
1992:
1988:
1981:
1975:
1971:
1967:
1962:
1957:
1951:
1947:
1943:
1938:
1937:
1933:
1917:
1910:
1907:
1902:
1895:
1892:
1887:
1883:
1877:
1874:
1862:
1858:
1854:
1848:
1844:
1840:
1836:
1832:
1828:
1824:
1818:
1815:
1811:
1794:
1788:
1778:
1776:0-387-94053-7
1772:
1768:
1767:
1759:
1756:
1753:, p. 38.
1752:
1747:
1744:
1732:
1728:
1722:
1719:
1715:
1709:
1705:
1701:
1695:
1691:
1687:
1683:
1679:
1675:
1669:
1667:
1663:
1660:, p. 41.
1659:
1654:
1651:
1646:
1642:
1636:
1633:
1628:
1624:
1618:
1615:
1610:
1606:
1600:
1597:
1592:
1588:
1582:
1579:
1571:
1564:
1561:
1556:
1550:
1539:
1538:
1530:
1527:
1522:
1518:
1513:
1508:
1504:
1500:
1496:
1492:
1488:
1481:
1478:
1473:
1467:
1464:. CRC Press.
1463:
1462:
1454:
1451:
1446:
1442:
1438:
1436:0-7803-4428-6
1432:
1428:
1424:
1420:
1416:
1409:
1406:
1401:
1397:
1393:
1389:
1385:
1381:
1377:
1373:
1369:
1363:
1360:
1355:
1353:9780080922508
1349:
1345:
1341:
1340:
1332:
1329:
1321:
1317:
1313:
1309:
1305:
1301:
1297:
1293:
1289:
1285:
1281:
1274:
1267:
1264:
1256:September 13,
1252:
1248:
1247:
1242:
1235:
1232:
1227:
1223:
1219:
1213:
1210:
1205:
1201:
1194:
1191:
1179:
1175:
1169:
1166:
1159:
1154:
1151:
1149:
1148:Normal number
1146:
1144:
1141:
1138:
1135:
1133:
1130:
1128:
1125:
1123:
1120:
1118:
1115:
1113:
1110:
1108:
1105:
1103:
1100:
1098:
1095:
1093:
1090:
1089:
1084:
1082:
1075:
1073:
1071:
1067:
1058:
1056:
1054:
1050:
1046:
1042:
1037:
1033:
1025:
1023:
1021:
1017:
1016:
1009:
1007:
1003:
997:
995:
991:
987:
982:
979:
973:
970:
965:
962:
956:
954:
945:
941:
937:
934:
930:
926:
922:
918:
914:
910:
907:
903:
899:
895:
891:
887:
883:
880:
879:
878:
876:
867:
865:
861:
855:
850:
847:
843:
839:
835:
830:
827:
823:
820:
817:
814:
810:
806:
802:
798:
797:
796:
794:
790:
786:
784:
780:
776:
768:
766:
764:
760:
756:
749:
741:
739:
737:
731:
729:
725:
721:
714:
712:
710:
709:cryptanalysis
706:
702:
698:
697:Cryptosystems
691:
689:
688:
680:
675:
672:
671:
667:
662:
659:
656:
653:
650:
647:
644:
641:
638:
635:
632:
629:
626:
623:
621:
617:
614:
611:
607:
603:
600:
597:
594:
591:
588:
585:
581:
578:
575:
572:
569:
566:
562:
559:
556:
552:
549:
546:
543:
540:
537:
536:
532:
527:
524:
521:
518:
515:
512:
509:
506:
503:
500:
497:
495:
492:
489:
486:
483:
480:
477:
474:
471:
468:
466:
463:
461:
458:
455:
452:
449:
446:
443:
440:
437:
434:
433:
429:
424:
421:
419:
415:
412:
406:
401:
397:
394:
391:
387:
384:
382:
378:
374:
371:
368:
364:
360:
356:
353:
352:
350:
347:
344:
341:
339:
335:
332:
329:
326:
324:
320:
316:
313:
312:
308:
306:
303:
299:
291:
287:
279:
277:
275:
271:
267:
262:
260:
256:
239:
237:
232:
230:
226:
222:
218:
214:
210:
209:United States
202:
200:
196:
193:
189:
184:
177:
175:
173:
168:
163:
160:
155:
150:
148:
144:
140:
136:
131:
128:
120:
118:
116:
112:
108:
104:
100:
96:
90:
88:
84:
80:
76:
72:
67:
65:
61:
56:
54:
49:
47:
43:
39:
35:
31:
27:
23:
19:
3264:Hutter Prize
3228:Quantization
3133:Compensation
2927:Quantization
2650:Compensation
2216:Shannon–Fano
2156:Entropy type
2146:
2088:overview of
2068:. Retrieved
2063:
2048:. Retrieved
2043:
2028:. Retrieved
2024:the original
2019:
2003:. Retrieved
1999:
1965:
1941:
1919:. Retrieved
1909:
1894:
1886:PKWARE, Inc.
1876:
1864:. Retrieved
1830:
1817:
1780:
1765:
1758:
1746:
1734:. Retrieved
1730:
1721:
1677:
1653:
1644:
1635:
1627:the original
1617:
1608:
1599:
1590:
1581:
1563:
1536:
1529:
1494:
1490:
1480:
1460:
1453:
1418:
1408:
1375:
1371:
1368:Ahmed, Nasir
1362:
1338:
1331:
1320:the original
1283:
1279:
1266:
1254:. Retrieved
1244:
1234:
1226:the original
1222:About Unisys
1221:
1212:
1203:
1193:
1181:. Retrieved
1177:
1168:
1117:Hutter Prize
1079:
1062:
1052:
1048:
1029:
1013:
1010:
1005:
1001:
998:
989:
983:
974:
968:
966:
960:
957:
949:
939:
932:
928:
924:
920:
916:
912:
905:
901:
897:
896:with length
893:
889:
871:
862:
859:
809:Hutter Prize
792:
789:Matt Mahoney
787:
772:
751:
732:
718:
700:
695:
692:Cryptography
684:
609:
605:
526:WMA Lossless
460:Dolby TrueHD
304:
293:
263:
258:
240:
233:
206:
197:
185:
181:
166:
164:
158:
153:
151:
132:
126:
124:
91:
68:
57:
50:
21:
20:
18:
3223:Prefix code
3076:Frame types
2897:Color space
2723:Convolution
2453:LZ77 + ANS
2364:Incremental
2337:Other types
2256:Levenshtein
2070:October 17,
2050:October 17,
2030:October 17,
2005:October 17,
1984:(488 pages)
1960:(790 pages)
1751:Sayood 2002
1736:October 30,
1658:Sayood 2002
1497:(20): 1–7.
1178:bjc.edc.org
868:Limitations
742:Executables
668:3D Graphics
618:– Lossless
604:– formerly
115:lossy audio
30:information
3328:Categories
3280:Mark Adler
3238:Redundancy
3155:Daubechies
3138:Estimation
3071:Frame rate
2993:Daubechies
2953:Chain code
2912:Macroblock
2718:Companding
2655:Estimation
2575:Daubechies
2281:Lempel–Ziv
2241:Exp-Golomb
2169:Arithmetic
1866:August 24,
1645:Free Tools
1160:References
1066:heuristics
978:redundancy
775:benchmarks
769:Benchmarks
763:JavaScript
736:eukaryotes
505:RealPlayer
488:MPEG-4 SLS
418:plain text
284:See also:
195:peaked.
178:Multimedia
121:Techniques
64:redundancy
3257:Community
3081:Interlace
2467:Zstandard
2246:Fibonacci
2236:Universal
2194:Canonical
1861:116983697
1674:Bell, Tim
1623:"Summary"
1549:cite book
1045:injection
943:lossless.
821:data set.
804:Broukhis.
580:JPEG 2000
494:OptimFROG
323:Zstandard
3243:Symmetry
3211:Timeline
3194:FM-index
3039:Bit rate
3032:Concepts
2880:Concepts
2743:Sampling
2696:Bit rate
2689:Concepts
2391:Sequitur
2226:Tunstall
2199:Modified
2189:Adaptive
2147:Lossless
2095:Archived
2084:Archived
1835:Springer
1708:26313283
1682:Springer
1521:22844100
1445:17045923
1400:13894279
1308:18237979
1183:April 9,
1085:See also
1041:lossless
1036:function
911:Because
842:flashzip
610:HD Photo
405:compress
192:JPEG2000
159:Adaptive
3201:Entropy
3150:Wavelet
3129:Motion
2988:Wavelet
2968:Fractal
2963:Deflate
2946:Methods
2733:Latency
2646:Motion
2570:Wavelet
2487:LHA/LZH
2437:Deflate
2386:Re-Pair
2381:Grammar
2211:Shannon
2184:Huffman
2140:methods
1921:June 8,
1714:pp. 8–9
1512:3488212
1380:Bibcode
1316:2765169
1288:Bibcode
961:greater
953:deflate
834:FreeArc
674:OpenCTM
606:WMPhoto
602:JPEG XR
596:JPEG XL
590:JPEG-LS
570:images)
520:WavPack
511:Shorten
438:(ATRAC)
408:utility
355:Deflate
280:Methods
250:
246:
227:-based
3312:codecs
3273:People
3176:Theory
3143:Vector
2660:Vector
2477:Brotli
2427:Hybrid
2326:Snappy
2179:Golomb
2000:github
1976:
1952:
1859:
1849:
1773:
1706:
1696:
1519:
1509:
1468:
1443:
1433:
1398:
1350:
1314:
1306:
1139:(LTAC)
990:define
986:random
969:subset
856:1.30c.
844:, and
728:HapMap
701:before
472:(FLAC)
390:WinRAR
369:images
365:, and
154:static
141:) and
60:random
3103:parts
3101:Codec
3066:Frame
3024:Video
3008:SPIHT
2917:Pixel
2872:Image
2826:ACELP
2797:ADPCM
2787:μ-law
2782:A-law
2775:parts
2773:Codec
2681:Audio
2620:ACELP
2608:ADPCM
2585:SPIHT
2526:Lossy
2510:bzip2
2501:LZHAM
2457:LZFSE
2359:Delta
2251:Gamma
2231:Unary
2206:Range
1857:S2CID
1704:S2CID
1573:(PDF)
1541:(PDF)
1441:S2CID
1396:S2CID
1323:(PDF)
1312:S2CID
1276:(PDF)
1246:ITU-T
921:every
885:file.
846:7-Zip
819:UTF-8
681:Video
574:JBIG2
565:Amiga
513:(SHN)
501:(OSQ)
478:(MLP)
456:(DST)
430:Audio
338:bzip2
319:LZFSE
296:(see
259:error
77:tool
3115:DPCM
2922:PSNR
2853:MDCT
2846:WLPC
2831:CELP
2792:DPCM
2640:WLPC
2625:CELP
2603:DPCM
2553:MDCT
2497:LZMA
2398:LDCT
2376:DPCM
2321:LZWL
2311:LZSS
2306:LZRW
2296:LZJB
2072:2017
2052:2017
2032:2017
2007:2017
1974:ISBN
1950:ISBN
1923:2009
1868:2021
1847:ISBN
1771:ISBN
1738:2022
1694:ISBN
1555:link
1517:PMID
1466:ISBN
1431:ISBN
1376:2419
1348:ISBN
1304:PMID
1258:2019
1185:2022
915:<
888:Let
854:ccmx
799:The
755:demo
685:See
661:WebP
655:TIFF
616:LDCT
608:and
561:ILBM
555:HEVC
551:HEIF
545:FLIF
539:AVIF
379:and
377:7zip
363:gzip
321:and
288:and
244:data
225:LZ77
105:and
103:TIFF
79:gzip
42:data
3160:DWT
3110:DCT
3054:VBR
3049:CBR
3044:ABR
3003:EZW
2998:DWT
2983:RLE
2973:KLT
2958:DCT
2841:LSP
2836:LAR
2821:LPC
2814:FFT
2711:VBR
2706:CBR
2701:ABR
2635:LSP
2630:LAR
2615:LPC
2580:DWT
2565:FFT
2560:DST
2548:DCT
2447:LZS
2442:LZX
2418:RLE
2413:PPM
2408:PAQ
2403:MTF
2371:DMC
2349:CTW
2344:BWT
2316:LZW
2301:LZO
2291:LZ4
2286:842
1839:doi
1686:doi
1507:PMC
1499:doi
1423:doi
1388:doi
1296:doi
1070:zip
838:CCM
816:XML
649:TGA
643:PNG
637:QOI
631:PDF
625:PCX
568:IFF
400:GIF
367:PNG
359:ZIP
315:ANS
213:LZW
107:MNG
99:GIF
97:or
95:PNG
87:MP3
75:GNU
71:ZIP
3330::
2978:LP
2809:FT
2802:DM
2354:CM
2082:.
2062:.
2042:.
2018:.
1998:.
1972:.
1948:.
1884:.
1855:.
1845:.
1833:.
1829:.
1779:.
1729:.
1702:.
1692:.
1665:^
1643:.
1607:.
1589:.
1551:}}
1547:{{
1515:.
1505:.
1495:40
1493:.
1489:.
1439:.
1429:.
1417:.
1394:.
1386:.
1374:.
1342:.
1310:.
1302:.
1294:.
1284:12
1282:.
1278:.
1249:.
1243:.
1220:.
1202:.
1176:.
1015:pi
996:.
919:,
840:,
836:,
765:.
759:1k
381:xz
361:,
190:.
174:.
3314:)
3310:(
2130:e
2123:t
2116:v
2074:.
2054:.
2034:.
2009:.
1982:.
1958:.
1925:.
1903:.
1870:.
1841::
1798:)
1795:x
1792:(
1789:C
1740:.
1716:.
1710:.
1688::
1611:.
1593:.
1557:)
1523:.
1501::
1474:.
1447:.
1425::
1402:.
1390::
1382::
1356:.
1298::
1290::
1260:.
1206:.
1187:.
1053:N
1049:N
1006:N
1002:N
940:N
935:.
933:N
929:F
925:N
917:M
913:N
908:.
906:F
902:N
898:M
894:F
890:M
848:.
828:.
586:)
557:)
274:Δ
248:—
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.