Knowledge

Slepian–Wolf coding

Source 📝

281: 474: 1157: 28: 900:. In other words, two isolated sources can compress data as efficiently as if they were communicating with each other. The whole system is operating in an asymmetric way (compression rate for the two sources are asymmetric). 675:. However, with joint decoding, if vanishing error probability for long sequences is accepted, the Slepian–Wolf theorem shows that much better compression rate can be achieved. As long as the total rate of 216: 467: 401: 346: 821: 878: 748: 274: 247: 633: 604: 535: 506: 898: 841: 779: 713: 693: 673: 653: 575: 555: 150: 473: 1217: 1198: 1093: 1055: 1017: 280: 82: 934: 109: 1222: 93: 479:
If both the encoder and the decoder of the two sources are independent, the lowest rate it can achieve for lossless compression is
143: 946: 276:, the Slepian–Wolf theorem gives a theoretical bound for the lossless coding rate for distributed coding of the two sources. 67: 119: 37: 124: 1191: 136: 98: 781:
is available at the decoder side but not accessible at the encoder side. This can be treated as the condition that
211:
Distributed coding is the coding of two, in this case, or more dependent sources with separate encoders and a joint
180: 407: 1184: 1143:
algorithm for video compression that performs close to the Slepian–Wolf bound (with links to source code).
1102: 352: 297: 1015:(March 1975). "A proof of the data compression theorem of Slepian and Wolf for ergodic sources" by T.". 1091:(January 1976). "The rate-distortion function for source coding with side information at the decoder". 929: 755: 200: 196: 62: 42: 1107: 47: 1164: 164: 57: 19: 784: 761:
A special case of distributed coding is compression with decoder side information, where source
1120: 1072: 1034: 1112: 1064: 1026: 924: 846: 718: 114: 72: 252: 225: 1012: 904: 609: 580: 511: 482: 1168: 1084: 908: 883: 826: 764: 698: 678: 658: 638: 560: 540: 1211: 1046: 184: 168: 52: 27: 1156: 77: 903:
This bound has been extended to the case of more than two correlated sources by
1140: 1124: 1116: 1076: 1068: 1038: 1030: 1088: 1050: 912: 192: 188: 220: 751: 941: 1053:(July 1973). "Noiseless coding of correlated information sources". 212: 967: 965: 963: 961: 750:
and none of the sources is encoded with a rate smaller than its
291:
The bound for the lossless coding rates as shown below:
1172: 915:
with regard to lossy coding of joint Gaussian sources.
907:
in 1975, and similar results were obtained in 1976 by
886: 849: 829: 787: 767: 721: 701: 681: 661: 641: 612: 583: 563: 543: 514: 485: 410: 355: 300: 255: 228: 754:, distributed coding can achieve arbitrarily small 892: 872: 835: 815: 773: 742: 707: 687: 667: 647: 627: 598: 569: 549: 529: 500: 461: 395: 340: 268: 241: 1192: 144: 8: 971: 1199: 1185: 191:in 1973. It is a method of theoretically 151: 137: 15: 1106: 885: 859: 848: 828: 792: 786: 766: 720: 700: 680: 660: 640: 611: 582: 562: 542: 513: 484: 462:{\displaystyle R_{X}+R_{Y}\geq H(X,Y).\,} 458: 428: 415: 409: 392: 378: 360: 354: 337: 323: 305: 299: 260: 254: 233: 227: 995: 1094:IEEE Transactions on Information Theory 1056:IEEE Transactions on Information Theory 1018:IEEE Transactions on Information Theory 957: 217:independent and identically distributed 18: 983: 7: 1153: 1151: 215:. Given two statistically dependent 715:is larger than their joint entropy 396:{\displaystyle R_{Y}\geq H(Y|X),\,} 341:{\displaystyle R_{X}\geq H(X|Y),\,} 83:Limiting density of discrete points 935:Synchronization (computer science) 14: 94:Asymptotic equipartition property 1155: 823:has already been used to encode 472: 279: 26: 110:Shannon's source coding theorem 1218:Error detection and correction 947:Timeline of information theory 867: 860: 853: 810: 804: 737: 725: 622: 616: 593: 587: 524: 518: 495: 489: 452: 440: 386: 379: 372: 331: 324: 317: 68:Conditional mutual information 1: 1171:. You can help Knowledge by 120:Noisy-channel coding theorem 1239: 1150: 816:{\displaystyle R_{Y}=H(Y)} 1141:Wyner-Ziv Coding of Video 843:, while we intend to use 181:distributed source coding 1223:Telecommunications stubs 1163:This article related to 1117:10.1109/TIT.1976.1055508 1069:10.1109/TIT.1973.1055037 1031:10.1109/TIT.1975.1055356 972:Slepian & Wolf 1973 219:finite-alphabet random 125:Shannon–Hartley theorem 894: 874: 873:{\displaystyle H(X|Y)} 837: 817: 775: 744: 743:{\displaystyle H(X,Y)} 709: 689: 669: 649: 629: 600: 571: 551: 531: 502: 463: 397: 342: 270: 243: 99:Rate–distortion theory 895: 875: 838: 818: 776: 745: 710: 690: 670: 650: 635:are the entropies of 630: 601: 572: 552: 532: 503: 464: 398: 343: 271: 269:{\displaystyle Y^{n}} 244: 242:{\displaystyle X^{n}} 996:Wyner & Ziv 1976 930:Data synchronization 884: 847: 827: 785: 765: 758:for long sequences. 719: 699: 679: 659: 639: 628:{\displaystyle H(Y)} 610: 599:{\displaystyle H(X)} 581: 577:respectively, where 561: 541: 530:{\displaystyle H(Y)} 512: 501:{\displaystyle H(X)} 483: 408: 353: 298: 253: 226: 175:, also known as the 63:Directed information 43:Differential entropy 986:, pp. 226–228. 974:, pp. 471–480. 197:lossless compressed 173:Slepian–Wolf coding 48:Conditional entropy 1165:telecommunications 890: 870: 833: 813: 771: 740: 705: 685: 665: 645: 625: 596: 567: 547: 527: 498: 459: 393: 338: 266: 239: 177:Slepian–Wolf bound 165:information theory 58:Mutual information 20:Information theory 1180: 1179: 1047:Slepian, David S. 893:{\displaystyle X} 836:{\displaystyle Y} 774:{\displaystyle Y} 756:error probability 708:{\displaystyle Y} 688:{\displaystyle X} 668:{\displaystyle Y} 648:{\displaystyle X} 570:{\displaystyle Y} 550:{\displaystyle X} 179:, is a result in 161: 160: 1230: 1201: 1194: 1187: 1159: 1152: 1128: 1110: 1080: 1042: 1013:Cover, Thomas M. 999: 998:, pp. 1–10. 993: 987: 981: 975: 969: 925:Data compression 899: 897: 896: 891: 879: 877: 876: 871: 863: 842: 840: 839: 834: 822: 820: 819: 814: 797: 796: 780: 778: 777: 772: 749: 747: 746: 741: 714: 712: 711: 706: 694: 692: 691: 686: 674: 672: 671: 666: 654: 652: 651: 646: 634: 632: 631: 626: 605: 603: 602: 597: 576: 574: 573: 568: 556: 554: 553: 548: 536: 534: 533: 528: 507: 505: 504: 499: 476: 468: 466: 465: 460: 433: 432: 420: 419: 402: 400: 399: 394: 382: 365: 364: 347: 345: 344: 339: 327: 310: 309: 283: 275: 273: 272: 267: 265: 264: 248: 246: 245: 240: 238: 237: 153: 146: 139: 115:Channel capacity 73:Relative entropy 30: 16: 1238: 1237: 1233: 1232: 1231: 1229: 1228: 1227: 1208: 1207: 1206: 1205: 1148: 1146: 1136: 1131: 1085:Wyner, Aaron D. 1083: 1045: 1011: 1007: 1002: 994: 990: 982: 978: 970: 959: 955: 921: 905:Thomas M. Cover 882: 881: 845: 844: 825: 824: 788: 783: 782: 763: 762: 717: 716: 697: 696: 677: 676: 657: 656: 637: 636: 608: 607: 579: 578: 559: 558: 539: 538: 510: 509: 481: 480: 424: 411: 406: 405: 356: 351: 350: 301: 296: 295: 289: 256: 251: 250: 229: 224: 223: 209: 157: 12: 11: 5: 1236: 1234: 1226: 1225: 1220: 1210: 1209: 1204: 1203: 1196: 1189: 1181: 1178: 1177: 1160: 1145: 1144: 1137: 1135: 1134:External links 1132: 1130: 1129: 1108:10.1.1.137.494 1081: 1063:(4): 471–480. 1043: 1025:(2): 226–228. 1008: 1006: 1003: 1001: 1000: 988: 976: 956: 954: 951: 950: 949: 944: 939: 938: 937: 927: 920: 917: 909:Aaron D. Wyner 889: 869: 866: 862: 858: 855: 852: 832: 812: 809: 806: 803: 800: 795: 791: 770: 739: 736: 733: 730: 727: 724: 704: 684: 664: 644: 624: 621: 618: 615: 595: 592: 589: 586: 566: 546: 526: 523: 520: 517: 497: 494: 491: 488: 470: 469: 457: 454: 451: 448: 445: 442: 439: 436: 431: 427: 423: 418: 414: 403: 391: 388: 385: 381: 377: 374: 371: 368: 363: 359: 348: 336: 333: 330: 326: 322: 319: 316: 313: 308: 304: 288: 285: 263: 259: 236: 232: 208: 205: 183:discovered by 159: 158: 156: 155: 148: 141: 133: 130: 129: 128: 127: 122: 117: 112: 104: 103: 102: 101: 96: 88: 87: 86: 85: 80: 75: 70: 65: 60: 55: 50: 45: 40: 32: 31: 23: 22: 13: 10: 9: 6: 4: 3: 2: 1235: 1224: 1221: 1219: 1216: 1215: 1213: 1202: 1197: 1195: 1190: 1188: 1183: 1182: 1176: 1174: 1170: 1166: 1161: 1158: 1154: 1149: 1142: 1139: 1138: 1133: 1126: 1122: 1118: 1114: 1109: 1104: 1100: 1096: 1095: 1090: 1086: 1082: 1078: 1074: 1070: 1066: 1062: 1058: 1057: 1052: 1051:Wolf, Jack K. 1048: 1044: 1040: 1036: 1032: 1028: 1024: 1020: 1019: 1014: 1010: 1009: 1004: 997: 992: 989: 985: 980: 977: 973: 968: 966: 964: 962: 958: 952: 948: 945: 943: 940: 936: 933: 932: 931: 928: 926: 923: 922: 918: 916: 914: 910: 906: 901: 887: 864: 856: 850: 830: 807: 801: 798: 793: 789: 768: 759: 757: 753: 734: 731: 728: 722: 702: 682: 662: 642: 619: 613: 590: 584: 564: 544: 521: 515: 492: 486: 477: 475: 455: 449: 446: 443: 437: 434: 429: 425: 421: 416: 412: 404: 389: 383: 375: 369: 366: 361: 357: 349: 334: 328: 320: 314: 311: 306: 302: 294: 293: 292: 286: 284: 282: 277: 261: 257: 234: 230: 222: 218: 214: 207:Problem setup 206: 204: 202: 198: 194: 190: 186: 185:David Slepian 182: 178: 174: 170: 169:communication 166: 154: 149: 147: 142: 140: 135: 134: 132: 131: 126: 123: 121: 118: 116: 113: 111: 108: 107: 106: 105: 100: 97: 95: 92: 91: 90: 89: 84: 81: 79: 76: 74: 71: 69: 66: 64: 61: 59: 56: 54: 53:Joint entropy 51: 49: 46: 44: 41: 39: 36: 35: 34: 33: 29: 25: 24: 21: 17: 1173:expanding it 1162: 1147: 1098: 1092: 1060: 1054: 1022: 1016: 991: 979: 902: 760: 478: 471: 290: 278: 210: 176: 172: 162: 78:Entropy rate 1101:(1): 1–10. 199:correlated 1212:Categories 1089:Ziv, Jacob 984:Cover 1975 953:References 880:to encode 1125:0018-9448 1103:CiteSeerX 1077:0018-9448 1039:0018-9448 913:Jacob Ziv 435:≥ 367:≥ 312:≥ 221:sequences 189:Jack Wolf 919:See also 1005:Sources 752:entropy 287:Theorem 213:decoder 201:sources 38:Entropy 1123:  1105:  1075:  1037:  942:DISCUS 193:coding 171:, the 1167:is a 1169:stub 1121:ISSN 1073:ISSN 1035:ISSN 911:and 695:and 655:and 606:and 557:and 537:for 508:and 249:and 195:two 187:and 167:and 1113:doi 1065:doi 1027:doi 163:In 1214:: 1119:. 1111:. 1099:22 1097:. 1087:; 1071:. 1061:19 1059:. 1049:; 1033:. 1023:21 1021:. 960:^ 203:. 1200:e 1193:t 1186:v 1175:. 1127:. 1115:: 1079:. 1067:: 1041:. 1029:: 888:X 868:) 865:Y 861:| 857:X 854:( 851:H 831:Y 811:) 808:Y 805:( 802:H 799:= 794:Y 790:R 769:Y 738:) 735:Y 732:, 729:X 726:( 723:H 703:Y 683:X 663:Y 643:X 623:) 620:Y 617:( 614:H 594:) 591:X 588:( 585:H 565:Y 545:X 525:) 522:Y 519:( 516:H 496:) 493:X 490:( 487:H 456:. 453:) 450:Y 447:, 444:X 441:( 438:H 430:Y 426:R 422:+ 417:X 413:R 390:, 387:) 384:X 380:| 376:Y 373:( 370:H 362:Y 358:R 335:, 332:) 329:Y 325:| 321:X 318:( 315:H 307:X 303:R 262:n 258:Y 235:n 231:X 152:e 145:t 138:v

Index

Information theory

Entropy
Differential entropy
Conditional entropy
Joint entropy
Mutual information
Directed information
Conditional mutual information
Relative entropy
Entropy rate
Limiting density of discrete points
Asymptotic equipartition property
Rate–distortion theory
Shannon's source coding theorem
Channel capacity
Noisy-channel coding theorem
Shannon–Hartley theorem
v
t
e
information theory
communication
distributed source coding
David Slepian
Jack Wolf
coding
lossless compressed
sources
decoder

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.