Knowledge (XXG)

Mid-range

Source 📝

149:, as outliers change it significantly. Indeed, for many distributions it is one of the least efficient and least robust statistics. However, it finds some use in special cases: it is the maximally efficient estimator for the center of a uniform distribution, trimmed mid-ranges address robustness, and as an 340:
Conversely, for the normal distribution, the sample mean is the UMVU estimator of the mean. Thus for platykurtic distributions, which can often be thought of as between a uniform distribution and a normal distribution, the informativeness of the middle sample points versus the extrema values varies
331:
and sample minimum, together with sample size, are a sufficient statistic for the population maximum and minimum – the distribution of other samples, conditional on a given maximum and minimum, is just the uniform distribution between the maximum and minimum and thus add no information. See
336:
for further discussion. Thus the mid-range, which is an unbiased and sufficient estimator of the population mean, is in fact the UMVU: using the sample mean just adds noise based on the uninformative distribution of points within this range.
539: 609: 133:
defined as the difference between maximum and minimum values. The two measures are complementary in sense that if one knows the mid-range and the range, one can find the sample maximum and minimum values.
120: 709: 341:
from "equal" for normal to "uninformative" for uniform, and for different distributions, one or the other (or some combination thereof) may be most efficient. A robust analog is the
169:
of 0, meaning that a single observation can change it arbitrarily. Further, it is highly influenced by outliers: increasing the sample maximum or decreasing the sample minimum by
278:
can be interpreted as the fully trimmed (50%) mid-range; this accords with the convention that the median of an even number of points is the mean of the two middle points.
233: 202: 472: 324: 902: 879: 856: 622: 562: 377:. The following table summarizes empirical data comparing three estimators of the mean for distributions of varied kurtosis; the 435:= 1 or 2, the midrange and the mean are equal (and coincide with the median), and are most efficient for all distributions. For 320: 293:: differences of midsummaries, such as midhinge minus the median, give measures of skewness at different points in the tail. 439:= 3, the modified mean is the median, and instead the mean is the most efficient measure of central tendency for values of 459: 927: 658: 75: 668: 142: 161:
The midrange is highly sensitive to outliers and ignores all but two data points. It is therefore a very non-
634: 302: 650: 282: 130: 794: 638: 549: 712: 333: 58: 724: 126: 898: 875: 852: 662: 239: 35: 31: 162: 54: 358: 263: 207: 176: 166: 62: 922: 868: 845: 382: 328: 235:
It is thus of little use in practical statistics, unless outliers are already handled.
916: 891: 614:
and, in particular, the variance does not decrease to zero as the sample size grows.
378: 773: 778:(Master's). University of North Carolina at Chapel Hill. Table (4.1), pp. 32–34. 309: 286: 150: 204:
while it changes the sample mean, which also has breakdown point of 0, by only
313: 42: 305: 301:
Despite its drawbacks, in some cases it is useful: the midrange is a highly
137:
The mid-range is rarely used in practical statistical analysis, as it lacks
357:
from 4 to 20) drawn from a sufficiently platykurtic distribution (negative
17: 729: 290: 271: 66: 775:
An Investigation of Measures of Central Tendency Used in Quality Control
373:)²) − 3), the mid-range is an efficient estimator of the mean 342: 654: 534:{\displaystyle \operatorname {var} (M)={\frac {\pi ^{2}}{24\ln(n)}}.} 275: 145:
of interest, because it ignores all intermediate points, and lacks
649:
While the mean of a set of values minimizes the sum of squares of
345:, which averages the midhinge (25% trimmed mid-range) and median. 893:
Applications, Basics and Computing of Exploratory Data Analysis
604:{\displaystyle \operatorname {var} (M)={\frac {\pi ^{2}}{12}}} 323:
with unknown maximum and minimum, the mid-range is the
671: 565: 475: 210: 179: 78: 890: 867: 844: 703: 603: 533: 227: 196: 114: 672: 385:, where the maximum and minimum are eliminated. 281:These trimmed midranges are also of interest as 97: 88: 759: 27:Arithmetic mean of the maximum and the minimum 831: 819: 807: 446:from 2.0 to 6.0 as well as from −0.8 to 2.0. 325:uniformly minimum-variance unbiased estimator 308:of μ, given a small sample of a sufficiently 262:)% percentiles, and is more robust, having a 115:{\displaystyle M={\frac {\max x+\min x}{2}}.} 8: 870:The Advanced Theory of Statistics, Volume 1 847:The Oxford dictionary of Statistical Terms 556:is unbiased, and has a variance given by: 466:is unbiased, and has a variance given by: 153:, it is simple to understand and compute. 704:{\displaystyle \max \left|x_{i}-m\right|} 684: 670: 590: 584: 564: 500: 494: 474: 254:% trimmed midrange is the average of the 214: 209: 183: 178: 85: 77: 65:of the maximum and minimum values of the 889:Velleman, P. F.; Hoaglin, D. C. (1981). 387: 312:distribution, but it is inefficient for 125:The mid-range is closely related to the 740: 791:Statistical methods in quality control 747: 7: 327:(UMVU) estimator for the mean. The 316:distributions, such as the normal. 274:, which is the 25% midsummary. The 866:Kendall, M.G.; Stuart, A. (1969). 25: 789:Cowden, Dudley Johnstone (1957). 270:%. In the middle of these is the 772:Vinson, William Daniel (1951). 321:continuous uniform distribution 146: 138: 578: 572: 522: 516: 488: 482: 1: 661:, the midrange minimizes the 460:standard normal distribution 398:Most efficient estimator of 851:. Oxford University Press. 760:Velleman & Hoaglin 1981 944: 793:. Prentice-Hall. pp.  659:average absolute deviation 29: 832:Kendall & Stuart 1969 820:Kendall & Stuart 1969 808:Kendall & Stuart 1969 711:): it is a solution to a 173:changes the mid-range by 141:as an estimator for most 353:For small sample sizes ( 635:asymptotic distribution 289:of central location or 242:midrange is known as a 705: 605: 535: 283:descriptive statistics 229: 198: 131:statistical dispersion 116: 30:For loudspeakers, see 706: 617:For a sample of size 606: 544:For a sample of size 536: 454:For a sample of size 230: 199: 117: 34:. For computers, see 669: 639:Laplace distribution 623:uniform distribution 621:from a zero-centred 563: 550:Laplace distribution 473: 228:{\displaystyle x/n.} 208: 197:{\displaystyle x/2,} 177: 76: 713:variational problem 450:Sampling properties 334:German tank problem 319:For example, for a 928:Summary statistics 843:Dodge, Y. (2003). 725:Range (statistics) 701: 601: 548:from the standard 531: 391:Excess kurtosis (γ 225: 194: 112: 897:. Duxbury Press. 663:maximum deviation 599: 526: 429: 428: 258:% and (100− 107: 36:Midrange computer 32:Mid-range speaker 16:(Redirected from 935: 908: 896: 885: 873: 862: 850: 835: 834:, Example 14.12. 829: 823: 817: 811: 805: 799: 798: 786: 780: 779: 769: 763: 757: 751: 745: 710: 708: 707: 702: 700: 696: 689: 688: 625:, the mid-range 610: 608: 607: 602: 600: 595: 594: 585: 552:, the mid-range 540: 538: 537: 532: 527: 525: 505: 504: 495: 462:, the mid-range 388: 248: 247: 234: 232: 231: 226: 218: 203: 201: 200: 195: 187: 163:robust statistic 121: 119: 118: 113: 108: 103: 86: 55:central tendency 53:is a measure of 21: 943: 942: 938: 937: 936: 934: 933: 932: 913: 912: 911: 905: 888: 882: 865: 859: 842: 838: 830: 826: 822:, Example 14.5. 818: 814: 810:, Example 14.4. 806: 802: 788: 787: 783: 771: 770: 766: 758: 754: 746: 742: 738: 721: 680: 679: 675: 667: 666: 647: 586: 561: 560: 506: 496: 471: 470: 452: 445: 394: 372: 368: 364: 359:excess kurtosis 351: 299: 264:breakdown point 245: 244: 206: 205: 175: 174: 167:breakdown point 159: 129:, a measure of 87: 74: 73: 63:arithmetic mean 61:defined as the 39: 28: 23: 22: 15: 12: 11: 5: 941: 939: 931: 930: 925: 915: 914: 910: 909: 903: 886: 880: 863: 857: 839: 837: 836: 824: 812: 800: 781: 764: 752: 739: 737: 734: 733: 732: 727: 720: 717: 699: 695: 692: 687: 683: 678: 674: 657:minimizes the 646: 643: 612: 611: 598: 593: 589: 583: 580: 577: 574: 571: 568: 542: 541: 530: 524: 521: 518: 515: 512: 509: 503: 499: 493: 490: 487: 484: 481: 478: 451: 448: 443: 427: 426: 425:Modified mean 423: 419: 418: 415: 411: 410: 407: 403: 402: 396: 392: 383:truncated mean 370: 366: 362: 361:, defined as γ 350: 347: 329:sample maximum 298: 295: 224: 221: 217: 213: 193: 190: 186: 182: 158: 155: 123: 122: 111: 106: 102: 99: 96: 93: 90: 84: 81: 26: 24: 14: 13: 10: 9: 6: 4: 3: 2: 940: 929: 926: 924: 921: 920: 918: 906: 904:0-87150-409-X 900: 895: 894: 887: 883: 881:0-85264-141-9 877: 872: 871: 864: 860: 858:0-19-920613-9 854: 849: 848: 841: 840: 833: 828: 825: 821: 816: 813: 809: 804: 801: 796: 792: 785: 782: 777: 776: 768: 765: 761: 756: 753: 749: 744: 741: 735: 731: 728: 726: 723: 722: 718: 716: 714: 697: 693: 690: 685: 681: 676: 664: 660: 656: 652: 644: 642: 640: 636: 632: 629:is unbiased, 628: 624: 620: 615: 596: 591: 587: 581: 575: 569: 566: 559: 558: 557: 555: 551: 547: 528: 519: 513: 510: 507: 501: 497: 491: 485: 479: 476: 469: 468: 467: 465: 461: 457: 449: 447: 442: 438: 434: 424: 421: 420: 416: 413: 412: 408: 405: 404: 401: 397: 390: 389: 386: 384: 380: 379:modified mean 376: 360: 356: 349:Small samples 348: 346: 344: 338: 335: 330: 326: 322: 317: 315: 311: 307: 304: 296: 294: 292: 288: 284: 279: 277: 273: 269: 265: 261: 257: 253: 249: 241: 236: 222: 219: 215: 211: 191: 188: 184: 180: 172: 168: 164: 156: 154: 152: 148: 144: 143:distributions 140: 135: 132: 128: 109: 104: 100: 94: 91: 82: 79: 72: 71: 70: 68: 64: 60: 56: 52: 48: 44: 37: 33: 19: 892: 869: 846: 827: 815: 803: 790: 784: 774: 767: 755: 743: 665:(defined as 648: 630: 626: 618: 616: 613: 553: 545: 543: 463: 455: 453: 440: 436: 432: 430: 414:−0.8 to 2.0 406:−1.2 to −0.8 399: 374: 354: 352: 339: 318: 300: 287:L-estimators 280: 267: 259: 255: 251: 243: 237: 170: 160: 136: 124: 50: 46: 40: 874:. Griffin. 637:which is a 422:2.0 to 6.0 310:platykurtic 165:, having a 151:L-estimator 51:mid-extreme 917:Categories 748:Dodge 2003 736:References 651:deviations 314:mesokurtic 297:Efficiency 246:midsummary 157:Robustness 147:robustness 139:efficiency 43:statistics 18:Half-range 691:− 645:Deviation 588:π 570:⁡ 514:⁡ 498:π 480:⁡ 458:from the 409:Midrange 306:estimator 303:efficient 47:mid-range 730:Midhinge 719:See also 653:and the 291:skewness 272:midhinge 67:data set 633:has an 381:is the 343:trimean 240:trimmed 901:  878:  855:  655:median 285:or as 276:median 250:– the 59:sample 45:, the 923:Means 795:67–68 417:Mean 127:range 57:of a 899:ISBN 876:ISBN 853:ISBN 431:For 365:= (μ 673:max 567:var 477:var 369:/(μ 266:of 98:min 89:max 49:or 41:In 919:: 715:. 641:. 631:nM 597:12 511:ln 508:24 238:A 69:: 907:. 884:. 861:. 797:. 762:. 750:. 698:| 694:m 686:i 682:x 677:| 627:M 619:n 592:2 582:= 579:) 576:M 573:( 554:M 546:n 529:. 523:) 520:n 517:( 502:2 492:= 489:) 486:M 483:( 464:M 456:n 444:2 441:γ 437:n 433:n 400:μ 395:) 393:2 375:μ 371:2 367:4 363:2 355:n 268:n 260:n 256:n 252:n 223:. 220:n 216:/ 212:x 192:, 189:2 185:/ 181:x 171:x 110:. 105:2 101:x 95:+ 92:x 83:= 80:M 38:. 20:)

Index

Half-range
Mid-range speaker
Midrange computer
statistics
central tendency
sample
arithmetic mean
data set
range
statistical dispersion
efficiency
distributions
robustness
L-estimator
robust statistic
breakdown point
trimmed
breakdown point
midhinge
median
descriptive statistics
L-estimators
skewness
efficient
estimator
platykurtic
mesokurtic
continuous uniform distribution
uniformly minimum-variance unbiased estimator
sample maximum

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.