Knowledge

Asymptotically optimal algorithm

Source 📝

25: 586:. Whether this algorithm is asymptotically optimal is unknown, and would be likely to be hailed as a significant result if it were resolved either way. Coppersmith and Winograd (1982) proved that matrix multiplication has a weak form of speed-up among a restricted class of algorithms (Strassen-type bilinear identities with lambda-computation). 253:
A consequence of an algorithm being asymptotically optimal is that, for large enough inputs, no algorithm can outperform it by more than a constant factor. For this reason, asymptotically optimal algorithms are often seen as the "end of the line" in research, the attaining of a result that cannot be
257:
In practice it's useful to find algorithms that perform better, even if they do not enjoy any asymptotic advantage. New algorithms may also present advantages such as better performance on specific inputs, decreased use of resources, or being simpler to describe and implement. Thus asymptotically
306:
may be "broken" by an asymptotically optimal algorithm (assuming the analysis did not take these hardware optimizations into account). In this case, there could be sub-optimal algorithms that make better use of these features and outperform an optimal algorithm on realistic
462:
model, as in the case of comparison sorts, or a particular organization of memory. By violating these assumptions, a new algorithm could potentially asymptotically outperform the lower bound and the "asymptotically optimal" algorithms.
438: 254:
dramatically improved upon. Conversely, if an algorithm is not asymptotically optimal, this implies that as the input grows in size, the algorithm performs increasingly worse than the best possible algorithm.
454:
Although usually applied to time efficiency, an algorithm can be said to use asymptotically optimal space, random bits, number of processors, or any other resource commonly measured using big-O notation.
331:
data structure published in "Resizable Arrays in Optimal Time and Space", which can index in constant time but on many machines carries a heavy practical penalty compared to ordinary array indexing.
137:
a constant factor (independent of the input size) worse than the best possible algorithm. It is a term commonly encountered in computer science research as a result of widespread use of
226:
properties which can be exploited in construction of algorithms, in addition to comparisons, then asymptotically faster algorithms may be possible. For example, if it is known that the
518: 584: 551: 458:
Sometimes vague or implicit assumptions can make it unclear whether an algorithm is asymptotically optimal. For example, a lower bound theorem might assume a particular
280:
It is too complex, so that the difficulty of comprehending and implementing it correctly outweighs its potential benefit in the range of input sizes under consideration.
261:
Although asymptotically optimal algorithms are important theoretical results, an asymptotically optimal algorithm might not be used in a number of practical situations:
42: 369: 89: 621: 61: 600: 68: 108: 75: 650: 57: 46: 288: 134: 595: 479:
whether many of the most well-known algorithms today are asymptotically optimal or not. For example, there is an
472: 222: 82: 35: 521: 320: 274: 355:)) time is said to be asymptotically optimal. This can also be expressed using limits: suppose that b( 175: 482: 554: 303: 560: 527: 625: 295: 459: 312: 122: 328: 182: 145: 475:
shows that there exist artificially constructed problems with speedup. However, it is an
339:
Formally, suppose that we have a lower-bound theorem showing that a problem requires Ω(f(
348: 324: 138: 644: 476: 299: 284: 144:
More formally, an algorithm is asymptotically optimal with respect to a particular
316: 247: 24: 351:
for the definition of Ω). Then, an algorithm which solves the problem in O(f(
433:{\displaystyle \lim _{n\rightarrow \infty }{\frac {t(n)}{b(n)}}<\infty .} 198: 126: 471:
The nonexistence of an asymptotically optimal algorithm is called speedup.
359:) is a lower bound on the running time, and a given algorithm takes time t( 311:
An example of an asymptotically optimal algorithm not used in practice is
178:, i.e., certain restrictions on operations allowable with the input data. 202: 231: 269:
beyond the range of practical input sizes, such as inputs with more
217:
comparisons, so they are asymptotically optimal in this sense.
160:
of that resource, and the algorithm has been proven to use only
270: 18: 291:
with bad worst-case times can nevertheless solve efficiently.
258:
optimal algorithms are not always the "end of the line".
633:, Department of Computer Science, University of Waterloo 563: 530: 485: 443:
This limit, if it exists, is always at least 1, as t(
372: 363:). Then the algorithm is asymptotically optimal if: 265:It only outperforms more commonly used methods for 174:These proofs require an assumption of a particular 133:if, roughly speaking, for large inputs it performs 49:. Unsourced material may be challenged and removed. 578: 545: 512: 432: 343:)) time to solve for an instance (input) of size 16:Measure of algorithm performance for large inputs 557:, but the best known lower bound is the trivial 374: 283:The inputs encountered in practice fall into 8: 287:that have more efficient algorithms or that 197:comparisons in the average and worst cases. 627:Resizable Arrays in Optimal Time and Space 553:is the very slowly growing inverse of the 148:if the problem has been proven to require 562: 529: 484: 389: 377: 371: 181:As a simple example, it's known that all 109:Learn how and when to remove this message 349:Big O notation § Big Omega notation 612: 7: 47:adding citations to reliable sources 620:Brodnik, Andrej; Carlsson, Svante; 601:Asymptotic computational complexity 205:are comparison sorts which perform 564: 424: 384: 58:"Asymptotically optimal algorithm" 14: 624:; Munro, JI; Demaine, ED (1999), 23: 34:needs additional citations for 573: 567: 540: 534: 513:{\displaystyle O(n\alpha (n))} 507: 504: 498: 489: 415: 409: 401: 395: 381: 1: 220:If the input data have some 667: 596:Element uniqueness problem 579:{\displaystyle \Omega (n)} 546:{\displaystyle \alpha (n)} 238:then they may be sorted 275:computer storage system 651:Analysis of algorithms 580: 547: 522:minimum spanning trees 520:algorithm for finding 514: 473:Blum's speedup theorem 434: 298:optimizations such as 273:than could fit in any 131:asymptotically optimal 581: 548: 515: 435: 294:On modern computers, 561: 528: 483: 370: 289:heuristic algorithms 176:model of computation 43:improve this article 304:parallel processing 246:time, e.g., by the 576: 555:Ackermann function 543: 510: 430: 388: 335:Formal definitions 622:Sedgewick, Robert 419: 373: 327:. Another is the 185:require at least 119: 118: 111: 93: 658: 635: 634: 632: 617: 585: 583: 582: 577: 552: 550: 549: 544: 519: 517: 516: 511: 460:abstract machine 439: 437: 436: 431: 420: 418: 404: 390: 387: 313:Bernard Chazelle 268: 245: 237: 229: 216: 196: 183:comparison sorts 171: 159: 123:computer science 114: 107: 103: 100: 94: 92: 51: 27: 19: 666: 665: 661: 660: 659: 657: 656: 655: 641: 640: 639: 638: 630: 619: 618: 614: 609: 592: 559: 558: 526: 525: 481: 480: 469: 405: 391: 368: 367: 337: 329:resizable array 266: 239: 235: 234:from the range 227: 206: 186: 161: 149: 115: 104: 98: 95: 52: 50: 40: 28: 17: 12: 11: 5: 664: 662: 654: 653: 643: 642: 637: 636: 611: 610: 608: 605: 604: 603: 598: 591: 588: 575: 572: 569: 566: 542: 539: 536: 533: 509: 506: 503: 500: 497: 494: 491: 488: 468: 465: 441: 440: 429: 426: 423: 417: 414: 411: 408: 403: 400: 397: 394: 386: 383: 380: 376: 336: 333: 325:simple polygon 319:algorithm for 309: 308: 292: 281: 278: 139:big-O notation 129:is said to be 117: 116: 31: 29: 22: 15: 13: 10: 9: 6: 4: 3: 2: 663: 652: 649: 648: 646: 629: 628: 623: 616: 613: 606: 602: 599: 597: 594: 593: 589: 587: 570: 556: 537: 531: 523: 501: 495: 492: 486: 478: 474: 466: 464: 461: 456: 452: 450: 446: 427: 421: 412: 406: 398: 392: 378: 366: 365: 364: 362: 358: 354: 350: 346: 342: 334: 332: 330: 326: 322: 321:triangulation 318: 314: 305: 301: 297: 293: 290: 286: 285:special cases 282: 279: 276: 272: 264: 263: 262: 259: 255: 251: 249: 243: 233: 225: 224: 218: 214: 210: 204: 200: 194: 190: 184: 179: 177: 172: 169: 165: 157: 153: 147: 142: 140: 136: 132: 128: 124: 113: 110: 102: 91: 88: 84: 81: 77: 74: 70: 67: 63: 60: –  59: 55: 54:Find sources: 48: 44: 38: 37: 32:This article 30: 26: 21: 20: 626: 615: 477:open problem 470: 457: 453: 448: 444: 442: 360: 356: 352: 344: 340: 338: 310: 300:memory cache 260: 256: 252: 241: 230:objects are 221: 219: 212: 208: 192: 188: 180: 173: 167: 163: 155: 151: 143: 130: 120: 105: 99:October 2008 96: 86: 79: 72: 65: 53: 41:Please help 36:verification 33: 317:linear-time 248:bucket sort 607:References 69:newspapers 565:Ω 532:α 496:α 425:∞ 385:∞ 382:→ 199:Mergesort 127:algorithm 645:Category 590:See also 524:, where 296:hardware 232:integers 223:a priori 203:heapsort 146:resource 135:at worst 467:Speedup 83:scholar 447:) ≥ b( 85:  78:  71:  64:  56:  631:(PDF) 347:(see 323:of a 307:data. 125:, an 90:JSTOR 76:books 422:< 302:and 271:bits 211:log 201:and 191:log 62:news 451:). 375:lim 315:'s 170:)). 121:In 45:by 647:: 250:. 240:O( 207:O( 187:Ω( 162:O( 158:)) 150:Ω( 141:. 574:) 571:n 568:( 541:) 538:n 535:( 508:) 505:) 502:n 499:( 493:n 490:( 487:O 449:n 445:n 428:. 416:) 413:n 410:( 407:b 402:) 399:n 396:( 393:t 379:n 361:n 357:n 353:n 345:n 341:n 277:. 267:n 244:) 242:N 236:, 228:N 215:) 213:n 209:n 195:) 193:n 189:n 168:n 166:( 164:f 156:n 154:( 152:f 112:) 106:( 101:) 97:( 87:· 80:· 73:· 66:· 39:.

Index


verification
improve this article
adding citations to reliable sources
"Asymptotically optimal algorithm"
news
newspapers
books
scholar
JSTOR
Learn how and when to remove this message
computer science
algorithm
at worst
big-O notation
resource
model of computation
comparison sorts
Mergesort
heapsort
a priori
integers
bucket sort
bits
computer storage system
special cases
heuristic algorithms
hardware
memory cache
parallel processing

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.