Knowledge

T5 (language model)

Source 📝

274: 207: 9718: 8013: 7574: 2497: 169:. Introduced in 2019, T5 models are trained on a massive dataset of text and code using a text-to-text framework. The T5 models are capable of performing the text-based tasks that they were pretrained for. They can also be finetuned to perform other tasks. They have been employed in various applications, including chatbots, machine translation systems, text summarization tools, code generation, and robotics. 8043: 9698: 8033: 8023: 2507: 277:
T5 encoder-decoder structure, showing the attention structure. In the encoder self-attention (lower square), all input tokens attend to each other; In the encoder–decoder cross-attention (upper rectangle), each target token attends to all input tokens; In the decoder self-attention (upper triangle),
1333:
The T5 encoder can be used as a text encoder, much like BERT. It encodes a text into a sequence of real-number vectors, which can be used for downstream applications. For example, Google Imagen uses T5-XXL as text encoder, and the encoded text vectors are used as conditioning on a
877:
Several subsequent models used the T5 architecture, with non-standardized naming conventions used to differentiate them. This section attempts to collect the main ones. An exhaustive list of the variants released by Google Brain is on the GitHub repo for T5X.
1925:
Roberts, Adam; Chung, Hyung Won; Mishra, Gaurav; Levskaya, Anselm; Bradbury, James; Andor, Daniel; Narang, Sharan; Lester, Brian; Gaffney, Colin; Mohiuddin, Afroz; Hawthorne, Curtis; Lewkowycz, Aitor; Salcianu, Alex; Zee, Marc van; Austin, Jacob (2023).
7483: 192:. This pre-training process enables the models to learn general language understanding and generation abilities. T5 models can then be fine-tuned on specific downstream tasks, adapting their knowledge to perform well in various applications. 1824:
Chung, Hyung Won; Hou, Le; Longpre, Shayne; Zoph, Barret; Tay, Yi; Fedus, William; Li, Yunxuan; Wang, Xuezhi; Dehghani, Mostafa; Brahma, Siddhartha; Webson, Albert; Gu, Shixiang Shane; Dai, Zhuyun; Suzgun, Mirac; Chen, Xinyun (2024).
7360: 868:
Compared to the original Transformer, it uses a few minor modifications: layer normalization with no additive bias; placing the layer normalization outside the residual path; relative positional embedding.
1419:
Jiang, Yunfan; Gupta, Agrim; Zhang, Zichen; Wang, Guanzhi; Dou, Yongqiang; Chen, Yanjun; Fei-Fei, Li; Anandkumar, Anima; Zhu, Yuke (2022-10-06). "VIMA: General Robot Manipulation with Multimodal Prompts".
7634: 9592: 7322: 894:
small, base, large, XL, XXL: Improved versions of the original T5 series. These have roughly equal parameters. The 3B and the 11B were changed to "XL" and "XXL", and their shapes are changed:
1330:
The T5 model itself is an encoder-decoder model, allowing it to be used for instruction following. The encoder encodes the instruction, and the decoder autoregressively generates the reply.
863: 289:
These models are often distinguished by their parameter count, which indicates the complexity and potential capacity of the model. The original paper reported the following 5 models:
8079: 881:
Some models are trained from scratch while others are trained by starting with a previous trained model. By default, each model is trained from scratch, except otherwise noted.
8239: 7345: 3691: 9434: 8840: 7506: 983: 943: 719: 640: 377: 337: 4956: 3377: 1082: 678: 476: 1045: 1014: 784: 751: 439: 408: 3565: 1718:
Sanh, Victor; Webson, Albert; Raffel, Colin; Bach, Stephen H.; Sutawika, Lintang; Alyafeai, Zaid; Chaffin, Antoine; Stiegler, Arnaud; Scao, Teven Le (2022-03-17),
8217: 3545: 5040: 2590: 1856:
Longpre, Shayne; Hou, Le; Vu, Tu; Webson, Albert; Chung, Hyung Won; Tay, Yi; Zhou, Denny; Le, Quoc V.; Zoph, Barret; Wei, Jason; Roberts, Adam (2023-07-03).
7791: 7266: 9762: 7904: 7404: 1776: 8628: 8072: 7970: 4695: 2583: 2298: 283: 173: 111: 8950: 8797: 7713: 7352: 3540: 1960:
Tay, Yi; Dehghani, Mostafa; Tran, Vinh Q.; Garcia, Xavier; Wei, Jason; Wang, Xuezhi; Chung, Hyung Won; Shakeri, Siamak; Bahri, Dara (2023-02-28),
8833: 6915: 6818: 1480: 1442: 9623: 8538: 8229: 8065: 7395: 2543: 3636: 1363:
Raffel, Colin; Shazeer, Noam; Roberts, Adam; Lee, Katherine; Narang, Sharan; Matena, Michael; Zhou, Yanqi; Li, Wei; Liu, Peter J. (2020).
9724: 9275: 9012: 8792: 7460: 7420: 6923: 3651: 3570: 3498: 2321: 2085: 1441:
Vaswani, Ashish; Shazeer, Noam; Parmar, Niki; Uszkoreit, Jakob; Jones, Llion; Gomez, Aidan N; Kaiser, Ɓukasz; Polosukhin, Illia (2017).
8399: 7436: 7376: 7368: 7314: 6979: 4562: 2650: 2467: 253: 9536: 9163: 8970: 8826: 8553: 8384: 7688: 4479: 4348: 4018: 3731: 3531: 1775:
Xue, Linting; Barua, Aditya; Constant, Noah; Al-Rfou, Rami; Narang, Sharan; Kale, Mihir; Roberts, Adam; Raffel, Colin (2022-03-25).
6537: 1468: 9491: 8324: 7980: 7933: 3297: 8741: 8394: 7259: 4531: 3739: 3658: 3434: 1219:
variant of T5, by replacing the feedforward layers in the encoder and decoder blocks with mixture of expert feedforward layers.
9757: 9678: 9618: 9216: 8389: 8134: 7883: 7863: 7298: 4687: 2573: 9211: 8900: 8658: 8379: 7412: 4671: 4371: 4342: 3941: 3721: 185: 9653: 9007: 8960: 8955: 8351: 8046: 7869: 7858: 7698: 7476: 7174: 6267: 5595: 5128: 3858: 3716: 3646: 3383: 3144: 2935: 2660: 86: 9704: 9000: 8926: 8696: 8681: 8653: 8518: 8513: 8088: 7761: 5898: 5753: 5589: 4676: 4176: 3184: 2399: 59: 7805: 642:: Number of layers in the encoder; also, number of layers in the decoder. They always have the same number of layers. 9328: 9263: 8864: 8433: 8404: 8182: 7743: 7577: 7252: 5669: 5583: 4841: 4625: 4171: 3852: 3784: 3673: 3481: 3322: 2645: 2578: 9729: 9587: 9226: 9057: 8880: 8276: 8129: 7922: 5430: 5345: 5030: 4681: 4619: 4473: 3625: 3513: 3421: 3414: 2796: 2745: 2456: 137: 9752: 9628: 8885: 8802: 8726: 8458: 8414: 8299: 8197: 7729: 6236: 5849: 5386: 4152: 3882: 3726: 3706: 3701: 2779: 2536: 2500: 2192: 6318: 5564: 794: 1239:(2021): a byte-level version of T5, trained on mC4 (multilingual C4) dataset. It operates on text encoded as 9673: 9658: 9311: 9306: 9206: 9074: 8855: 8706: 8676: 8343: 8005: 7943: 7890: 7468: 7306: 7290: 7206: 7039: 6059: 5887: 4818: 4641: 4631: 3887: 3663: 3599: 3403: 3393: 2665: 2411: 2078: 1694: 9633: 9393: 9112: 9107: 8563: 8256: 8234: 8224: 8192: 8167: 7677: 7557: 7533: 7443: 7222: 6830: 6644: 6617: 6479: 5737: 5525: 5326: 4467: 4299: 4278: 4090: 4013: 3668: 3609: 3560: 3371: 3218: 3082: 2706: 2331: 2186: 1314: 7353:
United States v. Adobe Systems, Inc., Apple Inc., Google Inc., Intel Corporation, Intuit, Inc., and Pixar
9663: 9648: 9613: 9301: 9201: 9069: 8423: 7736: 7541: 7490: 7428: 6634: 6421: 6348: 5449: 5222: 5154: 4969: 4609: 4499: 4273: 4261: 4133: 3931: 3398: 3388: 3271: 3194: 3189: 2691: 2655: 2387: 2198: 2041: 9531: 1397: 91: 9683: 9638: 9084: 9029: 8875: 8870: 8776: 8452: 8428: 8281: 7549: 7498: 6567: 6513: 6484: 6328: 6118: 5939: 5683: 5503: 5005: 4980: 4901: 4881: 4847: 4791: 4516: 4445: 4439: 4403: 4393: 4354: 4322: 4267: 4057: 3696: 3614: 3604: 3439: 3312: 3253: 3212: 2711: 2510: 2252: 162: 106: 9258: 9236: 8985: 8980: 8938: 8890: 8756: 8686: 8643: 8599: 8371: 8361: 8356: 8244: 7083: 6496: 6210: 6199: 6094: 5986: 5975: 5758: 5677: 5558: 5531: 5135: 4916: 4387: 4165: 3683: 3587: 3476: 3444: 3302: 3266: 3057: 3040: 3035: 2986: 2529: 2462: 98: 9697: 1605: 1556: 9643: 9221: 9050: 8766: 8638: 8503: 8266: 8249: 8107: 8026: 7876: 7227: 7212: 6390: 6384: 6367: 5835: 5815: 5748: 5510: 5438: 5419: 5148: 5104: 4728: 4651: 4552: 4536: 4310: 4249: 4128: 3956: 3946: 3789: 2767: 2071: 1967: 1788: 1725: 1669: 1637: 1580: 1528: 1421: 1318: 1300: 1254: 1230: 1216: 1496: 949: 909: 685: 606: 343: 303: 273: 9709: 9501: 9153: 9024: 9017: 8771: 8483: 8291: 8202: 8036: 7784: 7683: 7275: 7199: 6760: 6706: 6611: 6378: 6373: 6242: 6167: 5878: 5790: 5772: 5612: 5474: 5454: 5412: 5275: 4962: 4951: 4946: 4864: 4733: 4712: 4484: 4428: 4413: 4184: 4116: 4035: 4008: 3766: 3641: 3487: 3344: 3317: 3151: 3016: 2914: 2568: 2151: 1939: 1838: 1806: 1476: 1376: 1051: 647: 445: 9454: 9444: 9251: 9045: 8995: 8990: 8933: 8921: 8648: 8533: 8508: 8309: 8212: 7850: 7798: 7777: 7613: 7076: 6991: 6910: 6639: 6585: 6415: 6323: 6294: 6262: 6230: 5968: 5867: 5862: 5730: 5467: 5460: 5339: 5268: 5179: 4975: 4858: 4746: 4433: 4381: 4365: 4288: 4084: 4063: 4040: 3962: 3916: 3525: 3428: 3408: 3240: 3077: 2991: 2966: 2841: 2482: 2370: 2358: 1798: 1665:
Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity
1020: 989: 759: 726: 414: 383: 121: 206: 9567: 9511: 9333: 8975: 8895: 8760: 8721: 8716: 8584: 8314: 8187: 8162: 8144: 8016: 7938: 7819: 7750: 7179: 7090: 6554: 6532: 6519: 6448: 6192: 6107: 6035: 5981: 5904: 5828: 5743: 5723: 5646: 5624: 5618: 5262: 5211: 5204: 5191: 5122: 5035: 5015: 4908: 4870: 4707: 4451: 3552: 3493: 3470: 3359: 3276: 3260: 3206: 3161: 3092: 2950: 2831: 2638: 2304: 2115: 1335: 1289:
series, but scaled up to 20B, and trained with "mixture of denoisers" objective on the C4.
3464: 286:, where the encoder processes the input text, and the decoder generates the output text. 176:, where the encoder processes the input text, and the decoder generates the output text. 9541: 9506: 9496: 9321: 9079: 8905: 8468: 8448: 8172: 7965: 7897: 7722: 7651: 7629: 6964: 6959: 6543: 6490: 6395: 6299: 6289: 6180: 6083: 6065: 6015: 5992: 5911: 5810: 5656: 5640: 5631: 5606: 5572: 5486: 5480: 5398: 5312: 5185: 5071: 5066: 5061: 5048: 4579: 4418: 4408: 4359: 3817: 3508: 3459: 3354: 3292: 3246: 3201: 3111: 3045: 2996: 2925: 2851: 2811: 2216: 9746: 9486: 9466: 9383: 9062: 8731: 8543: 8523: 8304: 7641: 7624: 7514: 7217: 7164: 7061: 7056: 6693: 6622: 6602: 6507: 6473: 6442: 6257: 6134: 6129: 6046: 5926: 5893: 5872: 5663: 5651: 5497: 5392: 5377: 5319: 5228: 5198: 5025: 4894: 4852: 4584: 4567: 4510: 3921: 3454: 3223: 3138: 3087: 3067: 3029: 2955: 2876: 2871: 2846: 2773: 2757: 2752: 2740: 2724: 2686: 2563: 2364: 2264: 1857: 1475:. Cambridge New York Port Melbourne New Delhi Singapore: Cambridge University Press. 1208:(2021): a series of models (from small to XXL) that started from checkpoints of the 9572: 9403: 8711: 7770: 7703: 7671: 7192: 7142: 7125: 7096: 7066: 7016: 7001: 6986: 6953: 6825: 6788: 6772: 6678: 6662: 6573: 6549: 6461: 6405: 6312: 6306: 6205: 6139: 6112: 6077: 5998: 5855: 5711: 5700: 5577: 5444: 5425: 5353: 5296: 5020: 4927: 4876: 4665: 4574: 4423: 4190: 4052: 3983: 3978: 3577: 3307: 3121: 2981: 2971: 2945: 2940: 2866: 2856: 2735: 2696: 2672: 2632: 2477: 2157: 2110: 1365:"Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer" 7484:
Judgement of the German Federal Court of Justice on Google's autocomplete function
1858:"The Flan Collection: Designing Data and Methods for Effective Instruction Tuning" 282:
The T5 series encompasses several models with varying sizes and capabilities, all
9668: 9439: 9348: 9343: 8965: 8943: 8668: 8548: 8261: 8177: 8154: 8102: 7975: 7844: 7812: 7666: 7656: 7646: 7169: 7051: 7011: 6930: 6766: 6667: 6628: 6591: 6501: 6361: 6144: 6124: 6102: 6071: 6052: 5842: 5717: 5553: 5405: 5366: 5333: 5257: 5117: 5111: 5097: 4938: 4521: 4316: 4210: 4157: 4074: 4046: 4029: 3951: 3906: 3519: 3449: 3366: 3133: 3051: 3022: 2976: 2919: 2888: 2730: 2472: 2434: 189: 9562: 9521: 9516: 9429: 9338: 9246: 9158: 9138: 8271: 8057: 7826: 7661: 7135: 7046: 6947: 6890: 6560: 6526: 6467: 6400: 6334: 6252: 6186: 6041: 5800: 5601: 5217: 5142: 5076: 5055: 5010: 4494: 4457: 4254: 4195: 4069: 4023: 3993: 3926: 3840: 3823: 3799: 3631: 3503: 3235: 3116: 3062: 2960: 2930: 2898: 2893: 2882: 2861: 2826: 2716: 2701: 2621: 2326: 2224: 1276: 1264: 787:: Dimension of the key and value vectors used in the self-attention mechanism. 126: 25: 1943: 1842: 1810: 1380: 753:: Dimension of the feedforward network within each encoder and decoder layer. 9557: 9526: 9424: 9268: 9231: 9168: 9122: 9117: 9102: 8139: 7329: 7184: 7119: 7114: 6996: 6778: 6687: 6428: 6341: 6247: 6172: 5492: 5303: 5244: 4636: 4589: 4489: 4003: 3936: 3893: 3845: 2821: 2816: 2626: 2417: 2246: 2163: 2094: 791:
Note that unlike typical Transformers, the 3B and 11B models do not satisfy
166: 30: 8818: 1900: 5168: 1927: 1878: 1826: 278:
each target token attends to present and past target tokens only (causal).
9459: 9291: 8614: 8594: 8579: 8558: 8528: 8473: 8438: 8319: 7693: 7619: 6969: 6900: 6885: 6875: 6865: 6855: 6845: 6813: 6808: 6672: 6454: 6028: 6003: 5945: 5932: 5920: 5547: 5359: 4999: 4614: 4557: 4504: 4462: 4376: 4305: 4293: 4237: 4147: 3835: 3810: 3794: 3711: 3619: 3349: 2784: 2258: 1802: 1364: 67: 1777:"ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models" 9582: 9419: 9373: 9296: 9196: 9191: 9143: 8751: 8609: 8589: 8463: 8207: 8122: 7958: 7158: 7149: 7006: 6974: 6905: 6895: 6880: 6870: 6860: 6850: 6840: 6835: 6803: 6798: 6793: 6783: 6355: 6021: 5821: 5805: 5705: 5538: 5289: 4990: 4701: 4398: 4200: 4079: 3988: 3911: 3828: 3072: 2836: 2762: 2292: 2145: 1750: 1229:, and further trained to perform tasks based only on task instruction ( 238:
denote blanks to be filled, called "sentinels" in the original report.
9597: 9577: 9449: 9241: 8117: 8112: 7928: 7034: 6715: 6699: 6410: 6161: 6009: 5519: 5161: 4887: 4767: 4526: 4283: 4243: 4228: 4216: 4139: 4122: 4096: 3998: 3901: 3804: 3126: 2678: 2552: 2169: 2105: 1993: 1862:
Proceedings of the 40th International Conference on Machine Learning
1225:
3B, 11B (2021): a series of models that started from checkpoints of
92:
https://github.com/google-research/text-to-text-transfer-transformer
1972: 1793: 1730: 1674: 1642: 1631: 1533: 1522: 1467:
Zhang, Aston; Lipton, Zachary; Li, Mu; Smola, Alexander J. (2024).
1426: 9398: 9378: 9368: 9363: 9358: 9353: 9316: 9148: 8807: 8443: 6579: 5795: 5282: 5251: 5237: 5174: 4646: 4205: 3229: 2393: 2017: 1992:
Sutawika, Lintang; Komatsuzaki, Aran; Raffel, Colin (2024-04-15).
1961: 1719: 1663: 1240: 1233:). Different entries in the series uses different finetuning data. 272: 205: 195:
The T5 models were pretrained on many tasks, all in the format of
7361:
Umar Javeed, Sukarma Thapar, Aaqib Javeed vs. Google LLC and Ors.
1721:
Multitask Prompted Training Enables Zero-Shot Task Generalization
9388: 8329: 7953: 7948: 6435: 5767: 4921: 4786: 4781: 4656: 3973: 3968: 3877: 2405: 8822: 8061: 7594: 7248: 6741: 4816: 3764: 2603: 2525: 2067: 8604: 4776: 4762: 4757: 4752: 4738: 1212:
series, but trained further on 100B additional tokens from C4.
1781:
Transactions of the Association for Computational Linguistics
1521:
Shaw, Peter; Uszkoreit, Jakob; Vaswani, Ashish (2018-04-12),
7323:
Google, Inc. v. American Blind & Wallpaper Factory, Inc.
2063: 1630:
Lester, Brian; Al-Rfou, Rami; Constant, Noah (2021-09-02),
1662:
Fedus, William; Zoph, Barret; Shazeer, Noam (2022-06-16),
2521: 1633:
The Power of Scale for Parameter-Efficient Prompt Tuning
1338:. As another example, the AuraFlow diffusion model uses 888:
small, base, large, 3B, 11B (2019): The original models.
1285:
20B (2022): a model with the same architecture as the
16:
Series of large language models developed by Google AI
1524:
Self-Attention with Relative Position Representations
1275:
a model. The original T5 codebase was implemented in
1054: 1023: 992: 952: 912: 797: 762: 729: 688: 650: 609: 448: 417: 386: 346: 306: 252:
judging the grammatical acceptability of a sentence (
680:: Number of attention heads in each attention block. 220:
Thank you <X> me to your party <Y> week.
9606: 9550: 9479: 9412: 9284: 9184: 9177: 9131: 9095: 9038: 8914: 8854: 8785: 8740: 8695: 8667: 8627: 8572: 8494: 8482: 8413: 8370: 8342: 8290: 8153: 8095: 7914: 7836: 7760: 7712: 7605: 7525: 7452: 7387: 7338: 7282: 7107: 7027: 6940: 6752: 6653: 6601: 6278: 6223: 6153: 6093: 5954: 5781: 5693: 5376: 5089: 4989: 4937: 4829: 4721: 4598: 4545: 4335: 4226: 4105: 3868: 3777: 3682: 3586: 3335: 3285: 3170: 3101: 3009: 2907: 2804: 2795: 2614: 2448: 2427: 2380: 2351: 2344: 2314: 2285: 2278: 2239: 2208: 2179: 2138: 2131: 2124: 210:
How a T5 can be finetuned for a summarization task.
172:Like the original Transformer model, T5 models are 132: 120: 97: 85: 58: 36: 24: 1557:"t5x/docs/models.md at main · google-research/t5x" 1249:(2022): a model that started with a checkpoint of 1076: 1039: 1008: 977: 937: 857: 778: 745: 713: 672: 634: 470: 433: 402: 371: 331: 1469:"11.9. Large-Scale Pretraining with Transformers" 1447:Advances in Neural Information Processing Systems 1399:google-research/text-to-text-transfer-transformer 1928:"Scaling Up Models and Data with t5x and seqio" 1827:"Scaling Instruction-Finetuned Language Models" 224:<X> for inviting <Y> last <Z> 7507:Google Spain v AEPD and Mario Costeja GonzĂĄlez 184:The original T5 models are pre-trained on the 8834: 8073: 7260: 2537: 2079: 8: 19: 7792:Googled: The End of the World as We Know It 1321:. It came in sizes of base, large, XL, XXL. 9181: 8841: 8827: 8819: 8491: 8287: 8080: 8066: 8058: 7602: 7591: 7405:Viacom International Inc. v. YouTube, Inc. 7267: 7253: 7245: 6749: 6738: 6282: 5960: 4833: 4826: 4813: 4602: 4109: 3774: 3761: 3592: 3174: 2801: 2611: 2600: 2544: 2530: 2522: 2348: 2282: 2135: 2128: 2086: 2072: 2064: 1606:"config.json · google/t5-v1_1-xxl at main" 243:translate English to German: That is good. 18: 7905:Where on Google Earth is Carmen Sandiego? 1971: 1963:UL2: Unifying Language Learning Paradigms 1792: 1729: 1673: 1641: 1581:"config.json · google/t5-v1_1-xl at main" 1532: 1425: 1267:-based re-implementation of the original 1059: 1053: 1028: 1022: 997: 991: 957: 951: 917: 911: 840: 827: 802: 796: 767: 761: 734: 728: 693: 687: 655: 649: 614: 608: 453: 447: 422: 416: 391: 385: 351: 345: 311: 305: 2018:"Imagen: Text-to-Image Diffusion Models" 1497:"config.json · google-t5/t5-11b at main" 897: 858:{\displaystyle d_{model}=d_{kv}n_{head}} 291: 112:Transformer (deep learning architecture) 3566:Next Lab and Audience Development Group 1350: 159:T5 (Text-to-Text Transfer Transformer) 20:Text-to-Text Transfer Transformer (T5) 1955: 1953: 1873: 1871: 1309:(2024): has the same architecture of 721:: Dimension of the embedding vectors. 7: 9679:Generative adversarial network (GAN) 8539:Simple Knowledge Organization System 8042: 8022: 7396:Perfect 10, Inc. v. Amazon.com, Inc. 2506: 1932:Journal of Machine Learning Research 1831:Journal of Machine Learning Research 1551: 1549: 1462: 1460: 1392: 1390: 1369:Journal of Machine Learning Research 1358: 1356: 1354: 148:/exploring-transfer-learning-with-t5 7461:Rocky Mountain Bank v. Google, Inc. 7421:Authors Guild, Inc. v. Google, Inc. 3499:Quantum Artificial Intelligence Lab 2322:Quantum Artificial Intelligence Lab 7437:Google LLC v. Oracle America, Inc. 7315:Rosetta Stone Ltd. v. Google, Inc. 6538:Speech Recognition & Synthesis 2468:Generative pre-trained transformer 1901:"google/flan-t5-xl · Hugging Face" 14: 9763:Software using the Apache license 8554:Thesaurus (information retrieval) 8006:discontinued products or services 7689:Search engine manipulation effect 4349:Android Cloud to Device Messaging 9717: 9716: 9696: 8041: 8031: 8021: 8012: 8011: 7981:Stanford Digital Library Project 7864:"Google Me" (Teyana Taylor song) 7573: 7572: 3537:Submarine communications cables 3298:CNN/YouTube presidential debates 2591:List of mergers and acquisitions 2505: 2496: 2495: 230:means "end of output", and the 8032: 7635:2004 U.S. presidential election 7398:and A9.com Inc. and Google Inc. 3659:San Francisco tech bus protests 3637:"Ideological Echo Chamber" memo 3378:Computing University Initiative 188:(C4), containing text and code 9629:Recurrent neural network (RNN) 9619:Differentiable neural computer 8135:Natural language understanding 7884:Matt Nathanson: Live at Google 7859:"Google Me" (Kim Zolciak song) 7299:Rescuecom Corp. v. Google Inc. 3692:Back advertisement controversy 1751:"bigscience/T0 · Hugging Face" 1: 9674:Variational autoencoder (VAE) 9634:Long short-term memory (LSTM) 8901:Computational learning theory 8659:Optical character recognition 7413:Lenz v. Universal Music Corp. 3942:Federated Learning of Cohorts 3740:Slovenian government incident 1883:, Google Research, 2024-08-03 1402:, Google Research, 2024-08-21 1317:tokenizer. It was trained on 1215:Switch Transformer (2021): a 186:Colossal Clean Crawled Corpus 9654:Convolutional neural network 8352:Multi-document summarization 7699:Site reliability engineering 7477:United States v. Google Inc. 4563:applications in biochemistry 3717:Fantastic Adventures scandal 3145:Doodle Champion Island Games 284:encoder-decoder Transformers 174:encoder-decoder Transformers 9649:Multilayer perceptron (MLP) 8682:Latent Dirichlet allocation 8654:Natural language generation 8519:Machine-readable dictionary 8514:Linguistic Linked Open Data 8089:Natural language processing 7971:Relationship with Knowledge 7870:Is Google Making Us Stupid? 7377:United States v. Google LLC 7369:United States v. Google LLC 4177:Internet Speech Audio Codec 3571:Original Channel Initiative 1443:"Attention is All you Need" 258:The course is jumping well. 9779: 9725:Artificial neural networks 9639:Gated recurrent unit (GRU) 8865:Differentiable programming 8434:Explicit semantic analysis 8183:Deep linguistic processing 7744:Google and the World Brain 4480:Programmable Search Engine 4172:Internet Low Bitrate Codec 4019:Neural Machine Translation 3532:Student Ambassador Program 3482:Objectives and key results 1243:bytes, without tokenizers. 218:restoring corrupted text: 9692: 9058:Artificial neural network 8881:Automatic differentiation 8277:Word-sense disambiguation 8130:Computational linguistics 7997: 7923:Attention Is All You Need 7730:Google: Behind the Screen 7601: 7590: 7568: 7244: 6748: 6737: 6285: 5963: 4836: 4825: 4812: 4605: 4112: 3773: 3760: 3595: 3177: 2610: 2599: 2559: 2491: 2457:Attention Is All You Need 2101: 1453:. Curran Associates, Inc. 978:{\displaystyle d_{model}} 938:{\displaystyle n_{layer}} 714:{\displaystyle d_{model}} 635:{\displaystyle n_{layer}} 372:{\displaystyle d_{model}} 332:{\displaystyle n_{layer}} 190:scraped from the internet 81: 54: 8886:Neuromorphic engineering 8849:Differentiable computing 8803:Natural Language Toolkit 8727:Pronunciation assessment 8629:Automatic identification 8459:Latent semantic analysis 8415:Distributional semantics 8300:Compound-term processing 8198:Named-entity recognition 4532:Wave Federation Protocol 3435:Get Your Business Online 1077:{\displaystyle n_{head}} 673:{\displaystyle n_{head}} 471:{\displaystyle n_{head}} 9659:Residual neural network 9075:Artificial Intelligence 8707:Automated essay scoring 8677:Document classification 8344:Automatic summarization 7944:.app (top-level domain) 7891:The Billion Dollar Code 7469:Hibnick v. Google, Inc. 7307:Goddard v. Google, Inc. 7291:Feldman v. Google, Inc. 4642:Keyhole Markup Language 3404:Digital News Initiative 1473:Dive into deep learning 8564:Universal Dependencies 8257:Terminology extraction 8240:Semantic decomposition 8235:Semantic role labeling 8225:Part-of-speech tagging 8193:Information extraction 8178:Coreference resolution 8168:Collocation extraction 7934:Predictions of the end 7678:Illegal flower tribute 7558:Gonzalez v. Google LLC 7534:Garcia v. Google, Inc. 7444:Smartphone patent wars 7223:Tensor Processing Unit 7131:Digital media players 4661:Programming languages 4372:App Runtime for Chrome 3669:Smartphone patent wars 3610:Alphabet Workers Union 2332:Tensor Processing Unit 2022:imagen.research.google 1078: 1041: 1040:{\displaystyle d_{kv}} 1010: 1009:{\displaystyle d_{ff}} 979: 939: 859: 780: 779:{\displaystyle d_{kv}} 747: 746:{\displaystyle d_{ff}} 715: 674: 636: 472: 435: 434:{\displaystyle d_{kv}} 404: 403:{\displaystyle d_{ff}} 373: 333: 279: 211: 9758:Large language models 9614:Neural Turing machine 9202:Human image synthesis 8325:Sentence segmentation 7737:Google Maps Road Trip 7542:Google LLC v Defteros 7491:Joffe v. Google, Inc. 7429:Field v. Google, Inc. 7388:Intellectual property 5596:Questions and Answers 5129:Currents (social app) 3932:Exposure Notification 3722:Headquarters shooting 3389:Data Transfer Project 3384:Data Liberation Front 1313:, except it used the 1301:instruction-finetuned 1079: 1042: 1011: 980: 940: 860: 781: 748: 716: 675: 637: 473: 436: 405: 374: 334: 276: 209: 163:large language models 42:; 4 years ago 9705:Computer programming 9684:Graph neural network 9259:Text-to-video models 9237:Text-to-image models 9085:Large language model 9070:Scientific computing 8876:Statistical manifold 8871:Information geometry 8777:Voice user interface 8488:datasets and corpora 8429:Document-term matrix 8282:Word-sense induction 7550:Epic Games v. Google 7499:Mosley v SARL Google 5940:Wildfire Interactive 5899:Pay (payment method) 5590:Public Data Explorer 5041:original programming 4957:most downloaded apps 4848:Green Throttle Games 4517:Tesseract (software) 4355:Android Debug Bridge 4058:Skia Graphics Engine 3734:Innocence of Muslims 3440:Google for Education 2661:software development 2574:List of Android apps 1880:google-research/FLAN 1864:. PMLR: 22631–22648. 1803:10.1162/tacl_a_00461 1695:"SwitchTransformers" 1303:on the FLAN dataset. 1257:on the FLAN dataset. 1052: 1021: 990: 950: 910: 795: 760: 727: 686: 648: 607: 600:In the above table, 446: 415: 384: 344: 304: 107:Large language model 9051:In-context learning 8891:Pattern recognition 8757:Interactive fiction 8687:Pachinko allocation 8644:Speech segmentation 8600:Google Ngram Viewer 8372:Machine translation 8362:Text simplification 8357:Sentence extraction 8245:Semantic similarity 6941:Laptops and tablets 5670:Insights for Search 5584:Personalized Search 4842:Currents (news app) 4166:Global IP Solutions 3727:Kohistan video case 3674:Worker organization 3445:Google for Startups 3185:Developer Challenge 3083:St. John's Terminal 3041:Central Saint Giles 3036:Binoculars Building 2987:Shirley M. Tilghman 2579:List of Easter eggs 2463:Future of Go Summit 214:Some examples are: 201:<output text> 21: 9644:Echo state network 9532:JĂŒrgen Schmidhuber 9227:Facial recognition 9222:Speech recognition 9132:Software libraries 8767:Question answering 8639:Speech recognition 8504:Corpus linguistics 8484:Language resources 8267:Textual entailment 8250:Sentiment analysis 7877:Proceratium google 7228:Titan Security Key 7213:Sycamore processor 6319:Arts & Culture 5863:Marketing Platform 5360:Voice Local Search 5346:Translator Toolkit 4652:Open Location Code 4620:Chrome Experiments 4474:Plugin for Eclipse 4343:American Fuzzy Lop 3514:Silicon Initiative 3323:Symphony Orchestra 2209:In popular culture 1217:mixture-of-experts 1074: 1037: 1006: 975: 935: 855: 776: 743: 711: 670: 632: 468: 431: 400: 369: 329: 280: 212: 197:<input text> 26:Original author(s) 9740: 9739: 9502:Stephen Grossberg 9475: 9474: 8816: 8815: 8772:Virtual assistant 8697:Computer-assisted 8623: 8622: 8380:Computer-assisted 8338: 8337: 8330:Word segmentation 8292:Text segmentation 8230:Semantic analysis 8218:Syntactic parsing 8203:Ontology learning 8055: 8054: 7993: 7992: 7989: 7988: 7806:I'm Feeling Lucky 7785:Google Volume One 7606:Terms and phrases 7586: 7585: 7240: 7239: 7236: 7235: 7200:Pixel Visual Core 6761:Android Dev Phone 6733: 6732: 6729: 6728: 6725: 6724: 6379:Digital Wellbeing 6219: 6218: 5850:Affiliate Network 5773:Where Is My Train 5413:BeatThatQuote.com 5276:Neotonic Software 5085: 5084: 4865:PaperofRecord.com 4808: 4807: 4804: 4803: 4800: 4799: 4546:Search algorithms 4331: 4330: 4185:Gridcentric, Inc. 3778:Operating systems 3756: 3755: 3752: 3751: 3748: 3747: 3331: 3330: 3152:Magic Cat Academy 3017:111 Eighth Avenue 3005: 3004: 2915:Andy Bechtolsheim 2519: 2518: 2444: 2443: 2340: 2339: 2274: 2273: 2235: 2234: 2125:Computer programs 1482:978-1-009-38943-3 1255:instruction-tuned 1202: 1201: 596: 595: 156: 155: 9770: 9730:Machine learning 9720: 9719: 9700: 9455:Action selection 9445:Self-driving car 9252:Stable Diffusion 9217:Speech synthesis 9182: 9046:Machine learning 8922:Gradient descent 8843: 8836: 8829: 8820: 8793:Formal semantics 8742:Natural language 8649:Speech synthesis 8631:and data capture 8534:Semantic network 8509:Lexical resource 8492: 8310:Lexical analysis 8288: 8213:Semantic parsing 8082: 8075: 8068: 8059: 8045: 8044: 8035: 8034: 8025: 8024: 8015: 8014: 7799:How Google Works 7778:The Google Story 7603: 7592: 7576: 7575: 7269: 7262: 7255: 7246: 7207:Search Appliance 7072:Virtual reality 7040:List of products 6992:Chromebook Pixel 6911:Pixel 9 Pro Fold 6750: 6739: 6283: 6060:Quick Search Box 5961: 5957:and productivity 5888:Pay (mobile app) 5031:Official channel 4834: 4827: 4814: 4747:On2 Technologies 4603: 4434:MIT App Inventor 4110: 4041:Protocol Buffers 3775: 3762: 3707:Copyright strike 3702:Copyright issues 3664:Services outages 3647:Privacy concerns 3600:2018 data breach 3593: 3409:Digital Unlocked 3394:Developer Expert 3175: 3078:Sidewalk Toronto 2992:Rachel Whetstone 2967:Patrick Pichette 2842:John L. Hennessy 2802: 2612: 2601: 2584:April Fools' Day 2546: 2539: 2532: 2523: 2509: 2508: 2499: 2498: 2483:Google Workspace 2349: 2283: 2279:Machine learning 2136: 2129: 2088: 2081: 2074: 2065: 2056: 2055: 2053: 2052: 2038: 2032: 2031: 2029: 2028: 2014: 2008: 2007: 2005: 2004: 1989: 1983: 1982: 1981: 1980: 1975: 1957: 1948: 1947: 1922: 1916: 1915: 1913: 1912: 1897: 1891: 1890: 1889: 1888: 1875: 1866: 1865: 1853: 1847: 1846: 1821: 1815: 1814: 1796: 1772: 1766: 1765: 1763: 1762: 1747: 1741: 1740: 1739: 1738: 1733: 1715: 1709: 1708: 1706: 1705: 1691: 1685: 1684: 1683: 1682: 1677: 1659: 1653: 1652: 1651: 1650: 1645: 1627: 1621: 1620: 1618: 1617: 1602: 1596: 1595: 1593: 1592: 1577: 1571: 1570: 1568: 1567: 1553: 1544: 1543: 1542: 1541: 1536: 1518: 1512: 1511: 1509: 1508: 1493: 1487: 1486: 1464: 1455: 1454: 1438: 1432: 1431: 1429: 1416: 1410: 1409: 1408: 1407: 1394: 1385: 1384: 1360: 1271:codebase. It is 1083: 1081: 1080: 1075: 1073: 1072: 1046: 1044: 1043: 1038: 1036: 1035: 1015: 1013: 1012: 1007: 1005: 1004: 984: 982: 981: 976: 974: 973: 944: 942: 941: 936: 934: 933: 898: 864: 862: 861: 856: 854: 853: 835: 834: 819: 818: 785: 783: 782: 777: 775: 774: 752: 750: 749: 744: 742: 741: 720: 718: 717: 712: 710: 709: 679: 677: 676: 671: 669: 668: 641: 639: 638: 633: 631: 630: 477: 475: 474: 469: 467: 466: 440: 438: 437: 432: 430: 429: 409: 407: 406: 401: 399: 398: 378: 376: 375: 370: 368: 367: 338: 336: 335: 330: 328: 327: 292: 263: 259: 248: 244: 237: 233: 229: 225: 221: 202: 198: 152: 149: 147: 145: 143: 141: 139: 76: 73: 72:/google-research 71: 69: 50: 48: 43: 22: 9778: 9777: 9773: 9772: 9771: 9769: 9768: 9767: 9753:Google software 9743: 9742: 9741: 9736: 9688: 9602: 9568:Google DeepMind 9546: 9512:Geoffrey Hinton 9471: 9408: 9334:Project Debater 9280: 9178:Implementations 9173: 9127: 9091: 9034: 8976:Backpropagation 8910: 8896:Tensor calculus 8850: 8847: 8817: 8812: 8781: 8761:Syntax guessing 8743: 8736: 8722:Predictive text 8717:Grammar checker 8698: 8691: 8663: 8630: 8619: 8585:Bank of English 8568: 8496: 8487: 8478: 8409: 8366: 8334: 8286: 8188:Distant reading 8163:Argument mining 8149: 8145:Text processing 8091: 8086: 8056: 8051: 7985: 7910: 7837:Popular culture 7832: 7820:The Google Book 7756: 7751:The Creepy Line 7708: 7597: 7582: 7564: 7521: 7448: 7383: 7334: 7278: 7273: 7232: 7103: 7023: 6936: 6744: 6721: 6655: 6649: 6597: 6533:Sound Amplifier 6480:Opinion Rewards 6449:Live Transcribe 6274: 6215: 6149: 6089: 5956: 5950: 5783: 5777: 5689: 5526:Knowledge Graph 5372: 5081: 4985: 4933: 4909:Typhoon Studios 4821: 4796: 4717: 4708:Webdriver Torso 4594: 4541: 4327: 4222: 4153:Cloud Messaging 4101: 4091:Web Accelerator 4014:Mobile Services 3870: 3864: 3769: 3744: 3678: 3582: 3494:Privacy Sandbox 3422:Founders' Award 3372:Business Groups 3337: 3327: 3281: 3277:Talks at Google 3219:Developers Live 3166: 3162:Material Design 3097: 3093:YouTube Theater 3001: 2951:Omid Kordestani 2903: 2832:Sanjay Ghemawat 2791: 2707:Crisis Response 2666:version history 2651:booting process 2606: 2595: 2555: 2550: 2520: 2515: 2487: 2440: 2423: 2381:Language models 2376: 2336: 2310: 2286:Neural networks 2270: 2231: 2204: 2175: 2120: 2116:Google DeepMind 2097: 2092: 2061: 2059: 2050: 2048: 2040: 2039: 2035: 2026: 2024: 2016: 2015: 2011: 2002: 2000: 1998:EleutherAI Blog 1991: 1990: 1986: 1978: 1976: 1959: 1958: 1951: 1924: 1923: 1919: 1910: 1908: 1899: 1898: 1894: 1886: 1884: 1877: 1876: 1869: 1855: 1854: 1850: 1823: 1822: 1818: 1774: 1773: 1769: 1760: 1758: 1749: 1748: 1744: 1736: 1734: 1717: 1716: 1712: 1703: 1701: 1693: 1692: 1688: 1680: 1678: 1661: 1660: 1656: 1648: 1646: 1629: 1628: 1624: 1615: 1613: 1604: 1603: 1599: 1590: 1588: 1579: 1578: 1574: 1565: 1563: 1555: 1554: 1547: 1539: 1537: 1520: 1519: 1515: 1506: 1504: 1495: 1494: 1490: 1483: 1466: 1465: 1458: 1440: 1439: 1435: 1418: 1417: 1413: 1405: 1403: 1396: 1395: 1388: 1362: 1361: 1352: 1348: 1336:diffusion model 1328: 1055: 1050: 1049: 1024: 1019: 1018: 993: 988: 987: 953: 948: 947: 913: 908: 907: 875: 836: 823: 798: 793: 792: 763: 758: 757: 730: 725: 724: 689: 684: 683: 651: 646: 645: 610: 605: 604: 449: 444: 443: 418: 413: 412: 387: 382: 381: 347: 342: 341: 307: 302: 301: 271: 261: 257: 246: 242: 235: 231: 227: 223: 219: 200: 196: 182: 161:is a series of 136: 116: 77: 66: 47:23 October 2019 46: 44: 41: 40:23 October 2019 37:Initial release 17: 12: 11: 5: 9776: 9774: 9766: 9765: 9760: 9755: 9745: 9744: 9738: 9737: 9735: 9734: 9733: 9732: 9727: 9714: 9713: 9712: 9707: 9693: 9690: 9689: 9687: 9686: 9681: 9676: 9671: 9666: 9661: 9656: 9651: 9646: 9641: 9636: 9631: 9626: 9621: 9616: 9610: 9608: 9604: 9603: 9601: 9600: 9595: 9590: 9585: 9580: 9575: 9570: 9565: 9560: 9554: 9552: 9548: 9547: 9545: 9544: 9542:Ilya Sutskever 9539: 9534: 9529: 9524: 9519: 9514: 9509: 9507:Demis Hassabis 9504: 9499: 9497:Ian Goodfellow 9494: 9489: 9483: 9481: 9477: 9476: 9473: 9472: 9470: 9469: 9464: 9463: 9462: 9452: 9447: 9442: 9437: 9432: 9427: 9422: 9416: 9414: 9410: 9409: 9407: 9406: 9401: 9396: 9391: 9386: 9381: 9376: 9371: 9366: 9361: 9356: 9351: 9346: 9341: 9336: 9331: 9326: 9325: 9324: 9314: 9309: 9304: 9299: 9294: 9288: 9286: 9282: 9281: 9279: 9278: 9273: 9272: 9271: 9266: 9256: 9255: 9254: 9249: 9244: 9234: 9229: 9224: 9219: 9214: 9209: 9204: 9199: 9194: 9188: 9186: 9179: 9175: 9174: 9172: 9171: 9166: 9161: 9156: 9151: 9146: 9141: 9135: 9133: 9129: 9128: 9126: 9125: 9120: 9115: 9110: 9105: 9099: 9097: 9093: 9092: 9090: 9089: 9088: 9087: 9080:Language model 9077: 9072: 9067: 9066: 9065: 9055: 9054: 9053: 9042: 9040: 9036: 9035: 9033: 9032: 9030:Autoregression 9027: 9022: 9021: 9020: 9010: 9008:Regularization 9005: 9004: 9003: 8998: 8993: 8983: 8978: 8973: 8971:Loss functions 8968: 8963: 8958: 8953: 8948: 8947: 8946: 8936: 8931: 8930: 8929: 8918: 8916: 8912: 8911: 8909: 8908: 8906:Inductive bias 8903: 8898: 8893: 8888: 8883: 8878: 8873: 8868: 8860: 8858: 8852: 8851: 8848: 8846: 8845: 8838: 8831: 8823: 8814: 8813: 8811: 8810: 8805: 8800: 8795: 8789: 8787: 8783: 8782: 8780: 8779: 8774: 8769: 8764: 8754: 8748: 8746: 8744:user interface 8738: 8737: 8735: 8734: 8729: 8724: 8719: 8714: 8709: 8703: 8701: 8693: 8692: 8690: 8689: 8684: 8679: 8673: 8671: 8665: 8664: 8662: 8661: 8656: 8651: 8646: 8641: 8635: 8633: 8625: 8624: 8621: 8620: 8618: 8617: 8612: 8607: 8602: 8597: 8592: 8587: 8582: 8576: 8574: 8570: 8569: 8567: 8566: 8561: 8556: 8551: 8546: 8541: 8536: 8531: 8526: 8521: 8516: 8511: 8506: 8500: 8498: 8489: 8480: 8479: 8477: 8476: 8471: 8469:Word embedding 8466: 8461: 8456: 8449:Language model 8446: 8441: 8436: 8431: 8426: 8420: 8418: 8411: 8410: 8408: 8407: 8402: 8400:Transfer-based 8397: 8392: 8387: 8382: 8376: 8374: 8368: 8367: 8365: 8364: 8359: 8354: 8348: 8346: 8340: 8339: 8336: 8335: 8333: 8332: 8327: 8322: 8317: 8312: 8307: 8302: 8296: 8294: 8285: 8284: 8279: 8274: 8269: 8264: 8259: 8253: 8252: 8247: 8242: 8237: 8232: 8227: 8222: 8221: 8220: 8215: 8205: 8200: 8195: 8190: 8185: 8180: 8175: 8173:Concept mining 8170: 8165: 8159: 8157: 8151: 8150: 8148: 8147: 8142: 8137: 8132: 8127: 8126: 8125: 8120: 8110: 8105: 8099: 8097: 8093: 8092: 8087: 8085: 8084: 8077: 8070: 8062: 8053: 8052: 8050: 8049: 8039: 8029: 8019: 8009: 7998: 7995: 7994: 7991: 7990: 7987: 7986: 7984: 7983: 7978: 7973: 7968: 7966:Pimp My Search 7963: 7962: 7961: 7956: 7951: 7946: 7936: 7931: 7926: 7918: 7916: 7912: 7911: 7909: 7908: 7901: 7898:The Internship 7894: 7887: 7880: 7873: 7866: 7861: 7856: 7848: 7840: 7838: 7834: 7833: 7831: 7830: 7823: 7816: 7809: 7802: 7795: 7788: 7781: 7774: 7766: 7764: 7758: 7757: 7755: 7754: 7747: 7740: 7733: 7726: 7718: 7716: 7710: 7709: 7707: 7706: 7701: 7696: 7691: 7686: 7681: 7674: 7669: 7664: 7659: 7654: 7652:Google hacking 7649: 7644: 7639: 7638: 7637: 7630:Google bombing 7627: 7622: 7617: 7609: 7607: 7599: 7598: 7595: 7588: 7587: 7584: 7583: 7581: 7580: 7569: 7566: 7565: 7563: 7562: 7554: 7546: 7538: 7529: 7527: 7523: 7522: 7520: 7519: 7511: 7503: 7495: 7487: 7481: 7473: 7465: 7456: 7454: 7450: 7449: 7447: 7446: 7441: 7433: 7425: 7417: 7409: 7401: 7391: 7389: 7385: 7384: 7382: 7381: 7373: 7365: 7357: 7349: 7348:(2010–present) 7346:European Union 7342: 7340: 7336: 7335: 7333: 7332: 7327: 7319: 7311: 7303: 7295: 7286: 7284: 7280: 7279: 7274: 7272: 7271: 7264: 7257: 7249: 7242: 7241: 7238: 7237: 7234: 7233: 7231: 7230: 7225: 7220: 7215: 7210: 7203: 7196: 7189: 7188: 7187: 7182: 7177: 7175:Smart Speakers 7167: 7162: 7155: 7154: 7153: 7146: 7139: 7129: 7122: 7117: 7111: 7109: 7105: 7104: 7102: 7101: 7100: 7099: 7094: 7087: 7080: 7070: 7064: 7059: 7054: 7049: 7044: 7043: 7042: 7031: 7029: 7025: 7024: 7022: 7021: 7020: 7019: 7014: 7009: 7004: 6999: 6994: 6984: 6983: 6982: 6977: 6972: 6967: 6962: 6950: 6944: 6942: 6938: 6937: 6935: 6934: 6927: 6920: 6919: 6918: 6913: 6908: 6903: 6898: 6893: 6888: 6883: 6878: 6873: 6868: 6863: 6858: 6853: 6848: 6843: 6838: 6833: 6823: 6822: 6821: 6816: 6811: 6806: 6801: 6796: 6791: 6786: 6781: 6769: 6764: 6756: 6754: 6746: 6745: 6742: 6735: 6734: 6731: 6730: 6727: 6726: 6723: 6722: 6720: 6719: 6712: 6711: 6710: 6696: 6691: 6684: 6683: 6682: 6670: 6665: 6659: 6657: 6651: 6650: 6648: 6647: 6642: 6637: 6635:Remote Desktop 6632: 6625: 6620: 6615: 6607: 6605: 6599: 6598: 6596: 6595: 6588: 6583: 6576: 6571: 6564: 6557: 6552: 6547: 6540: 6535: 6530: 6523: 6516: 6511: 6504: 6499: 6494: 6487: 6482: 6477: 6470: 6465: 6458: 6451: 6446: 6439: 6432: 6425: 6422:Gesture Search 6418: 6413: 6408: 6403: 6398: 6396:Find My Device 6393: 6388: 6381: 6376: 6371: 6364: 6359: 6352: 6349:Building Maker 6345: 6338: 6331: 6326: 6321: 6316: 6309: 6304: 6303: 6302: 6297: 6286: 6280: 6276: 6275: 6273: 6272: 6271: 6270: 6260: 6255: 6250: 6245: 6240: 6233: 6227: 6225: 6221: 6220: 6217: 6216: 6214: 6213: 6208: 6203: 6196: 6189: 6184: 6177: 6176: 6175: 6165: 6157: 6155: 6151: 6150: 6148: 6147: 6142: 6137: 6132: 6127: 6122: 6115: 6110: 6105: 6099: 6097: 6091: 6090: 6088: 6087: 6080: 6075: 6068: 6063: 6056: 6049: 6044: 6039: 6032: 6025: 6018: 6013: 6006: 6001: 5996: 5989: 5984: 5979: 5972: 5964: 5958: 5952: 5951: 5949: 5948: 5943: 5936: 5929: 5924: 5917: 5916: 5915: 5908: 5901: 5896: 5884: 5883: 5882: 5875: 5870: 5860: 5859: 5858: 5853: 5839: 5832: 5825: 5818: 5813: 5808: 5803: 5798: 5793: 5787: 5785: 5779: 5778: 5776: 5775: 5770: 5765: 5764: 5763: 5762: 5761: 5756: 5746: 5741: 5734: 5727: 5715: 5708: 5703: 5697: 5695: 5691: 5690: 5688: 5687: 5680: 5675: 5674: 5673: 5661: 5660: 5659: 5649: 5644: 5637: 5636: 5635: 5621: 5616: 5609: 5604: 5599: 5592: 5587: 5580: 5575: 5570: 5569: 5568: 5561: 5551: 5544: 5543: 5542: 5535: 5523: 5516: 5515: 5514: 5507: 5495: 5490: 5483: 5478: 5471: 5464: 5457: 5452: 5450:Dataset Search 5447: 5442: 5435: 5434: 5433: 5423: 5416: 5409: 5402: 5395: 5390: 5382: 5380: 5374: 5373: 5371: 5370: 5363: 5356: 5351: 5350: 5349: 5337: 5330: 5323: 5316: 5309: 5308: 5307: 5293: 5286: 5279: 5272: 5265: 5260: 5255: 5248: 5241: 5234: 5233: 5232: 5225: 5215: 5208: 5201: 5196: 5195: 5194: 5189: 5182: 5172: 5165: 5158: 5155:Friend Connect 5151: 5146: 5139: 5132: 5125: 5120: 5115: 5108: 5101: 5093: 5091: 5087: 5086: 5083: 5082: 5080: 5079: 5074: 5069: 5064: 5059: 5052: 5049:YouTube Rewind 5045: 5044: 5043: 5033: 5028: 5023: 5018: 5013: 5008: 5003: 4995: 4993: 4987: 4986: 4984: 4983: 4978: 4973: 4966: 4959: 4954: 4949: 4943: 4941: 4935: 4934: 4932: 4931: 4924: 4919: 4914: 4913: 4912: 4905: 4891: 4884: 4879: 4874: 4867: 4862: 4855: 4850: 4845: 4837: 4831: 4823: 4822: 4817: 4810: 4809: 4806: 4805: 4802: 4801: 4798: 4797: 4795: 4794: 4789: 4784: 4779: 4774: 4773: 4772: 4771: 4770: 4760: 4755: 4743: 4742: 4741: 4731: 4725: 4723: 4719: 4718: 4716: 4715: 4710: 4705: 4698: 4693: 4692: 4691: 4684: 4679: 4674: 4669: 4659: 4654: 4649: 4644: 4639: 4634: 4629: 4622: 4617: 4612: 4606: 4600: 4596: 4595: 4593: 4592: 4587: 4582: 4577: 4572: 4571: 4570: 4565: 4555: 4549: 4547: 4543: 4542: 4540: 4539: 4534: 4529: 4524: 4519: 4514: 4507: 4502: 4500:Search Console 4497: 4492: 4487: 4482: 4477: 4470: 4465: 4460: 4455: 4448: 4443: 4436: 4431: 4426: 4421: 4416: 4411: 4406: 4401: 4396: 4391: 4384: 4379: 4374: 4369: 4362: 4360:Android Studio 4357: 4352: 4345: 4339: 4337: 4333: 4332: 4329: 4328: 4326: 4325: 4320: 4313: 4308: 4303: 4296: 4291: 4286: 4281: 4276: 4271: 4264: 4262:Compute Engine 4259: 4258: 4257: 4247: 4240: 4234: 4232: 4224: 4223: 4221: 4220: 4213: 4208: 4203: 4198: 4193: 4188: 4181: 4180: 4179: 4174: 4162: 4161: 4160: 4155: 4145: 4144: 4143: 4134:Cloud Platform 4131: 4126: 4119: 4113: 4107: 4103: 4102: 4100: 4099: 4094: 4087: 4082: 4077: 4072: 4067: 4060: 4055: 4050: 4043: 4038: 4033: 4026: 4021: 4016: 4011: 4006: 4001: 3996: 3991: 3986: 3981: 3976: 3971: 3966: 3959: 3954: 3949: 3944: 3939: 3934: 3929: 3924: 3919: 3914: 3909: 3904: 3899: 3898: 3897: 3885: 3880: 3874: 3872: 3866: 3865: 3863: 3862: 3855: 3850: 3849: 3848: 3843: 3833: 3832: 3831: 3826: 3821: 3814: 3807: 3802: 3797: 3792: 3781: 3779: 3771: 3770: 3765: 3758: 3757: 3754: 3753: 3750: 3749: 3746: 3745: 3743: 3742: 3737: 3729: 3724: 3719: 3714: 3709: 3704: 3699: 3694: 3688: 3686: 3680: 3679: 3677: 3676: 3671: 3666: 3661: 3656: 3655: 3654: 3644: 3639: 3634: 3629: 3622: 3617: 3612: 3607: 3602: 3596: 3590: 3584: 3583: 3581: 3580: 3575: 3574: 3573: 3568: 3563: 3561:Creator Awards 3555: 3550: 3549: 3548: 3543: 3535: 3528: 3523: 3516: 3511: 3506: 3501: 3496: 3491: 3484: 3479: 3474: 3467: 3462: 3460:Made with Code 3457: 3452: 3447: 3442: 3437: 3432: 3425: 3418: 3411: 3406: 3401: 3399:Digital Garage 3396: 3391: 3386: 3381: 3374: 3369: 3364: 3363: 3362: 3357: 3347: 3341: 3339: 3333: 3332: 3329: 3328: 3326: 3325: 3320: 3315: 3310: 3305: 3300: 3295: 3289: 3287: 3283: 3282: 3280: 3279: 3274: 3272:Summer of Code 3269: 3264: 3257: 3250: 3247:Living Stories 3243: 3238: 3233: 3226: 3221: 3216: 3209: 3204: 3199: 3198: 3197: 3192: 3187: 3178: 3172: 3168: 3167: 3165: 3164: 3159: 3158: 3157: 3156: 3155: 3148: 3131: 3130: 3129: 3124: 3119: 3114: 3105: 3103: 3099: 3098: 3096: 3095: 3090: 3085: 3080: 3075: 3070: 3065: 3060: 3055: 3048: 3046:Chelsea Market 3043: 3038: 3033: 3026: 3019: 3013: 3011: 3007: 3006: 3003: 3002: 3000: 2999: 2997:Susan Wojcicki 2994: 2989: 2984: 2979: 2974: 2969: 2964: 2958: 2953: 2948: 2943: 2938: 2936:David Drummond 2933: 2928: 2926:David Cheriton 2923: 2917: 2911: 2909: 2905: 2904: 2902: 2901: 2896: 2891: 2886: 2880: 2874: 2869: 2864: 2859: 2854: 2852:Salar Kamangar 2849: 2844: 2839: 2834: 2829: 2824: 2819: 2814: 2812:Krishna Bharat 2808: 2806: 2799: 2793: 2792: 2790: 2789: 2788: 2787: 2782: 2777: 2770: 2760: 2758:Sustainability 2755: 2750: 2749: 2748: 2738: 2733: 2728: 2721: 2720: 2719: 2714: 2709: 2699: 2694: 2689: 2684: 2683: 2682: 2670: 2669: 2668: 2663: 2658: 2653: 2643: 2642: 2641: 2636: 2624: 2618: 2616: 2608: 2607: 2604: 2597: 2596: 2594: 2593: 2588: 2587: 2586: 2576: 2571: 2566: 2560: 2557: 2556: 2551: 2549: 2548: 2541: 2534: 2526: 2517: 2516: 2514: 2513: 2503: 2492: 2489: 2488: 2486: 2485: 2480: 2475: 2470: 2465: 2460: 2452: 2450: 2446: 2445: 2442: 2441: 2439: 2438: 2431: 2429: 2425: 2424: 2422: 2421: 2415: 2409: 2403: 2397: 2391: 2384: 2382: 2378: 2377: 2375: 2374: 2368: 2362: 2355: 2353: 2346: 2342: 2341: 2338: 2337: 2335: 2334: 2329: 2324: 2318: 2316: 2312: 2311: 2309: 2308: 2302: 2296: 2289: 2287: 2280: 2276: 2275: 2272: 2271: 2269: 2268: 2262: 2256: 2250: 2243: 2241: 2237: 2236: 2233: 2232: 2230: 2229: 2221: 2212: 2210: 2206: 2205: 2203: 2202: 2196: 2190: 2183: 2181: 2177: 2176: 2174: 2173: 2167: 2161: 2155: 2149: 2142: 2140: 2133: 2126: 2122: 2121: 2119: 2118: 2113: 2108: 2102: 2099: 2098: 2093: 2091: 2090: 2083: 2076: 2068: 2058: 2057: 2046:huggingface.co 2033: 2009: 1984: 1949: 1917: 1905:huggingface.co 1892: 1867: 1848: 1816: 1767: 1755:huggingface.co 1742: 1710: 1699:huggingface.co 1686: 1654: 1622: 1610:huggingface.co 1597: 1585:huggingface.co 1572: 1545: 1513: 1501:huggingface.co 1488: 1481: 1456: 1433: 1411: 1386: 1349: 1347: 1344: 1327: 1324: 1323: 1322: 1304: 1290: 1280: 1258: 1244: 1234: 1220: 1213: 1200: 1199: 1196: 1193: 1190: 1187: 1184: 1181: 1177: 1176: 1173: 1170: 1167: 1164: 1161: 1158: 1154: 1153: 1150: 1147: 1144: 1141: 1138: 1135: 1131: 1130: 1127: 1124: 1121: 1118: 1115: 1112: 1108: 1107: 1104: 1101: 1098: 1095: 1092: 1089: 1085: 1084: 1071: 1068: 1065: 1062: 1058: 1047: 1034: 1031: 1027: 1016: 1003: 1000: 996: 985: 972: 969: 966: 963: 960: 956: 945: 932: 929: 926: 923: 920: 916: 905: 902: 896: 895: 889: 874: 871: 852: 849: 846: 843: 839: 833: 830: 826: 822: 817: 814: 811: 808: 805: 801: 789: 788: 773: 770: 766: 754: 740: 737: 733: 722: 708: 705: 702: 699: 696: 692: 681: 667: 664: 661: 658: 654: 643: 629: 626: 623: 620: 617: 613: 594: 593: 590: 587: 584: 581: 578: 575: 571: 570: 567: 564: 561: 558: 555: 552: 548: 547: 544: 541: 538: 535: 532: 529: 525: 524: 521: 518: 515: 512: 509: 506: 502: 501: 498: 495: 492: 489: 486: 483: 479: 478: 465: 462: 459: 456: 452: 441: 428: 425: 421: 410: 397: 394: 390: 379: 366: 363: 360: 357: 354: 350: 339: 326: 323: 320: 317: 314: 310: 299: 296: 270: 267: 266: 265: 262:not acceptable 250: 239: 181: 178: 154: 153: 134: 130: 129: 124: 118: 117: 115: 114: 109: 103: 101: 95: 94: 89: 83: 82: 79: 78: 64: 62: 60:Stable release 56: 55: 52: 51: 38: 34: 33: 28: 15: 13: 10: 9: 6: 4: 3: 2: 9775: 9764: 9761: 9759: 9756: 9754: 9751: 9750: 9748: 9731: 9728: 9726: 9723: 9722: 9715: 9711: 9708: 9706: 9703: 9702: 9699: 9695: 9694: 9691: 9685: 9682: 9680: 9677: 9675: 9672: 9670: 9667: 9665: 9662: 9660: 9657: 9655: 9652: 9650: 9647: 9645: 9642: 9640: 9637: 9635: 9632: 9630: 9627: 9625: 9622: 9620: 9617: 9615: 9612: 9611: 9609: 9607:Architectures 9605: 9599: 9596: 9594: 9591: 9589: 9586: 9584: 9581: 9579: 9576: 9574: 9571: 9569: 9566: 9564: 9561: 9559: 9556: 9555: 9553: 9551:Organizations 9549: 9543: 9540: 9538: 9535: 9533: 9530: 9528: 9525: 9523: 9520: 9518: 9515: 9513: 9510: 9508: 9505: 9503: 9500: 9498: 9495: 9493: 9490: 9488: 9487:Yoshua Bengio 9485: 9484: 9482: 9478: 9468: 9467:Robot control 9465: 9461: 9458: 9457: 9456: 9453: 9451: 9448: 9446: 9443: 9441: 9438: 9436: 9433: 9431: 9428: 9426: 9423: 9421: 9418: 9417: 9415: 9411: 9405: 9402: 9400: 9397: 9395: 9392: 9390: 9387: 9385: 9384:Chinchilla AI 9382: 9380: 9377: 9375: 9372: 9370: 9367: 9365: 9362: 9360: 9357: 9355: 9352: 9350: 9347: 9345: 9342: 9340: 9337: 9335: 9332: 9330: 9327: 9323: 9320: 9319: 9318: 9315: 9313: 9310: 9308: 9305: 9303: 9300: 9298: 9295: 9293: 9290: 9289: 9287: 9283: 9277: 9274: 9270: 9267: 9265: 9262: 9261: 9260: 9257: 9253: 9250: 9248: 9245: 9243: 9240: 9239: 9238: 9235: 9233: 9230: 9228: 9225: 9223: 9220: 9218: 9215: 9213: 9210: 9208: 9205: 9203: 9200: 9198: 9195: 9193: 9190: 9189: 9187: 9183: 9180: 9176: 9170: 9167: 9165: 9162: 9160: 9157: 9155: 9152: 9150: 9147: 9145: 9142: 9140: 9137: 9136: 9134: 9130: 9124: 9121: 9119: 9116: 9114: 9111: 9109: 9106: 9104: 9101: 9100: 9098: 9094: 9086: 9083: 9082: 9081: 9078: 9076: 9073: 9071: 9068: 9064: 9063:Deep learning 9061: 9060: 9059: 9056: 9052: 9049: 9048: 9047: 9044: 9043: 9041: 9037: 9031: 9028: 9026: 9023: 9019: 9016: 9015: 9014: 9011: 9009: 9006: 9002: 8999: 8997: 8994: 8992: 8989: 8988: 8987: 8984: 8982: 8979: 8977: 8974: 8972: 8969: 8967: 8964: 8962: 8959: 8957: 8954: 8952: 8951:Hallucination 8949: 8945: 8942: 8941: 8940: 8937: 8935: 8932: 8928: 8925: 8924: 8923: 8920: 8919: 8917: 8913: 8907: 8904: 8902: 8899: 8897: 8894: 8892: 8889: 8887: 8884: 8882: 8879: 8877: 8874: 8872: 8869: 8867: 8866: 8862: 8861: 8859: 8857: 8853: 8844: 8839: 8837: 8832: 8830: 8825: 8824: 8821: 8809: 8806: 8804: 8801: 8799: 8798:Hallucination 8796: 8794: 8791: 8790: 8788: 8784: 8778: 8775: 8773: 8770: 8768: 8765: 8762: 8758: 8755: 8753: 8750: 8749: 8747: 8745: 8739: 8733: 8732:Spell checker 8730: 8728: 8725: 8723: 8720: 8718: 8715: 8713: 8710: 8708: 8705: 8704: 8702: 8700: 8694: 8688: 8685: 8683: 8680: 8678: 8675: 8674: 8672: 8670: 8666: 8660: 8657: 8655: 8652: 8650: 8647: 8645: 8642: 8640: 8637: 8636: 8634: 8632: 8626: 8616: 8613: 8611: 8608: 8606: 8603: 8601: 8598: 8596: 8593: 8591: 8588: 8586: 8583: 8581: 8578: 8577: 8575: 8571: 8565: 8562: 8560: 8557: 8555: 8552: 8550: 8547: 8545: 8544:Speech corpus 8542: 8540: 8537: 8535: 8532: 8530: 8527: 8525: 8524:Parallel text 8522: 8520: 8517: 8515: 8512: 8510: 8507: 8505: 8502: 8501: 8499: 8493: 8490: 8485: 8481: 8475: 8472: 8470: 8467: 8465: 8462: 8460: 8457: 8454: 8450: 8447: 8445: 8442: 8440: 8437: 8435: 8432: 8430: 8427: 8425: 8422: 8421: 8419: 8416: 8412: 8406: 8403: 8401: 8398: 8396: 8393: 8391: 8388: 8386: 8385:Example-based 8383: 8381: 8378: 8377: 8375: 8373: 8369: 8363: 8360: 8358: 8355: 8353: 8350: 8349: 8347: 8345: 8341: 8331: 8328: 8326: 8323: 8321: 8318: 8316: 8315:Text chunking 8313: 8311: 8308: 8306: 8305:Lemmatisation 8303: 8301: 8298: 8297: 8295: 8293: 8289: 8283: 8280: 8278: 8275: 8273: 8270: 8268: 8265: 8263: 8260: 8258: 8255: 8254: 8251: 8248: 8246: 8243: 8241: 8238: 8236: 8233: 8231: 8228: 8226: 8223: 8219: 8216: 8214: 8211: 8210: 8209: 8206: 8204: 8201: 8199: 8196: 8194: 8191: 8189: 8186: 8184: 8181: 8179: 8176: 8174: 8171: 8169: 8166: 8164: 8161: 8160: 8158: 8156: 8155:Text analysis 8152: 8146: 8143: 8141: 8138: 8136: 8133: 8131: 8128: 8124: 8121: 8119: 8116: 8115: 8114: 8111: 8109: 8106: 8104: 8101: 8100: 8098: 8096:General terms 8094: 8090: 8083: 8078: 8076: 8071: 8069: 8064: 8063: 8060: 8048: 8040: 8038: 8030: 8028: 8020: 8018: 8010: 8007: 8003: 8000: 7999: 7996: 7982: 7979: 7977: 7974: 7972: 7969: 7967: 7964: 7960: 7957: 7955: 7952: 7950: 7947: 7945: 7942: 7941: 7940: 7937: 7935: 7932: 7930: 7927: 7924: 7920: 7919: 7917: 7913: 7907: 7906: 7902: 7900: 7899: 7895: 7893: 7892: 7888: 7886: 7885: 7881: 7879: 7878: 7874: 7872: 7871: 7867: 7865: 7862: 7860: 7857: 7855: 7853: 7849: 7847: 7846: 7842: 7841: 7839: 7835: 7829: 7828: 7824: 7822: 7821: 7817: 7815: 7814: 7810: 7808: 7807: 7803: 7801: 7800: 7796: 7794: 7793: 7789: 7787: 7786: 7782: 7780: 7779: 7775: 7773: 7772: 7768: 7767: 7765: 7763: 7759: 7753: 7752: 7748: 7746: 7745: 7741: 7739: 7738: 7734: 7732: 7731: 7727: 7725: 7724: 7720: 7719: 7717: 7715: 7714:Documentaries 7711: 7705: 7702: 7700: 7697: 7695: 7692: 7690: 7687: 7685: 7682: 7679: 7675: 7673: 7670: 7668: 7665: 7663: 7660: 7658: 7655: 7653: 7650: 7648: 7645: 7643: 7642:Google effect 7640: 7636: 7633: 7632: 7631: 7628: 7626: 7625:Google (verb) 7623: 7621: 7618: 7615: 7614:Don't be evil 7611: 7610: 7608: 7604: 7600: 7593: 7589: 7579: 7571: 7570: 7567: 7560: 7559: 7555: 7552: 7551: 7547: 7544: 7543: 7539: 7536: 7535: 7531: 7530: 7528: 7524: 7517: 7516: 7515:Frank v. Gaos 7512: 7509: 7508: 7504: 7501: 7500: 7496: 7493: 7492: 7488: 7485: 7482: 7479: 7478: 7474: 7471: 7470: 7466: 7463: 7462: 7458: 7457: 7455: 7451: 7445: 7442: 7439: 7438: 7434: 7431: 7430: 7426: 7423: 7422: 7418: 7415: 7414: 7410: 7407: 7406: 7402: 7399: 7397: 7393: 7392: 7390: 7386: 7379: 7378: 7374: 7371: 7370: 7366: 7363: 7362: 7358: 7355: 7354: 7350: 7347: 7344: 7343: 7341: 7337: 7331: 7328: 7325: 7324: 7320: 7317: 7316: 7312: 7309: 7308: 7304: 7301: 7300: 7296: 7293: 7292: 7288: 7287: 7285: 7281: 7277: 7270: 7265: 7263: 7258: 7256: 7251: 7250: 7247: 7243: 7229: 7226: 7224: 7221: 7219: 7216: 7214: 7211: 7209: 7208: 7204: 7202: 7201: 7197: 7195: 7194: 7190: 7186: 7183: 7181: 7178: 7176: 7173: 7172: 7171: 7168: 7166: 7165:Liquid Galaxy 7163: 7161: 7160: 7156: 7152: 7151: 7147: 7145: 7144: 7140: 7138: 7137: 7133: 7132: 7130: 7128: 7127: 7123: 7121: 7118: 7116: 7113: 7112: 7110: 7106: 7098: 7095: 7093: 7092: 7088: 7086: 7085: 7081: 7079: 7078: 7074: 7073: 7071: 7068: 7065: 7063: 7062:Pixel Watch 3 7060: 7058: 7057:Pixel Watch 2 7055: 7053: 7050: 7048: 7045: 7041: 7038: 7037: 7036: 7033: 7032: 7030: 7026: 7018: 7015: 7013: 7010: 7008: 7005: 7003: 7000: 6998: 6995: 6993: 6990: 6989: 6988: 6985: 6981: 6978: 6976: 6973: 6971: 6968: 6966: 6963: 6961: 6958: 6957: 6956: 6955: 6951: 6949: 6946: 6945: 6943: 6939: 6933: 6932: 6928: 6926: 6925: 6921: 6917: 6914: 6912: 6909: 6907: 6904: 6902: 6899: 6897: 6894: 6892: 6889: 6887: 6884: 6882: 6879: 6877: 6874: 6872: 6869: 6867: 6864: 6862: 6859: 6857: 6854: 6852: 6849: 6847: 6844: 6842: 6839: 6837: 6834: 6832: 6829: 6828: 6827: 6824: 6820: 6817: 6815: 6812: 6810: 6807: 6805: 6802: 6800: 6797: 6795: 6792: 6790: 6787: 6785: 6782: 6780: 6777: 6776: 6775: 6774: 6770: 6768: 6765: 6763: 6762: 6758: 6757: 6755: 6751: 6747: 6740: 6736: 6718: 6717: 6713: 6709: 6708: 6704: 6703: 6702: 6701: 6697: 6695: 6692: 6690: 6689: 6685: 6681: 6680: 6676: 6675: 6674: 6671: 6669: 6666: 6664: 6661: 6660: 6658: 6652: 6646: 6643: 6641: 6638: 6636: 6633: 6631: 6630: 6626: 6624: 6623:Dinosaur Game 6621: 6619: 6616: 6614: 6613: 6609: 6608: 6606: 6604: 6600: 6594: 6593: 6589: 6587: 6584: 6582: 6581: 6577: 6575: 6572: 6570: 6569: 6568:URL Shortener 6565: 6563: 6562: 6558: 6556: 6553: 6551: 6548: 6546: 6545: 6541: 6539: 6536: 6534: 6531: 6529: 6528: 6524: 6522: 6521: 6517: 6515: 6514:Safe Browsing 6512: 6510: 6509: 6505: 6503: 6500: 6498: 6495: 6493: 6492: 6488: 6486: 6485:Person Finder 6483: 6481: 6478: 6476: 6475: 6471: 6469: 6466: 6464: 6463: 6459: 6457: 6456: 6452: 6450: 6447: 6445: 6444: 6440: 6438: 6437: 6433: 6431: 6430: 6426: 6424: 6423: 6419: 6417: 6414: 6412: 6409: 6407: 6404: 6402: 6399: 6397: 6394: 6392: 6389: 6387: 6386: 6382: 6380: 6377: 6375: 6372: 6370: 6369: 6365: 6363: 6360: 6358: 6357: 6353: 6351: 6350: 6346: 6344: 6343: 6339: 6337: 6336: 6332: 6330: 6329:Authenticator 6327: 6325: 6322: 6320: 6317: 6315: 6314: 6310: 6308: 6305: 6301: 6298: 6296: 6293: 6292: 6291: 6288: 6287: 6284: 6281: 6277: 6269: 6266: 6265: 6264: 6261: 6259: 6258:Toontastic 3D 6256: 6254: 6251: 6249: 6246: 6244: 6241: 6239: 6238: 6234: 6232: 6229: 6228: 6226: 6222: 6212: 6209: 6207: 6204: 6202: 6201: 6197: 6195: 6194: 6190: 6188: 6185: 6183: 6182: 6178: 6174: 6171: 6170: 6169: 6166: 6164: 6163: 6159: 6158: 6156: 6152: 6146: 6143: 6141: 6138: 6136: 6133: 6131: 6128: 6126: 6123: 6121: 6120: 6119:Fusion Tables 6116: 6114: 6111: 6109: 6106: 6104: 6101: 6100: 6098: 6096: 6092: 6086: 6085: 6081: 6079: 6076: 6074: 6073: 6069: 6067: 6064: 6062: 6061: 6057: 6055: 6054: 6050: 6048: 6045: 6043: 6040: 6038: 6037: 6033: 6031: 6030: 6026: 6024: 6023: 6019: 6017: 6014: 6012: 6011: 6007: 6005: 6002: 6000: 5997: 5995: 5994: 5990: 5988: 5985: 5983: 5980: 5978: 5977: 5973: 5971: 5970: 5966: 5965: 5962: 5959: 5953: 5947: 5944: 5942: 5941: 5937: 5935: 5934: 5930: 5928: 5925: 5923: 5922: 5918: 5914: 5913: 5909: 5907: 5906: 5902: 5900: 5897: 5895: 5892: 5891: 5890: 5889: 5885: 5881: 5880: 5876: 5874: 5873:Looker Studio 5871: 5869: 5866: 5865: 5864: 5861: 5857: 5854: 5852: 5851: 5847: 5846: 5845: 5844: 5840: 5838: 5837: 5833: 5831: 5830: 5826: 5824: 5823: 5819: 5817: 5814: 5812: 5809: 5807: 5804: 5802: 5799: 5797: 5794: 5792: 5789: 5788: 5786: 5780: 5774: 5771: 5769: 5766: 5760: 5757: 5755: 5752: 5751: 5750: 5747: 5745: 5742: 5740: 5739: 5735: 5733: 5732: 5728: 5726: 5725: 5721: 5720: 5719: 5716: 5714: 5713: 5709: 5707: 5704: 5702: 5699: 5698: 5696: 5692: 5686: 5685: 5681: 5679: 5676: 5672: 5671: 5667: 5666: 5665: 5662: 5658: 5655: 5654: 5653: 5650: 5648: 5645: 5643: 5642: 5638: 5634: 5633: 5629: 5628: 5627: 5626: 5622: 5620: 5617: 5615: 5614: 5610: 5608: 5605: 5603: 5600: 5598: 5597: 5593: 5591: 5588: 5586: 5585: 5581: 5579: 5576: 5574: 5571: 5567: 5566: 5562: 5560: 5557: 5556: 5555: 5552: 5550: 5549: 5545: 5541: 5540: 5536: 5534: 5533: 5529: 5528: 5527: 5524: 5522: 5521: 5517: 5513: 5512: 5508: 5506: 5505: 5504:Image Labeler 5501: 5500: 5499: 5496: 5494: 5491: 5489: 5488: 5484: 5482: 5479: 5477: 5476: 5472: 5470: 5469: 5465: 5463: 5462: 5458: 5456: 5453: 5451: 5448: 5446: 5443: 5441: 5440: 5436: 5432: 5429: 5428: 5427: 5424: 5422: 5421: 5417: 5415: 5414: 5410: 5408: 5407: 5403: 5401: 5400: 5396: 5394: 5391: 5389: 5388: 5384: 5383: 5381: 5379: 5375: 5369: 5368: 5364: 5362: 5361: 5357: 5355: 5352: 5348: 5347: 5343: 5342: 5341: 5338: 5336: 5335: 5331: 5329: 5328: 5324: 5322: 5321: 5317: 5315: 5314: 5310: 5306: 5305: 5301: 5300: 5299: 5298: 5294: 5292: 5291: 5287: 5285: 5284: 5280: 5278: 5277: 5273: 5271: 5270: 5266: 5264: 5261: 5259: 5256: 5254: 5253: 5249: 5247: 5246: 5242: 5240: 5239: 5235: 5231: 5230: 5226: 5224: 5221: 5220: 5219: 5216: 5214: 5213: 5209: 5207: 5206: 5202: 5200: 5197: 5193: 5190: 5188: 5187: 5183: 5181: 5178: 5177: 5176: 5173: 5171: 5170: 5166: 5164: 5163: 5159: 5157: 5156: 5152: 5150: 5147: 5145: 5144: 5140: 5138: 5137: 5133: 5131: 5130: 5126: 5124: 5121: 5119: 5116: 5114: 5113: 5109: 5107: 5106: 5102: 5100: 5099: 5095: 5094: 5092: 5090:Communication 5088: 5078: 5075: 5073: 5070: 5068: 5065: 5063: 5060: 5058: 5057: 5053: 5051: 5050: 5046: 5042: 5039: 5038: 5037: 5034: 5032: 5029: 5027: 5024: 5022: 5019: 5017: 5014: 5012: 5009: 5007: 5004: 5002: 5001: 4997: 4996: 4994: 4992: 4988: 4982: 4979: 4977: 4974: 4972: 4971: 4967: 4965: 4964: 4960: 4958: 4955: 4953: 4950: 4948: 4945: 4944: 4942: 4940: 4936: 4930: 4929: 4925: 4923: 4920: 4918: 4915: 4911: 4910: 4906: 4904: 4903: 4899: 4898: 4897: 4896: 4892: 4890: 4889: 4885: 4883: 4882:Santa Tracker 4880: 4878: 4875: 4873: 4872: 4868: 4866: 4863: 4861: 4860: 4856: 4854: 4853:Owlchemy Labs 4851: 4849: 4846: 4844: 4843: 4839: 4838: 4835: 4832: 4830:Entertainment 4828: 4824: 4820: 4815: 4811: 4793: 4790: 4788: 4785: 4783: 4780: 4778: 4775: 4769: 4766: 4765: 4764: 4761: 4759: 4756: 4754: 4751: 4750: 4749: 4748: 4744: 4740: 4737: 4736: 4735: 4732: 4730: 4727: 4726: 4724: 4720: 4714: 4711: 4709: 4706: 4704: 4703: 4699: 4697: 4694: 4690: 4689: 4685: 4683: 4680: 4678: 4675: 4673: 4670: 4668: 4667: 4663: 4662: 4660: 4658: 4655: 4653: 4650: 4648: 4645: 4643: 4640: 4638: 4635: 4633: 4630: 4628: 4627: 4623: 4621: 4618: 4616: 4613: 4611: 4608: 4607: 4604: 4601: 4597: 4591: 4588: 4586: 4583: 4581: 4578: 4576: 4573: 4569: 4566: 4564: 4561: 4560: 4559: 4556: 4554: 4551: 4550: 4548: 4544: 4538: 4535: 4533: 4530: 4528: 4525: 4523: 4520: 4518: 4515: 4513: 4512: 4508: 4506: 4503: 4501: 4498: 4496: 4493: 4491: 4488: 4486: 4483: 4481: 4478: 4476: 4475: 4471: 4469: 4466: 4464: 4461: 4459: 4456: 4454: 4453: 4449: 4447: 4446:Native Client 4444: 4442: 4441: 4440:Mashup Editor 4437: 4435: 4432: 4430: 4427: 4425: 4422: 4420: 4417: 4415: 4412: 4410: 4407: 4405: 4404:Data Protocol 4402: 4400: 4397: 4395: 4394:Closure Tools 4392: 4390: 4389: 4385: 4383: 4380: 4378: 4375: 4373: 4370: 4368: 4367: 4363: 4361: 4358: 4356: 4353: 4351: 4350: 4346: 4344: 4341: 4340: 4338: 4334: 4324: 4321: 4319: 4318: 4314: 4312: 4309: 4307: 4304: 4302: 4301: 4297: 4295: 4292: 4290: 4287: 4285: 4282: 4280: 4277: 4275: 4272: 4270: 4269: 4265: 4263: 4260: 4256: 4253: 4252: 4251: 4248: 4246: 4245: 4241: 4239: 4236: 4235: 4233: 4231: 4230: 4225: 4219: 4218: 4214: 4212: 4209: 4207: 4204: 4202: 4199: 4197: 4194: 4192: 4189: 4187: 4186: 4182: 4178: 4175: 4173: 4170: 4169: 4168: 4167: 4163: 4159: 4156: 4154: 4151: 4150: 4149: 4146: 4142: 4141: 4137: 4136: 4135: 4132: 4130: 4127: 4125: 4124: 4120: 4118: 4115: 4114: 4111: 4108: 4104: 4098: 4095: 4093: 4092: 4088: 4086: 4083: 4081: 4078: 4076: 4073: 4071: 4068: 4066: 4065: 4061: 4059: 4056: 4054: 4051: 4049: 4048: 4044: 4042: 4039: 4037: 4034: 4032: 4031: 4027: 4025: 4022: 4020: 4017: 4015: 4012: 4010: 4007: 4005: 4002: 4000: 3997: 3995: 3992: 3990: 3987: 3985: 3982: 3980: 3977: 3975: 3972: 3970: 3967: 3965: 3964: 3960: 3958: 3955: 3953: 3950: 3948: 3945: 3943: 3940: 3938: 3935: 3933: 3930: 3928: 3925: 3923: 3920: 3918: 3915: 3913: 3910: 3908: 3905: 3903: 3900: 3896: 3895: 3891: 3890: 3889: 3886: 3884: 3881: 3879: 3876: 3875: 3873: 3867: 3861: 3860: 3856: 3854: 3851: 3847: 3844: 3842: 3839: 3838: 3837: 3834: 3830: 3827: 3825: 3822: 3820: 3819: 3815: 3813: 3812: 3808: 3806: 3803: 3801: 3798: 3796: 3793: 3791: 3788: 3787: 3786: 3783: 3782: 3780: 3776: 3772: 3768: 3763: 3759: 3741: 3738: 3736: 3735: 3732:Reactions to 3730: 3728: 3725: 3723: 3720: 3718: 3715: 3713: 3710: 3708: 3705: 3703: 3700: 3698: 3695: 3693: 3690: 3689: 3687: 3685: 3681: 3675: 3672: 3670: 3667: 3665: 3662: 3660: 3657: 3653: 3650: 3649: 3648: 3645: 3643: 3640: 3638: 3635: 3633: 3630: 3628: 3627: 3623: 3621: 3618: 3616: 3613: 3611: 3608: 3606: 3605:2018 walkouts 3603: 3601: 3598: 3597: 3594: 3591: 3589: 3585: 3579: 3576: 3572: 3569: 3567: 3564: 3562: 3559: 3558: 3556: 3554: 3551: 3547: 3544: 3542: 3539: 3538: 3536: 3534: 3533: 3529: 3527: 3524: 3522: 3521: 3517: 3515: 3512: 3510: 3507: 3505: 3502: 3500: 3497: 3495: 3492: 3490: 3489: 3485: 3483: 3480: 3478: 3475: 3473: 3472: 3468: 3466: 3463: 3461: 3458: 3456: 3455:Liquid Galaxy 3453: 3451: 3448: 3446: 3443: 3441: 3438: 3436: 3433: 3431: 3430: 3426: 3424: 3423: 3419: 3417: 3416: 3412: 3410: 3407: 3405: 3402: 3400: 3397: 3395: 3392: 3390: 3387: 3385: 3382: 3380: 3379: 3375: 3373: 3370: 3368: 3365: 3361: 3358: 3356: 3353: 3352: 3351: 3348: 3346: 3343: 3342: 3340: 3334: 3324: 3321: 3319: 3316: 3314: 3311: 3309: 3306: 3304: 3301: 3299: 3296: 3294: 3291: 3290: 3288: 3284: 3278: 3275: 3273: 3270: 3268: 3265: 3263: 3262: 3258: 3256: 3255: 3251: 3249: 3248: 3244: 3242: 3239: 3237: 3234: 3232: 3231: 3227: 3225: 3224:Doodle4Google 3222: 3220: 3217: 3215: 3214: 3213:Developer Day 3210: 3208: 3205: 3203: 3200: 3196: 3195:Developer Lab 3193: 3191: 3190:Developer Day 3188: 3186: 3183: 3182: 3180: 3179: 3176: 3173: 3169: 3163: 3160: 3154: 3153: 3149: 3147: 3146: 3142: 3141: 3140: 3137: 3136: 3135: 3132: 3128: 3125: 3123: 3120: 3118: 3115: 3113: 3110: 3109: 3107: 3106: 3104: 3100: 3094: 3091: 3089: 3088:YouTube Space 3086: 3084: 3081: 3079: 3076: 3074: 3071: 3069: 3068:Mayfield Mall 3066: 3064: 3061: 3059: 3056: 3054: 3053: 3049: 3047: 3044: 3042: 3039: 3037: 3034: 3032: 3031: 3027: 3025: 3024: 3020: 3018: 3015: 3014: 3012: 3008: 2998: 2995: 2993: 2990: 2988: 2985: 2983: 2980: 2978: 2975: 2973: 2970: 2968: 2965: 2962: 2959: 2957: 2956:Paul Otellini 2954: 2952: 2949: 2947: 2944: 2942: 2939: 2937: 2934: 2932: 2929: 2927: 2924: 2921: 2918: 2916: 2913: 2912: 2910: 2906: 2900: 2897: 2895: 2892: 2890: 2887: 2884: 2881: 2878: 2877:Sundar Pichai 2875: 2873: 2872:Rick Osterloh 2870: 2868: 2865: 2863: 2860: 2858: 2855: 2853: 2850: 2848: 2845: 2843: 2840: 2838: 2835: 2833: 2830: 2828: 2825: 2823: 2820: 2818: 2815: 2813: 2810: 2809: 2807: 2803: 2800: 2798: 2794: 2786: 2783: 2781: 2780:Social impact 2778: 2775: 2774:Me at the zoo 2771: 2769: 2766: 2765: 2764: 2761: 2759: 2756: 2754: 2753:Sidewalk Labs 2751: 2747: 2744: 2743: 2742: 2739: 2737: 2734: 2732: 2729: 2727: 2726: 2722: 2718: 2715: 2713: 2712:Public Alerts 2710: 2708: 2705: 2704: 2703: 2700: 2698: 2695: 2693: 2690: 2688: 2685: 2681: 2680: 2676: 2675: 2674: 2671: 2667: 2664: 2662: 2659: 2657: 2656:recovery mode 2654: 2652: 2649: 2648: 2647: 2644: 2640: 2637: 2635: 2634: 2630: 2629: 2628: 2625: 2623: 2620: 2619: 2617: 2613: 2609: 2602: 2598: 2592: 2589: 2585: 2582: 2581: 2580: 2577: 2575: 2572: 2570: 2567: 2565: 2564:Alphabet Inc. 2562: 2561: 2558: 2554: 2547: 2542: 2540: 2535: 2533: 2528: 2527: 2524: 2512: 2504: 2502: 2494: 2493: 2490: 2484: 2481: 2479: 2476: 2474: 2471: 2469: 2466: 2464: 2461: 2458: 2454: 2453: 2451: 2447: 2436: 2433: 2432: 2430: 2426: 2419: 2416: 2413: 2410: 2407: 2404: 2401: 2398: 2395: 2392: 2389: 2386: 2385: 2383: 2379: 2372: 2369: 2366: 2363: 2360: 2357: 2356: 2354: 2350: 2347: 2345:Generative AI 2343: 2333: 2330: 2328: 2325: 2323: 2320: 2319: 2317: 2313: 2306: 2303: 2300: 2297: 2294: 2291: 2290: 2288: 2284: 2281: 2277: 2266: 2265:AlphaGeometry 2263: 2260: 2257: 2254: 2251: 2248: 2245: 2244: 2242: 2238: 2227: 2226: 2222: 2219: 2218: 2214: 2213: 2211: 2207: 2200: 2197: 2194: 2191: 2188: 2185: 2184: 2182: 2178: 2171: 2168: 2165: 2162: 2159: 2156: 2153: 2150: 2147: 2144: 2143: 2141: 2137: 2134: 2130: 2127: 2123: 2117: 2114: 2112: 2109: 2107: 2104: 2103: 2100: 2096: 2089: 2084: 2082: 2077: 2075: 2070: 2069: 2066: 2062: 2047: 2043: 2037: 2034: 2023: 2019: 2013: 2010: 1999: 1995: 1988: 1985: 1974: 1969: 1965: 1964: 1956: 1954: 1950: 1945: 1941: 1937: 1933: 1929: 1921: 1918: 1906: 1902: 1896: 1893: 1882: 1881: 1874: 1872: 1868: 1863: 1859: 1852: 1849: 1844: 1840: 1836: 1832: 1828: 1820: 1817: 1812: 1808: 1804: 1800: 1795: 1790: 1786: 1782: 1778: 1771: 1768: 1756: 1752: 1746: 1743: 1732: 1727: 1723: 1722: 1714: 1711: 1700: 1696: 1690: 1687: 1676: 1671: 1667: 1666: 1658: 1655: 1644: 1639: 1635: 1634: 1626: 1623: 1611: 1607: 1601: 1598: 1586: 1582: 1576: 1573: 1562: 1558: 1552: 1550: 1546: 1535: 1530: 1526: 1525: 1517: 1514: 1502: 1498: 1492: 1489: 1484: 1478: 1474: 1470: 1463: 1461: 1457: 1452: 1448: 1444: 1437: 1434: 1428: 1423: 1415: 1412: 1401: 1400: 1393: 1391: 1387: 1382: 1378: 1375:(140): 1–67. 1374: 1370: 1366: 1359: 1357: 1355: 1351: 1345: 1343: 1341: 1337: 1331: 1325: 1320: 1316: 1312: 1308: 1305: 1302: 1298: 1294: 1291: 1288: 1284: 1281: 1278: 1274: 1270: 1266: 1262: 1259: 1256: 1252: 1248: 1245: 1242: 1238: 1235: 1232: 1228: 1227:LM-adapted T5 1224: 1221: 1218: 1214: 1211: 1207: 1206:LM-adapted T5 1204: 1203: 1197: 1194: 1191: 1188: 1185: 1182: 1179: 1178: 1174: 1171: 1168: 1165: 1162: 1159: 1156: 1155: 1151: 1148: 1145: 1142: 1139: 1136: 1133: 1132: 1128: 1125: 1122: 1119: 1116: 1113: 1110: 1109: 1105: 1102: 1099: 1096: 1093: 1090: 1087: 1086: 1069: 1066: 1063: 1060: 1056: 1048: 1032: 1029: 1025: 1017: 1001: 998: 994: 986: 970: 967: 964: 961: 958: 954: 946: 930: 927: 924: 921: 918: 914: 906: 903: 900: 899: 893: 890: 887: 884: 883: 882: 879: 872: 870: 866: 850: 847: 844: 841: 837: 831: 828: 824: 820: 815: 812: 809: 806: 803: 799: 786: 771: 768: 764: 755: 738: 735: 731: 723: 706: 703: 700: 697: 694: 690: 682: 665: 662: 659: 656: 652: 644: 627: 624: 621: 618: 615: 611: 603: 602: 601: 598: 591: 588: 585: 582: 579: 576: 573: 572: 568: 565: 562: 559: 556: 553: 550: 549: 545: 542: 539: 536: 533: 530: 527: 526: 522: 519: 516: 513: 510: 507: 504: 503: 499: 496: 493: 490: 487: 484: 481: 480: 463: 460: 457: 454: 450: 442: 426: 423: 419: 411: 395: 392: 388: 380: 364: 361: 358: 355: 352: 348: 340: 324: 321: 318: 315: 312: 308: 300: 297: 294: 293: 290: 287: 285: 275: 268: 255: 251: 241:translation: 240: 217: 216: 215: 208: 204: 193: 191: 187: 179: 177: 175: 170: 168: 165:developed by 164: 160: 151: 135: 131: 128: 125: 123: 119: 113: 110: 108: 105: 104: 102: 100: 96: 93: 90: 88: 84: 80: 75: 63: 61: 57: 53: 39: 35: 32: 29: 27: 23: 9573:Hugging Face 9537:David Silver 9185:Audio–visual 9039:Applications 9018:Augmentation 8863: 8712:Concordancer 8108:Bag-of-words 8001: 7903: 7896: 7889: 7882: 7875: 7868: 7851: 7843: 7825: 7818: 7811: 7804: 7797: 7790: 7783: 7776: 7771:Google Hacks 7769: 7749: 7742: 7735: 7728: 7721: 7704:YouTube poop 7672:Googlization 7556: 7548: 7540: 7532: 7513: 7505: 7497: 7489: 7475: 7467: 7459: 7435: 7427: 7419: 7411: 7403: 7394: 7375: 7367: 7359: 7351: 7321: 7313: 7305: 7297: 7289: 7205: 7198: 7191: 7157: 7148: 7143:Nexus Player 7141: 7134: 7124: 7089: 7084:Contact Lens 7082: 7075: 7069:(unreleased) 7067:Project Iris 7002:Pixelbook Go 6952: 6929: 6924:Play Edition 6922: 6789:Galaxy Nexus 6771: 6759: 6714: 6705: 6698: 6686: 6679:Nik Software 6677: 6627: 6610: 6590: 6578: 6574:Voice Access 6566: 6559: 6542: 6525: 6518: 6506: 6497:Question Hub 6489: 6472: 6462:Nearby Share 6460: 6453: 6441: 6434: 6427: 6420: 6406:Google Fonts 6383: 6366: 6354: 6347: 6340: 6333: 6313:Android Beam 6311: 6307:Android Auto 6235: 6211:Web Designer 6200:Page Creator 6198: 6191: 6179: 6160: 6117: 6095:Docs Editors 6082: 6070: 6058: 6051: 6034: 6027: 6020: 6008: 5991: 5987:Cloud Search 5976:Browser Sync 5974: 5967: 5955:Organization 5938: 5931: 5919: 5910: 5903: 5886: 5877: 5856:Invite Media 5848: 5841: 5834: 5827: 5820: 5736: 5729: 5722: 5712:ImageAmerica 5710: 5682: 5678:Voice Search 5668: 5639: 5630: 5623: 5611: 5594: 5582: 5578:People Cards 5563: 5546: 5537: 5530: 5518: 5509: 5502: 5485: 5473: 5466: 5459: 5445:Data Commons 5437: 5431:Ngram Viewer 5418: 5411: 5404: 5397: 5385: 5365: 5358: 5344: 5332: 5325: 5318: 5311: 5302: 5297:Quest Visual 5295: 5288: 5281: 5274: 5267: 5250: 5243: 5236: 5227: 5210: 5203: 5184: 5167: 5160: 5153: 5141: 5134: 5127: 5110: 5103: 5096: 5054: 5047: 5006:BrandConnect 4998: 4968: 4961: 4926: 4907: 4900: 4893: 4886: 4877:Quick, Draw! 4869: 4857: 4840: 4745: 4722:File formats 4700: 4686: 4664: 4624: 4509: 4472: 4450: 4438: 4388:Chrome Frame 4386: 4364: 4347: 4315: 4298: 4266: 4242: 4227: 4215: 4191:ITA Software 4183: 4164: 4138: 4121: 4089: 4062: 4045: 4028: 3961: 3892: 3857: 3816: 3809: 3733: 3624: 3546:Grace Hopper 3530: 3518: 3486: 3469: 3427: 3420: 3413: 3376: 3336:Projects and 3313:Music Awards 3267:Science Fair 3259: 3254:Lunar XPRIZE 3252: 3245: 3228: 3211: 3150: 3143: 3122:Product Sans 3058:Data centers 3050: 3028: 3021: 2982:Amit Singhal 2972:Eric Schmidt 2946:Timnit Gebru 2941:Alan Eustace 2867:Alan Mulally 2857:Ray Kurzweil 2723: 2677: 2631: 2478:Google Pixel 2223: 2215: 2180:Competitions 2158:AlphaGo Zero 2111:Google Brain 2060: 2049:. Retrieved 2045: 2036: 2025:. Retrieved 2021: 2012: 2001:. Retrieved 1997: 1987: 1977:, retrieved 1962: 1938:(377): 1–8. 1935: 1931: 1920: 1909:. Retrieved 1907:. 2024-01-04 1904: 1895: 1885:, retrieved 1879: 1861: 1851: 1837:(70): 1–53. 1834: 1830: 1819: 1784: 1780: 1770: 1759:. Retrieved 1757:. 2024-03-04 1754: 1745: 1735:, retrieved 1720: 1713: 1702:. Retrieved 1698: 1689: 1679:, retrieved 1664: 1657: 1647:, retrieved 1632: 1625: 1614:. Retrieved 1612:. 2020-11-19 1609: 1600: 1589:. Retrieved 1587:. 2020-11-19 1584: 1575: 1564:. Retrieved 1560: 1538:, retrieved 1523: 1516: 1505:. Retrieved 1503:. 2020-04-24 1500: 1491: 1472: 1450: 1446: 1436: 1414: 1404:, retrieved 1398: 1372: 1368: 1339: 1332: 1329: 1326:Applications 1310: 1306: 1296: 1295:20B (2022): 1292: 1286: 1282: 1279:with MeshTF. 1272: 1268: 1260: 1250: 1246: 1236: 1226: 1222: 1209: 1205: 891: 885: 880: 876: 867: 790: 756: 599: 597: 288: 281: 269:Architecture 247:Das ist gut. 226:, where the 213: 194: 183: 171: 158: 157: 9721:Categories 9669:Autoencoder 9624:Transformer 9492:Alex Graves 9440:OpenAI Five 9344:IBM Watsonx 8966:Convolution 8944:Overfitting 8669:Topic model 8549:Text corpus 8395:Statistical 8262:Text mining 8103:AI-complete 8047:WikiProject 7976:Sensorvault 7845:Google Feud 7813:In the Plex 7667:Googlewhack 7657:Googleshare 7647:Googlefight 7283:Advertising 7052:Pixel Watch 6931:Project Ara 6767:Android One 6753:Smartphones 6656:photography 6629:GreenBorder 6502:Quick Share 6391:Family Link 6385:Expeditions 6374:Crowdsource 6368:Cloud Print 6268:Marketplace 6237:Grasshopper 6053:Quickoffice 5843:DoubleClick 5836:Contributor 5816:Attribution 5784:and finance 5749:Street View 5511:Image Swirl 5439:Code Search 5420:Blog Search 5149:Fi Wireless 4696:Transformer 4553:Hummingbird 4537:Web Toolkit 4522:Trendalyzer 4317:Stackdriver 4211:Project IDX 4158:Crashlytics 4129:Apps Script 4047:Reqwireless 3952:FlatBuffers 3947:File System 3767:Development 3652:Street View 3520:Solve for X 3477:Nightingale 3345:20% project 3338:initiatives 3303:Comedy Week 3052:Chrome Zone 3023:Androidland 3010:Real estate 2977:Ram Shriram 2920:Sergey Brin 2889:Rajen Sheth 2473:Google Labs 2299:Transformer 1787:: 291–306. 256:sentence): 9747:Categories 9710:Technology 9563:EleutherAI 9522:Fei-Fei Li 9517:Yann LeCun 9430:Q-learning 9413:Decisional 9339:IBM Watson 9247:Midjourney 9139:TensorFlow 8986:Activation 8939:Regression 8934:Clustering 8390:Rule-based 8272:Truecasing 8140:Stop words 7827:The MANIAC 7662:Google tax 7276:Litigation 7180:Thermostat 7136:Chromecast 7047:Pixel Buds 6980:Comparison 6948:Chromebook 6916:Comparison 6819:Comparison 6707:Web Albums 6654:Images and 6561:Tilt Brush 6527:SlickLogin 6253:Read Along 6187:FeedBurner 6154:Publishing 5791:Ad Manager 5738:Navigation 5694:Navigation 5613:Searchwiki 5602:SafeSearch 5475:Flu Trends 5455:Dictionary 5056:RightsFlow 5011:Content ID 4713:Web Server 4495:Schema.org 4485:Public DNS 4458:OpenRefine 4429:Lighthouse 4255:VirusTotal 4196:Kubernetes 4117:App Engine 4070:TensorFlow 4024:OpenSocial 3927:Dialogflow 3871:frameworks 3869:Libraries/ 3841:ChromiumOS 3790:Automotive 3697:Censorship 3642:Litigation 3632:FairSearch 3615:Censorship 3504:RechargeIT 3488:PowerMeter 3063:Googleplex 2961:Larry Page 2931:Matt Cutts 2899:Neal Mohan 2894:Hal Varian 2883:Ruth Porat 2862:Ann Mather 2847:Urs Hölzle 2827:John Doerr 2717:RechargeIT 2702:Google.org 2400:Chinchilla 2327:TensorFlow 2225:The MANIAC 2051:2024-08-23 2042:"AuraFlow" 2027:2024-08-23 2003:2024-05-05 1979:2024-08-05 1973:2205.05131 1911:2024-08-05 1887:2024-08-05 1794:2105.13626 1761:2024-08-21 1737:2024-08-05 1731:2110.08207 1704:2024-08-05 1681:2024-08-05 1675:2101.03961 1649:2024-08-21 1643:2104.08691 1616:2024-09-17 1591:2024-09-17 1566:2024-08-05 1540:2024-09-17 1534:1803.02155 1507:2024-09-17 1427:2210.03094 1406:2024-08-21 1346:References 1340:Pile-T5-XL 1277:TensorFlow 1263:(2022): a 1247:Flan-T5-XL 904:Parameters 298:Parameters 127:Apache-2.0 87:Repository 9593:MIT CSAIL 9558:Anthropic 9527:Andrew Ng 9425:AlphaZero 9269:VideoPoet 9232:AlphaFold 9169:MindSpore 9123:SpiNNaker 9118:Memristor 9025:Diffusion 9001:Rectifier 8981:Batchnorm 8961:Attention 8956:Adversary 8699:reviewing 8497:standards 8495:Types and 8004:indicate 7852:Google Me 7339:Antitrust 7330:Jedi Blue 7120:Chromebox 7115:Chromebit 7077:Cardboard 7028:Wearables 6997:Pixelbook 6779:Nexus One 6688:Panoramio 6640:Web Store 6586:Web Light 6429:Impermium 6342:BufferBox 6324:Assistant 6295:Dashboard 6263:Workspace 6248:Photomath 6231:Classroom 6224:Education 6173:Pyra Labs 5969:Bookmarks 5868:Analytics 5731:Map Maker 5493:Google.by 5468:Fast Flip 5461:Directory 5340:Translate 5304:Word Lens 5269:Moderator 5245:Marratech 5192:Interface 5136:Dodgeball 4970:Newsstand 4637:Googlebot 4590:RankBrain 4490:reCAPTCHA 4468:PageSpeed 4366:App Maker 4300:Messaging 4279:Datastore 4250:Chronicle 4106:Platforms 4004:MapReduce 3937:Fast Pair 3917:Chart API 3846:Neverware 3626:Dragonfly 3588:Criticism 3429:Free Zone 3415:Dragonfly 3318:Space Lab 2963:(Founder) 2922:(Founder) 2822:Jeff Dean 2817:Vint Cerf 2615:Divisions 2418:VideoPoet 2359:Assistant 2253:AlphaStar 2247:AlphaFold 2193:Lee Sedol 2164:AlphaZero 2095:Google AI 1994:"Pile-T5" 1944:1533-7928 1843:1533-7928 1811:2307-387X 1381:1533-7928 1231:zero-shot 236:<Y> 232:<X> 228:<Z> 167:Google AI 140:.research 31:Google AI 9701:Portals 9460:Auto-GPT 9292:Word2vec 9096:Hardware 9013:Datasets 8915:Concepts 8615:Wikidata 8595:FrameNet 8580:BabelNet 8559:Treebank 8529:PropBank 8474:Word2vec 8439:fastText 8320:Stemming 8017:Category 7939:Registry 7694:Sitelink 7620:Gayglers 7578:Category 7091:Daydream 6965:7 (2013) 6960:7 (2012) 6743:Hardware 6673:Snapseed 6618:Chromium 6555:TalkBack 6520:Sidewiki 6455:MyTracks 6243:Socratic 6193:One Pass 6108:Drawings 6036:Notebook 6029:Jamboard 6004:Etherpad 5982:Calendar 5946:Widevine 5933:Softcard 5921:PostRank 5829:Checkout 5782:Business 5754:Coverage 5724:Latitude 5625:Catalogs 5619:Shopping 5548:Like.com 5532:Freebase 5387:Aardvark 5263:Messages 5223:Japanese 5212:Helpouts 5205:Hangouts 5123:Contacts 5000:BandPage 4981:Services 4871:Podcasts 4819:Products 4615:BigQuery 4558:PageRank 4505:Sitemaps 4463:OR-Tools 4452:Optimize 4377:AppSheet 4306:Orbitera 4294:Mandiant 4274:Dataflow 4238:Bigtable 4148:Firebase 3836:ChromeOS 3811:Goobuntu 3795:Glass OS 3712:Elsagate 3620:DeGoogle 3557:YouTube 3526:Starline 3471:News Lab 3350:Area 120 3261:Mapathon 3207:Code Jam 3181:Android 3112:Croscore 2785:YouTuber 2746:Timeline 2639:DeepMind 2501:Category 2449:See also 2352:Chatbots 2259:AlphaDev 2139:Versions 1319:The Pile 1293:Flan-UL2 873:Variants 180:Training 9583:Meta AI 9420:AlphaGo 9404:PanGu-ÎŁ 9374:ChatGPT 9349:Granite 9297:Seq2seq 9276:Whisper 9197:WaveNet 9192:AlexNet 9164:Flux.jl 9144:PyTorch 8996:Sigmoid 8991:Softmax 8856:General 8786:Related 8752:Chatbot 8610:WordNet 8590:DBpedia 8464:Seq2seq 8208:Parsing 8123:Trigram 8037:Outline 8027:Commons 8002:Italics 7959:.google 7723:AlphaGo 7684:Rooting 7596:Related 7453:Privacy 7159:Dropcam 7150:Nexus Q 6906:Pixel 9 6544:Station 6356:BumpTop 6300:Takeout 6290:Account 6181:Domains 6168:Blogger 6084:Toolbar 6066:Surveys 6022:iGoogle 5993:Desktop 5822:BebaPay 5811:AdSense 5806:Adscape 5759:Trusted 5706:Endoxon 5657:Flights 5641:Squared 5632:Express 5607:Scholar 5573:Patents 5565:Weather 5559:Archive 5539:Metaweb 5487:Goggles 5481:Finance 5399:Answers 5327:Sparrow 5313:Schemer 5290:Postini 5180:History 5169:Google+ 5036:Premium 5016:Instant 4991:YouTube 4702:Viewdle 4688:Sawzall 4626:Flutter 4580:Penguin 4409:Gadgets 4399:Cpplint 4323:Storage 4268:Connect 4201:LevelDB 4080:WaveNet 4036:Polymer 3989:Guetzli 3957:Flutter 3912:Blockly 3888:Angular 3853:Fuchsia 3829:Wear OS 3785:Android 3684:YouTube 3553:Sunroof 3286:YouTube 3202:Code-in 3073:Pier 57 2837:Al Gore 2805:Current 2768:History 2763:YouTube 2646:Android 2605:Company 2569:History 2511:Commons 2365:Sparrow 2293:WaveNet 2217:AlphaGo 2187:Fan Hui 2146:AlphaGo 2132:AlphaGo 1307:Pile-T5 1253:, then 142:.google 133:Website 122:License 45: ( 9598:Huawei 9578:OpenAI 9480:People 9450:MuZero 9312:Gemini 9307:Claude 9242:DALL-E 9154:Theano 8759:(c.f. 8417:models 8405:Neural 8118:Bigram 8113:n-gram 7929:elgooG 7915:Others 7854:(film) 7561:(2022) 7553:(2021) 7545:(2020) 7537:(2015) 7518:(2019) 7510:(2014) 7502:(2013) 7494:(2013) 7486:(2013) 7480:(2012) 7472:(2010) 7464:(2009) 7440:(2021) 7432:(2016) 7424:(2015) 7416:(2015) 7408:(2010) 7400:(2007) 7380:(2023) 7372:(2020) 7364:(2019) 7356:(2011) 7326:(2017) 7318:(2012) 7310:(2009) 7302:(2009) 7294:(2007) 7218:Tensor 7108:Others 7035:Fitbit 7017:Tablet 6716:Picnik 6700:Picasa 6694:Photos 6663:Camera 6603:Chrome 6508:Reader 6474:Offers 6443:Lively 6416:Gemini 6411:Gboard 6279:Others 6162:Apture 6135:Slides 6130:Sheets 6047:Photos 6010:fflick 5927:Primer 5894:Wallet 5879:Urchin 5664:Trends 5652:Travel 5520:Kaltix 5498:Images 5393:Alerts 5378:Search 5320:Spaces 5229:Pinyin 5199:Groups 5162:Gizmo5 5072:Studio 5067:Shorts 5062:Select 4895:Stadia 4888:Songza 4859:Oyster 4768:libvpx 4672:Carbon 4632:Gemini 4599:Others 4585:Pigeon 4568:Matrix 4527:VisBug 4511:Swiffy 4414:Gerrit 4289:Looker 4284:Kaggle 4244:Bitium 4229:Apigee 4217:SageTV 4140:Anvato 4123:AppJet 4097:WebRTC 4009:Matter 3999:gVisor 3922:Charts 3902:ARCore 3818:Things 3805:gLinux 3541:Dunant 3509:Shield 3360:Tables 3293:Awards 3241:Jigsaw 3171:Events 3139:Doodle 3127:Roboto 3108:Fonts 3102:Design 3030:Barges 2908:Former 2797:People 2741:Search 2725:Health 2687:Chrome 2679:Goojje 2553:Google 2437:(2024) 2420:(2024) 2414:(2023) 2412:Gemini 2408:(2022) 2402:(2022) 2396:(2021) 2390:(2018) 2373:(2023) 2371:Gemini 2367:(2022) 2361:(2016) 2307:(2022) 2301:(2017) 2295:(2016) 2267:(2024) 2261:(2023) 2255:(2019) 2249:(2018) 2228:(2023) 2220:(2017) 2201:(2017) 2199:Ke Jie 2195:(2016) 2189:(2015) 2172:(2019) 2170:MuZero 2166:(2017) 2160:(2017) 2154:(2016) 2152:Master 2148:(2015) 2106:Google 1942:  1841:  1809:  1561:GitHub 1479:  1379:  892:T5 1.1 260:-> 245:-> 222:-> 199:-> 68:github 9664:Mamba 9435:SARSA 9399:LLaMA 9394:BLOOM 9379:GPT-J 9369:GPT-4 9364:GPT-3 9359:GPT-2 9354:GPT-1 9317:LaMDA 9149:Keras 8808:spaCy 8453:large 8444:GloVe 7762:Books 7526:Other 7193:OnHub 7126:Clips 7097:Glass 7012:Slate 6987:Pixel 6954:Nexus 6831:Pixel 6826:Pixel 6773:Nexus 6580:Wavii 6550:Store 6206:Sites 6140:Sites 6113:Forms 6078:Tasks 6016:Files 5999:Drive 5796:AdMob 5701:Earth 5647:Tenor 5426:Books 5354:Voice 5283:Orkut 5252:Meebo 5238:Jaiku 5186:Inbox 5175:Gmail 5026:Music 4963:Music 4952:Games 4947:Books 4928:Video 4902:games 4792:WOFF2 4647:LaMDA 4575:Panda 4424:Kythe 4382:Bazel 4336:Tools 4311:Shell 4206:Neatx 4085:Weave 4064:Tango 4053:Shell 3984:Guice 3979:Guava 3963:Gears 3465:Māori 3355:Reply 3230:G-Day 2885:(CFO) 2879:(CEO) 2736:Pixel 2697:Glass 2692:Cloud 2673:China 2633:Brain 2428:Other 2394:LaMDA 2315:Other 2240:Other 1968:arXiv 1789:arXiv 1726:arXiv 1670:arXiv 1638:arXiv 1529:arXiv 1422:arXiv 1315:Llama 1251:T5 XL 1241:UTF-8 1192:10240 1134:Large 1088:Small 901:Model 586:65536 563:16384 528:Large 482:Small 295:Model 234:and 150:.html 144:/2020 9588:Mila 9389:PaLM 9322:Bard 9302:BERT 9285:Text 9264:Sora 8573:Data 8424:BERT 7954:g.co 7949:.dev 7185:Wifi 7170:Nest 6891:Fold 6668:Lens 6612:Apps 6592:WiFi 6491:Poly 6436:Knol 6362:Cast 6335:Body 6145:Vids 6125:Keep 6103:Docs 6072:Sync 5905:Send 5768:Waze 5718:Maps 5684:WDYL 5554:News 5406:Base 5367:Wave 5334:Talk 5258:Meet 5118:Chat 5112:Buzz 5105:Bump 5098:Allo 5021:Kids 4976:Pass 4939:Play 4922:Vevo 4787:WebP 4782:WebM 4677:Dart 4666:Caja 4657:PaLM 4610:BERT 4075:Test 4030:Pack 3974:Gson 3969:gRPC 3907:APIs 3878:ALTS 3578:Zero 3450:Labs 3367:ATAP 3308:Live 3134:Logo 3117:Noto 2731:Maps 2435:Vids 2406:PaLM 2388:BERT 2305:Gato 1940:ISSN 1839:ISSN 1807:ISSN 1477:ISBN 1377:ISSN 1299:20B 1237:ByT5 1189:4096 1169:5120 1166:2048 1146:2816 1143:1024 1137:770M 1123:2048 1114:220M 1111:Base 1100:1024 592:128 583:1024 560:1024 540:4096 537:1024 531:770M 517:3072 508:220M 505:Base 494:2048 254:CoLA 138:blog 99:Type 74:/t5x 70:.com 65:T5X 9329:NMT 9212:OCR 9207:HWR 9159:JAX 9113:VPU 9108:TPU 9103:IPU 8927:SGD 8605:UBY 6468:Now 6401:Fit 6042:One 5912:Tez 5801:Ads 5744:Pin 5218:IME 5143:Duo 4777:VP9 4763:VP8 4758:VP6 4753:VP3 4739:AV1 4734:APK 4729:AAB 4419:GYP 3994:JAX 3883:AMP 3236:I/O 2622:Ads 1799:doi 1297:UL2 1283:UL2 1273:not 1265:JAX 1261:T5X 1198:64 1183:11B 1180:XXL 1175:32 1152:16 1129:12 1120:768 1097:512 1091:60M 589:128 577:11B 574:11B 569:32 566:128 546:16 523:12 514:768 491:512 485:60M 146:/02 9749:: 6970:10 6901:8a 6886:7a 6876:6a 6866:5a 6856:4a 6846:3a 6814:6P 6809:5X 6645:V8 5077:TV 4917:TV 4682:Go 3894:JS 3859:TV 3824:TV 3800:Go 2627:AI 2044:. 2020:. 1996:. 1966:, 1952:^ 1936:24 1934:. 1930:. 1903:. 1870:^ 1860:. 1835:25 1833:. 1829:. 1805:. 1797:. 1785:10 1783:. 1779:. 1753:. 1724:, 1697:. 1668:, 1636:, 1608:. 1583:. 1559:. 1548:^ 1527:, 1499:. 1471:. 1459:^ 1451:30 1449:. 1445:. 1389:^ 1373:21 1371:. 1367:. 1353:^ 1342:. 1311:T5 1287:T5 1269:T5 1223:T0 1210:T5 1195:64 1186:24 1172:64 1163:24 1160:3B 1157:XL 1149:64 1140:24 1126:64 1117:12 1106:6 1103:64 886:T5 865:. 580:24 557:24 554:3B 551:3B 543:64 534:24 520:64 511:12 500:8 497:64 203:. 8842:e 8835:t 8828:v 8763:) 8486:, 8455:) 8451:( 8081:e 8074:t 8067:v 8008:. 7925:" 7921:" 7680:" 7676:" 7616:" 7612:" 7268:e 7261:t 7254:v 7007:C 6975:9 6896:8 6881:7 6871:6 6861:5 6851:4 6841:3 6836:2 6804:6 6799:5 6794:4 6784:S 2776:" 2772:" 2545:e 2538:t 2531:v 2459:" 2455:" 2087:e 2080:t 2073:v 2054:. 2030:. 2006:. 1970:: 1946:. 1914:. 1845:. 1813:. 1801:: 1791:: 1764:. 1728:: 1707:. 1672:: 1640:: 1619:. 1594:. 1569:. 1531:: 1510:. 1485:. 1430:. 1424:: 1383:. 1094:8 1070:d 1067:a 1064:e 1061:h 1057:n 1033:v 1030:k 1026:d 1002:f 999:f 995:d 971:l 968:e 965:d 962:o 959:m 955:d 931:r 928:e 925:y 922:a 919:l 915:n 851:d 848:a 845:e 842:h 838:n 832:v 829:k 825:d 821:= 816:l 813:e 810:d 807:o 804:m 800:d 772:v 769:k 765:d 739:f 736:f 732:d 707:l 704:e 701:d 698:o 695:m 691:d 666:d 663:a 660:e 657:h 653:n 628:r 625:e 622:y 619:a 616:l 612:n 488:6 464:d 461:a 458:e 455:h 451:n 427:v 424:k 420:d 396:f 393:f 389:d 365:l 362:e 359:d 356:o 353:m 349:d 325:r 322:e 319:y 316:a 313:l 309:n 264:. 249:. 49:)

Index

Original author(s)
Google AI
Stable release
github.com/google-research/t5x
Repository
https://github.com/google-research/text-to-text-transfer-transformer
Type
Large language model
Transformer (deep learning architecture)
License
Apache-2.0
blog.research.google/2020/02/exploring-transfer-learning-with-t5.html
large language models
Google AI
encoder-decoder Transformers
Colossal Clean Crawled Corpus
scraped from the internet

CoLA

encoder-decoder Transformers
mixture-of-experts
zero-shot
UTF-8
instruction-tuned
JAX
TensorFlow
instruction-finetuned
Llama
The Pile

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑