812:
because wikipedia can edit everybody), that the species really exist and is valid. 2) This robot properly add very complicated taxonomy of gastropods (there is very high danger of error, when starting by non-expert or unexperienced wikipedia editor) and also properly add the authority of species (this also can be a problem for non expert). 3) It works systematically and efficiently. Also note, that gastropods are generally highly underestimated in knowledge not only in wikipedia, but also in scientific world. There are less member projects for larger tasks than any other wikiprojects. We usually have no time even to use images from commons in some cases. There is for example less wikipedia articles for
Conidae than related categories on commons(!). Help by a robot would be very helpful. There were some robots in the past that were considered unhelpful and some that were considered helpful. Are you able to learn from these lessons from past or will you refuse all (as recommended to refuse by The Earwig who starts the same(!) as articles
1476:
1491:
1033:
48:
544:. These are unique to each article and are used in the actual stub article to create a reference. Additional text has been added to lede by the project members to ensure the stub is not a one-liner. It has proper stub templates, taxo infobox and categories. To do this manually is a lot of work, so the bot request. Regards,
504:. We - wikiproject gastropod members - are starting stubs exactly like this manually. - This approval is only formal check, that this task of bot will not be against wikipedia guidelines. This task was already approved by one wikiproject Gastropods already. This bot task is fine and can be approved without protraction. --
432:
language-independent, so placing it in the
English Knowledge (XXG) (for example) would be inappropriate. Wikispecies also has different software requirements from those of Knowledge (XXG). The project is not intended to reduce the scope of Knowledge (XXG), or take the place of the in-depth biology articles therein.
476:. I could understand finding and adding individual species articles, if there's something to actually write, but... do we really thousands of new stubs here that (apparently) consist of nothing more then some genetic information, when there is a specialized wiki that specifically does want those sorts of articles?
253:
724:
while there have been runs of adding information to
Knowledge (XXG) like this in the past, that shouldn't be understood to imply carte blance approval for continuing to do so. Knowledge (XXG) has grown a lot over the years, to the point where I'm not sure that these automated article additions are that helpful.
811:
This robot can start an article, which is much better in content and compatible with standards than any other commonly manually started gastropod related article. 1) This robot do things, that humans usually do not. It uses inline reference, which easily allows verify to everybody (that is important,
723:
A couple of quick points: in direct reply to yourself, Ganeshk, The Earwig was making a general comparison in response to the "it's been done before" statement made by Tim1357, above. He did say that he didn't expect this to run into the same issues that Anybot's edits did. The Point though, is that
971:
On the other hand, to be sure, that we will check all of bot generated articles (especially if
Ganeshbot will be approved for other tasks). You can make some solution, that will alert for all bot generated articles, that were not edited by any wikipedian (or project member(?)) for a long time since
956:
But yes, accuracy of all generated species articles will be verified based on its related manually created generic article. And every article that will be edited by any user who will add an additional reference will be verified so. This can not be done immediately, but there is an intention to not
815:
in unnecessary 8 edits as
Ganeshbot can do easily in one edit) even if the robot will not broke any of wikipedia policies and guidelines? This start of 650 stubs is really very small part of the work that needs to be done. Maybe there are some fields in wikipedia that really does not need any help
727:
Just brainstorming here, one possible way forward is for yourself and the rest of the project to manually create about a dozen of these articles. If they all survive for, say, 6 months, then that will go far in showing "hey, no one seems to have an issue with these articles on
Knowledge (XXG)", if
600:
Let me talk to the project about removing those empty sections. On your second note, the article content has been verified by the project members. The data was pulled from a credible site, WoRMS. There was no new bot written for this task. CSVLoader and AWB are just software tools that facilitate
471:
I'm just passing by here, and I want to make it clear that I don't feel strongly about this either way. I'd just like to say that MZMcBride makes a point. I understand the argument that there are two audiences that the wiki's cater too and all, but really, by this point I would question the mass
431:
The primary reason that
Wikispecies is not part of Knowledge (XXG) is that the two reference works serve different purposes and audiences. The needs of a general-purpose, general-audience encyclopedia differ from those of a professional reference work. Furthermore, much of Wikispecies is
987:
As a way forward, Can the BAG let me create the first 50 articles? The Gastro project can check and confirm that they are good. Once they are good, I can get the bot approval to complete the rest. I am hoping that will take care of the concerns here. Please advise. Thanks.
654:
automated stub creation by bots. If over five hundred pages are simple enough and repetitive enough to be created by bots, I'm not entirely sure they should be on
Knowledge (XXG) in the first place. What's the point of having 500+ articles that are essentially the same?
952:
If you think, that we should verify that
Ganeshbot will not make an error, then it is the same as with any other Bot. (Who is verifying if some bot is adding a proper DOI? The bot either works all right, or works absolutely false). Any error of function will stop this
957:
let article in the state that "This species exist", but to expand it. I expect (based on my experiences) that usual expansion will be like this: adding distribution, adding at least dimensions of shells and adding wikilink to its author. --
1220:, for example). This information can be verified in the article's reference. Shouldn't it be present in the article's taxobox as well? If not, then project gastropods members would have to add this information manually later, I presume.--
900:
that shows the data in wiki syntax. You can check the accuracy of the data using the species links. The bot generated content will be as good as the data that is hosted on WoRMS site (as of the download date). Regards,
532:
Ohms, I understand your concern. There was a lot of pre-work done before this bot approval request. The stubs contain more just the genetic information. The data for the stubs was retrieved using Web
Services.
388:(edit conflict) Yes, yes it does. We could bulk import a lot of things into Knowledge (XXG) (case law, dictionary definitions, images, etc.) but we don't because we have other projects for such things (
1210:
If I may add my comment as a member of wikiproject gastropods: All the information present in each of these 20 bot-created stubs seems precise, according to the cited reference. The Taxonomy follows
400:). We've had specific issues in the past with bulk creation of articles and with bulk creation of species articles in particular. This is going to need some broader discussion and consideration. --
772:
Thanks Ohms for looking for a way forward. Similar articles have been created manually and no one had any issue with them. Please browse through articles in the sub-categories of this category,
644:
683:
This task is no way close to what AnyBot did. I had clearly mentioned that about 580 stubs will be created (nothing more). The list of articles that will be created are already listed in the
1484:
Ok Ganesh. A few of us in the BAG were concerned that the bot would finish making all these articles and the Wikiproject would be swamped with these new stubs. We want humans to be at least
576:
should not be published. Most of the content is unreferenced, and there's really no point in adding multiple empty section headers with ugly cleanup tags. If and when they get expanded,
1558:
1211:
1159:
Thanks for your comments. The empty sections will be helpful for the person who is expanding the article next. How about a compromise? Two empty sections instead of three. See
622:
285:
to show how the finished article will look like. It was checked and verified by all the Gastropod project members. The page also served as a central place for discussions.
1501:
create 100 articles per month. This gives your wikiproject some time to look over the stubs. Hopefully this is acceptable for you, and if not, please leave a message on
68:
601:
automation. CSVLoader reads each line from the CSV file and replaces the column names in the stub template with data from the file, and creates the article. Regards,
640:
816:
similar to this anymore, but marine organisms are among those ones, that without bots are unable to join among professional encyclopedias and stay updated. --
580:
the sections can be added. Moreover, I'm always a bit wary of bot-created content, and I'd like to see some evidence that it will be verified and accurate. –
941:: The threshold for inclusion in Knowledge (XXG) is verifiability, not truth. Even despite this, there was an effort with results in identifying errors in
647:. Not that this will be anything like that, of course, but when you say "this has been done before", you don't settle my concerns about this task. I
203:
21:
1077:
1354:
Here I am trying to get this task approved. Another user picks up from where I left off last and creates 85 new articles. Please check
684:
541:
289:
88:
1167:
934:
296:
295:
The CSVLoader plugin will use the approved article template and the CSV data to create the finished articles. A reference to the
282:
198:
186:
1037:
Approved for trial (20 new pages). Please provide a link to the relevant contributions and/or diffs when the trial is complete.
897:
83:
1254:
Daniel, I was able to pull the synonyms using web services. I ran the bot again to update the articles with the synonyms (See
742:
490:
573:
311:
1520:
139:
420:
I am quoting from the meta page. The final sentence is important, we still need a encyclopedia for the general audience.
118:
773:
687:(first column). The point of creating these articles is so that the Gastro team can come in and expand them. Regards,
306:
1088:). I hope the Gastro team has some explanation for this. I will ask the Gastro team to verify the results. Regards,
1283:
1225:
831:
I dunno... I wouldn't trust a "professional" encyclopedia if it were written by unmonitored automated accounts... –
103:
938:
1141:
Otherwise, the article structure seems fine in practice, although I have yet to verify the actual data. Best. —
1355:
1524:
1468:
1438:
1417:
1394:
1373:
1327:
1306:
1287:
1273:
1229:
1185:
1152:
1103:
1047:
1020:
1003:
981:
966:
916:
891:
872:
856:
842:
825:
795:
747:
702:
666:
634:
616:
591:
559:
513:
495:
458:
409:
379:
353:
338:
1115:
1508:
1011:
Sure, but how about 20. Additionally, I want to see some input from members of the Gastropod wiki-project.
889:
870:
840:
589:
1540:
880:
Whether or not the information presented is as factually correct as a human-generated piece of content. –
40:
1279:
1221:
98:
1502:
1085:
93:
1150:
664:
152:
1081:
405:
334:
78:
1214:
as it should. One thing comes to mind though... Several species in these articles have synonyms (
777:
738:
486:
1516:
1461:
1431:
1387:
1366:
1320:
1266:
1178:
1096:
1043:
1016:
996:
909:
882:
863:
833:
788:
695:
630:
609:
582:
552:
451:
372:
349:
133:
1133:
537:
260:
178:
17:
1453:
for acurracy. They found all of them correct. Can the BAG approve this task please? Thanks.
1413:
1302:
1216:
1160:
1073:
977:
962:
852:
821:
509:
275:
271:
181:
is looking to use automation to create about 600 new species stub articles under the genus
161:
157:
1255:
1142:
1065:
656:
540:
to check the WoRMS site and pull information like Authority and Aphia ID and create the
361:
401:
330:
263:
would like to use automation to create about 600 new species articles under the genus,
59:
534:
1552:
731:
479:
222:
1512:
1454:
1424:
1380:
1359:
1313:
1259:
1171:
1089:
1069:
1039:
1012:
989:
902:
896:
The bot will only use data that was already downloaded from WoRMS. I have created,
781:
688:
626:
602:
545:
444:
365:
345:
129:
278:. CSVLoader allows creating articles using data stored in delimited text files.
1409:
1298:
973:
958:
848:
817:
505:
167:
1278:
Way to go, Ganesh! Now everything seems to be in order, as far as I can tell.--
47:
1358:. I have requested that these new articles be deleted quickly. Regards,
1423:
I think it sounds right in the intro. Let us leave it at that. Thanks.
1124:
The category and stub template are in the wrong place. Should be like
937:, then no. We will not verify it immediately after article creation.
360:
MZMcBride, Meta has a whole page devoted to answering your question,
933:
If you think, that we should verify, that there is not a mistake in
252:
397:
326:
264:
251:
182:
1533:
The above discussion is preserved as an archive of the debate.
1539:
To request review of this BRFA, please start a new section at
861:
Is every article going to be manually verified for accuracy? –
39:
To request review of this BRFA, please start a new section at
645:
Knowledge (XXG):Articles for deletion/Anybot's algae articles
344:
Doesn't mean we cant use the information. Looks pretty cool!
1258:). Can you please give the articles one more check? Thanks.
393:
267:. This task has been verified and approved by the project.
1405:"Predatory sentence" can be at Ecology section like this
389:
1450:
1406:
1295:
1129:
1125:
1061:
813:
113:
108:
73:
1488:
at the new articles. With that in mind, Ill give you
1134:
Knowledge (XXG):Stub#How to mark an article as a stub
1060:
Thanks Tim. I have completed the trial. Here are the
1166:
I have fixed the category and stub structure on the
1170:. Newer articles will have these changes. Regards,
1559:Approved Knowledge (XXG) bot requests for approval
1449:Two project members have checked and verified the
193:Links to relevant discussions (where appropriate):
1076:shows up fine when I queried using web services,
1072:. The only worrisome thing that I found was that
33:The following discussion is an archived debate.
1292:Excellent. I think there should be <br/: -->
949:and its incompatibilities with other databases.
1111:Please, please, please, please... remove the
8:
1121:tags and empty sections. They look horrible.
847:But it is not this case. It is monitored. --
1080:. But when searched online, it shows up as
288:The data file for the task is available at
1311:Thanks. Okay. I will add the <br/: -->
1064:. The bot skipped two existing articles,
204:WikiProject Gastropods#Conus sample stub
302:Two sample articles are in my sandbox:
639:Yes, but it has also been done before
574:User:Ganeshk/sandbox/Conus_abbreviatus
1078:User:Ganeshk/sandbox/Conus anabathrum
7:
1379:The articles have been deleted now.
299:will be placed in each new article.
216:Estimated number of pages affected:
472:addition of most sets of articles
256:Screenshot of the CSVLoader plugin
28:
1497:, but with a limitation. You can
290:User:Anna Frodesiak/Green sandbox
45:The result of the discussion was
1489:
1474:
1031:
947:World Register of Marine Species
935:World Register of Marine Species
643:. I still have nightmares about
297:World Register of Marine Species
283:User:Anna Frodesiak/Gold sandbox
199:User:Anna Frodesiak/Gold sandbox
187:World Register of Marine Species
46:
898:User:Ganeshbot/Conus data table
281:A stub template was created at
146:Automatic or Manually assisted:
1:
939:Knowledge (XXG):Verifiability
877:What "accuracy" do you mean?
728:you see what I'm getting at.
572:First of all, articles like
1312:on the next runs. Regards,
774:Category:Gastropod families
1575:
1525:04:12, 25 March 2010 (UTC)
1469:15:38, 21 March 2010 (UTC)
1439:13:39, 21 March 2010 (UTC)
1418:12:50, 21 March 2010 (UTC)
1408:, but both ways are OK. --
1395:04:06, 21 March 2010 (UTC)
1374:03:48, 21 March 2010 (UTC)
1328:13:39, 21 March 2010 (UTC)
1307:12:50, 21 March 2010 (UTC)
1288:00:06, 21 March 2010 (UTC)
1274:20:41, 20 March 2010 (UTC)
1230:06:09, 20 March 2010 (UTC)
1186:18:19, 20 March 2010 (UTC)
1153:04:59, 20 March 2010 (UTC)
1128:; you can see what I mean
1104:04:26, 20 March 2010 (UTC)
1082:in quarantaine, unverified
1048:03:27, 20 March 2010 (UTC)
1021:03:27, 20 March 2010 (UTC)
1004:01:55, 19 March 2010 (UTC)
982:17:12, 18 March 2010 (UTC)
967:17:12, 18 March 2010 (UTC)
917:03:11, 20 March 2010 (UTC)
892:00:48, 19 March 2010 (UTC)
873:15:39, 18 March 2010 (UTC)
857:14:49, 18 March 2010 (UTC)
843:14:17, 18 March 2010 (UTC)
826:12:17, 18 March 2010 (UTC)
796:03:34, 18 March 2010 (UTC)
748:03:21, 18 March 2010 (UTC)
703:01:45, 18 March 2010 (UTC)
667:01:33, 18 March 2010 (UTC)
635:00:58, 18 March 2010 (UTC)
617:22:06, 17 March 2010 (UTC)
592:19:31, 17 March 2010 (UTC)
560:12:47, 17 March 2010 (UTC)
514:10:30, 17 March 2010 (UTC)
496:06:01, 17 March 2010 (UTC)
459:03:17, 17 March 2010 (UTC)
410:03:14, 17 March 2010 (UTC)
380:03:13, 17 March 2010 (UTC)
354:03:11, 17 March 2010 (UTC)
339:03:07, 17 March 2010 (UTC)
1212:Bouchet & Rocroi 2005
621:Well, this kind of thing
185:. The stubs will use the
1536:Please do not modify it.
218:580 new pages (approx.)
36:Please do not modify it.
641:with disastrous results
270:This task will use the
1293:instead of <br: -->
780:for example. Regards,
500:Yes, we need them for
434:
257:
236:Already has a bot flag
429:
255:
22:Requests for approval
623:had been done before
474:into Knowledge (XXG)
153:Programming language
18:Knowledge (XXG):Bots
329:devoted to this? --
223:Exclusion compliant
778:Category:Acmaeidae
258:
176:Function overview:
1511:comment added by
1451:20 trial articles
502:further expansion
312:Conus abbreviatus
261:Gastropod project
248:Function details:
179:Gastropod project
1566:
1538:
1527:
1493:
1492:
1478:
1477:
1467:
1464:
1457:
1437:
1434:
1427:
1393:
1390:
1383:
1372:
1369:
1362:
1326:
1323:
1316:
1280:Daniel Cavallari
1272:
1269:
1262:
1222:Daniel Cavallari
1217:Conus amphiurgus
1184:
1181:
1174:
1161:Conus alconnelli
1148:
1145:
1120:
1114:
1102:
1099:
1092:
1074:Conus anabathrum
1035:
1034:
1002:
999:
992:
972:its creation. --
915:
912:
905:
885:
866:
836:
794:
791:
784:
746:
734:
701:
698:
691:
662:
659:
615:
612:
605:
585:
558:
555:
548:
494:
482:
457:
454:
447:
378:
375:
368:
50:
38:
1574:
1573:
1569:
1568:
1567:
1565:
1564:
1563:
1549:
1548:
1547:
1534:
1506:
1490:
1480:Trial complete.
1475:
1462:
1459:
1455:
1447:
1432:
1429:
1425:
1403:
1388:
1385:
1381:
1367:
1364:
1360:
1352:
1321:
1318:
1314:
1267:
1264:
1260:
1256:Conus ammiralis
1179:
1176:
1172:
1146:
1143:
1118:
1112:
1097:
1094:
1090:
1066:Conus africanus
1062:20 new articles
1032:
997:
994:
990:
910:
907:
903:
883:
864:
834:
789:
786:
782:
736:
732:
696:
693:
689:
660:
657:
610:
607:
603:
583:
553:
550:
546:
484:
480:
452:
449:
445:
373:
370:
366:
362:Wikispecies FAQ
325:Isn't there an
323:
276:AutoWikiBrowser
210:Edit period(s):
158:AutoWikiBrowser
124:
63:
34:
26:
25:
24:
12:
11:
5:
1572:
1570:
1562:
1561:
1551:
1550:
1546:
1545:
1530:
1529:
1528:
1446:
1443:
1442:
1441:
1402:
1399:
1398:
1397:
1351:
1348:
1347:
1346:
1345:
1344:
1343:
1342:
1341:
1340:
1339:
1338:
1337:
1336:
1335:
1334:
1333:
1332:
1331:
1330:
1241:
1240:
1239:
1238:
1237:
1236:
1235:
1234:
1233:
1232:
1199:
1198:
1197:
1196:
1195:
1194:
1193:
1192:
1191:
1190:
1189:
1188:
1164:
1139:
1138:
1137:
1122:
1116:expand section
1053:
1052:
1051:
1050:
1026:
1025:
1024:
1023:
985:
984:
969:
954:
950:
930:
929:
928:
927:
926:
925:
924:
923:
922:
921:
920:
919:
809:
808:
807:
806:
805:
804:
803:
802:
801:
800:
799:
798:
759:
758:
757:
756:
755:
754:
753:
752:
751:
750:
729:
725:
712:
711:
710:
709:
708:
707:
706:
705:
674:
673:
672:
671:
670:
669:
619:
595:
594:
569:
568:
567:
566:
565:
564:
563:
562:
523:
522:
521:
520:
519:
518:
517:
516:
477:
464:
463:
462:
461:
438:
437:
436:
435:
424:
423:
422:
421:
415:
414:
413:
412:
383:
382:
357:
356:
322:
319:
317:
315:
314:
309:
207:
206:
201:
189:as reference.
123:
122:
116:
111:
106:
101:
96:
91:
86:
81:
76:
74:Approved BRFAs
71:
64:
62:
57:
56:
55:
29:
27:
15:
14:
13:
10:
9:
6:
4:
3:
2:
1571:
1560:
1557:
1556:
1554:
1544:
1542:
1537:
1531:
1526:
1522:
1518:
1514:
1510:
1504:
1500:
1496:
1487:
1483:
1482:
1481:
1473:
1472:
1471:
1470:
1465:
1458:
1452:
1444:
1440:
1435:
1428:
1422:
1421:
1420:
1419:
1415:
1411:
1407:
1400:
1396:
1391:
1384:
1378:
1377:
1376:
1375:
1370:
1363:
1357:
1349:
1329:
1324:
1317:
1310:
1309:
1308:
1304:
1300:
1296:
1291:
1290:
1289:
1285:
1281:
1277:
1276:
1275:
1270:
1263:
1257:
1253:
1252:
1251:
1250:
1249:
1248:
1247:
1246:
1245:
1244:
1243:
1242:
1231:
1227:
1223:
1219:
1218:
1213:
1209:
1208:
1207:
1206:
1205:
1204:
1203:
1202:
1201:
1200:
1187:
1182:
1175:
1169:
1165:
1162:
1158:
1157:
1156:
1155:
1154:
1151:
1149:
1140:
1135:
1131:
1127:
1123:
1117:
1110:
1109:
1107:
1106:
1105:
1100:
1093:
1087:
1083:
1079:
1075:
1071:
1067:
1063:
1059:
1058:
1057:
1056:
1055:
1054:
1049:
1045:
1041:
1038:
1030:
1029:
1028:
1027:
1022:
1018:
1014:
1010:
1009:
1008:
1007:
1006:
1005:
1000:
993:
983:
979:
975:
970:
968:
964:
960:
955:
951:
948:
944:
940:
936:
932:
931:
918:
913:
906:
899:
895:
894:
893:
890:
887:
886:
879:
878:
876:
875:
874:
871:
868:
867:
860:
859:
858:
854:
850:
846:
845:
844:
841:
838:
837:
830:
829:
828:
827:
823:
819:
814:
797:
792:
785:
779:
775:
771:
770:
769:
768:
767:
766:
765:
764:
763:
762:
761:
760:
749:
744:
740:
735:
726:
722:
721:
720:
719:
718:
717:
716:
715:
714:
713:
704:
699:
692:
686:
682:
681:
680:
679:
678:
677:
676:
675:
668:
665:
663:
653:
650:
646:
642:
638:
637:
636:
632:
628:
624:
620:
618:
613:
606:
599:
598:
597:
596:
593:
590:
587:
586:
579:
575:
571:
570:
561:
556:
549:
543:
539:
535:
531:
530:
529:
528:
527:
526:
525:
524:
515:
511:
507:
503:
499:
498:
497:
492:
488:
483:
475:
470:
469:
468:
467:
466:
465:
460:
455:
448:
442:
441:
440:
439:
433:
428:
427:
426:
425:
419:
418:
417:
416:
411:
407:
403:
399:
395:
391:
387:
386:
385:
384:
381:
376:
369:
363:
359:
358:
355:
351:
347:
343:
342:
341:
340:
336:
332:
328:
320:
318:
313:
310:
308:
305:
304:
303:
300:
298:
293:
291:
286:
284:
279:
277:
273:
268:
266:
262:
254:
250:
249:
245:
243:
240:
237:
233:
231:
228:
225:
224:
219:
217:
213:
212:One time run
211:
205:
202:
200:
197:
196:
195:
194:
190:
188:
184:
180:
177:
173:
171:
169:
164:
163:
159:
156:
154:
149:
147:
143:
141:
138:
135:
131:
128:
120:
117:
115:
112:
110:
107:
105:
102:
100:
97:
95:
92:
90:
87:
85:
82:
80:
77:
75:
72:
70:
66:
65:
61:
58:
53:
49:
44:
42:
37:
31:
30:
23:
19:
1535:
1532:
1498:
1494:
1485:
1479:
1448:
1404:
1353:
1350:ANI incident
1294:. Like this
1215:
1163:for example.
1108:Two things:
1070:Conus amadis
1036:
986:
946:
942:
884:Juliancolton
881:
865:Juliancolton
862:
835:Juliancolton
832:
810:
651:
648:
584:Juliancolton
581:
577:
501:
473:
430:
324:
316:
301:
294:
287:
280:
274:plugin with
269:
259:
247:
246:
241:
238:
235:
234:
229:
226:
221:
220:
215:
214:
209:
208:
192:
191:
175:
174:
166:
165:
151:
150:
145:
144:
136:
126:
125:
51:
35:
32:
1507:—Preceding
364:. Regards,
327:entire wiki
307:Conus abbas
168:Source code
60:Ganeshbot 4
1505:. Thanks!
1445:Next steps
1086:screenshot
538:AWB plugin
536:I wrote a
394:Wiktionary
390:Wikisource
321:Discussion
170:available:
148:Automatic
114:rights log
104:page moves
1495:Approved.
1401:New ideas
685:data page
542:data file
443:Regards,
402:MZMcBride
331:MZMcBride
272:CSVLoader
162:CSVLoader
127:Operator:
109:block log
1553:Category
1521:contribs
1513:Tim1357
1509:unsigned
1486:glancing
1168:template
743:Contribs
491:Contribs
140:contribs
84:contribs
52:Approved
20: |
1541:WT:BRFA
1456:Ganeshk
1426:Ganeshk
1382:Ganeshk
1361:Ganeshk
1315:Ganeshk
1261:Ganeshk
1173:Ganeshk
1091:Ganeshk
1040:Tim1357
1013:Tim1357
991:Ganeshk
904:Ganeshk
783:Ganeshk
690:Ganeshk
627:Tim1357
604:Ganeshk
547:Ganeshk
446:Ganeshk
398:Commons
367:Ganeshk
346:Tim1357
130:Ganeshk
41:WT:BRFA
1503:WT:BAG
1410:Snek01
1299:Snek01
1147:Earwig
1132:. See
974:Snek01
959:Snek01
849:Snek01
818:Snek01
733:V = IR
661:Earwig
649:oppose
506:Snek01
481:V = IR
943:Conus
265:Conus
239:(Y/N)
227:(Y/N)
183:Conus
89:count
16:<
1517:talk
1499:only
1463:talk
1433:talk
1414:talk
1389:talk
1368:talk
1322:talk
1303:talk
1297:. --
1284:talk
1268:talk
1226:talk
1180:talk
1144:The
1130:here
1126:this
1098:talk
1068:and
1044:talk
1017:talk
998:talk
978:talk
963:talk
953:bot.
911:talk
853:talk
822:talk
790:talk
739:Talk
697:talk
658:The
631:talk
611:talk
578:then
554:talk
510:talk
487:Talk
453:talk
406:talk
374:talk
350:talk
335:talk
232:N/A
172:Yes
160:and
155:(s):
134:talk
119:flag
99:logs
79:talk
69:BRFA
1356:ANI
945:in
652:all
292:.
94:SUL
1555::
1523:)
1519:•
1416:)
1305:)
1286:)
1228:)
1119:}}
1113:{{
1046:)
1019:)
980:)
965:)
888:|
869:|
855:)
839:|
824:)
776:.
741:•
730:—
655:—
633:)
625:.
588:|
512:)
489:•
478:—
408:)
396:,
392:,
352:)
337:)
244:Y
142:)
1543:.
1515:(
1466:)
1460:(
1436:)
1430:(
1412:(
1392:)
1386:(
1371:)
1365:(
1325:)
1319:(
1301:(
1282:(
1271:)
1265:(
1224:(
1183:)
1177:(
1136:.
1101:)
1095:(
1084:(
1042:(
1015:(
1001:)
995:(
976:(
961:(
914:)
908:(
851:(
820:(
793:)
787:(
745:)
737:(
700:)
694:(
629:(
614:)
608:(
557:)
551:(
508:(
493:)
485:(
456:)
450:(
404:(
377:)
371:(
348:(
333:(
242::
230::
137:·
132:(
121:)
67:(
54:.
43:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.