Knowledge (XXG)

:Bot requests/Archive 6 - Knowledge (XXG)

Source 📝

3127:
that have articles now, and if you look at either of the 2 lists I mentioned for any of the states, you can see only about 10-20% of high schools have articles. It just seems as if there should be more high schools represented, and as it stands now, some are extremely well represented (have long articles), some have a little info, and some have absolutely no info. The types of articles that high schools fall into are different than other types of articles in that many people attended (and therefore theoretically have intimate knowledge of) said high schools, plus the types of people that will be searching for high schools will be much more likely to expand existing articles than even know how to go about creating a brand new article. This may not be the greatest idea, I admit, but the reasons you gave don't seem to apply to this request. If the concept was misunderstood, I can attempt to restate it in a different way. (
2180:
few random 5-10 word phrases from the article, no punctuation, and search on Google, Altavista, etc for the exact phrase. The bot would make a list of any positive results. Of course it would have to ignore wikipedia mirrors. The odds of it listing a copyvio of something that is actually PD/GPL are low, in my experience, people are more interested in copying and pasting press releases, corporate bios, etc. than Project Gutenberg kind of stuff. But even still... that's where the human factor comes in.
1450:?) and editing the list of duplications would probably have to be limited to admins. By having this bot, a person could add the lowest level category that applies and the category would also end up in the higher level categories. The bot would have to look at each article in the category and see if the higher level categorization exists, if it does not, the categorization would be added. For categories of people, the piping should be copied so that the article is alphabetized correctly. 161:
verifiably, is what the situation was six years ago; c) it's simply bad grammar to say, "As of the census of 2000, there are 1,270 people... residing in the town"; d) these articles are currently inconsistent, as the intro is already in the past tense. Note that I am not suggesting that the "geography" section (also based on the census bureau figures) should be changed to past tense, as the area of the towns and the portion covered by water is unlikely to have changed. --
31: 2616: 3086:. The beginning article could be a stub that said only: "XXXXXXXXXX High School is a secondary education school located in CITY, STATE.", or something of that nature. It is my opinion that a lot of people that attend a high school, or alumni of a high school would be much more apt to edit an existing article on their high school than create a brand new article on their high school (much more so than other articles on wikipedia). ( 3165:(but far from perfect) degree of accuracy, and enumerate them in the form of a bulleted list on the stub's talk page, including also a link to the page(s) from which the information was derived. A human could follow the links, verify the factoids, hopefully also locate additional information while visiting the various sites (some of them might be obscure news articles, for example), and then add the information to the article. 919:
probably have no idea where the dot is in realtion to the USA as a whole. In general, the system from American towns/cities/etc seems a bit American-centric. Maybe I'm off base. I wouldn't mind a discussion, as I feel I can learn quite a bit more about British towns in general than the American counterparts by the general info displayed.
3078:"Knowledge (XXG) Project Missing encyclopedic articles/High Schools/US/XXXXX", again XXXXX being the name of the state. The bot could (for instance) make an article with the name of the school and then the name of the city and state in parentheses (since a high school is often named the same as other high schools, ex: 638:
administrators, I expected to find an actual bot to help them out, but I can't find it. If I'm blind, could someone point me in the right direction. If not, would someone be willing and able to write such a bot? The task seems pretty straight forward; it's just an awful lot of clicking when done by hand.
2814:
Once you cull a list of articles (which would be the hard part) it would take a simple regex and ignore to do that. The regex would have it be put at the top of the articles and the ignore would prevent double placement on articles that already have it. An alternative to doing an ignore would be to
2584:
I've run through the entire links here list and got no hits for the image using the find replace you specified so somebody must have beat me to it though I have no idea why anything is registering as linking to the image when I cannot find any links that actually point to it (other than this page and
2338:
I don't mean every word on the page. All terms would be defined in a separate page - a dictionary page. Each term on this page would have an associated code - a simple "1" or "0", for example. This code could be listed anywhere in the definition, hidden or not. What this code would do is tell the
1123:
would it be possible for a bot to remove the data and insert it perhaps as a section just before 'see also' with a heading such as 'corporate branding'? (The first would be very useful, the second is still subject to the discussion outcome - just trying to get a feel for what can be done) Many thanks
889:
Does anyone have a robot that they could run which could change all occurences of "id=toc" into "class=toccolours"? They both look the same to most folk, but id=toc hides the division from folk who have preferences for "contents turned off". And 99% of these are not tables of contents but are related
3268:
placing the "featured article star" on the inbound "in other languages" links from other language wikipedias? Currently it is necessary to manually go through each linked language and add {{Link FA|en}} to the code, surely this could be done with a bot or something. Furthermore, the stars need to be
3126:
Perhaps I do not understand, 1) How does someone using several IPs to edit have something to do with this topic? and 2) How would the articles be impossible to expand? I have created articles in the past as stubs and they have been expanded. I was just thinking that there are only a few high schools
2799:
etc. (see template page for specifics). On those pages it can be seen in use. This template could be used as the standard format for navigating through such articles, however it would be tedious to add them by hand. Hence I suggest a semi-automatic bot to add this template appropriately in the dates
2395:
I've noticed by browsing some of the Knowledge (XXG) Chinese articles that they will often link to an English page which has no corresponding link back to the Chinese version, and even one or two Chinese pages with no link to the relevant English article. I came across this actually several times in
2179:
I find a lot of copyvios that have been dormant for months... and it seems like there are probably tons out there, if I can just check a few random articles and find one pretty fast. It seems like a bot with an organized approach would uncover thousands, and with an easy methodology... just select a
1445:
I am wondering if a bot could be created to run frequently (once a day?) which would go through a list of categories that should be duplicated in other categories and check to see if the duplications exist. If they do not, they would be added. I suspect that there will need to be a page created to
3077:
I don't know if this would be feasible, but I thought I would put it out there for discussion. It might be helpful (possible?) to create a bot that creates articles for all high schools. Just about every state has an article named "List of High Schools in XXXXX", XXXXX being the state, as well as a
2399:
It would be helpful if someone could create a bot to scan pages and follow the links to different language versions, and make sure that all of the different translations are linked up. (i.e. that if a page exists in 15 different languages on any given topic, that each of those 15 versions has links
2298:
I am able to generate a list of all the terms, and also to create definitions (manually), but I need to go back through the wiki site and create links from those terms to the dictionary. Further, some of the terms are too common to automatically replace using a bot, so they need to be removed from
2187:
There are over a million articles now so it would be a lot of work and time... but afterwards it could perhaps monitor new articles (though that might be mre difficult to implement). Also since it's not live, I'm not even sure it would need to be flagged as a bot... all it would do is upload a list
1403:
Prometheuspan 00:33, 17 March 2006 (UTC)huh. You'd think they would have like an RSF or some such thing set up to link a new wiki to its parent networks like that. I have asked at the bot request wikipedia zone. Is there somebody else or someplace else to go look? Prometheuspan 00:33, 17 March 2006
1294:
This suggestion is about a tool and not a bot, but I didn't know were to put it. I'm suggesting a tool that searches through a page's history and lists all the images that have been used, even if they were removed. This way, we could retrieve images that were replaced by better ones in the article,
1369:
is there a bot that can perform search functions to link Knowledge (XXG) articles that are relevant to a Wikibook? Also, and, more generally, Wikicities, other areas of WikiBooks, Wiktionary, (And, I know, getting less likely the heuristic would be able to discern what was relevant, but i'd rather
798:
All namespaces. I created the template to be used on all namespaces except article, but since interwiki redirects shouldn't be done on articles, the category associated with the template will allow us to find and fix them. Also, the pseudo-namespace WP: (which is also in the article namespace) has
1761:
Several of AllyUnion's bots appear to have gone offline several days ago. It's only when the automated tasks that you are used to being done don't get done that you realize how much you depend on a bot. And this is currently the case. From AllyUnion's user page, it appears that he is mostly on
1122:
there is an emerging consensus that the slogan field should be removed from the infobox, but we want to hang on to the data in the field. There are two things which you might be able to help us with, firstly could a bot create a list of the pages which contain infoboxes with slogans, and secondly
3410:
I think that when a page or more of text is deleted from a single article, a bot should undo the change. Since the standards of information that are put into the article are fairly high, anyone deleting a lot of this information is likely a vandal. For someone who is actually doing work, the bot
2746:
Obviously the titles are fictitious, but the scenario is real. Each of these two links would then be created as a redirect to the article about the album on which the song was released. I'm thinking the above should only apply in the case of unrelated songs. Cover versions should, in my opinion,
2191:
Anyway, I'm not a programmer... so I have no idea how hard this would be to implement. But given that it's not live, it could presumably be written in any language, up to Visual Basic. I've been thinking about this for a while though, and I think it would make a very positive impact on Knowledge
1553:
This is a common response that I've heard for about a year and a half. I am not convinced that this will ever happen, and I'm not sure it needs to happen. Usually, the higher level categories only have subcategories. Having them populated adds the flexibility of seeing the larger sets and the
1437:
it was necessary to break large categories into smaller subcategories, and there is a value in having these smaller categories. However, categories also serve as the master index of subjects and it is often frustrating to have to look in several subcategories to browse through the articles in a
3437:
There seem to be about 145 articles (Google search terms USS + Splashed) on US Navy ships of the Second World war that use the terms "splash", "splashing" or "splashed" as euphimisms for the shooting down by US Forces of aircraft flown specifically by Japaneese pilots, or crashes involving such
2619:
on them - which is what Mdd4696 wanted replaced. Keep in mind that the "Whatlinkshere" doesn't work for image uses, and also any capitalisation or underscore issues your regex might have. Also keep in mind that the flag really doesn't need to be substituted, seeing the red image links it's that
2183:
The bot would just create a simple list of possible copyvios (with URLs), so it would be 100% non-invasive... humans (me, for example) would go through the list and handle as appropriate. The list could be stored in the bots userspace or wherever... I imagine it wouldn't be hard to drum up some
918:
It might be nice if American Cities/Towns were displayed as nicely as British towns were. E.g. on the right side are all the vital/geo stats, and the articles expalins the town in question. American towns currently get a red dot on a map of the state they are in, but the non-American readers
2105:
Agreed, it would be too hard to perform automatically. I guess I wasn't considering its implementation as a stand-alone bot but perhaps as an add-on to an existing spelling type bot that editors could use when browing articles. I did that exact Google search and others before putting this note
3168:
If somebody wanted to run a bot like that, and a critical mass of other people were willing to follow up on each result posting, this idea could be a successful operation, improve Knowledge (XXG)'s coverage of non-notable schools, make the existing stubs worth keeping, and help reduce the
160:
describing what the situation was six years ago, at the time of the last census. They should be changed because a) except for the very smallest of communities, these figures are unlikely to still be accurate; b) even where the figures haven't changed, we don't know that - all that we know,
1874:
I have been trying to clean up the cocktails articles, I have tagged about 90 articles for "move to wikibooks". Anyone have a bot that could transwiki them? They all have cocktail recipes in them, the majority are nothing but recipe. They'd need to end up in the wikibook Bartending.
637:
Since fy: has been getting a higher load of real vandalism lately (as opposed to the occasional graffiti-editor), I'm looking for a way to revert the contributions of a specified anonymous user in a somewhat automated way. As the bot bit does have a function in this process, as done by
1484:
What do you mean by the AI? Are you talking about the piping? The bot can look at the categorization for the subcategory and use the same piping when adding the categorization in the parent category. As a test, would it be possible to duplicate the categorization of the articles in
2143:"(Note: The U.S. Census Bureau counts township populations in the Connecticut Western Reserve as distinct from any municipalities located within the township. For populations of any municipalities within the township, please read the corresponding articles for those municipalities.)" 2080:
I can envision the development of a bot that searches for phrases such as "is a great", "is a fantastic", "is a terrific", "is an awful" ..etc. that could indicate strong POV within the article text. If the phrase appears within quoted text, i.e. as dialog, then it would be
1099:
I don't think it's a good idea for a bot to count votes based on human edits. If people use a different format, they accidentally (or purposely) sign twice, etc etc... What might be able to be done is that the winning articlename be put in a protected page, and the bot read
3140:
It appears Knowledge (XXG) already has around 50,000 high school related pages, and a significant number of them are starved of content. I have a more useful suggestion, but it would require more effort to set up. A bot could scan the categories populated by the various
124:
It's pretty straightforward to do - within the demographics section, replace "is" with "was" (10 instances); replace "are" with "were" (12 instances); replace "have" with "had" (4 instances). I've been doing this manually when I come across them (see, for example,
2091:
Barring exceptionally brilliant AI, the bot would not be able to conclude from the context whether the statement is truly POV or not. It would still have to be reviewed by a human, and you can already achieve this functionality by doing a google search for
458:
I've been removing referral IDs from outgoing links I've been able to find every now and then. I frankly don't like the idea that someone could make money from Knowledge (XXG) by sneaking these links into places where they even could be considered legit.
1392:
to the approprate section of that wikibook? This may need to be written for each book seperately, but of particular interest to me would be the Jewish/Christian Bible and Islamic Qu'ran. If this would interest anyone, please contact me on my talk page!
2216:
I've found that a simple "-wikipedia" (or equivalent depending on search engine) in the search cuts a lot of them out... to the point where you tend to just be left with copyvios, if there are any. Another option is whitelisting the domains listed at
1472:
Well, if you just want the bot to add a category (the higher up one) to every page in a list its trivial, any pywikipedia bot (including Tawkerbot) can do it. If you want the AI, that's getting into fuzzy logic and might be a little trickier --
613:
OK I've finished with all the Amazon URLs I could find, and have moved on to allmusic.com. These are rather impressive; they can have a 10 (or so) character uid= component, along with a 128-character token=. There's about 3600 of them in enwiki.
1542:
It seems to be that a better solution would be to get the MediaWiki software to display all subcat articles of a particular category. If this feature is introduced in the future, carrying out the category population with a bot would have been a
2117:
That is going to be a nightmare to implement reliably, there is no way it would be autorevert like Tawkerbot2, it would have to compile lists and post them to a page somewhere. Its food for thought, I'll throw it out and we shall see --
764:
some time ago, and it seems to have been well received. It is supposed to be used instead of interwiki redirects, which do not work. However, it's hard to find the interwiki redirects without a bot. I'd like to ask for a bot to convert
3485:
I also oppose this per freakofnurture due to POV issues, also even if a consensus is reached regarding this it's a job that would be better served being done by hand where a person can review each edit for accuracy rather than a bot.
1012:
I've added it to my bots que, give it a few days and I should be able to run the job. I think I can run it fairly quickly as it's only 100 pages (12 min on the bots clock, I might just use AWB because its faster for the small jobs.
660:
is getting out of control, with 1.3% of Knowledge (XXG) currently tagged for cleanup. In order to speed the cleanup process, I propose a janitor bot to move "cleanup" pages that belong in other maintenance departments elsewhere.
1976:
We have lots of wars and we name them XX Years' War... which, is close XX Years War. I think the apostrophe is the more common way to do it... and the proper... but they are both used in some settings. Should this be bot-ted?
556:
I've updated my scripts and I'm working my way through the Amazon links I've found. Hopefully it'll only take a few days. If you do find any other prominent sites with referrals, I'd be interested to hear about them. Cheers,
2294:
I am trying to find out if there is a bot created already that establishes a dictionary of terms found in a wiki site. Currently, there is no listing of definitions, and the terms end up being fairly convoluted at times.
3031:
I Just wanna make my page or article "Don't Wanna lose You" that gets open when someone write "DON'T WANNA LOSE YOU", or "Don't Wanna Lose You" or "don't wanna lose you". I mean with diferent letters to the letters I've
2403:
Aside from just making it easier to find content in multiple languages, this may also encourage users to contribute in more than one language if they know the article exists in a second language they are familiar with.
2750:
In the event that the title does refer only to a song, and equally notable versions of the same song have been released my more than one artist, it should probably be created as a distinct article explaining this. —
703:, More info will be posted on the bot page. The bot can do that now, but is not completly tested and some of the code is lacking. :-). P.S. the Gnome bot is a C++/CLI bot. If interested contact me on my talk page:-) 2747:
redirect to the original release, or be combined on the same line of the disambiguation page, if "Foo at Tiffany's" also refers to something else, such as a film. This would most likely require manual intervention.
1728:
However, it's a lot of work to do this. Not all of these tags can be automated, but it seems to me that at least a couple could be. For instance, #4 might be. And it would be very useful if we could automate #6.
2230:
If its a legit M/F it would have a GFDL complaint notice and see content from Knowledge (XXG) in it. That might be our saviour, though this bot would just list on a page, no way would I want it auto blanking --
1255:
Currently the dinosaur pages on WP are in bad shape. There are several hundred categorized dinosaur stubs that could use an infobox, but manually adding them might take some time. Can't a bot do all that work
502:
That's a good idea. I did a quick search over the enwiki dump from 20060125 and there are about 1500 amazon links with /ref in them. Note that you can reduce the URLs even further. Your example can be reduced
2265:
Yeah, the whole point is doing a comprehensive job of pointing human copyvio hunters to all the probable needles in the haystack, so to speak. A bot shouldn't actually directly do anything with the articles.
3455:
I think there is a general consensus to avoid using bots for POV related issues. Even if the code being used had a 0% rate of error (which seems improbable in this case), I'd still recommend manual edits. —
2206:
We were talking about this idea for Tawkerbot2 (as another feature) but we've run into one big snag. There are thousands of WP mirrors out there and every one of them would screw up automated detection --
1765:
So the next question becomes, how long do we wait until the bots are declared out of service, and how then can we get some other bots to pick up the duties. Specific bots that appear to be down include:
1456:
I am just wondering if this is possible. There would have to be quite a bit of discussion about whether this should happen and how it will happen. First I want to know what is possible. Thanks. --
1725:
So as you can see, we use between 4 and 6 tags for each recording. They all serve a good purpose: they promote the project, help organize our work, and make sure that people can find our recordings.
1381:
I will look that up. As a side note, I had envisioned linking to as many other wikibooks as were relevant, and to as many wikipedia articles as possible. Prometheuspan 22:26, 28 February 2006 (UTC)
682:
does not show the capability to determine percentage of wiki links, but this is probably not a difficult task, as word counting and a repeated regexp search for /]/ should be all that's required.
728:
has been deemed capable of doing the task...It is currently undergoing modifications to be able to preform the task. (Bot was in existance before I mentioned it) Now Alba and I are working on it.
2730:
Note to whomever undertakes this task, it would be adviseable to create disambiguation pages in many cases. Thus, if the bot detects that the page it is about to create already exists as an {{
1593:
that lists new articles that have been recorded. That way, project members and casual listeners can find our new content easily. It would be great if we had a way to automate this, to save
3411:
should send them their alterations in it's response (So if it's valid they don't have to retype it) and give a link to a human editor they can appeal to if their editing wasn't vandalism.
992: 3161:
Further, with a little bit of fuzzy logic, it may be possible to ascertain some vital statistics (e.g. enrollment, year of establishment, mascot, name of current principal...) with a
2955:
What do you mean by "not formatted correctly"? if you're talking about articles with bad titles, you can just manually specify the second parameter rather than using subst:PAGENAME.--
527:
Someone should get on the job of finding the top sites with referral programs and make an algorithm to remove those as well, but I guess Amazon would be the major culprit, though.
3442:. It could be argued that this terminology makes light of the deaths of the young men involved. If this argument is accepted then is this something a bot could be used to fix? 1823:
NekoDaemon being out of service is what brought this all to my attention, as CFD is one of my normal home playgrounds. But AFD bot appears to have an even more critical role. -
1510:
By "AI" he means that the bot will run on its own with artificial intelligence. A pywikipediabot can't run on it's own, as the human must tell it what do, where to do it, etc.
1953: 3101:. Horrible idea, I'm afraid. "Articles" (sub-stubs that is), especially those which are impossible to expand and impermissable to delete, should be avoided at all costs. — 2302:
Please note that this is for a mediawiki site. Is there a way that mediawiki can be set up to do this? (I have only glanced at the software docs, as I am not the admin).
819:
appears to be populated by all the stock exchanges, most of which fall into the geographical categories of ...in Europe, ...in North America, ...in Asia, etc. According to
1082: 976: 2464:
take its place (along with some extra functions), but neither of their operators have been around recently. What's needed is three edits a day, shortly after 0000 UTC:
2992:
It seems like the problem was just in April. Is it only me but the page May 4, 2005 gives a terrible typo in the apache? (SpecialPage.ohp instead of SpecialPage.php).
2339:
bot, "Ok, this term is something to go through the wiki site and make into a link back here." or else would say, "Ok, don't make links of this term on the wiki site."
931:
needs categorising better, but it's kind of a bit hard to do alone manually. I'm wondering if a bot could do it better. Here's what essentially needs to be done:
674:
If less than MINWIKILINK % of the text contains hyperlinks within Knowledge (XXG), replace {{cleanup}} with {{wikify}}. A proposed value for MINWIKILINK is 0.1%.
984: 3158:, and post it to the article's talk page, to assist anybody who may have a keen interest in improving the article, but no idea where and how to find the info. 1762:
wikibreak. I tried emailing him, but his email does not work. So I've left a message on his talk page. But if he's on break, who knows when he will see it.
3083: 2673:
Personally I would recommend just removing it rather than replacing, but due to a history of problems with that user, I shall not be accepting this task. —
3439: 1442:. The proposal is to keep the subcategories, but also have articles duplicated in parent (or grand-parent) categories up to the level of topic articles. 2510: 1893:
a bot to make the following fixes to template could do much, much good to wikipedia (it,d avoid manual fixing of these things, at the very least lol):
470: 789:
That can be done, but are you looking for interwiki redirects in all namespaces, just the User namespace, Knowledge (XXG), Help, the main namespace?--
678:
The first two tasks appear to be within the capabilities of current bots, and should be easy to accomplish. Going through the capabilities of current
1554:
smaller ones. If having a large number of articles makes it difficult to see the set of subcategories, it is possible to split the category such as
1453:
Another bot might scan through the higher level lists and collate a list of articles that have not been put in any of the lower level subcategories.
3079: 2476: 2469: 2299:
the "link creation" process, yet still remain in the dictionary. Is there something like this I can use as a bot base? Or is it simple to write?
3146: 1991: 1581:
would like to explore using a bot to help with our work. Here are a couple of things that have come up in discussions with other project members:
2487: 1986: 968: 1295:
but still good for usage. The only problem, is that Knowledge (XXG) doesn't categorises images, that's why I think this tool would be useful.
1981: 1145: 3470: 3187: 3155: 3115: 2765: 2687: 960: 227: 97: 2424:, though they do not run on all wikipedias, I suspect that not many run on the chinese wiki. Details of the bot they use can be found at 1039:
Bots that can find articles containing in-line interlanguage links (] ]). There are too many of these links in Thai WP such as this page
823:, articles shouldn't be a broad category that is covered by a sharper category. What bots, if any, do mass recategorization like this? -- 3048: 1046:
Then it might be good to get all the links in the articles that someone else can create articles (or even stubs) similar to Wanted Pages
952: 3372: 89: 84: 72: 67: 59: 462:
What I'd like this bot to do is to find links that contain a referral ID, strip it off and post a normal one that works just as well.
2446: 3098: 1796: 1352: 944: 3060:
which is the standard form for this. As far as I can tell, the system now works to direct any form of capitalisation to that page
2541: 1812: 988: 972: 956: 940: 928: 143:
No, I'm pretty sure these should remain present tense. They are still describing the current world, just somewhat inaccurately.
1423:
about repopulating some categories that had previously been depopulated after being divided into subcategories. One example is
1420: 3173:
project as a whole. If somebody wants to give this a serious attempt, I might be able to provide a bit of technical support. —
2724: 898: 121:, the population is... the average income is... the majority of families have..." etc. These should be changed to past tense. 3445:
PS My comment implies no endorsement of the Japanese military during WW2. I had a grand uncle who served in the pacific. --
2218: 1696: 174:
I had been changing these when I met them, recently I have changed a whole bunch and now started serious automated testing.
1447: 2648: 2559: 2003: 1780: 1498: 1486: 1119: 2641:
I just didn't want to seem like an ass, deleting Germen's sig image without fixing it again. Thanks for your help guys. ~
129:), but that'll take a while across 30,000 articles and it seems to me like a task ideally suited to a bot. Any offers? -- 2016: 936: 3269:
removed when an article gets de-listed from FA status, this should also be done automatically. Is this even possible?
2425: 471:
http://www.amazon.com/gp/product/B000000W5L/sr=1-1/qid=1138522986/ref=pd_bbs_1/103-0299503-7272610?%5Fencoding=UTF8
38: 1732:
Again, if this idea is something you;d like to pursue, I can re-explain all of this and/or provide more details.
1501:. I've been doing this with AWB and it takes quite a bit of work and there are hundreds of articles. Thanks. -- 1310: 1271: 657: 649: 430: 198: 179: 47: 17: 2815:
use whatlinkshere to remove from the culled list all instances that already have it before running the regex.
1709: 1494: 1490: 1424: 1090: 860: 816: 2918:
How can I add those tags to articles that are not formatted correctly? AWB seems to have stopped at April...
1032:
I'm wondering if I can find any of these bots somewhere. Right now I'm using pyWikipedia with XML Dump file.
3044: 2734:}}, it could determine the artist/band and the year associated with both songs and create a dab page, e.g.: 2026: 1439: 1085:
if the clock reaches 18:00 Sunday, so people can have more time to work on expanding the collaborations. --
166: 134: 3036: 689:
requires human interaction; but at least we can figure out some of the human interaction that's necessary.
299: 287: 275: 263: 251: 2518: 2859: 2247: 1846: 1517: 1296: 1220: 1193: 1186: 737: 712: 305: 293: 281: 269: 3386: 3337: 3295: 3200: 3128: 3087: 3057: 2999: 2925: 2890: 2721: 2472:
to ] (just the article, not the talk). The date should be that of the day that's beginning, not ending).
2457: 1790: 1371: 773: 758: 639: 578:
Cool, I though it would require a lot more bureaucracy than that to get a thing like this done. Thanks!
2606: 2537: 2107: 2082: 2010:
Yes, yes, yes. Please have a bot do this. It hurts me physically to witness the lack of apostrophes. —
478: 3232:
Could someone make a robot that would put on school information? 2 websites I would like you to use:
2791:
was created to allow easy navigation through artciles for individual days given in the format below:
2785: 2731: 2717: 2621: 2321:
You mean a list of every word on a page, I don't know how you would define terms over other words --
2000: 1840: 1806: 1748: 1736: 1544: 1431: 1307: 1268: 1214: 852: 427: 365: 195: 176: 1384:
Bot Unfortunately, I haven't heard of anything like this. --Derbeth talk 23:40, 16 March 2006 (UTC)
3369: 1349: 1348:
still can be found in many wiki-articles. IMHO it will be a good task for a bot - to remove "the".
1086: 856: 824: 733: 708: 371: 257: 245: 223: 148: 2377: 2308: 2267: 2222: 2193: 1578: 231: 3040: 2713: 2152: 2021: 2011: 1563: 1502: 1457: 1081:. It should count the number of votes, remove failed nominations, and do everything listed under 162: 130: 1051:
Similar to above request, can I get the wanted pages for specific categories (including sub-cat)
2173: 1827: 1182: 1036:
Bots that can get page titles from dump file if there is no interlanguage link in the articles.
584: 579: 535: 530: 488: 483: 3280: 2964: 2842: 2645: 2556: 2514: 2158:
I'd like to see a bot that would delete the paragraph from all pages on which it is found. --
2062: 1960: 1855: 1824: 1774: 1229: 1152: 1137: 1101: 790: 380: 2704:
I'm not sure if this is worth the time and effort, but what about a bot that would parse the
2097: 1249: 1178: 1070: 679: 3488: 2817: 2712:
and create redirects to the album article? For example, the bot would create redirects like
2587: 2570: 2058: 1843: 1786: 1394: 1217: 1160:
Thanks! Its much easier to consider the problem now that we know how big it is... thank you
725: 700: 615: 558: 515: 126: 3170: 2490:
and add the day's archive to the top of the page. Monthly archiving can be left to humans.
2453: 2039: 1816: 1074: 877: 820: 3416: 3314: 3270: 2880: 2801: 2155:, are part of the surrounding townships. This is no different than anywhere else in Ohio. 2054: 1997: 1880: 1802: 1745: 1733: 1555: 1363: 838: 353: 341: 329: 317: 2833:. There are articles for most days, cant the ignore thing ignore nonexistent articles? -- 1078: 686: 3493: 3449: 3423: 3399: 3350: 3317: 3308: 3283: 3273: 3253: 3203: 3131: 3090: 3064: 3012: 2971: 2938: 2903: 2866: 2849: 2822: 2808: 2651: 2624: 2592: 2575: 2562: 2522: 2503: 2432: 2414: 2380: 2325: 2311: 2270: 2260: 2235: 2225: 2211: 2196: 2162: 2122: 2110: 2100: 2085: 2069: 2046: 2031: 1962: 1942: 1883: 1751: 1739: 1566: 1547: 1530: 1505: 1477: 1460: 1397: 1374: 1299: 1260: 1201: 1164: 1155: 1140: 1127: 1104: 1093: 1059: 1017: 1002: 905: 863: 841: 827: 803: 793: 783: 741: 716: 693: 642: 618: 592: 561: 543: 518: 496: 402: 383: 234: 169: 151: 137: 2863: 2709: 2615: 2461: 2429: 2148: 1852: 1615:
2. We are discussing two tags that project members can add to the article's talk page:
1559: 1389: 1226: 1198: 1190: 999: 729: 704: 588: 539: 492: 395: 144: 3327: 3323: 1332:
Formerly names "Crimea" and "Ukraine" were used with the definite article. Today only
3061: 2805: 2796: 2792: 2421: 2411: 2159: 1833:
I'll try to examine the exact behavior and write up a clone. Bear with me on this. —
1594: 1257: 902: 359: 347: 335: 323: 311: 3210:
The following appears to be a duplicate or very similar request to the one above. —
2956: 2834: 2642: 2553: 2066: 2043: 1957: 1939: 1876: 1770: 1248:
Is there a bot for adding an info box to all articles in a category? I checked the
1161: 1149: 1134: 1124: 848: 3313:
So how do we go about doing it? What has to happen to have this function enacted?
2720:. It would also create redirects for lowercase variations of song titles. Thanks, 2139:
Someone has put the following paragraph in a bunch of articles on Ohio townships:
3420: 3393: 3381: 3344: 3332: 3302: 3290: 3006: 2994: 2932: 2920: 2897: 2885: 2533: 2322: 2254: 2242: 2232: 2208: 2119: 1524: 1512: 1474: 1014: 881: 800: 780: 399: 114: 46:
If you wish to start a new discussion or revive an old one, please do so on the
1699:
page that alerts people that the article is available in an audio format, too.
508: 2500: 2147:
This is inaccurate. Some municipalities in Western Reserve townships, such as
1056: 1628:
a related tag to use when a project member decides to record an article, but
3250: 894: 690: 3199:
Excellent suggestions! I whole heartedly agree with your recommendations. (
1315: 435: 226:(and their derivatives) in albums to plain text. Here is the discussion. 184: 3151:%2B%22Herbert+Hoover+High+School%22+%2B%22Des+Moines%22+-%22Wikipedia%22 2663:
The catch-all regex you'd want for this might look something like this:
2396:
a short period of time, and would guess it to be not all that uncommon.
837:
Here's an easy one: phd, Phd, PhD and Ph.D should be changed into Ph.D.
671:
If {{cleanup}} is found on a disambig, replace it with disambig-cleanup.
3463: 3446: 3438:
aircraft. The wording appears to have been taken in verbatim from the
3218: 3180: 3108: 2758: 2680: 1689:--it provides a link to the old version that the recording it based on. 1276: 890:
items link boxes. Editing the 1% by hand would be easier than the 99%.
203: 2610: 3240: 2613:) is accurate, all of those talk pages (about 50-100) appear to have 1427:. There is a good deal of support for doing this. Before there was 1040: 3379:
I see what you mean. It's a big task, but hopefully a feasible one.
2883:
is doing it correctly, but I'll run it manually for the time being.
1714:
But of course, there's a slight variation to the tag if the article
1408: 1879:
Once transwikied, I could then clean up the wikipedia articles. --
1641:
3. We have a couple of tags that go on the article's page itself:
479:
http://www.amazon.com/gp/product/B000000W5L/sr=1-1/qid=1138522986/
2479:, remove the redirect, and replace it with a generic header like 2452:
I am hoping to recruit a bot to help with the daily archiving at
2305:
Admins, you may email me with any responses. Thanks in advance!
1318: 1279: 1266:
Putting the infobox in would be a cinch, populating it less so.
668:
If {{cleanup}} is found on a list, replace it with cleanup-list.
438: 206: 187: 118: 2544:. Could someone run a bot to replace all instances of the text 156:
They are absolutely not describing the current world. They are
3235: 2151:, are independent from surrounding townships. Others, such as 1877:
http://en.wikibooks.org/Category:Bartending_pages_needing_work
1252:
page, but I didn't see a bot that would do what I'm thinking.
25: 1305:
A blacklist of the vandal's favourite pix might be useful...
685:
These are the tasks that seem obviously automatable. Much of
2540:
in his signature, which was deleted as a redundant image of
1954:
Talk:Voivodes of the Polish-Lithuanian Commonwealth#Bot help
1597:
the work. Let me know if you're interested in that project.
1370:
delete bad links than put in all of my own.) Google? Yahoo?
113:
The majority of the US location articles that were added by
1625:
when a project member has volunteered to record the article
2829:
When adding this template to pages, the best wikitext is:
2620:
distracting and many of the uses are in archives anyway.--
2352:
Does this make sense? I am not trying to glean all terms
2176:
of course, but that doesn't seem to have ever taken off.)
1340:(without article) are considered to be correct forms. But 3097:
Something similar to this has been attempted before, see
2192:(XXG), and our goal of creating a truly 💕. Thoughts? -- 993:
Category:Countries in the Junior Eurovision Song Contest
3473: 3326:. If you know someone who has the time to run it, read 3222: 3190: 3118: 2768: 2690: 2511:
Knowledge (XXG):Bots/Requests for approvals#Jitse's bot
2480: 1858: 1655:
long recordings that have been split into several parts
1590: 1232: 422: 3149:, determine a pair, form a search engine query, e.g. 3145:
templates, and for the articles we already have, e.g.
1669:
4. In addition to that, we add a tag to the article's
1133:
Working on it (just the list, not actually editing). —
1069:
There should be a bot for automating the processes of
2240:
I agree. We wouldn't want automated blanking by now.
1493:. In this case the aphabetizing of each article in 1083:
Knowledge (XXG):Collaboration of the week/Maintenance
664:
The bot would have the following proposed behaviors:
1907:
remove trailing/empty rows: many templates end with
1244:
bot for adding box to an entire category of articles
961:
Category:Years in the Junior Eurovision Song Contest
2358:the pages, but rather apply the dictionary terms 977:Category:Countries in the Eurovision Song Contest 799:interwiki redirects which should be converted. -- 2668:\mage\s*\:\s*l+small\.gif\s*(?:\|\s*+px)?\s*\]\] 2042:. I'll report here. Consider it done for now. -- 1695:5. Additionally, we have a tag that goes on the 1605:Currently, we use several tags for our project: 2420:There are many bots that do this task, such as 1497:would be the same when copying the piping from 1409:http://en.wikibooks.org/User_talk:Prometheuspan 228:Wikipedia_talk:WikiProject_Albums#Stars_to_text 3099:Knowledge (XXG):Long term abuse/B-Movie Bandit 3279:Pywikipedia has a function for it, I think.-- 985:Belgium in the Junior Eurovision Song Contest 945:Category:Years in the Eurovision Song Contest 870:Robot wanted "id=toc" into "class=toccolours" 240:Here are the conversions that need to happen: 8: 3084:Southside High School (Fort Smith, Arkansas) 2509:I wrote a script, and requested approval at 1114:Moving slogans from infobox company template 983:Country in Junior Eurovision articles (like 3440:Dictionary of American Naval Fighting Ships 2477:Knowledge (XXG):Articles for creation/Today 2470:Knowledge (XXG):Articles for creation/Today 750:Convert interwiki redirects to softredirect 699:The 3rd task may be within the capacity of 509:http://www.amazon.com/gp/product/B000000W5L 222:Was wondering if a bot could convert these 2488:Knowledge (XXG):Articles for creation/List 1638:Once an article is recorded and uploaded: 1388:What about a bot that links references to 2568:I'll do that. It's a simple task to do. 1948:Bot needed for search and replace mission 1837: 1716:was featured at the time of the recording 1211: 1207:Sounds like fun, I'll get right on it. — 117:are written in the present tense. "As of 3080:Southside High School (Gadsden, Alabama) 2605:The list of file links at the bottom of 2188:eventually, or in installments perhaps. 855:can correct spelling errors already. -- 3147:Herbert Hoover High School (Des Moines) 1992:Special:Whatlinkshere/Hundred_Years_War 3217: 2585:the tasks page for my bot of course). 1987:Special:Whatlinkshere/Thirty_Years_War 1589:Right now, we have a manually-updated 969:Andorra in the Eurovision Song Contest 951:Junior Eurovision year articles (like 425:for the reasons this didn't go ahead. 398:to do this once I recieve a bot flag. 44:Do not edit the contents of this page. 2447:Knowledge (XXG):Articles for Creation 1982:Special:Whatlinkshere/Seven_Years_War 1958:Piotr Konieczny aka Prokonsul Piotrus 1630:nobody has requested it for recording 1448:Knowledge (XXG):Duplicated Categories 1146:Template talk:Infobox Company/Slogans 967:Country in Eurovision articles (like 769:interwiki redirects into uses of the 7: 3419:- it already does it and more :) -- 2879:Thanks for the list, Martin! I hope 2738:'''Foo at Tiffany's''' can refer to: 1799:) - Performs various archival tasks. 1438:subject. A good example of this is 3368:.. would be a useful spelling fix. 1685:And we have a version that is used 1421:Knowledge (XXG) talk:Categorization 953:Junior Eurovision Song Contest 2005 901:which has had this change done. -- 2831:{{daybar|{{subst:PAGENAME}}|xxxx}} 2718:Operation: Mindcrime#Track listing 2094:site:en.wikipedia.org "is a great" 1911:,which makes no sense, or include 1662:page summarised in another article 1612:to have read aloud and recorded. 1446:discuss this duplication process ( 24: 3241:http://www.publicschoolreview.com 3169:community's animosity toward the 2542:Image:Flag of the Netherlands.svg 2529:Replace an image on several pages 2219:Knowledge (XXG):Mirrors and forks 2038:I'm looking into doing that with 1930:Replace <br clear="all" /: --> 1697:Knowledge (XXG):Featured articles 1608:1. We have a tag that people use 1290:List of images used in an article 109:Rambot demographics to past tense 2614: 2460:, and there were some plans for 1926:Remove trailing </center: --> 1708:We add a tag for the article on 1028:Iinterwiki and getlinks requests 989:Category:Eurovision Song Contest 973:Category:Eurovision Song Contest 957:Category:Eurovision Song Contest 941:Category:Eurovision Song Contest 929:Category:Eurovision Song Contest 29: 2426:m:Using the python wikipediabot 935:Eurovision year articles (like 899:Template:Finnishmobileoperators 633:Ladnav bot, to reverse a vandal 3430:Innapropriate terminology bot? 2456:. The task used to be done by 1915:, which is equally nonsensical 1682:(stable versions, for example) 1680:where the article is unchanged 1579:Spoken Knowledge (XXG) Project 1573:Spoken Knowledge (XXG) Project 876:This was originally posted on 1: 2742:*], from the 2007 album '']'' 2740:*], from the 1988 album '']'' 1499:Category:American film actors 1487:Category:American film actors 1419:There has been discussion at 1128:22:09, 27 February 2006 (UTC) 1120:Template talk:Infobox Company 1105:15:31, 26 February 2006 (UTC) 1094:00:05, 25 February 2006 (UTC) 1060:20:56, 24 February 2006 (UTC) 1003:00:18, 23 February 2006 (UTC) 906:21:17, 18 February 2006 (UTC) 864:00:06, 25 February 2006 (UTC) 842:05:36, 17 February 2006 (UTC) 828:18:03, 11 February 2006 (UTC) 804:18:01, 11 February 2006 (UTC) 794:15:49, 11 February 2006 (UTC) 784:01:25, 11 February 2006 (UTC) 717:22:08, 27 February 2006 (UTC) 403:02:26, 14 February 2006 (UTC) 170:11:07, 20 February 2006 (UTC) 152:09:38, 20 February 2006 (UTC) 1870:Anyone have a Transwiki bot? 1687:when the article has changed 1660:and one for recordings of a 1647:one for recordings that are 1610:when they request an article 987:) need re-categorising from 971:) need re-categorising from 955:) need re-categorising from 939:) need re-categorising from 937:Eurovision Song Contest 2005 694:00:39, 8 February 2006 (UTC) 643:21:03, 7 February 2006 (UTC) 619:13:25, 9 February 2006 (UTC) 593:07:58, 7 February 2006 (UTC) 562:23:59, 6 February 2006 (UTC) 544:16:54, 6 February 2006 (UTC) 519:14:52, 6 February 2006 (UTC) 497:23:11, 5 February 2006 (UTC) 384:06:20, 5 February 2006 (UTC) 235:06:14, 5 February 2006 (UTC) 218:Convert star ratings to text 138:10:07, 3 February 2006 (UTC) 18:Knowledge (XXG):Bot requests 3260:Interlanguage link FA stars 3236:http://www.greatschools.net 2862:. hope that helps someone. 1353:_Crimea,_the_Ukraine_-: --> 998:Thanks if anyone can help. 454:Referral ID spam remove bot 3512: 3450:16:19, 28 April 2006 (UTC) 3424:20:58, 26 April 2006 (UTC) 3400:00:24, 25 April 2006 (UTC) 3374:22:47, 24 April 2006 (UTC) 3351:23:02, 26 April 2006 (UTC) 3318:07:18, 25 April 2006 (UTC) 3309:00:17, 25 April 2006 (UTC) 3284:05:09, 24 April 2006 (UTC) 3274:04:18, 24 April 2006 (UTC) 3132:02:06, 24 April 2006 (UTC) 3091:20:12, 23 April 2006 (UTC) 3065:16:07, 23 April 2006 (UTC) 3013:20:00, 23 April 2006 (UTC) 2972:17:29, 23 April 2006 (UTC) 2939:14:13, 23 April 2006 (UTC) 2904:14:26, 22 April 2006 (UTC) 2867:14:00, 22 April 2006 (UTC) 2850:13:42, 22 April 2006 (UTC) 2823:06:03, 22 April 2006 (UTC) 2809:16:29, 21 April 2006 (UTC) 2726:22:26, 20 April 2006 (UTC) 2652:15:42, 22 April 2006 (UTC) 2625:07:50, 22 April 2006 (UTC) 2593:07:27, 22 April 2006 (UTC) 2576:06:06, 22 April 2006 (UTC) 2563:21:38, 20 April 2006 (UTC) 2504:18:13, 15 April 2006 (UTC) 2184:people to go through it. 2106:here...lots of POV hits.-- 1943:20:07, 31 March 2006 (UTC) 1884:09:36, 26 March 2006 (UTC) 1828:15:00, 21 March 2006 (UTC) 1783:) - Does various AFD tasks 1740:02:41, 20 March 2006 (UTC) 1567:21:43, 20 March 2006 (UTC) 1548:12:26, 20 March 2006 (UTC) 1531:12:05, 20 March 2006 (UTC) 1506:09:17, 20 March 2006 (UTC) 1478:18:35, 19 March 2006 (UTC) 1461:10:21, 19 March 2006 (UTC) 1398:20:50, 24 March 2006 (UTC) 1375:18:18, 16 March 2006 (UTC) 1355:09:07, 12 March 2006 (UTC) 1327:Crimea, the Ukraine -: --> 1300:21:22, 10 March 2006 (UTC) 1043:(containing - en, fr, ja) 742:21:44, 21 March 2006 (UTC) 3322:You can read more on the 2858:I've made a list of days 2433:19:42, 6 April 2006 (UTC) 2415:19:35, 6 April 2006 (UTC) 2381:21:52, 6 April 2006 (UTC) 2326:06:04, 6 April 2006 (UTC) 2312:04:37, 6 April 2006 (UTC) 2271:14:10, 6 April 2006 (UTC) 2261:13:33, 6 April 2006 (UTC) 2236:06:22, 6 April 2006 (UTC) 2226:06:10, 6 April 2006 (UTC) 2212:06:03, 6 April 2006 (UTC) 2197:22:37, 5 April 2006 (UTC) 2163:02:24, 4 April 2006 (UTC) 2123:06:02, 6 April 2006 (UTC) 2111:05:51, 5 April 2006 (UTC) 2101:05:06, 5 April 2006 (UTC) 2086:21:04, 2 April 2006 (UTC) 2070:12:22, 7 April 2006 (UTC) 2047:15:12, 5 April 2006 (UTC) 2032:13:17, 5 April 2006 (UTC) 2005:02:35, 2 April 2006 (UTC) 1963:03:27, 1 April 2006 (UTC) 1678:A version for recordings 1489:so that they are also in 1261:02:47, 6 March 2006 (UTC) 1202:22:06, 3 March 2006 (UTC) 1187:Knowledge (XXG):Userboxes 1177:We need a bot to replace 1173:Userbox and Userboxes bot 1165:22:38, 3 March 2006 (UTC) 1156:18:02, 2 March 2006 (UTC) 1141:14:25, 2 March 2006 (UTC) 1018:08:01, 2 March 2006 (UTC) 658:Category:Cleanup by month 650:Category:Cleanup by month 3254:18:57, 5 June 2006 (UTC) 2523:14:26, 24 May 2006 (UTC) 1752:21:35, 21 May 2006 (UTC) 1710:Category:Spoken articles 1623:replaces the request tag 1495:Category:American actors 1491:Category:American actors 1425:Category:American actors 1364:wikibooks: THINKSTARSHIP 817:Category:Stock exchanges 812:Category:Stock exchanges 3494:22:27, 1 May 2006 (UTC) 3204:14:19, 6 May 2006 (UTC) 3056:I've moved the page to 2053:Done for the redirects 1918:replace <center: --> 1577:Hi there! Those on the 1440:Category:Film directors 1065:Bot for COTW, AID, etc. 3406:General Vandalism bot. 3153:, produce a URL, e.g. 2725:_Album_redirects": --> 2391:Language-links Checker 1956:for details. Thanks!-- 1919:with align="center" + 1657:for faster downloading 652:by type of edit needed 648:Janitor bot: classify 300:Image:0hv out of 5.png 288:Image:1hv out of 5.png 276:Image:2hv out of 5.png 264:Image:3hv out of 5.png 252:Image:4hv out of 5.png 3373:_arrondissement": --> 3363:arrondissment --: --> 2708:sections of pages at 914:American Cities/Towns 42:of past discussions. 3058:Don't wanna lose you 2781:Recently a template 2096:. Good idea though! 1889:minor template fixes 1756: 1250:Knowledge (XXG):bots 1079:other collaborations 893:For an example, see 680:Knowledge (XXG):Bots 379:Hope that helps. -- 306:Image:0 out of 5.png 294:Image:1 out of 5.png 282:Image:2 out of 5.png 270:Image:3 out of 5.png 258:Image:4 out of 5.png 246:Image:5 out of 5.png 2458:User:Uncle G's 'bot 2400:to all 14 others). 2135:Western Reserve bot 3330:and give it a go! 3264:Is there a way of 2716:for the tracks at 2607:Image:Nl_small.gif 2538:Image:Nl small.gif 2153:Newton Falls, Ohio 1903:class="toccolours" 1118:In disscussion at 884:referred me here. 167:Talk to the driver 135:Talk to the driver 3491: 3459: 3213: 3176: 3143:{{*-school-stub}} 3104: 3053: 3039:comment added by 2820: 2754: 2676: 2590: 2573: 2494:Can anyone help? 2063:Hundred Years War 1935:style declaration 1923:style declaration 1836: 1415:Is this possible? 1326:the Crimea -: --> 1210: 886: 726:User: Gnome (Bot) 103: 102: 54: 53: 48:current main page 3503: 3487: 3478: 3457: 3396: 3391: 3384: 3347: 3342: 3335: 3305: 3300: 3293: 3288:I am certain :) 3211: 3201:Cardsplayer4life 3195: 3174: 3157: 3129:Cardsplayer4life 3123: 3102: 3088:Cardsplayer4life 3052: 3033: 3009: 3004: 2997: 2969: 2961: 2935: 2930: 2923: 2900: 2895: 2888: 2847: 2839: 2832: 2816: 2790: 2784: 2777:Daybar inclusion 2773: 2752: 2695: 2674: 2618: 2586: 2569: 2548:(articles) with 2257: 2252: 2245: 2076:POV detector bot 2059:Thirty Years War 2024: 2019: 2014: 1934: 1922: 1914: 1910: 1904: 1900: 1865: 1861: 1834: 1757:AllyUnion's Bots 1585:RSS Feed Updater 1527: 1522: 1515: 1436: 1430: 1407:Retrieved from " 1239: 1235: 1208: 1196: 874: 778: 772: 763: 757: 701:User:Gnome (Bot) 582: 533: 486: 366:Image:0hvof5.png 354:Image:1hvof5.png 342:Image:2hvof5.png 330:Image:3hvof5.png 318:Image:4hvof5.png 127:Jasper, New York 81: 56: 55: 33: 32: 26: 3511: 3510: 3506: 3505: 3504: 3502: 3501: 3500: 3467: 3460: 3432: 3408: 3394: 3387: 3382: 3366: 3345: 3338: 3333: 3303: 3296: 3291: 3262: 3184: 3177: 3154: 3112: 3105: 3072: 3034: 3029: 3007: 3000: 2995: 2965: 2957: 2933: 2926: 2921: 2898: 2891: 2886: 2843: 2835: 2830: 2802:January 1, 2003 2788: 2782: 2779: 2762: 2755: 2702: 2700:Album redirects 2684: 2677: 2622:Commander Keane 2531: 2450: 2393: 2289: 2255: 2248: 2243: 2170: 2137: 2078: 2055:Seven Years War 2022: 2017: 2012: 1974: 1950: 1932: 1920: 1912: 1908: 1902: 1898: 1891: 1872: 1859: 1850: 1759: 1744:Nothing? D'oh! 1603: 1587: 1575: 1556:Category:Operas 1545:Commander Keane 1525: 1518: 1513: 1434: 1428: 1417: 1390:religious texts 1367: 1330: 1292: 1246: 1233: 1224: 1194: 1189:. Thank you. -- 1175: 1116: 1067: 1030: 925: 916: 872: 853:Oleg Alexandrov 835: 814: 776: 770: 761: 755: 752: 656:The backlog on 654: 635: 580: 531: 484: 456: 220: 111: 77: 30: 22: 21: 20: 12: 11: 5: 3509: 3507: 3499: 3498: 3497: 3496: 3480: 3479: 3465: 3431: 3428: 3427: 3426: 3407: 3404: 3403: 3402: 3370:Colonies Chris 3365: 3364:arrondissement 3361: 3360: 3359: 3358: 3357: 3356: 3355: 3354: 3353: 3328:the bot policy 3324:Pywikipediabot 3261: 3258: 3257: 3256: 3246: 3245: 3244: 3243: 3238: 3228: 3208: 3207: 3182: 3152: 3144: 3138: 3137: 3136: 3135: 3110: 3071: 3068: 3028: 3025: 3024: 3023: 3022: 3021: 3020: 3019: 3018: 3017: 3016: 3015: 2981: 2980: 2979: 2978: 2977: 2976: 2975: 2974: 2946: 2945: 2944: 2943: 2942: 2941: 2911: 2910: 2909: 2908: 2907: 2906: 2872: 2871: 2870: 2869: 2853: 2852: 2826: 2825: 2778: 2775: 2760: 2743: 2741: 2739: 2710:List of albums 2701: 2697: 2682: 2671: 2670: 2669: 2661: 2660: 2659: 2658: 2657: 2656: 2655: 2654: 2632: 2631: 2630: 2629: 2628: 2627: 2598: 2597: 2596: 2595: 2579: 2578: 2551: 2547: 2530: 2527: 2526: 2525: 2492: 2491: 2484: 2473: 2462:User:ShinmaBot 2449: 2443: 2441: 2438: 2436: 2435: 2407:Any thoughts? 2392: 2389: 2388: 2387: 2386: 2385: 2384: 2383: 2370: 2369: 2368: 2367: 2366: 2365: 2345: 2344: 2343: 2342: 2341: 2340: 2336: 2330: 2329: 2328: 2288: 2287:Dictionary bot 2285: 2284: 2283: 2282: 2281: 2280: 2279: 2278: 2277: 2276: 2275: 2274: 2273: 2263: 2169: 2166: 2149:Cortland, Ohio 2145: 2144: 2136: 2133: 2132: 2131: 2130: 2129: 2128: 2127: 2126: 2125: 2077: 2074: 2073: 2072: 2050: 2049: 2035: 2034: 1995: 1994: 1989: 1984: 1973: 1970: 1969: 1967: 1949: 1946: 1937: 1936: 1928: 1924: 1921:margin:0 auto; 1916: 1905: 1890: 1887: 1871: 1868: 1867: 1866: 1848: 1821: 1820: 1800: 1784: 1758: 1755: 1723: 1722: 1721: 1720: 1719: 1712: 1693: 1692: 1691: 1690: 1683: 1667: 1666: 1665: 1664: 1658: 1651: 1649:a single file. 1636: 1635: 1634: 1633: 1626: 1602: 1599: 1586: 1583: 1574: 1571: 1570: 1569: 1564:Samuel Wantman 1560:Category:Opera 1540: 1539: 1538: 1537: 1536: 1535: 1534: 1533: 1503:Samuel Wantman 1482: 1481: 1480: 1458:Samuel Wantman 1416: 1413: 1401: 1387: 1379: 1378: 1366: 1361: 1359: 1354:_Ukraine": --> 1350:Don Alessandro 1329: 1324: 1323: 1322: 1291: 1288: 1286: 1284: 1283: 1245: 1242: 1241: 1240: 1222: 1174: 1171: 1170: 1169: 1168: 1167: 1143: 1115: 1112: 1110: 1108: 1107: 1087:King of Hearts 1066: 1063: 1053: 1052: 1049: 1048: 1047: 1037: 1029: 1026: 1025: 1024: 1023: 1022: 1021: 1020: 996: 995: 980: 979: 964: 963: 948: 947: 924: 923:Recategorising 921: 915: 912: 910: 871: 868: 867: 866: 857:King of Hearts 834: 831: 825:Christopherlin 813: 810: 809: 808: 807: 806: 751: 748: 747: 746: 745: 744: 720: 719: 676: 675: 672: 669: 653: 646: 634: 631: 630: 629: 628: 627: 626: 625: 624: 623: 622: 621: 602: 601: 600: 599: 598: 597: 596: 595: 569: 568: 567: 566: 565: 564: 549: 548: 547: 546: 528: 522: 521: 512: 511: 505: 504: 455: 452: 451: 450: 449: 448: 447: 446: 445: 444: 443: 442: 410: 409: 408: 407: 406: 405: 396:User:Tawkerbot 387: 386: 376: 375: 372:Image:0of5.png 369: 363: 360:Image:1of5.png 357: 351: 348:Image:2of5.png 345: 339: 336:Image:3of5.png 333: 327: 324:Image:4of5.png 321: 315: 312:Image:5of5.png 309: 303: 297: 291: 285: 279: 273: 267: 261: 255: 249: 242: 241: 219: 216: 215: 214: 213: 212: 211: 210: 193:BTW all done. 172: 110: 107: 105: 101: 100: 95: 92: 87: 82: 75: 70: 65: 62: 52: 51: 34: 23: 15: 14: 13: 10: 9: 6: 4: 3: 2: 3508: 3495: 3490: 3484: 3483: 3482: 3481: 3477: 3475: 3472: 3469: 3454: 3453: 3452: 3451: 3448: 3443: 3441: 3435: 3429: 3425: 3422: 3418: 3414: 3413: 3412: 3405: 3401: 3398: 3397: 3392: 3390: 3385: 3378: 3377: 3376: 3375: 3371: 3362: 3352: 3349: 3348: 3343: 3341: 3336: 3329: 3325: 3321: 3320: 3319: 3316: 3312: 3311: 3310: 3307: 3306: 3301: 3299: 3294: 3287: 3286: 3285: 3282: 3278: 3277: 3276: 3275: 3272: 3267: 3266:automatically 3259: 3255: 3252: 3248: 3247: 3242: 3239: 3237: 3234: 3233: 3231: 3230: 3229: 3227: 3225: 3224: 3220: 3214: 3205: 3202: 3198: 3197: 3196: 3194: 3192: 3189: 3186: 3172: 3166: 3164: 3159: 3156: 3150: 3148: 3142: 3133: 3130: 3125: 3124: 3122: 3120: 3117: 3114: 3100: 3096: 3095: 3094: 3092: 3089: 3085: 3081: 3075: 3069: 3067: 3066: 3063: 3059: 3054: 3050: 3046: 3042: 3041:Charlie White 3038: 3026: 3014: 3011: 3010: 3005: 3003: 2998: 2991: 2990: 2989: 2988: 2987: 2986: 2985: 2984: 2983: 2982: 2973: 2970: 2968: 2962: 2960: 2954: 2953: 2952: 2951: 2950: 2949: 2948: 2947: 2940: 2937: 2936: 2931: 2929: 2924: 2917: 2916: 2915: 2914: 2913: 2912: 2905: 2902: 2901: 2896: 2894: 2889: 2882: 2878: 2877: 2876: 2875: 2874: 2873: 2868: 2865: 2861: 2857: 2856: 2855: 2854: 2851: 2848: 2846: 2840: 2838: 2828: 2827: 2824: 2819: 2813: 2812: 2811: 2810: 2807: 2803: 2798: 2797:June 11, 2004 2794: 2793:June 10, 2004 2787: 2776: 2774: 2772: 2770: 2767: 2764: 2748: 2744: 2737: 2735: 2733: 2728: 2727: 2723: 2722:TheJabberwock 2719: 2715: 2711: 2707: 2706:Track Listing 2698: 2696: 2694: 2692: 2689: 2686: 2667: 2666: 2665: 2664: 2653: 2650: 2647: 2644: 2640: 2639: 2638: 2637: 2636: 2635: 2634: 2633: 2626: 2623: 2617: 2612: 2608: 2604: 2603: 2602: 2601: 2600: 2599: 2594: 2589: 2583: 2582: 2581: 2580: 2577: 2572: 2567: 2566: 2565: 2564: 2561: 2558: 2555: 2549: 2545: 2543: 2539: 2535: 2528: 2524: 2520: 2516: 2512: 2508: 2507: 2506: 2505: 2502: 2499: 2498: 2489: 2485: 2482: 2478: 2474: 2471: 2467: 2466: 2465: 2463: 2459: 2455: 2448: 2444: 2442: 2439: 2434: 2431: 2427: 2423: 2422:User:YurikBot 2419: 2418: 2417: 2416: 2413: 2408: 2405: 2401: 2397: 2390: 2382: 2379: 2376: 2375: 2374: 2373: 2372: 2371: 2363: 2362: 2357: 2356: 2351: 2350: 2349: 2348: 2347: 2346: 2337: 2334: 2333: 2331: 2327: 2324: 2320: 2319: 2318: 2317: 2316: 2315: 2314: 2313: 2310: 2306: 2303: 2300: 2296: 2292: 2286: 2272: 2269: 2264: 2262: 2259: 2258: 2253: 2251: 2246: 2239: 2238: 2237: 2234: 2229: 2228: 2227: 2224: 2220: 2215: 2214: 2213: 2210: 2205: 2204: 2203: 2202: 2201: 2200: 2199: 2198: 2195: 2189: 2185: 2181: 2177: 2175: 2167: 2165: 2164: 2161: 2156: 2154: 2150: 2142: 2141: 2140: 2134: 2124: 2121: 2116: 2115: 2114: 2113: 2112: 2109: 2104: 2103: 2102: 2099: 2095: 2090: 2089: 2088: 2087: 2084: 2075: 2071: 2068: 2064: 2060: 2056: 2052: 2051: 2048: 2045: 2041: 2037: 2036: 2033: 2030: 2029: 2025: 2020: 2015: 2009: 2008: 2007: 2006: 2004:_Years'": --> 2002: 1999: 1993: 1990: 1988: 1985: 1983: 1980: 1979: 1978: 1968: 1965: 1964: 1961: 1959: 1955: 1947: 1945: 1944: 1941: 1929: 1925: 1917: 1906: 1896: 1895: 1894: 1888: 1886: 1885: 1882: 1878: 1869: 1864: 1862: 1857: 1854: 1851: 1845: 1842: 1832: 1831: 1830: 1829: 1826: 1818: 1814: 1811: 1808: 1804: 1801: 1798: 1795: 1792: 1788: 1785: 1782: 1779: 1776: 1772: 1769: 1768: 1767: 1763: 1754: 1753: 1750: 1747: 1742: 1741: 1738: 1735: 1730: 1726: 1717: 1713: 1711: 1707: 1706: 1705: 1704: 1703: 1700: 1698: 1688: 1684: 1681: 1677: 1676: 1675: 1674: 1673: 1672: 1663: 1659: 1656: 1652: 1650: 1646: 1645: 1644: 1643: 1642: 1639: 1631: 1627: 1624: 1620: 1619: 1618: 1617: 1616: 1613: 1611: 1606: 1600: 1598: 1596: 1592: 1584: 1582: 1580: 1572: 1568: 1565: 1561: 1557: 1552: 1551: 1550: 1549: 1546: 1532: 1529: 1528: 1523: 1521: 1516: 1509: 1508: 1507: 1504: 1500: 1496: 1492: 1488: 1483: 1479: 1476: 1471: 1470: 1469: 1468: 1467: 1466: 1465: 1464: 1463: 1462: 1459: 1454: 1451: 1449: 1443: 1441: 1433: 1426: 1422: 1414: 1412: 1410: 1405: 1400: 1399: 1396: 1391: 1385: 1382: 1377: 1376: 1373: 1372:Prometheuspan 1365: 1362: 1360: 1357: 1356: 1351: 1347: 1343: 1339: 1335: 1325: 1320: 1317: 1313: 1312: 1309: 1304: 1303: 1302: 1301: 1298: 1289: 1287: 1281: 1278: 1274: 1273: 1270: 1265: 1264: 1263: 1262: 1259: 1253: 1251: 1243: 1238: 1236: 1231: 1228: 1225: 1219: 1216: 1206: 1205: 1204: 1203: 1200: 1197: 1192: 1188: 1184: 1180: 1172: 1166: 1163: 1159: 1158: 1157: 1154: 1151: 1147: 1144: 1142: 1139: 1136: 1132: 1131: 1130: 1129: 1126: 1121: 1113: 1111: 1106: 1103: 1098: 1097: 1096: 1095: 1092: 1088: 1084: 1080: 1076: 1072: 1064: 1062: 1061: 1058: 1050: 1045: 1044: 1042: 1038: 1035: 1034: 1033: 1027: 1019: 1016: 1011: 1010: 1009: 1008: 1007: 1006: 1005: 1004: 1001: 994: 990: 986: 982: 981: 978: 974: 970: 966: 965: 962: 958: 954: 950: 949: 946: 942: 938: 934: 933: 932: 930: 927:The category 922: 920: 913: 911: 908: 907: 904: 900: 896: 891: 887: 885: 883: 879: 869: 865: 862: 858: 854: 850: 847:I think that 846: 845: 844: 843: 840: 832: 830: 829: 826: 822: 818: 811: 805: 802: 797: 796: 795: 792: 788: 787: 786: 785: 782: 775: 768: 760: 749: 743: 739: 735: 731: 727: 724: 723: 722: 721: 718: 714: 710: 706: 702: 698: 697: 696: 695: 692: 688: 683: 681: 673: 670: 667: 666: 665: 662: 659: 651: 647: 645: 644: 641: 640:217.123.4.108 632: 620: 617: 612: 611: 610: 609: 608: 607: 606: 605: 604: 603: 594: 590: 586: 583: 577: 576: 575: 574: 573: 572: 571: 570: 563: 560: 555: 554: 553: 552: 551: 550: 545: 541: 537: 534: 529: 526: 525: 524: 523: 520: 517: 514: 513: 510: 507: 506: 501: 500: 499: 498: 494: 490: 487: 481: 480: 476: 473: 472: 468: 467: 463: 460: 453: 440: 437: 433: 432: 429: 424: 420: 419: 418: 417: 416: 415: 414: 413: 412: 411: 404: 401: 397: 393: 392: 391: 390: 389: 388: 385: 382: 378: 377: 373: 370: 367: 364: 361: 358: 355: 352: 349: 346: 343: 340: 337: 334: 331: 328: 325: 322: 319: 316: 313: 310: 307: 304: 301: 298: 295: 292: 289: 286: 283: 280: 277: 274: 271: 268: 265: 262: 259: 256: 253: 250: 247: 244: 243: 239: 238: 237: 236: 233: 229: 225: 217: 208: 205: 201: 200: 197: 192: 191: 189: 186: 182: 181: 178: 173: 171: 168: 164: 163:OpenToppedBus 159: 155: 154: 153: 150: 146: 142: 141: 140: 139: 136: 132: 131:OpenToppedBus 128: 122: 120: 116: 108: 106: 99: 96: 93: 91: 88: 86: 83: 80: 76: 74: 71: 69: 66: 63: 61: 58: 57: 49: 45: 41: 40: 35: 28: 27: 19: 3461: 3458:Apr. 29, '06 3444: 3436: 3433: 3409: 3388: 3380: 3367: 3339: 3331: 3297: 3289: 3281:Orgullomoore 3265: 3263: 3215: 3209: 3178: 3175:Apr. 29, '06 3167: 3162: 3160: 3139: 3106: 3103:Apr. 24, '06 3076: 3073: 3070:High Schools 3055: 3030: 3001: 2993: 2966: 2958: 2927: 2919: 2892: 2884: 2844: 2836: 2780: 2756: 2753:Apr. 29, '06 2749: 2745: 2736: 2729: 2705: 2703: 2678: 2675:Apr. 23, '06 2672: 2662: 2532: 2515:Jitse Niesen 2496: 2495: 2493: 2451: 2440: 2437: 2409: 2406: 2402: 2398: 2394: 2360: 2359: 2354: 2353: 2307: 2304: 2301: 2297: 2293: 2290: 2249: 2241: 2190: 2186: 2182: 2178: 2171: 2157: 2146: 2138: 2093: 2079: 2027: 1996: 1975: 1971:Years -: --> 1966: 1951: 1938: 1892: 1873: 1838: 1835:Mar. 21, '06 1825:TexasAndroid 1822: 1809: 1793: 1777: 1764: 1760: 1743: 1731: 1727: 1724: 1715: 1702:6. Finally, 1701: 1694: 1686: 1679: 1670: 1668: 1661: 1654: 1653:and one for 1648: 1640: 1637: 1629: 1622: 1614: 1609: 1607: 1604: 1588: 1576: 1541: 1519: 1511: 1455: 1452: 1444: 1418: 1406: 1402: 1386: 1383: 1380: 1368: 1358: 1345: 1341: 1337: 1333: 1331: 1306: 1293: 1285: 1267: 1254: 1247: 1212: 1176: 1117: 1109: 1102:Orgullomoore 1068: 1054: 1031: 997: 926: 917: 909: 892: 888: 875: 873: 836: 815: 791:Orgullomoore 779:template. -- 774:softredirect 766: 759:softredirect 753: 684: 677: 663: 655: 636: 482: 477: 474: 469: 465: 464: 461: 457: 426: 224:star ratings 221: 194: 175: 157: 123: 112: 104: 78: 43: 37: 3489:Pegasus1138 3212:Jun. 5, '06 3035:—Preceding 2818:Pegasus1138 2732:R from song 2699:Song -: --> 2588:Pegasus1138 2571:Pegasus1138 2552:? Thanks! ~ 2534:User:Germen 2172:(There was 2168:Copyvio bot 2108:Hooperbloob 2083:Hooperbloob 2081:excluded.-- 1933:clear:both; 1787:Kurando-san 1432:CategoryTOC 1395:Andrewjuren 1346:the Ukraine 1209:Mar. 3, '06 882:User:Angela 833:Common typo 616:Cmdrjameson 559:Cmdrjameson 516:Cmdrjameson 421:Please see 394:I will set 36:This is an 3464:freakofnur 3417:Tawkerbot2 3315:Witty lama 3271:Witty lama 3181:freakofnur 3163:reasonable 3109:freakofnur 2881:Fetofsbot2 2804:till now. 2759:freakofnur 2681:freakofnur 2536:was using 2445:Archiving 2364:the pages. 1881:Xyzzyplugh 1803:NekoDaemon 1746:Ckamaeleon 1734:Ckamaeleon 1342:the Crimea 1311:Farmbrough 1272:Farmbrough 1256:instead?-- 1057:Manop - TH 839:Kjaergaard 754:I created 431:Farmbrough 368:to (0.5/5) 356:to (1.5/5) 344:to (2.5/5) 332:to (3.5/5) 320:to (4.5/5) 302:to (0.5/5) 290:to (1.5/5) 278:to (2.5/5) 266:to (3.5/5) 254:to (4.5/5) 199:Farmbrough 180:Farmbrough 158:explicitly 98:Archive 10 2174:User:Cobo 1671:talk page 1621:One that 1316:15 March 1183:Userboxes 1041:th:LAOTSE 1000:Esteffect 895:Elisa Oyj 475:becomes: 436:10 March 185:11 March 145:Superm401 90:Archive 8 85:Archive 7 79:Archive 6 73:Archive 5 68:Archive 4 60:Archive 1 3062:LukeSurl 3049:contribs 3037:unsigned 2959:Alfakim 2837:Alfakim 2806:LukeSurl 2714:this one 2481:this one 2412:Hughitt1 2160:Mwalcoff 1899:id="toc" 1897:replace 1813:contribs 1797:contribs 1781:contribs 1595:SCEhardt 1591:RSS feed 1543:waste.-- 1277:8 March 1258:Firsfron 903:SGBailey 466:example: 374:to (0/5) 362:to (1/5) 350:to (2/5) 338:to (3/5) 326:to (4/5) 314:to (5/5) 308:to (0/5) 296:to (1/5) 284:to (2/5) 272:to (3/5) 260:to (4/5) 248:to (5/5) 204:13 June 3074:Hello, 3032:placed. 3027:Linking 2378:Delfeld 2335:Tawker, 2309:Delfeld 2291:Hello, 2268:W.marsh 2223:W.marsh 2194:W.marsh 2067:Ligulem 2044:Ligulem 2013:Nightst 1940:Circeus 1931:with a 1841:freakof 1771:AFD Bot 1601:Tagging 1338:Ukraine 1328:Ukraine 1215:freakof 1199:ng Aili 1179:Userbox 1162:Ian3055 1150:Cryptic 1135:Cryptic 1125:Ian3055 1100:that.-- 1071:WP:COTW 849:Mathbot 232:Gflores 190:(UTC). 39:archive 3421:Tawker 3171:WP:SCH 2864:Martin 2786:daybar 2454:WP:AfC 2430:Martin 2332:----- 2323:Tawker 2233:Tawker 2209:Tawker 2120:Tawker 2040:WP:AWB 1972:Years' 1817:WP:CFD 1475:Tawker 1404:(UTC) 1334:Crimea 1321:(UTC). 1314:12:19 1282:(UTC). 1275:19:44 1153:(talk) 1138:(talk) 1091:(talk) 1077:, and 1075:WP:AID 1015:Tawker 880:, but 878:WP:VPA 861:(talk) 821:WP:CAT 801:cesarb 781:cesarb 441:(UTC). 434:01:58 400:Tawker 209:(GMT). 202:17:11 183:14:44 115:Rambot 3492:---- 3476:: --> 3415:Meet 3226:: --> 3219:freak 3193:: --> 3121:: --> 2967:talk 2845:talk 2821:---- 2800:from 2771:: --> 2693:: --> 2591:---- 2574:---- 2513:. -- 2501:Meegs 2486:edit 2475:edit 2468:move 2023:llion 1901:with 1863:: --> 1749:((T)) 1737:((T)) 1562:. -- 1308:Rich 1269:Rich 1237:: --> 1185:with 730:Eagle 705:Eagle 687:WP:CU 428:Rich 177:Rich 16:< 3474:talk 3468:ture 3462:< 3434:Hi 3395:tofs 3346:tofs 3304:tofs 3251:Ksax 3223:talk 3216:< 3191:talk 3185:ture 3179:< 3119:talk 3113:ture 3107:< 3082:and 3045:talk 3008:tofs 2934:tofs 2899:tofs 2860:here 2769:talk 2763:ture 2757:< 2691:talk 2685:ture 2679:< 2611:link 2519:talk 2355:from 2256:tofs 2221:. -- 2065:. -- 2061:and 1998:gren 1952:See 1927:tags 1913:|-|- 1909:|-|} 1860:talk 1853:ture 1839:< 1815:) - 1807:talk 1791:talk 1775:talk 1558:and 1526:tofs 1344:and 1336:and 1319:2006 1280:2006 1234:talk 1227:ture 1213:< 1181:and 897:and 738:desk 734:talk 713:desk 709:talk 691:Alba 589:Talk 540:Talk 493:Talk 439:2006 423:here 207:2006 196:Rich 188:2006 149:Talk 119:2000 2963:-- 2841:-- 2643:MDD 2554:MDD 2028:(?) 2001:グレン 1819:bot 1148:. — 991:to 975:to 959:to 943:to 851:by 767:all 736:) ( 711:) ( 585:bli 536:bli 503:to: 489:bli 3447:Sf 3249:-- 3093:) 3051:) 3047:• 2795:, 2789:}} 2783:{{ 2649:96 2646:46 2560:96 2557:46 2521:) 2428:. 2410:-- 2361:to 2266:-- 2098:GT 2057:, 1844:nu 1435:}} 1429:{{ 1411:" 1297:CG 1218:nu 1089:| 1073:, 1055:-- 859:| 777:}} 771:{{ 762:}} 756:{{ 740:) 715:) 591:) 542:) 495:) 381:WB 230:. 165:- 147:- 133:- 94:→ 64:← 3471:| 3466:x 3389:e 3383:F 3340:e 3334:F 3298:e 3292:F 3221:| 3206:) 3188:| 3183:x 3134:) 3116:| 3111:x 3043:( 3002:e 2996:F 2928:e 2922:F 2893:e 2887:F 2766:| 2761:x 2688:| 2683:x 2609:( 2550:] 2546:] 2517:( 2497:× 2483:. 2250:e 2244:F 2018:a 1856:| 1849:x 1847:r 1810:· 1805:( 1794:· 1789:( 1778:· 1773:( 1718:. 1632:. 1520:e 1514:F 1230:| 1223:x 1221:r 1195:a 1191:F 732:( 707:( 587:( 581:O 538:( 532:O 491:( 485:O 50:.

Index

Knowledge (XXG):Bot requests
archive
current main page
Archive 1
Archive 4
Archive 5
Archive 6
Archive 7
Archive 8
Archive 10
Rambot
2000
Jasper, New York
OpenToppedBus
Talk to the driver
10:07, 3 February 2006 (UTC)
Superm401
Talk
09:38, 20 February 2006 (UTC)
OpenToppedBus
Talk to the driver
11:07, 20 February 2006 (UTC)
Rich
Farmbrough
11 March
2006
Rich
Farmbrough
13 June
2006

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.