Wikipedia talk:Bots/Archive 12

Page contents not supported in other languages.
Source: Wikipedia, the free encyclopedia.
<
Wikipedia talk:Bots
Archive 5 Archive 10 Archive 11 Archive 12 Archive 13 Archive 14 Archive 15

Welcome bot

A bot that welcomes new users, providing they arn't vandals or don't have offensive usernames. The bot uses the new user list combined with a "memory", then waits for a specified amount of time (eg. 30 minutes) before welcoming them. If the user account is blocked before the time limit, the account won't be welcomed.

67.60.52.155 14:34, 20 October 2005 (UTC)

I think this is a bad idea. I believe it's like 5% or less of the users who actually create an account actually stay, so it would be a resource hog to continuously make new user talk pages. Plus, I think the Welcoming committee does a better job, even if slow sometimes, I know I would prefer to be welcomed by a human. Plus each user has their own welcome message that is unique, and you can reply to them and ask questions, it's part of being in the community. IMHO, I prefer there not be one. «»Who?¿?meta 15:15, 20 October 2005 (UTC)
Thank you for your feedback - all opinions are welcomed. With regards to the "resource hog" issue, the robot can be made a little more sophisticated by only welcoming users who have made edits, using the User Contributions feature. While I agree that it is better to be welcomed by a human, there is no reason why the message could not indicate a person to contact on the welcoming comittee, designated on a rolling basis. 67.60.52.155 16:30, 20 October 2005 (UTC)
I have put a note on the welcome committee page asking for any feedback they may have. 67.60.52.155 16:35, 20 October 2005 (UTC)
If we want to do this (and I'm not sure we do), why not just make it the default content for the talk page of newly created users? Should be a trivial mediawiki patch. --fvw* 18:42, 20 October 2005 (UTC)
Actually I was thinking the same thing, having a default message for the talk page. Wouldn't be a bad idea, and there are plenty of users already on WC that could probably come up with something decent. «»Who?¿?meta 18:57, 20 October 2005 (UTC)
The point of a greeting isn't to give information to newcomers. The point is to welcome them into the Wikipedia community. A bot can't do that. Isomorphic 02:57, 21 October 2005 (UTC)

Personally, welcome messages, whether from bots or humans, annoy me. If people need information, we should make that information easily accessible from the Main Page, Community Portal, or other obvious location. -- Beland 22:54, 22 October 2005 (UTC)

I think users should be welcomed by humans, not bots. The main reason I have for this is that many users who are welcomed go immediately to the welcomer's talk page and leave them a message and/or question. If someone is welcomed by a bot, they can't do this. Also, welcomes are much more personal when left by a human; it makes the user think, "Wow, there are people out there who care, and one of them noticed me." A bot makes someone think, "Wow, they have a bot that welcomes people...kind of cool, but if this is a community, couldn't someone have taken 60 seconds of their day to welcome me personally?" EWS23 | (Leave me a message!) 04:12, 24 October 2005 (UTC)

I had the same thought months ago. We have bots to show people the door, we have humans welcome them to a community of humans. --AllyUnion (talk) 11:52, 25 October 2005 (UTC)

When I welcome people, I like to comment on some of their edits and point them to WikiProjects or other project pages related to the articles they've edited. A bot can't do that (without complications, anyway). --TantalumTelluride 05:21, 2 December 2005 (UTC)

I'd like permission to use a bot (

WP:SFD. It would use pywikipedia bot; template.py for template renames and touch.py for category renames (I don't know if that needs approval as it's not actually changing anything, but it's better to be safe). --Mairi
22:51, 20 October 2005 (UTC)

If it is making edits, yes it would need approval, unless you felt that it needs the watch of human editors on RC patrol. --AllyUnion (talk) 11:54, 25 October 2005 (UTC)
I figured that, but touch.py makes null edits, so they wouldn't actually show up anywhere regardless. But either way, I'd need approval for the template renaming... --Mairi 17:13, 25 October 2005 (UTC)
  • I assume the pywikipedia framework respects our link ordering convention - i.e. at the end of the page, category links come first, followed by interwiki links, followed by stub notices? I've had to add support to Pearle (another category bot) to do that, and anything her parser can't handle gets dumped in Category:Articles to check for link ordering. Human editors don't always follow the convention, but I'm thinking if there are multiple bots rearranging thing, they should converge on the same ordering. (Whobot also does category work, and is actually based on Pearle.) -- Beland 07:31, 26 October 2005 (UTC)
I'm not sure about the other parts of pywikipedia, but template.py replaces any occurrances of the existing template with the new template, regardless of where they are (it isn't specific to stub templates, even though that's what I intend to use it for). So the ordering wouldn't matter, and the existing order would be preserved, I believe.
Incidentally, I wasn't aware that was our link ordering convention. I generally put stub notices before category links, and I think that's where I usually see them... --Mairi 07:56, 26 October 2005 (UTC)

It looks like I'll occasionally have to use replace.py when renaming redirects; because Mediawiki now apparently considers using Template:A (that redirects to Template:B) as only a use of and link to Template:B. --Mairi 05:31, 8 November 2005 (UTC)

I'd like permission to run

Cryptic (talk)
20:37, 21 October 2005 (UTC)

You could easily generate a list of stats... actually, I've been asked for the AFD Bot to generate a list of a summary, but I haven't yet figured out how that could be done. Well, I just had a small thought, but that would assume everyone's sig ends with a "(UTC)" at the end. --AllyUnion (talk) 11:56, 25 October 2005 (UTC)
I actually already use a script to generate a list of pages that are in
Cryptic (talk)
06:02, 26 October 2005 (UTC)

Expansion for archival duties

As requested at

WP:VP
, or those here, object). The way this will work is:

  1. Fetch the current contents of the page.
  2. Split it into separate sections (using ^==[^=].*==\s+$ as a delimiter) (for those reading this who don't speak regexp, that's any line starting with exactly two equals signs, followed by at least one character where the first isn't an equals sign, and then ending with at least two equals signs, optionally followed with whitespace).
  3. Find the latest timestamp in the section's text.
  4. Optionally look for comments like <!-- don't touch this section --> or <!-- don't copy this section to the archive when it's removed -->, if folks think such would be useful.
  5. Add any sections where the latest timestamp was more than seven days in the past to the respective archive page. (In the case of Wikipedia:Village pump (technical)/Archive and the other village post archives, the sections are instead just removed.)
  6. Replace the current contents of the original page with all the remaining sections.

Like Crypticbot's current AFD-orphan task, this will be run once a day; unlike the AFDs, I won't be vetting the edits beforehand. Not sure if it merits a bot flag or not; if used on all the village pump (six, and their archives) and administrator's noticeboard pages (three, and their archives), it'll be at most eighteen edits a day. Perhaps one more to update

Cryptic (talk)
20:02, 16 November 2005 (UTC)

RussBot: proposed expansion

I am requesting permission to expand the operations of RussBot to include a regularly scheduled counting of links to disambiguation pages. I have a script that performs the link count and formats the output. For sample output, please see

11:16, 23 October 2005 (UTC)

So it would count the links for every disambig it needs to every time it runs? Or for only new links? What happens when the list is excessive, is there anyway for you to control the bot on the number of pages it needs to count links for? --AllyUnion (talk) 11:59, 25 October 2005 (UTC)
The bot would count all the links to pages that are listed on
Wikipedia:Disambiguation pages maintenance. There are about 365 pages listed currently. For each one of those articles, the bot retrieves Special:Whatlinkshere/Article title and counts the number of links there. It does not retrieve all the referencing pages. For pages that have more than 500 inbound links (currently there are only two pages out of 365 that fall into this category), it makes one call to [[Special:Whatlinkshere/...]] for every 500 references. So the bot would send about 370 HTTP requests to the server. It uses the pywikipediabot framework, so the throttle spaces these requests out several seconds apart. What type of controls do you think would be appropriate? --Russ Blau (talk)
13:26, 25 October 2005 (UTC)