Jump to content

Wikispecies:Bots/Requests for approval/KoumzBot

From Wikispecies

KoumzBot

  1. Operator: Koumz
  2. Automatic or Manually Assisted: Fast Assisted Editing Only
  3. Programming Language(s): N/A
  4. Function Summary: Various projects, including cleanup, template conversions and automatic page creation
  5. Edit period(s) (e.g. Continuous, daily, one time run): Sporadic, depending on project
  6. Edit rate requested: N/A, manual
  7. Already has a bot flag (Y/N): N
  8. Function Details: This is just a procedural flood flag for me when I do large numbers of assisted edits in a short period. Nothing fully automated, just AWB with me at the controls. For an example, see any of the strings of cleanup edits I have done recently (including 500 author link disambigs in 1/2 hour at about 0400 UTC on 20 July.) For a second example, see the 50 new basic Onthophagus pages I just created in 2 minutes a little earlier today. The first 30-40 edits of any project will always be made under my main account to allow for proper scrutiny by others. Koumz (talk) 06:38, 17 July 2011 (UTC)[reply]
I have supervised this, and it is very good indeed...Stho002 (talk) 06:45, 17 July 2011 (UTC)[reply]
This sounds too good to be true. Can someone explain where the data comes from that then appears to be ported to Wiki format and uploaded so quickly? It would seem that this has to be just a simple copy of information held elsewhere and not actually validated in the transfer process. Thanks in advance for the explanation. Accassidy (talk) 22:04, 23 July 2011 (UTC)[reply]
The answers about the details on validation will have to come from Stephen, as he provided the list I am working from. Once the name vs. author/year information (from any source, including a direct collection of all the information from papers) has been organized into a properly-formatted list (and I can perform the formatting on almost any kind of list given to me electronically with relative ease), it is fairly easy to create the basic page structure en masse by author/year combination. His notes on the page for Onthophagus itself provide some light, I think. His idea (I hope I am stating it accurately) is to create the pages as frames first, and he then fills them in with further details, which has to be done individually. Perhaps I shouldn't use that as an example for the automated work, I meant it as an example of the advantages of automation over editing by hand using the same data. For example, if one had a single reference describing, let's say, 10 species (along with a typed list of those species), it would be possible to mass create the 10 pages containing the names, author, year AND the reference information for that reference within less than a minute (after a few minutes of setup in AWB). Koumz (talk) 22:39, 23 July 2011 (UTC)[reply]
Alan: I harvested the basic name/author/year data from CoL via another website which had formatted it in such a way that it was easy to do a bit more formatting and give Koumz an Excel spreadsheet which he could then use to create the pages using AWB. No "magic", just a small dollop of cunning on my part! So, for any big genus, basically if you can find a webpage with all the names/authors/years ON ONE PAGE, then we can harvest that info to create basic pages like Koumz is doing, and then later go back to add more detail manually. The advantage is really just to get more pages with some content, as some of our competitors boast more pages even though many of them have nothing more than name/author/date. It is also better for third party sites to link to these minimal pages, than to blind links. This is in part why BHL links to EoL (very few blind links), but will not link to us (too many blind links) ... Stho002 (talk) 00:05, 24 July 2011 (UTC)[reply]
Thanks, I get the picture. Accassidy (talk) 09:33, 24 July 2011 (UTC)[reply]

 Support Open2universe | Talk 11:49, 20 July 2011 (UTC)[reply]

Everything looks like is in proper order. Approved. OhanaUnitedTalk page 19:54, 24 July 2011 (UTC)[reply]