pywikipedia is a set of python tools used to maintain mediawiki sites.
You can build a set of scripts, or with an interactive python shell, perform maintenance activities. This has been used in many ways to help migrate this site by cleaning up Special:Wantedpages.
This library supports scanning all pages, reading the wikitext, and making updates. It also has a throttling mechanism built in so that robot operations won't tax the wiki web server.
The following section lists various samples of how pywikipedia has been used to maintain this site.
The following batch job was kicked off to migrate every page that had Category Homepage into Category:MythPeople, and thus retire the wanted article of Category Homepage.
% python >>> from pywikipedia import wikipedia >>> page = wikipedia.Page(site=site, title='Category Homepage') >>> links = page.getReferences() Getting references to [[Category Homepage]] >>> for eachPage in links: ... wikitext = eachPage.get() ... newtext = re.compile("Category Homepage").sub("Category:MythPeople", wikitext) ... eachPage.put(newtext=newtext, \ ... comment='[[pywikipedia]] assisted cleanup -> Moving Category Homepage to [[:Category:MythPeople]]', \ ... minorEdit=True)
The following batch job was kicked off to edit every article that had Mail To by substituting it with mailto:.
>>> page = wikipedia.Page(site=site, title='Mail To') >>> links = page.getReferences() >>> for eachPage in links: ... wikitext = eachPage.get() ... newtext = re.compile("\[\[Mail To\]\]").sub("mailto:", wikitext) ... eachPage.put(newtext=newtext, comment='pywikipedia update -> Replacing Mail To with mailto:', minorEdit=True)
There are scripts that have been written to support this site. They all have a common pattern of ending in .py, indicating them as Python scripts. They are each listed in the Bot category.