This fork of the repo is for Outreachy Round 22 - Synchronising Wikidata and Wikipedias using pywikibot
The Pywikibot framework is a Python library that interfaces with the MediaWiki API version 1.23 or higher.
Also included are various general function scripts that can be adapted for different tasks.
For further information about the library excluding scripts see the full code documentation.
pip install requests git clone https://gerrit.wikimedia.org/r/pywikibot/core.git cd core git submodule update --init python pwb.py script_name
Or to install using PyPI (excluding scripts)
pip install -U setuptools pip install pywikibot
In addition a MediaWiki markup parser is required. Please install one of them:
pip install mwparserfromhell
or
pip install wikitextparser
Our installation guide has more details for advanced usage.
If you wish to write your own script it's very easy to get started:
import pywikibot site = pywikibot.Site('en', 'wikipedia') # The site we want to run our bot on page = pywikibot.Page(site, 'Wikipedia:Sandbox') page.text = page.text.replace('foo', 'bar') page.save('Replacing "foo" with "bar"') # Saves the page
Wikibase is a flexible knowledge base software that drives Wikidata. A sample pywikibot script for getting data from Wikibase:
import pywikibot site = pywikibot.Site('wikipedia:en') repo = site.data_repository() # the Wikibase repository for given site page = repo.page_from_repository('Q91') # create a local page for the given item item = pywikibot.ItemPage(repo, 'Q91') # a repository item data = item.get() # get all item data from repository for this item
For more documentation on Pywikibot see our docs.
It may require the following programs to function properly:
- 7za: To extract 7z files
See https://github.com/wikimedia/pywikibot/blob/stable/HISTORY.rst
Our code is maintained on Wikimedia's Gerrit installation, learn how to get started.