Tag

wikipedia

Showing 21 - 35 out of 35 datasets
  • Wikipedia³ - Conversion of Wikipedia into RDF

    Offsite — Wikipedia³ is a conversion of the English Wikipedia into RDF. It’s a monthly updated dataset containing around 47 million triples. The creation of the dataset is motivated by several factors, one being the desire to have more real-world RDF datasets of reasonable size. Wikipedia assembles a wealth of information created and maintained by people all over the globe – ...
  • Wikilocation Geolocation API

    Offsite — An API for Wikipedia that would allow you to search for entries based on geo-location information. Wikilocation is a full RESTful web API service for developers wishing to search for Wikipedia articles by location. The data is gathered by downloading the Wikipedia database on a weekly basis and then parsing all of the geocoded entries. This data is then stored in a ...
  • Freebase Data Dump

    Offsite — Freebase data dumps provide all of the current facts and assertions within the Freebase system. The data dumps are complete, general-purpose extracts of the Freebase data in a variety of formats. Freebase releases a fresh data dump every three months. Freebase is an open database of the world’s information, covering millions of topics across hundreds of categories. ...
  • List of films: A - Wikipedia, the free encyclopedia

    Offsite — This is an alphabetical list of film articles (or sections within articles about films), beginning at A. It includes made for television films.
  • DBpedia Dataset

    Offsite — A large multi-domain ontology derived from Wikipedia. GNU Free Documentation License. N3 and CSV formats.
  • Freebase API

    Offsite — Query the Freebase shared database of the world’s knowledge via a JSON-based query language. See also: Freebase Data Dumps and Freebase Acre
  • Freebase Data Dumps

    Offsite — Full data dumps of the Freebase shared database of the world’s knowledge. Available in tab-separated values format and a low-level link export suitable for converting into RDF or XML. Creative Commons Attribution license. See also: Freebase API and Freebase Wikipedia Extraction (WEX).
  • FUTEF Wikipedia API

    Offsite — Search API for accessing Wikipedia content. Available for non-commercial use.
  • MediaWiki API

    Offsite — Programmatically access websites running on MediaWiki wiki software, including Wikimedia Foundation sites like Wikipedia, which are licensed under the GNU Free Documentation License. In active development.
  • Freebase Wikipedia Extraction (WEX)

    Offsite — “A processed dump of the English-language Wikipedia. The wiki markup for each article is transformed into machine-readable XML, and common relational features such as templates, infoboxes, categories, article sections and redirects are extracted in tabular form.” TSV format for PostgreSQL. GNU Free Documentation License.
  • Wikipedia Page Traffic Statistics

    Offsite — A 320 GB sample of the data used to power Trending Topics, containing 7 months of hourly page traffic statistics for over 2.5 million Wikipedia articles along with the associated Wikipedia content, linkgraph and metadata. Compiled by Peter Skomoroch. All text content, statistics and link data is licensed under the GNU Free Documentation License (GFDL).
  • Hex color codes to RGB values and color names

    Free Download — A simple mapping from hex color codes to color names and rgb values. Eg: color, hex, r, g, b Almond,#EFDECD,239,222,205 Dodger blue,#1E90FF,30,144,255 Meat brown,#E5B73B,229,183,59 Scarlet,#FF2000,255,32,0 Tiffany Blue,#0ABAB5,10,186,181 Violet (color wheel),#7F00FF,127,0,255 Source: http://en.wikipedia.org/wiki/List_of_colors
  • language family - iso639-N

    Free Download — http://github.com/korczis/smartdata/tree/master/data/ source: http://en.wikipedia.org/wiki/ISO_639
  • Entropy per revision of Wikipedia pages beginning with M

    Free Download — See http://slightlynew.blogspot.com/2011/05/who-writes-wikipedia-information.html and especially https://github.com/lsb/ugc-contributors for more information.
  • Complete and Latest English Wikipedia raw dump with edit history

    Offsite — This is a direct link to the raw wikipedia data dump, roughly 7TB uncompressed. The data is bz2, gz, and 7z compressed and in .xml format. A higher level view of the data is available at this link: http://dumps.wikimedia.org/ As explained on this page: http://en.wikipedia.org/wiki/Wikipedia:Database_download, downloading data of this size uses a lot of bandwidth, which ...