User:Triciaburmeister/Sandbox/Data platform/Discover data
This page's contents have been moved to the mainspace at Data_Platform. See project history in phab:T350911. |
This page provides links to data documentation for private and public Wikimedia data sources. Its primary audience is WMF data analysts, product teams, and researchers who have an official non-disclosure agreement with the Wikimedia Foundation.
- Private data requires production data access. It includes datasets in WMF's Data Lake: a large, analytics-oriented repository of data about Wikimedia projects.
- A selection of public data sources are linked here, but public Wikimedia data is described more fully at meta:Research:Data.
Traffic data
Analytics data about wiki pageviews and site usage.
Most Data Lake traffic datasets are updated at hourly granularity, with 2-3 hours lag behind real-time. This data includes:
Full dataset list at Data Lake/Traffic.
View datasets tagged with "traffic" in DataHub (requires a developer account)
APIs:
Specialized datasets:
Dashboards:
Content data
Datasets that contain full content of revisions for Wikimedia wikis.
- mediawiki_wikitext_current: wikitext last-revision per-page of Wikimedia wikis.
- mediawiki_wikitext_history: full content of all revisions, past and present, from Wikimedia wikis.
- wikidata_entity:conversion of the Wikidata entities JSON dumps in parquet.
- wikidata_item_page_link: links between a Wikidata item and its related Wikipedia pages in various languages.
APIs:
Specialized datasets:
MediaWiki database tables:
Contributing and edits data
Data about wiki revisions, pages, and users. Includes data about editors and their characteristics.
Edits datasets are generated as monthly snapshots, not continuously updated. This data includes:
- MediaWiki_history: Fully denormalized dataset with user, page and revision data
- Raw, unprocessed copies of MediaWiki database tables, bundled to facilitate cross-wiki queries.
Full dataset list at Data Lake/Edits.
Private datasets about contributors or editors includes:
- Geoeditors: Counts of editors by project by country
View datasets tagged with "editors" in DataHub (requires a developer account))
APIs:
MediaWiki database tables:
Dashboards:
Instrumentation and events data
Through the Event Platform and Metrics Platform, you can create and deploy your own instruments to collect event data.
Events are ingested into event
and event_sanitized
databases in the Data Lake.
- The Hive table name is a normalized version of the stream name.
- The
event
database stores original (unsanitized) events within a 90 day retention period. - The
event_sanitized
database is an archive of sanitized events, beyond the 90 day retention period.- Sanitized event data is processed per WMFâs Privacy Policy and Data Retention Guidelines.
After the data becomes available, you can access it with standard query tools and create dashboards based on the data. See the Instrumentation tutorial for how to consume events directly from Kafka or through the internal EventStreams instance.
How to query private data
Visit Analyze data to learn how to run queries and generate visualizations using WMF private datasets and analysis tools.