How Much is Wikipedia Lagging Behind News

From Wikipedia Quality
Revision as of 08:45, 5 June 2019 by Eva (talk | contribs) (+ wikilinks)
Jump to: navigation, search

How Much is Wikipedia Lagging Behind News - scientific work related to Wikipedia quality published in 2015, written by Besnik Fetahu, Abhijit Anand and Avishek Anand.

Overview

Wikipedia, rich in entities and events, is an invaluable resource for various knowledge harvesting, extraction and mining tasks. Numerous resources like DBpedia, YAGO and other knowledge bases are based on extracting entity and event based knowledge from it. Online news, on the other hand, is an authoritative and rich source for emerging entities, events and facts relating to existing entities. In this work, authors study the creation of entities in Wikipedia with respect to news by studying how entity and event based information flows from news to Wikipedia. Authors analyze the lag of Wikipedia (based on the revision history of the English Wikipedia) with 20 years of The New York Times dataset (NYT). Authors model and analyze the lag of entities and events, namely their first appearance in Wikipedia and in NYT, respectively. In extensive experimental analysis, authors find that almost 20% of the external references in entity pages are news articles encoding the importance of news to Wikipedia. Second, authors observe that the entity-based lag follows a normal distribution with a high standard deviation, whereas the lag for news-based events is typically very low. Finally, authors find that events are responsible for creation of emergent entities with as many as 12% of the entities mentioned in the event page are created after the creation of the event page.