Difference between revisions of "Semantic Stability in Wikipedia"

From Wikipedia Quality
Jump to: navigation, search
(Starting an article - Semantic Stability in Wikipedia)
 
(Links)
Line 1: Line 1:
'''Semantic Stability in Wikipedia''' - scientific work related to Wikipedia quality published in 2016, written by Darko Stanisavljevic, Ilire Hasani-Mavriqi, Elisabeth Lex, Markus Strohmaier and Denis Helic.
+
'''Semantic Stability in Wikipedia''' - scientific work related to [[Wikipedia quality]] published in 2016, written by [[Darko Stanisavljevic]], [[Ilire Hasani-Mavriqi]], [[Elisabeth Lex]], [[Markus Strohmaier]] and [[Denis Helic]].
  
 
== Overview ==
 
== Overview ==
In this paper authors assess the semantic stability of Wikipedia by investigating the dynamics of Wikipedia articles’ revisions over time. In a semantically stable system, articles are infrequently edited, whereas in unstable systems, article content changes more frequently. In other words, in a stable system, the Wikipedia community has reached consensus on the majority of articles. In work, authors measure semantic stability using the Rank Biased Overlap method. To that end, authors preprocess Wikipedia dumps to obtain a sequence of plain-text article revisions, whereas each revision is represented as a TF-IDF vector. To measure the similarity between consequent article revisions, authors calculate Rank Biased Overlap on subsequent term vectors. Authors evaluate approach on 10 Wikipedia language editions including the five largest language editions as well as five randomly selected small language editions. Authors experimental results reveal that even in policy driven collaboration networks such as Wikipedia, semantic stability can be achieved. However, there are differences on the velocity of the semantic stability process between small and large Wikipedia editions. Small editions exhibit faster and higher semantic stability than large ones. In particular, in large Wikipedia editions, a higher number of successive revisions is needed in order to reach a certain semantic stability level, whereas, in small Wikipedia editions, the number of needed successive revisions is much lower for the same level of semantic stability.
+
In this paper authors assess the semantic stability of [[Wikipedia]] by investigating the dynamics of Wikipedia articles’ revisions over time. In a semantically stable system, articles are infrequently edited, whereas in unstable systems, article content changes more frequently. In other words, in a stable system, the [[Wikipedia community]] has reached consensus on the majority of articles. In work, authors measure semantic stability using the Rank Biased Overlap method. To that end, authors preprocess Wikipedia dumps to obtain a sequence of plain-text article revisions, whereas each revision is represented as a TF-IDF vector. To measure the similarity between consequent article revisions, authors calculate Rank Biased Overlap on subsequent term vectors. Authors evaluate approach on 10 Wikipedia language editions including the five largest language editions as well as five randomly selected small language editions. Authors experimental results reveal that even in policy driven collaboration networks such as Wikipedia, semantic stability can be achieved. However, there are differences on the velocity of the semantic stability process between small and large Wikipedia editions. Small editions exhibit faster and higher semantic stability than large ones. In particular, in large Wikipedia editions, a higher number of successive revisions is needed in order to reach a certain semantic stability level, whereas, in small Wikipedia editions, the number of needed successive revisions is much lower for the same level of semantic stability.

Revision as of 07:39, 18 June 2020

Semantic Stability in Wikipedia - scientific work related to Wikipedia quality published in 2016, written by Darko Stanisavljevic, Ilire Hasani-Mavriqi, Elisabeth Lex, Markus Strohmaier and Denis Helic.

Overview

In this paper authors assess the semantic stability of Wikipedia by investigating the dynamics of Wikipedia articles’ revisions over time. In a semantically stable system, articles are infrequently edited, whereas in unstable systems, article content changes more frequently. In other words, in a stable system, the Wikipedia community has reached consensus on the majority of articles. In work, authors measure semantic stability using the Rank Biased Overlap method. To that end, authors preprocess Wikipedia dumps to obtain a sequence of plain-text article revisions, whereas each revision is represented as a TF-IDF vector. To measure the similarity between consequent article revisions, authors calculate Rank Biased Overlap on subsequent term vectors. Authors evaluate approach on 10 Wikipedia language editions including the five largest language editions as well as five randomly selected small language editions. Authors experimental results reveal that even in policy driven collaboration networks such as Wikipedia, semantic stability can be achieved. However, there are differences on the velocity of the semantic stability process between small and large Wikipedia editions. Small editions exhibit faster and higher semantic stability than large ones. In particular, in large Wikipedia editions, a higher number of successive revisions is needed in order to reach a certain semantic stability level, whereas, in small Wikipedia editions, the number of needed successive revisions is much lower for the same level of semantic stability.