Difference between revisions of "The Use of Software Tools and Autonomous Bots Against Vandalism: Eroding Wikipedia's Moral Order?"

From Wikipedia Quality
Jump to: navigation, search
(Basic information on The Use of Software Tools and Autonomous Bots Against Vandalism: Eroding Wikipedia's Moral Order?)
 
(wikilinks)
Line 1: Line 1:
'''The Use of Software Tools and Autonomous Bots Against Vandalism: Eroding Wikipedia's Moral Order?''' - scientific work related to Wikipedia quality published in 2015, written by Paul B. de Laat.
+
'''The Use of Software Tools and Autonomous Bots Against Vandalism: Eroding Wikipedia's Moral Order?''' - scientific work related to [[Wikipedia quality]] published in 2015, written by [[Paul B. de Laat]].
  
 
== Overview ==
 
== Overview ==
English-language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the `coactivity' in use between humans and bots, this research `discloses' the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical layer. Further, surveillance exhibits several troubling features: questionable profiling practices (concerning anonymous users in particular), the use of the controversial measure of reputation (under consideration), `oversurveillance' where quantity trumps quality, and a prospective loss of the required moral skills whenever bots take over from humans. The most troubling aspect, though, is that Wikipedia has become a Janus-faced institution. One face is the basic platform of MediaWiki software, transparent to all. Its other face is the anti-vandalism system, which, in contrast, is opaque to the average user, in particular as a result of the algorithms and neural networks in use. Finally it is argued that this secrecy impedes a much needed discussion to unfold; a discussion that should focus on a `rebalancing' of the anti-vandalism system and the development of more ethical information practices towards contributors.
+
English-language [[Wikipedia]] is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the `coactivity' in use between humans and bots, this research `discloses' the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical layer. Further, surveillance exhibits several troubling [[features]]: questionable profiling practices (concerning anonymous users in particular), the use of the controversial measure of [[reputation]] (under consideration), `oversurveillance' where quantity trumps quality, and a prospective loss of the required moral skills whenever bots take over from humans. The most troubling aspect, though, is that Wikipedia has become a Janus-faced institution. One face is the basic platform of [[MediaWiki]] software, transparent to all. Its other face is the anti-vandalism system, which, in contrast, is opaque to the average user, in particular as a result of the algorithms and neural networks in use. Finally it is argued that this secrecy impedes a much needed discussion to unfold; a discussion that should focus on a `rebalancing' of the anti-vandalism system and the development of more ethical information practices towards contributors.

Revision as of 12:07, 20 June 2019

The Use of Software Tools and Autonomous Bots Against Vandalism: Eroding Wikipedia's Moral Order? - scientific work related to Wikipedia quality published in 2015, written by Paul B. de Laat.

Overview

English-language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the `coactivity' in use between humans and bots, this research `discloses' the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical layer. Further, surveillance exhibits several troubling features: questionable profiling practices (concerning anonymous users in particular), the use of the controversial measure of reputation (under consideration), `oversurveillance' where quantity trumps quality, and a prospective loss of the required moral skills whenever bots take over from humans. The most troubling aspect, though, is that Wikipedia has become a Janus-faced institution. One face is the basic platform of MediaWiki software, transparent to all. Its other face is the anti-vandalism system, which, in contrast, is opaque to the average user, in particular as a result of the algorithms and neural networks in use. Finally it is argued that this secrecy impedes a much needed discussion to unfold; a discussion that should focus on a `rebalancing' of the anti-vandalism system and the development of more ethical information practices towards contributors.