Difference between revisions of "Detecting Biased Statements in Wikipedia"

From Wikipedia Quality
Jump to: navigation, search
(Overview - Detecting Biased Statements in Wikipedia)
 
(cats.)
 
(3 intermediate revisions by 3 users not shown)
Line 1: Line 1:
'''Detecting Biased Statements in Wikipedia''' - scientific work related to Wikipedia quality published in 2018, written by Christoph Hube and Besnik Fetahu.
+
{{Infobox work
 +
| title = Detecting Biased Statements in Wikipedia
 +
| date = 2018
 +
| authors = [[Christoph Hube]]<br />[[Besnik Fetahu]]
 +
| doi = 10.1145/3184558.3191640
 +
| link = https://dl.acm.org/citation.cfm?doid=3184558.3191640
 +
}}
 +
'''Detecting Biased Statements in Wikipedia''' - scientific work related to [[Wikipedia quality]] published in 2018, written by [[Christoph Hube]] and [[Besnik Fetahu]].
  
 
== Overview ==
 
== Overview ==
Quality in Wikipedia is enforced through a set of editing policies and guidelines recommended for Wikipedia editors. Neutral point of view (NPOV) is one of the main principles in Wikipedia, which ensures that for controversial information all possible points of view are represented proportionally. Furthermore, language used in Wikipedia should be neutral and not opinionated. However, due to the large number of Wikipedia articles and its operating principle based on a voluntary basis of Wikipedia editors; quality assurances and Wikipedia guidelines cannot always be enforced. Currently, there are more than 40,000 articles, which are flagged with NPOV or similar quality tags. Furthermore, these represent only the portion of articles for which such quality issues are explicitly flagged by the Wikipedia editors, however, the real number may be higher considering that only a small percentage of articles are of good quality or featured as categorized by Wikipedia. In this work, authors focus on the case of language bias at the sentence level in Wikipedia. Language bias is a hard problem, as it represents a subjective task and usually the linguistic cues are subtle and can be determined only through its context. Authors propose a supervised classification approach, which relies on an automatically created lexicon of bias words, and other syntactical and semantic characteristics of biased statements. Authors experimentally evaluate approach on a dataset consisting of biased and unbiased statements, and show that authors are able to detect biased statements with an accuracy of 74%. Furthermore, authors show that competitors that determine bias words are not suitable for detecting biased statements, which authors outperform with a relative improvement of over 20%.
+
Quality in [[Wikipedia]] is enforced through a set of editing policies and guidelines recommended for [[Wikipedia editors]]. Neutral point of view (NPOV) is one of the main principles in Wikipedia, which ensures that for controversial information all possible points of view are represented proportionally. Furthermore, language used in Wikipedia should be neutral and not opinionated. However, due to the large number of Wikipedia articles and its operating principle based on a voluntary basis of Wikipedia editors; quality assurances and Wikipedia guidelines cannot always be enforced. Currently, there are more than 40,000 articles, which are flagged with NPOV or similar quality tags. Furthermore, these represent only the portion of articles for which such quality issues are explicitly flagged by the Wikipedia editors, however, the real number may be higher considering that only a small percentage of articles are of good quality or featured as categorized by Wikipedia. In this work, authors focus on the case of language bias at the sentence level in Wikipedia. Language bias is a hard problem, as it represents a subjective task and usually the linguistic cues are subtle and can be determined only through its context. Authors propose a supervised classification approach, which relies on an automatically created lexicon of bias words, and other syntactical and semantic characteristics of biased statements. Authors experimentally evaluate approach on a dataset consisting of biased and unbiased statements, and show that authors are able to detect biased statements with an accuracy of 74%. Furthermore, authors show that competitors that determine bias words are not suitable for detecting biased statements, which authors outperform with a relative improvement of over 20%.
 +
 
 +
== Embed ==
 +
=== Wikipedia Quality ===
 +
<code>
 +
<nowiki>
 +
Hube, Christoph; Fetahu, Besnik. (2018). "[[Detecting Biased Statements in Wikipedia]]". International World Wide Web Conferences Steering Committee. DOI: 10.1145/3184558.3191640.
 +
</nowiki>
 +
</code>
 +
 
 +
=== English Wikipedia ===
 +
<code>
 +
<nowiki>
 +
{{cite journal |last1=Hube |first1=Christoph |last2=Fetahu |first2=Besnik |title=Detecting Biased Statements in Wikipedia |date=2018 |doi=10.1145/3184558.3191640 |url=https://wikipediaquality.com/wiki/Detecting_Biased_Statements_in_Wikipedia |journal=International World Wide Web Conferences Steering Committee}}
 +
</nowiki>
 +
</code>
 +
 
 +
=== HTML ===
 +
<code>
 +
<nowiki>
 +
Hube, Christoph; Fetahu, Besnik. (2018). &amp;quot;<a href="https://wikipediaquality.com/wiki/Detecting_Biased_Statements_in_Wikipedia">Detecting Biased Statements in Wikipedia</a>&amp;quot;. International World Wide Web Conferences Steering Committee. DOI: 10.1145/3184558.3191640.
 +
</nowiki>
 +
</code>
 +
 
 +
 
 +
 
 +
[[Category:Scientific works]]

Latest revision as of 09:56, 15 February 2021


Detecting Biased Statements in Wikipedia
Authors
Christoph Hube
Besnik Fetahu
Publication date
2018
DOI
10.1145/3184558.3191640
Links
Original

Detecting Biased Statements in Wikipedia - scientific work related to Wikipedia quality published in 2018, written by Christoph Hube and Besnik Fetahu.

Overview

Quality in Wikipedia is enforced through a set of editing policies and guidelines recommended for Wikipedia editors. Neutral point of view (NPOV) is one of the main principles in Wikipedia, which ensures that for controversial information all possible points of view are represented proportionally. Furthermore, language used in Wikipedia should be neutral and not opinionated. However, due to the large number of Wikipedia articles and its operating principle based on a voluntary basis of Wikipedia editors; quality assurances and Wikipedia guidelines cannot always be enforced. Currently, there are more than 40,000 articles, which are flagged with NPOV or similar quality tags. Furthermore, these represent only the portion of articles for which such quality issues are explicitly flagged by the Wikipedia editors, however, the real number may be higher considering that only a small percentage of articles are of good quality or featured as categorized by Wikipedia. In this work, authors focus on the case of language bias at the sentence level in Wikipedia. Language bias is a hard problem, as it represents a subjective task and usually the linguistic cues are subtle and can be determined only through its context. Authors propose a supervised classification approach, which relies on an automatically created lexicon of bias words, and other syntactical and semantic characteristics of biased statements. Authors experimentally evaluate approach on a dataset consisting of biased and unbiased statements, and show that authors are able to detect biased statements with an accuracy of 74%. Furthermore, authors show that competitors that determine bias words are not suitable for detecting biased statements, which authors outperform with a relative improvement of over 20%.

Embed

Wikipedia Quality

Hube, Christoph; Fetahu, Besnik. (2018). "[[Detecting Biased Statements in Wikipedia]]". International World Wide Web Conferences Steering Committee. DOI: 10.1145/3184558.3191640.

English Wikipedia

{{cite journal |last1=Hube |first1=Christoph |last2=Fetahu |first2=Besnik |title=Detecting Biased Statements in Wikipedia |date=2018 |doi=10.1145/3184558.3191640 |url=https://wikipediaquality.com/wiki/Detecting_Biased_Statements_in_Wikipedia |journal=International World Wide Web Conferences Steering Committee}}

HTML

Hube, Christoph; Fetahu, Besnik. (2018). &quot;<a href="https://wikipediaquality.com/wiki/Detecting_Biased_Statements_in_Wikipedia">Detecting Biased Statements in Wikipedia</a>&quot;. International World Wide Web Conferences Steering Committee. DOI: 10.1145/3184558.3191640.