Difference between revisions of "Menta: Inducing Multilingual Taxonomies from Wikipedia"
(+ links) |
(+ Infobox work) |
||
Line 1: | Line 1: | ||
+ | {{Infobox work | ||
+ | | title = Menta: Inducing Multilingual Taxonomies from Wikipedia | ||
+ | | date = 2010 | ||
+ | | authors = [[Gerard de Melo]]<br />[[Gerhard Weikum]] | ||
+ | | doi = 10.1145/1871437.1871577 | ||
+ | | link = http://dl.acm.org/citation.cfm?id=1871577 | ||
+ | | plink = https://www.researchgate.net/profile/Gerard_De_Melo/publication/221614201_MENTA_inducing_multilingual_taxonomies_from_Wikipedia/links/548701750cf268d28f06fd70.pdf | ||
+ | }} | ||
'''Menta: Inducing Multilingual Taxonomies from Wikipedia''' - scientific work related to [[Wikipedia quality]] published in 2010, written by [[Gerard de Melo]] and [[Gerhard Weikum]]. | '''Menta: Inducing Multilingual Taxonomies from Wikipedia''' - scientific work related to [[Wikipedia quality]] published in 2010, written by [[Gerard de Melo]] and [[Gerhard Weikum]]. | ||
== Overview == | == Overview == | ||
In recent years, a number of projects have turned to [[Wikipedia]] to establish large-scale taxonomies that describe orders of magnitude more entities than traditional manually built knowledge bases. So far, however, the [[multilingual]] nature of Wikipedia has largely been neglected. This paper investigates how entities from all editions of Wikipedia as well as [[WordNet]] can be integrated into a single coherent taxonomic class hierarchy. Authors rely on linking heuristics to discover potential taxonomic relationships, graph partitioning to form consistent equivalence classes of entities, and a Markov chain-based ranking approach to construct the final taxonomy. This results in MENTA (Multilingual Entity Taxonomy), a resource that describes 5.4 million entities and is presumably the largest multilingual lexical knowledge base currently available. | In recent years, a number of projects have turned to [[Wikipedia]] to establish large-scale taxonomies that describe orders of magnitude more entities than traditional manually built knowledge bases. So far, however, the [[multilingual]] nature of Wikipedia has largely been neglected. This paper investigates how entities from all editions of Wikipedia as well as [[WordNet]] can be integrated into a single coherent taxonomic class hierarchy. Authors rely on linking heuristics to discover potential taxonomic relationships, graph partitioning to form consistent equivalence classes of entities, and a Markov chain-based ranking approach to construct the final taxonomy. This results in MENTA (Multilingual Entity Taxonomy), a resource that describes 5.4 million entities and is presumably the largest multilingual lexical knowledge base currently available. |
Revision as of 08:35, 22 October 2020
Authors | Gerard de Melo Gerhard Weikum |
---|---|
Publication date | 2010 |
DOI | 10.1145/1871437.1871577 |
Links | Original Preprint |
Menta: Inducing Multilingual Taxonomies from Wikipedia - scientific work related to Wikipedia quality published in 2010, written by Gerard de Melo and Gerhard Weikum.
Overview
In recent years, a number of projects have turned to Wikipedia to establish large-scale taxonomies that describe orders of magnitude more entities than traditional manually built knowledge bases. So far, however, the multilingual nature of Wikipedia has largely been neglected. This paper investigates how entities from all editions of Wikipedia as well as WordNet can be integrated into a single coherent taxonomic class hierarchy. Authors rely on linking heuristics to discover potential taxonomic relationships, graph partitioning to form consistent equivalence classes of entities, and a Markov chain-based ranking approach to construct the final taxonomy. This results in MENTA (Multilingual Entity Taxonomy), a resource that describes 5.4 million entities and is presumably the largest multilingual lexical knowledge base currently available.