Inducing Conceptual Embedding Spaces from Wikipedia

From Wikipedia Quality
Revision as of 08:35, 14 August 2020 by Aaliyah (talk | contribs) (+ Embed)
Jump to: navigation, search


Inducing Conceptual Embedding Spaces from Wikipedia
Authors
Gerard de Melo
Publication date
2017
DOI
10.1145/3041021.3054144
Links
Original

Inducing Conceptual Embedding Spaces from Wikipedia - scientific work related to Wikipedia quality published in 2017, written by Gerard de Melo.

Overview

The word2vec word vector representations are one of the most well-known new semantic resources to appear in recent years. While large sets of pre-trained vectors are available, these focus on frequent words and multi-word expressions but lack sufficient coverage of named entities. Moreover, Google only released pre-trained vectors for English. In this paper, authors explore an automatic expansion of Google's pre-trained vectors using Wikipedia, adding millions of concepts and named entities in over 270 languages. Authors method enables all of these to reside in the same vector space, thus flexibly facilitating cross-lingual semantic applications.

Embed

Wikipedia Quality

Melo, Gerard de. (2017). "[[Inducing Conceptual Embedding Spaces from Wikipedia]]". International World Wide Web Conferences Steering Committee. DOI: 10.1145/3041021.3054144.

English Wikipedia

{{cite journal |last1=Melo |first1=Gerard de |title=Inducing Conceptual Embedding Spaces from Wikipedia |date=2017 |doi=10.1145/3041021.3054144 |url=https://wikipediaquality.com/wiki/Inducing_Conceptual_Embedding_Spaces_from_Wikipedia |journal=International World Wide Web Conferences Steering Committee}}

HTML

Melo, Gerard de. (2017). &quot;<a href="https://wikipediaquality.com/wiki/Inducing_Conceptual_Embedding_Spaces_from_Wikipedia">Inducing Conceptual Embedding Spaces from Wikipedia</a>&quot;. International World Wide Web Conferences Steering Committee. DOI: 10.1145/3041021.3054144.