Articles with "word embeddings" as a keyword



Photo from archive.org

Exploring Implicit Semantic Constraints for Bilingual Word Embeddings

Sign Up to like & get
recommendations!
Published in 2017 at "Neural Processing Letters"

DOI: 10.1007/s11063-017-9762-8

Abstract: Bilingual word embeddings (BWEs) have proven to be useful in many cross-lingual natural language processing tasks. Previous studies often require bilingual texts or dictionaries that are scarce resources. As a result, in these studies, the… read more here.

Keywords: word; word embeddings; semantic constraints; implicit semantic ... See more keywords
Photo by bekkybekks from unsplash

ReMemNN: A novel memory neural network for powerful interaction in aspect-based sentiment analysis

Sign Up to like & get
recommendations!
Published in 2020 at "Neurocomputing"

DOI: 10.1016/j.neucom.2020.02.018

Abstract: Abstract Deep neural networks have been employed to analyze the sentiment of text sequences and achieved significant effect. However, these models still face the issues of weakness of pre-trained word embeddings and weak interaction between… read more here.

Keywords: memory neural; interaction; word embeddings; sentiment ... See more keywords
Photo by bekkybekks from unsplash

Imparting interpretability to word embeddings while preserving semantic structure

Sign Up to like & get
recommendations!
Published in 2020 at "Natural Language Engineering"

DOI: 10.1017/s1351324920000315

Abstract: Abstract As a ubiquitous method in natural language processing, word embeddings are extensively employed to map semantic properties of words into a dense vector representation. They capture semantic and syntactic relations among words, but the… read more here.

Keywords: imparting interpretability; interpretability word; embeddings preserving; interpretability ... See more keywords
Photo by maty_17 from unsplash

Historical representations of social groups across 200 years of word embeddings from Google Books

Sign Up to like & get
recommendations!
Published in 2022 at "Proceedings of the National Academy of Sciences of the United States of America"

DOI: 10.1073/pnas.2121798119

Abstract: Significance How did societies of the past represent the various social groups of their world? Here, we address this question using word embeddings from 850 billion words of English-language books (from 1800 to 1999) to… read more here.

Keywords: representations social; across 200; word embeddings; historical representations ... See more keywords
Photo by srz from unsplash

More than Bags of Words: Sentiment Analysis with Word Embeddings

Sign Up to like & get
recommendations!
Published in 2018 at "Communication Methods and Measures"

DOI: 10.1080/19312458.2018.1455817

Abstract: ABSTRACT Moving beyond the dominant bag-of-words approach to sentiment analysis we introduce an alternative procedure based on distributed word embeddings. The strength of word embeddings is the ability to capture similarities in word meaning. We… read more here.

Keywords: word; bags words; word embeddings; words sentiment ... See more keywords
Photo by sharonmccutcheon from unsplash

Gender-sensitive word embeddings for healthcare

Sign Up to like & get
recommendations!
Published in 2022 at "Journal of the American Medical Informatics Association : JAMIA"

DOI: 10.1093/jamia/ocab279

Abstract: OBJECTIVE To analyze gender bias in clinical trials, to design an algorithm that mitigates the effects of biases of gender representation on natural-language (NLP) systems trained on text drawn from clinical trials, and to evaluate… read more here.

Keywords: clinical trials; gender sensitive; prediction; word embeddings ... See more keywords
Photo from wikipedia

Learning Chinese Word Embeddings With Words and Subcharacter N-Grams

Sign Up to like & get
recommendations!
Published in 2019 at "IEEE Access"

DOI: 10.1109/access.2019.2908014

Abstract: Co-occurrence information between words is the basis of training word embeddings; besides, Chinese characters are composed of subcharacters, words made up by the same characters or subcharacters usually have similar semantics, but this internal substructure… read more here.

Keywords: chinese word; learning chinese; word embeddings; subcharacter grams ... See more keywords
Photo by dawson2406 from unsplash

Comparative Analysis of Word Embeddings in Assessing Semantic Similarity of Complex Sentences

Sign Up to like & get
recommendations!
Published in 2021 at "IEEE Access"

DOI: 10.1109/access.2021.3135807

Abstract: Semantic textual similarity is one of the open research challenges in the field of Natural Language Processing. Extensive research has been carried out in this field and near-perfect results are achieved by recent transformer-based models… read more here.

Keywords: similarity; comparative analysis; semantic similarity; word embeddings ... See more keywords
Photo by sambalye from unsplash

Query Expansion With Local Conceptual Word Embeddings in Microblog Retrieval

Sign Up to like & get
recommendations!
Published in 2021 at "IEEE Transactions on Knowledge and Data Engineering"

DOI: 10.1109/tkde.2019.2945764

Abstract: Since the length of microblog texts, such as tweets, is strictly limited to 140 characters, traditional Information Retrieval techniques suffer from the vocabulary mismatch problem severely and cannot yield good performance in the context of… read more here.

Keywords: math; word embeddings; microblog; local conceptual ... See more keywords
Photo from wikipedia

Fair Is Better than Sensational: Man Is to Doctor as Woman Is to Doctor

Sign Up to like & get
recommendations!
Published in 2020 at "Computational Linguistics"

DOI: 10.1162/coli_a_00379

Abstract: Analogies such as man is to king as woman is to X are often used to illustrate the amazing power of word embeddings. Concurrently, they have also been used to expose how strongly human biases… read more here.

Keywords: woman; word embeddings; man; doctor ... See more keywords
Photo from wikipedia

Exploring the Privacy-Preserving Properties of Word Embeddings: Algorithmic Validation Study

Sign Up to like & get
recommendations!
Published in 2020 at "Journal of Medical Internet Research"

DOI: 10.2196/18055

Abstract: Background Word embeddings are dense numeric vectors used to represent language in neural networks. Until recently, there had been no publicly released embeddings trained on clinical data. Our work is the first to study the… read more here.

Keywords: embeddings created; privacy preserving; word; word embeddings ... See more keywords