Photo from archive.org
Sign Up to like & get
recommendations!
1
Published in 2017 at "Neural Processing Letters"
DOI: 10.1007/s11063-017-9762-8
Abstract: Bilingual word embeddings (BWEs) have proven to be useful in many cross-lingual natural language processing tasks. Previous studies often require bilingual texts or dictionaries that are scarce resources. As a result, in these studies, the…
read more here.
Keywords:
word;
word embeddings;
semantic constraints;
implicit semantic ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2020 at "Neurocomputing"
DOI: 10.1016/j.neucom.2020.02.018
Abstract: Abstract Deep neural networks have been employed to analyze the sentiment of text sequences and achieved significant effect. However, these models still face the issues of weakness of pre-trained word embeddings and weak interaction between…
read more here.
Keywords:
memory neural;
interaction;
word embeddings;
sentiment ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Political Analysis"
DOI: 10.1017/pan.2024.17
Abstract: Word embeddings are now a vital resource for social science research. However, obtaining high-quality training data for non-English languages can be difficult, and fitting embeddings therein may be computationally expensive. In addition, social scientists typically…
read more here.
Keywords:
word embeddings;
social scientists;
embeddings social;
multilanguage ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2020 at "Natural Language Engineering"
DOI: 10.1017/s1351324920000315
Abstract: Abstract As a ubiquitous method in natural language processing, word embeddings are extensively employed to map semantic properties of words into a dense vector representation. They capture semantic and syntactic relations among words, but the…
read more here.
Keywords:
imparting interpretability;
interpretability word;
embeddings preserving;
interpretability ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Scientific Reports"
DOI: 10.1038/s41598-024-72144-1
Abstract: Word embeddings provide an unsupervised way to understand differences in word usage between discursive communities. A number of papers have focused on identifying words that are used differently by two or more communities. But word…
read more here.
Keywords:
words used;
word embeddings;
word;
discursive communities ... See more keywords
Sign Up to like & get
recommendations!
2
Published in 2022 at "Proceedings of the National Academy of Sciences of the United States of America"
DOI: 10.1073/pnas.2121798119
Abstract: Significance How did societies of the past represent the various social groups of their world? Here, we address this question using word embeddings from 850 billion words of English-language books (from 1800 to 1999) to…
read more here.
Keywords:
representations social;
across 200;
word embeddings;
historical representations ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2018 at "Communication Methods and Measures"
DOI: 10.1080/19312458.2018.1455817
Abstract: ABSTRACT Moving beyond the dominant bag-of-words approach to sentiment analysis we introduce an alternative procedure based on distributed word embeddings. The strength of word embeddings is the ability to capture similarities in word meaning. We…
read more here.
Keywords:
word;
bags words;
word embeddings;
words sentiment ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2022 at "Journal of the American Medical Informatics Association : JAMIA"
DOI: 10.1093/jamia/ocab279
Abstract: OBJECTIVE To analyze gender bias in clinical trials, to design an algorithm that mitigates the effects of biases of gender representation on natural-language (NLP) systems trained on text drawn from clinical trials, and to evaluate…
read more here.
Keywords:
clinical trials;
gender sensitive;
prediction;
word embeddings ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2019 at "IEEE Access"
DOI: 10.1109/access.2019.2908014
Abstract: Co-occurrence information between words is the basis of training word embeddings; besides, Chinese characters are composed of subcharacters, words made up by the same characters or subcharacters usually have similar semantics, but this internal substructure…
read more here.
Keywords:
chinese word;
learning chinese;
word embeddings;
subcharacter grams ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2021 at "IEEE Access"
DOI: 10.1109/access.2021.3135807
Abstract: Semantic textual similarity is one of the open research challenges in the field of Natural Language Processing. Extensive research has been carried out in this field and near-perfect results are achieved by recent transformer-based models…
read more here.
Keywords:
similarity;
comparative analysis;
semantic similarity;
word embeddings ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "IEEE Access"
DOI: 10.1109/access.2024.3367246
Abstract: Text categorization remains a formidable challenge in information retrieval, requiring effective strategies, especially when applied to low-resource languages such as Italian. This paper delves into the intricacies of categorizing Italian news articles, addressing the complexities…
read more here.
Keywords:
word embeddings;
categorization;
news;
language ... See more keywords