Articles with "self attention" as a keyword



Photo from wikipedia

Nested-block self-attention multiple resolution residual network for multi-organ segmentation from CT.

Sign Up to like & get
recommendations!
Published in 2022 at "Medical physics"

DOI: 10.1002/mp.15765

Abstract: BACKGROUND Fast and accurate multi-organs segmentation from CT scans is essential for radiation treatment planning. Self-attention based deep learning methodologies provide higher accuracies than standard methods but require memory and computationally intensive calculations, which restricts… read more here.

Keywords: self attention; multi organ; attention; nested block ... See more keywords
Photo from wikipedia

SARU: A self attention ResUnet to generate synthetic CT images for MR-only BNCT treatment planning.

Sign Up to like & get
recommendations!
Published in 2022 at "Medical physics"

DOI: 10.1002/mp.15986

Abstract: BACKGROUND Despite the significant physical differences between MRI and CT, the high entropy of MRI data indicates the existence of a surjective transformation from MRI to CT image. However, there is no specific optimization of… read more here.

Keywords: network; self attention; treatment planning; treatment ... See more keywords
Photo by jareddrice from unsplash

Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis

Sign Up to like & get
recommendations!
Published in 2021 at "Journal of Intelligent Information Systems"

DOI: 10.1007/s10844-021-00664-7

Abstract: People express their opinions and views in different and often ambiguous ways, hence the meaning of their words is often not explicitly stated and frequently depends on the context. Therefore, it is difficult for machines… read more here.

Keywords: contextual embeddings; self attention; sentiment; embeddings self ... See more keywords
Photo from wikipedia

DNN-based speech enhancement with self-attention on feature dimension

Sign Up to like & get
recommendations!
Published in 2020 at "Multimedia Tools and Applications"

DOI: 10.1007/s11042-020-09345-z

Abstract: To make full use of the key information in frame-level features, a DNN-based model for speech enhancement is proposed using self-attention on the feature dimension. Two improvement strategies are adopted to strengthen the attention of… read more here.

Keywords: information; feature; speech; self attention ... See more keywords
Photo by lucabravo from unsplash

An explicit self-attention-based multimodality CNN in-loop filter for versatile video coding

Sign Up to like & get
recommendations!
Published in 2021 at "Multimedia Tools and Applications"

DOI: 10.1007/s11042-021-11214-2

Abstract: The newest video coding standard, versatile video coding (VVC), has just been published recently. While it greatly improves the performance over the last High Efficiency Video Coding (HEVC) standard, there are still blocking artifacts under… read more here.

Keywords: video; attention; video coding; self attention ... See more keywords
Photo by jareddrice from unsplash

Dual Attention with the Self-Attention Alignment for Efficient Video Super-resolution

Sign Up to like & get
recommendations!
Published in 2021 at "Cognitive Computation"

DOI: 10.1007/s12559-021-09874-1

Abstract: By selectively enhancing the features extracted from convolution networks, the attention mechanism has shown its effectiveness for low-level visual tasks, especially for image super-resolution (SR). However, due to the spatiotemporal continuity of video sequences, simply… read more here.

Keywords: video; attention; self attention; attention alignment ... See more keywords
Photo from wikipedia

Voice gender recognition under unconstrained environments using self-attention

Sign Up to like & get
recommendations!
Published in 2021 at "Applied Acoustics"

DOI: 10.1016/j.apacoust.2020.107823

Abstract: Abstract Voice Gender Recognition is a non-trivial task that is extensively studied in the literature, however, when the voice gets surrounded by noises and unconstrained environments, the task becomes more challenging. This paper presents two… read more here.

Keywords: unconstrained environments; gender recognition; self attention; voice gender ... See more keywords
Photo from wikipedia

Classifying cancer pathology reports with hierarchical self-attention networks

Sign Up to like & get
recommendations!
Published in 2019 at "Artificial intelligence in medicine"

DOI: 10.1016/j.artmed.2019.101726

Abstract: We introduce a deep learning architecture, hierarchical self-attention networks (HiSANs), designed for classifying pathology reports and show how its unique architecture leads to a new state-of-the-art in accuracy, faster training, and clear interpretability. We evaluate… read more here.

Keywords: pathology reports; self attention; pathology; attention networks ... See more keywords
Photo by usgs from unsplash

Super-resolution of Pneumocystis carinii pneumonia CT via self-attention GAN.

Sign Up to like & get
recommendations!
Published in 2021 at "Computer methods and programs in biomedicine"

DOI: 10.1016/j.cmpb.2021.106467

Abstract: BACKGROUND AND OBJECTIVE Computed tomography (CT) examination plays an important role in screening suspected and confirmed patients in pneumocystis carinii pneumonia (PCP), and the efficient acquisition of high-quality medical CT images is essential for the… read more here.

Keywords: resolution; resolution image; super resolution; self attention ... See more keywords
Photo from wikipedia

Long- and short-term self-attention network for sequential recommendation

Sign Up to like & get
recommendations!
Published in 2021 at "Neurocomputing"

DOI: 10.1016/j.neucom.2020.10.066

Abstract: Abstract With great value in real applications, sequential recommendation aims to recommend users the personalized sequential actions. To achieve better performance, it is essential to consider both long-term preferences and sequential patterns ( i .… read more here.

Keywords: network; term; sequential recommendation; self attention ... See more keywords
Photo by erol from unsplash

Context-aware Self-Attention Networks for Natural Language Processing

Sign Up to like & get
recommendations!
Published in 2021 at "Neurocomputing"

DOI: 10.1016/j.neucom.2021.06.009

Abstract: Abstract Recently, Self-Attention NetworksSANs have shown its flexibility in parallel computation and effectiveness of modeling both short- and long-term dependencies. However, SANs face two problems: 1) the weighted averaging inhibits relations among neighboring words (i.e.,… read more here.

Keywords: context; attention; natural language; self attention ... See more keywords